Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
4.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Line of Service Advisory Industry/Sector Not Applicable Specialism Oracle Management Level Senior Associate Job Description & Summary At PwC, our people in business application consulting specialise in consulting services for a variety of business applications, helping clients optimise operational efficiency. These individuals analyse client needs, implement software solutions, and provide training and support for seamless integration and utilisation of business applications, enabling clients to achieve their strategic objectives. Those in Oracle technology at PwC will focus on utilising and managing Oracle suite of software and technologies for various purposes within an organisation. You will be responsible for tasks such as installation, configuration, administration, development, and support of Oracle products and solutions. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Job Description & Summary Managing business performance in today’s complex and rapidly changing business environment is crucial for any organization’s short-term and long-term success. However ensuring streamlined E2E Oracle fusion Technical to seamlessly adapt to the changing business environment is crucial from a process and compliance perspective. As part of the Technology Consulting -Business Applications - Oracle Practice team, we leverage opportunities around digital disruption, new age operating model and best in class practices to deliver technology enabled transformation to our clients Responsibilities: Extensive experience in Oracle ERP/Fusion SaaS/PaaS project implementations as a technical developer Completed at least 2 full Oracle Cloud (Fusion) Implementation Extensive Knowledge on database structure for ERP/Oracle Cloud (Fusion) Extensively worked on BI Publisher reports, FBDI/OTBI Cloud and Oracle Integration (OIC) Mandatory skill sets BI Publisher reports, FBDI/OTBI Cloud and Oracle Integration (OIC) Preferred skill sets database structure for ERP/Oracle Cloud (Fusion) Years of experience required Minimum 4Years of Oracle fusion experience Education Qualification BE/BTech MBA Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor of Engineering, Master of Business Administration, Bachelor of Technology Degrees/Field Of Study Preferred: Certifications (if blank, certifications not specified) Required Skills Oracle Integration Cloud (OIC) Optional Skills Accepting Feedback, Active Listening, Analytical Thinking, Business Transformation, Communication, Creativity, Design Automation, Embracing Change, Emotional Regulation, Empathy, Inclusion, Intellectual Curiosity, Learning Agility, Optimism, Oracle Application Development Framework (ADF), Oracle Business Intelligence (BI) Publisher, Oracle Cloud Infrastructure, Oracle Data Integration, Process Improvement, Process Optimization, Self-Awareness, Strategic Technology Planning, Teamwork, Well Being Desired Languages (If blank, desired languages not specified) Travel Requirements Available for Work Visa Sponsorship? Government Clearance Required? Job Posting End Date Show more Show less
Posted 1 week ago
4.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Line of Service Advisory Industry/Sector Not Applicable Specialism Oracle Management Level Senior Associate Job Description & Summary At PwC, our people in business application consulting specialise in consulting services for a variety of business applications, helping clients optimise operational efficiency. These individuals analyse client needs, implement software solutions, and provide training and support for seamless integration and utilisation of business applications, enabling clients to achieve their strategic objectives. Those in Oracle technology at PwC will focus on utilising and managing Oracle suite of software and technologies for various purposes within an organisation. You will be responsible for tasks such as installation, configuration, administration, development, and support of Oracle products and solutions. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Job Description & Summary Managing business performance in today’s complex and rapidly changing business environment is crucial for any organization’s short-term and long-term success. However ensuring streamlined E2E Oracle fusion Technical to seamlessly adapt to the changing business environment is crucial from a process and compliance perspective. As part of the Technology Consulting -Business Applications - Oracle Practice team, we leverage opportunities around digital disruption, new age operating model and best in class practices to deliver technology enabled transformation to our clients Responsibilities: Extensive experience in Oracle ERP/Fusion SaaS/PaaS project implementations as a technical developer Completed at least 2 full Oracle Cloud (Fusion) Implementation Extensive Knowledge on database structure for ERP/Oracle Cloud (Fusion) Extensively worked on BI Publisher reports, FBDI/OTBI Cloud and Oracle Integration (OIC) Mandatory skill sets BI Publisher reports, FBDI/OTBI Cloud and Oracle Integration (OIC) Preferred skill sets database structure for ERP/Oracle Cloud (Fusion) Years of experience required Minimum 4Years of Oracle fusion experience Education Qualification BE/BTech MBA Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor of Engineering, Master of Business Administration, Bachelor of Technology Degrees/Field Of Study Preferred: Certifications (if blank, certifications not specified) Required Skills Oracle Integration Cloud (OIC) Optional Skills Accepting Feedback, Active Listening, Analytical Thinking, Business Transformation, Communication, Creativity, Design Automation, Embracing Change, Emotional Regulation, Empathy, Inclusion, Intellectual Curiosity, Learning Agility, Optimism, Oracle Application Development Framework (ADF), Oracle Business Intelligence (BI) Publisher, Oracle Cloud Infrastructure, Oracle Data Integration, Process Improvement, Process Optimization, Self-Awareness, Strategic Technology Planning, Teamwork, Well Being Desired Languages (If blank, desired languages not specified) Travel Requirements Available for Work Visa Sponsorship? Government Clearance Required? Job Posting End Date Show more Show less
Posted 1 week ago
3.0 - 9.0 years
0 Lacs
Andhra Pradesh, India
On-site
At PwC, our people in business application consulting specialise in consulting services for a variety of business applications, helping clients optimise operational efficiency. These individuals analyse client needs, implement software solutions, and provide training and support for seamless integration and utilisation of business applications, enabling clients to achieve their strategic objectives. As a Guidewire developer at PwC, you will specialise in developing and customising applications using the Guidewire platform. Guidewire is a software suite that provides insurance companies with tools for policy administration, claims management, and billing. You will be responsible for designing, coding, and testing software solutions that meet the specific needs of insurance organisations. Driven by curiosity, you are a reliable, contributing member of a team. In our fast-paced environment, you are expected to adapt to working with a variety of clients and team members, each presenting varying challenges and scope. Every experience is an opportunity to learn and grow. You are expected to take ownership and consistently deliver quality work that drives value for our clients and success as a team. As you navigate through the Firm, you build a brand for yourself, opening doors to more opportunities. Skills Examples of the skills, knowledge, and experiences you need to lead and deliver value at this level include but are not limited to: Apply a learning mindset and take ownership for your own development. Appreciate diverse perspectives, needs, and feelings of others. Adopt habits to sustain high performance and develop your potential. Actively listen, ask questions to check understanding, and clearly express ideas. Seek, reflect, act on, and give feedback. Gather information from a range of sources to analyse facts and discern patterns. Commit to understanding how the business works and building commercial awareness. Learn and apply professional and technical standards (e.g. refer to specific PwC tax and audit guidance), uphold the Firm's code of conduct and independence requirements. Total Experience – 3 To 9 Years Education Qualification: BTech/BE/MTech/MS/MCA Preferred Skill Set/Roles and Responsibility - Hands on Experience in Azure Data Bricks and ADF Guidewire. Works with business in identifying detailed analytical and operational reporting/extracts requirements. Experience in Python is a must have. Able to create Microsoft SQL / ETL / SSIS complex queries. Participates in Sprint development, test, and integration activities. Creates detailed source to target mappings. Creates and validates data dictionaries Writes and validates data translation and migration scripts. Communicating with business to gather business requirements. Performs GAP analysis between existing (legacy) and new (GW) data related solutions. Working with Informatica ETL devs. Show more Show less
Posted 1 week ago
0 years
6 - 8 Lacs
Hyderābād
On-site
Genpact (NYSE: G) is a global professional services and solutions firm delivering outcomes that shape the future. Our 125,000+ people across 30+ countries are driven by our innate curiosity, entrepreneurial agility, and desire to create lasting value for clients. Powered by our purpose – the relentless pursuit of a world that works better for people – we serve and transform leading enterprises, including the Fortune Global 500, with our deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI. Inviting applications for the role of Lead Consultant- Databricks Developer ! In this role, the Databricks Developer is responsible for solving the real world cutting edge problem to meet both functional and non-functional requirements. Responsibilities Maintains close awareness of new and emerging technologies and their potential application for service offerings and products. Work with architect and lead engineers for solutions to meet functional and non-functional requirements. Demonstrated knowledge of relevant industry trends and standards. Demonstrate strong analytical and technical problem-solving skills. Must have experience in Data Engineering domain . Qualifications we seek in you! Minimum qualifications Bachelor’s Degree or equivalency (CS, CE, CIS, IS, MIS, or engineering discipline) or equivalent work experience. Maintains close awareness of new and emerging technologies and their potential application for service offerings and products. Work with architect and lead engineers for solutions to meet functional and non-functional requirements. Demonstrated knowledge of relevant industry trends and standards. Demonstrate strong analytical and technical problem-solving skills. Must have excellent coding skills either Python or Scala, preferably Python. Must have experience in Data Engineering domain . Must have implemented at least 2 project end-to-end in Databricks. Must have at least experience on databricks which consists of various components as below Delta lake dbConnect db API 2.0 Databricks workflows orchestration Must be well versed with Databricks Lakehouse concept and its implementation in enterprise environments. Must have good understanding to create complex data pipeline Must have good knowledge of Data structure & algorithms. Must be strong in SQL and sprak-sql . Must have strong performance optimization skills to improve efficiency and reduce cost . Must have worked on both Batch and streaming data pipeline . Must have extensive knowledge of Spark and Hive data processing framework. Must have worked on any cloud (Azure, AWS, GCP) and most common services like ADLS/S3, ADF/Lambda, CosmosDB /DynamoDB, ASB/SQS, Cloud databases. Must be strong in writing unit test case and integration test Must have strong communication skills and have worked on the team of size 5 plus Must have great attitude towards learning new skills and upskilling the existing skills. Preferred Qualifications Good to have Unity catalog and basic governance knowledge. Good to have Databricks SQL Endpoint understanding. Good To have CI/CD experience to build the pipeline for Databricks jobs. Good to have if worked on migration project to build Unified data platform. Good to have knowledge of DBT. Good to have knowledge of docker and Kubernetes. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color , religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. For more information, visit www.genpact.com . Follow us on Twitter, Facebook, LinkedIn, and YouTube. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a 'starter kit,' paying to apply, or purchasing equipment or training. Job Lead Consultant Primary Location India-Hyderabad Schedule Full-time Education Level Bachelor's / Graduation / Equivalent Job Posting Jun 9, 2025, 9:15:16 AM Unposting Date Ongoing Master Skills List Digital Job Category Full Time
Posted 1 week ago
0 years
0 Lacs
New Delhi, Delhi, India
Remote
We are seeking a proactive and business-oriented Data Functional Consultant with strong experience in Azure Data Factory and Azure Databricks . This role bridges the gap between business stakeholders and technical teams—translating business needs into scalable data solutions, ensuring effective data management, and enabling insights-driven decision-making. The ideal candidate is not a pure developer or data engineer but someone who understands business processes, data flows, and stakeholder priorities , and can help drive value from data platforms using cloud-native Azure services. What You’ll Do: Collaborate closely with business stakeholders to gather, understand, and document functional data requirements. Translate business needs into high-level data design, data workflows, and process improvements. Work with data engineering teams to define and validate ETL/ELT logic and data pipeline workflows using Azure Data Factory and Databricks. Facilitate functional workshops and stakeholder meetings to align on data needs and business KPIs. Act as a bridge between business teams and data engineers to ensure accurate implementation and delivery of data solutions. Conduct data validation, UAT, and support users in adopting data platforms and self-service analytics. Maintain functional documentation, data dictionaries, and mapping specifications. Assist in defining data governance, data quality, and master data management practices from a business perspective. Monitor data pipeline health and help triage issues from a functional/business impact standpoint. What You’ll Bring: Proven exposure to Azure Data Factory (ADF) for orchestrating data workflows. Practical experience with Azure Databricks for data processing (functional understanding, not necessarily coding). Strong understanding of data warehousing, data modeling, and business KPIs. Experience working in agile or hybrid project environments. Excellent communication and stakeholder management skills. Ability to translate complex technical details into business-friendly language. Familiarity with tools like Power BI, Excel, or other reporting solutions is a plus. Background in Banking, Finance industries is a bonus. What We Offer: At Delphi, we are dedicated to creating an environment where you can thrive, both professionally and personally. Our competitive compensation package, performance-based incentives, and health benefits are designed to ensure you're well-supported. We believe in your continuous growth and offer company-sponsored certifications, training programs , and skill-building opportunities to help you succeed. We foster a culture of inclusivity and support, with remote work options and a fully supported work-from-home setup to ensure your comfort and productivity. Our positive and inclusive culture includes team activities, wellness and mental health programs to ensure you feel supported. Show more Show less
Posted 1 week ago
3.0 years
10 Lacs
Gurgaon
Remote
Data Engineer – Azure This is a hands on data platform engineering role that places significant emphasis on consultative data engineering engagements with a wide range of customer stakeholders; Business Owners, Business Analytics, Data Engineering teams, Application Development, End Users and Management teams. You Will: Design and build resilient and efficient data pipelines for batch and real-time streaming Collaborate with product managers, software engineers, data analysts, and data scientists to build scalable and data-driven platforms and tools. Provide technical product expertise, advise on deployment architectures, and handle in-depth technical questions around data infrastructure, PaaS services, design patterns and implementation approaches. Collaborate with enterprise architects, data architects, ETL developers & engineers, data scientists, and information designers to lead the identification and definition of required data structures, formats, pipelines, metadata, and workload orchestration capabilities Address aspects such as data privacy & security, data ingestion & processing, data storage & compute, analytical & operational consumption, data modeling, data virtualization, self-service data preparation & analytics, AI enablement, and API integrations. Execute projects with an Agile mindset. Build software frameworks to solve data problems at scale. Technical Requirements: 3+ years of data engineering experience leading implementations of large-scale lakehouses on Databricks, Snowflake, or Synapse. Prior experience using DBT and PowerBI will be a plus. Extensive experience with Azure data services (Databricks, Synapse, ADF) and related azure infrastructure services like firewall, storage, key vault etc. is required. Strong programming / scripting experience using SQL and python and Spark. Knowledge of software configuration management environments and tools such as JIRA, Git, Jenkins, TFS, Shell, PowerShell, Bitbucket. Experience with Agile development methods in data-oriented projects Other Requirements: Highly motivated self-starter and team player and demonstrated success in prior roles. Track record of success working through technical challenges within enterprise organizations Ability to prioritize deals, training, and initiatives through highly effective time management Excellent problem solving, analytical, presentation, and whiteboarding skills Track record of success dealing with ambiguity (internal and external) and working collaboratively with other departments and organizations to solve challenging problems Strong knowledge of technology and industry trends that affect data analytics decisions for enterprise organizations Certifications on Azure Data Engineering and related technologies. "Remote postings are limited to candidates residing within the country specified in the posting location" About Rackspace Technology We are the multicloud solutions experts. We combine our expertise with the world’s leading technologies — across applications, data and security — to deliver end-to-end solutions. We have a proven record of advising customers based on their business challenges, designing solutions that scale, building and managing those solutions, and optimizing returns into the future. Named a best place to work, year after year according to Fortune, Forbes and Glassdoor, we attract and develop world-class talent. Join us on our mission to embrace technology, empower customers and deliver the future. More on Rackspace Technology Though we’re all different, Rackers thrive through our connection to a central goal: to be a valued member of a winning team on an inspiring mission. We bring our whole selves to work every day. And we embrace the notion that unique perspectives fuel innovation and enable us to best serve our customers and communities around the globe. We welcome you to apply today and want you to know that we are committed to offering equal employment opportunity without regard to age, color, disability, gender reassignment or identity or expression, genetic information, marital or civil partner status, pregnancy or maternity status, military or veteran status, nationality, ethnic or national origin, race, religion or belief, sexual orientation, or any legally protected characteristic. If you have a disability or special need that requires accommodation, please let us know.
Posted 1 week ago
7.0 years
10 Lacs
Gurgaon
Remote
Senior Data Engineer – Azure This is a hands on data platform engineering role that places significant emphasis on consultative data engineering engagements with a wide range of customer stakeholders; Business Owners, Business Analytics, Data Engineering teams, Application Development, End Users and Management teams. You Will: Design and build resilient and efficient data pipelines for batch and real-time streaming Architect and design data infrastructure on cloud using Infrastructure-as-Code tools. Collaborate with product managers, software engineers, data analysts, and data scientists to build scalable and data-driven platforms and tools. Provide technical product expertise, advise on deployment architectures, and handle in-depth technical questions around data infrastructure, PaaS services, design patterns and implementation approaches. Collaborate with enterprise architects, data architects, ETL developers & engineers, data scientists, and information designers to lead the identification and definition of required data structures, formats, pipelines, metadata, and workload orchestration capabilities Address aspects such as data privacy & security, data ingestion & processing, data storage & compute, analytical & operational consumption, data modeling, data virtualization, self-service data preparation & analytics, AI enablement, and API integrations. Lead a team of engineers to deliver impactful results at scale. Execute projects with an Agile mindset. Build software frameworks to solve data problems at scale. Technical Requirements: 7+ years of data engineering experience leading implementations of large-scale lakehouses on Databricks, Snowflake, or Synapse. Prior experience using DBT and PowerBI will be a plus. 3+ years' experience architecting solutions for developing data pipelines from structured, unstructured sources for batch and realtime workloads. Extensive experience with Azure data services (Databricks, Synapse, ADF) and related azure infrastructure services like firewall, storage, key vault etc. is required. Strong programming / scripting experience using SQL and python and Spark. Strong Data Modeling, Data lakehouse concepts. Knowledge of software configuration management environments and tools such as JIRA, Git, Jenkins, TFS, Shell, PowerShell, Bitbucket. Experience with Agile development methods in data-oriented projects Other Requirements: Highly motivated self-starter and team player and demonstrated success in prior roles. Track record of success working through technical challenges within enterprise organizations Ability to prioritize deals, training, and initiatives through highly effective time management Excellent problem solving, analytical, presentation, and whiteboarding skills Track record of success dealing with ambiguity (internal and external) and working collaboratively with other departments and organizations to solve challenging problems Strong knowledge of technology and industry trends that affect data analytics decisions for enterprise organizations Certifications on Azure Data Engineering and related technologies. "Remote postings are limited to candidates residing within the country specified in the posting location" About Rackspace Technology We are the multicloud solutions experts. We combine our expertise with the world’s leading technologies — across applications, data and security — to deliver end-to-end solutions. We have a proven record of advising customers based on their business challenges, designing solutions that scale, building and managing those solutions, and optimizing returns into the future. Named a best place to work, year after year according to Fortune, Forbes and Glassdoor, we attract and develop world-class talent. Join us on our mission to embrace technology, empower customers and deliver the future. More on Rackspace Technology Though we’re all different, Rackers thrive through our connection to a central goal: to be a valued member of a winning team on an inspiring mission. We bring our whole selves to work every day. And we embrace the notion that unique perspectives fuel innovation and enable us to best serve our customers and communities around the globe. We welcome you to apply today and want you to know that we are committed to offering equal employment opportunity without regard to age, color, disability, gender reassignment or identity or expression, genetic information, marital or civil partner status, pregnancy or maternity status, military or veteran status, nationality, ethnic or national origin, race, religion or belief, sexual orientation, or any legally protected characteristic. If you have a disability or special need that requires accommodation, please let us know.
Posted 1 week ago
0 years
5 - 7 Lacs
Chennai
On-site
Flex is the diversified manufacturing partner of choice that helps market-leading brands design, build and deliver innovative products that improve the world. A career at Flex offers the opportunity to make a difference and invest in your growth in a respectful, inclusive, and collaborative environment. If you are excited about a role but don't meet every bullet point, we encourage you to apply and join us to create the extraordinary. Job Description The “Business Intelligence Developer” will be Chennai based (with very occasional travel to sites), Responsible for maintaining and improving the organisation’s business intelligence systems to ensure that they function reliably and in accordance with user needs. What a typical day looks like: Support the Senior Business Intelligence Developer to maintain and improve the Power BI suite of reports for the business Develop, test, review and help deploy automated reports and dashboards using Power BI and other reporting tools. Understand business requirements to set functional specifications for reporting applications. Exhibit an understanding of database concepts such relational database architecture and multidimensional database design Design data models that transform raw data into insightful knowledge by understanding business requirements in the context of BI. Develop technical specifications from business needs, and accurately scope the work to help set realistic deadlines for work completion. Make charts and data documentation that includes descriptions of the techniques, parameters, models, and relationships. Developing Power BI desktop to create dashboards, KPI scorecards, and visual reports. Examine, comprehend, and study business needs as they relate to business intelligence. Design and map data models to transform raw data into insightful information. Create dynamic and eye-catching dashboards and reports using Power BI. Make necessary tactical and technological adjustments to enhance current business intelligence systems Integrate data, alter data, and connect to data sources for business intelligence. The experience we’re looking to add to our team: Knowledge of SSRS and TSQL, Power Query, MDX, PowerBI, and DAX and systems on the MS SQL Server BI Stack. Good communication skills are necessary to effectively work with stakeholders, end users and all levels at the organisation who request reports. Ability to run a Power BI project End to end through all stages from Requirements gathering to Report Deployment. Exceptional analytical thinking skills for converting data into illuminating reports and reports. Some knowledge of data warehousing, data gateway, and data preparation projects Good knowledge of Power BI,and desirable a knowledge of SSAS, SSRS, and SSIS components of the Microsoft Business Intelligence Stack Articulating, representing, and analysing solutions with the team while documenting, creating, and modelling them Strong understanding of DAX, Intermediate Knowledge of SQL, and M-Query and a basic understanding of Python. A basic understanding of ADF or Fabric Pipelines. Comprehensive understanding of data modelling, administration, and visualisation Capacity to perform in an atmosphere where agility and continual development are prioritised Awareness of BI technologies (e.g., Microsoft Power BI, Oracle BI) Expertise of SQL queries, SSRS, and SQL Server Integration Services (SSIS) Troubleshooting and problem-solving skills. Demonstrates basic functional, technical and people and/or process management skills as well as customer (external and internal) relationship skills. Demonstrates skills in functional/ technical area. Use of the following tools may be required: Office Skills: telephones, data entry, and office software to include word processing, spreadsheets, and presentation package and database systems. What you’ll receive for the great work you provide: Health Insurance Paid Time Off #BB04 Job Category IT Flex pays for all costs associated with the application, interview or offer process, a candidate will not be asked for any payment related to these costs. Flex does not accept unsolicited resumes from headhunters, recruitment agencies or fee based recruitment services. Flex is an Equal Opportunity Employer and employment selection decisions are based on merit, qualifications, and abilities. Flex does not discriminate in employment opportunities or practices based on: age, race, religion, color, sex, national origin, marital status, sexual orientation, gender identity, veteran status, disability, pregnancy status or any other status protected by law. Flex provides reasonable accommodation so that qualified applicants with a disability may participate in the selection process. Please advise us of any accommodations you request to express interest in a position by e-mailing: accessibility@flex.com . Please state your request for assistance in your message. Only reasonable accommodation requests related to applying for a specific position within Flex will be reviewed at the e-mail address. Flex will contact you if it is determined that your background is a match to the required skills required for this position. Thank you for considering a career with Flex.
Posted 1 week ago
5.0 years
0 Lacs
Calcutta
On-site
Skill required: Tech for Operations - Microsoft Azure Cloud Services Designation: App Automation Eng Senior Analyst Qualifications: Any Graduation/12th/PUC/HSC Years of Experience: 5 to 8 years About Accenture Accenture is a global professional services company with leading capabilities in digital, cloud and security.Combining unmatched experience and specialized skills across more than 40 industries, we offer Strategy and Consulting, Technology and Operations services, and Accenture Song— all powered by the world’s largest network of Advanced Technology and Intelligent Operations centers. Our 699,000 people deliver on the promise of technology and human ingenuity every day, serving clients in more than 120 countries. We embrace the power of change to create value and shared success for our clients, people, shareholders, partners and communities.Visit us at www.accenture.com What would you do? Accenture is a global professional services company with leading capabilities in digital, cloud and security. Combining unmatched experience and specialized skills across more than 40 industries, we offer Strategy and Consulting, Technology and Operations services, and Accenture Song— all powered by the world s largest network of Advanced Technology and Intelligent Operations centers. Our 699,000 people deliver on the promise of technology and human ingenuity every day, serving clients in more than 120 countries. We embrace the power of change to create value and shared success for our clients, people, shareholders, partners and communities. Visit us at www.accenture.com. In our Service Supply Chain offering, we leverage a combination of proprietary technology and client systems to develop, execute, and deliver BPaaS (business process as a service) or Managed Service solutions across the service lifecycle: Plan, Deliver, and Recover. In this role, you will partner with business development and act as a Business Subject Matter Expert (SME) to help build resilient solutions that will enhance our clients supply chains and customer experience. The Senior Azure Data factory (ADF) Support Engineer Il will be a critical member of our Enterprise Applications Team, responsible for designing, supporting & maintaining robust data solutions. The ideal candidate is proficient in ADF, SQL and has extensive experience in troubleshooting Azure Data factory environments, conducting code reviews, and bug fixing. This role requires a strategic thinker who can collaborate with cross-functional teams to drive our data strategy and ensure the optimal performance of our data systems. What are we looking for? • Bachelor s or Master s degree in Computer Science, Information Technology, or a related field. • Proven experience (5+ years) as a Azure Data Factory Support Engineer Il Expertise in ADF with a deep understanding of its data-related libraries. • Strong experience in Azure cloud services, including troubleshooting and optimizing cloud-based environments. • Proficient in SQL and experience with SQL database design. • Strong analytical skills with the ability to collect, organize, analyze, and disseminate significant amounts of information with attention to detail and accuracy. Experience with ADF pipelines. • Excellent problem-solving and troubleshooting skills. • Experience in code review and debugging in a collaborative project setting. • Excellent verbal and written communication skills. • Ability to work in a fast-paced, team-oriented environment. • Strong understanding of the business and a passion for the mission of Service Supply Chain • Hands on with Jira, Devops ticketing, ServiceNow is good to have Roles and Responsibilities: • Innovate. Collaborate. Build. Create. Solve ADF & associated systems Ensure systems meet business requirements and industry practices. • Integrate new data management technologies and software engineering tools into existing structures. • Recommend ways to improve data reliability, efficiency, and quality. • Use large data sets to address business issues. • Use data to discover tasks that can be automated. • Fix bugs to ensure robust and sustainable codebase. • Collaborate closely with the relevant teams to diagnose and resolve issues in data processing systems, ensuring minimal downtime and optimal performance. • Analyze and comprehend existing ADF data pipelines, systems, and processes to identify and troubleshoot issues effectively. • Develop, test, and implement code changes to fix bugs and improve the efficiency and reliability of data pipelines. • Review and validate change requests from stakeholders, ensuring they align with system capabilities and business objectives. • Implement robust monitoring solutions to proactively detect and address issues in ADF data pipelines and related infrastructure. • Coordinate with data architects and other team members to ensure that changes are in line with the overall architecture and data strategy. • Document all changes, bug fixes, and updates meticulously, maintaining clear and comprehensive records for future reference and compliance. • Provide technical guidance and support to other team members, promoting a culture of continuous learning and improvement. • Stay updated with the latest technologies and practices in ADF to continuously improve the support and maintenance of data systems. • Flexible Work Hours to include US Time Zones • Flexible working hours however this position may require you to work a rotational On-Call schedule, evenings, weekends, and holiday shifts when need arises Participate in the Demand Management and Change Management processes. • Work in partnership with internal business, external 3rd party technical teams and functional teams as a technology partner in communicating and coordinating delivery of technology services from Technology For Operations (TfO) Any Graduation,12th/PUC/HSC
Posted 1 week ago
5.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Role Description Data Engineer Responsibilities/Tasks Work with Technical Leads and Architects to analyse solutions. Translate complex business requirements into tangible data requirements through collaborative work with both business and technical subject matter experts. Develop / modify data models with an eye towards high performance, scalability, flexibility and usability. Ensure data models are in alignment with the overall architecture standards. Create source to target mapping documentation. Serves as data flow and enrichment “owner” with deep expertise in data dynamics, capable of recognizing and elevating improvement opportunities early in the process Work with product owners to understand business reporting requirements and deliver appropriate insights on regular basis Responsible for system configuration to deliver reports, data visualizations, and other solution components Skills Required More than 5 years of software development experience Proficient in Azure services - Azure Data Factory, Synapse, Data Lake, Data Bricks Nice to have experience in C#.Net. Experience querying, analysing, or managing data required. Experience within the healthcare insurance industry Experience in data cleansing, data engineering, data enrichment, data warehousing/ Business Intelligence preferred. Strong analytical, problem solving and planning skills. Strong organizational and presentation skills. Excellent interpersonal and communication skills. Ability to multi-task in a fast-paced environment. Flexibility to adapt readily to changing business needs in a fast-paced environment. Team player who is delivery-oriented and takes responsibility for the team’s success. Enthusiastic, can-do attitude with the drive to continually learn and improve. Knowledge of Agile, SCRUM and/or Agile methodologies. Skills Data Engineering,Azure, ADB, ADF Show more Show less
Posted 1 week ago
5.0 - 10.0 years
9 - 18 Lacs
Mumbai, Mumbai Suburban, Navi Mumbai
Work from Office
Job description Hiring for Oracle Fusion Technical - Mumbai (WFO) Experience: 5+ Years Work location: Mumbai_WFO Notice Period: Immediate to 15 Days Max About Clover InfoTech: With 30 years of IT excellence, Clover Infotech is a leading global IT services and consulting company. Our 5000+ experts specialized in Oracle, Microsoft, and Open Source technologies, delivering solutions in application and technology modernization, cloud enablement, data management, automation, and assurance services. We help enterprises on their transformation journey by implementing business-critical applications and supporting technology infrastructure through a proven managed services model. Our SLA-based delivery ensures operational efficiency, cost-effectiveness, and enhanced information security. We proudly partner with companies ranging from Fortune 500 companies to emerging enterprises and new-age startups. We offer technology-powered solutions that accelerate growth and drive success. Job Description: Oracle Fusion Technical Responsible for providing technical solutioning & implementing the same Should have worked extensively on Fusion Reporting tools like BIP, OTBI and FRS. Should have expertise in Oracle BPM and workflows. Strong knowledge in FBDI, ADF, FBL and webservices SOAP & REST. Ready to Join immediately or within 30 days. Ready to work from Clover Office or client locations. Total relevant experience in Oracle ERP (EBS+Fusion) more than 5 years. Minimum Fusion experience out of total experience is 3 years. Preference to candidates with Certification in Oracle Fusion. Have done minimum 2 Fusion implementation projects. Should be ready to work in different time zone. Should have managed a Team of Technical as well as co-ordination with Functional teams. Should be ready to Travel as & when required & has a valid passport. Should be able to train and Mentor team in Oracle Fusion. Thank you for considering this opportunity with Clover InfoTech. We look forward to hearing from you! Best Regards, Talent Acquisition Team https://www.cloverinfotech.com Office Locations India: Mumbai | Navi Mumbai | Pune | Gurugram | Bengaluru | Chennai International: UAE | USA | Canada | Singapore
Posted 1 week ago
29.0 years
0 Lacs
Vishakhapatnam, Andhra Pradesh, India
On-site
Company Description Miracle Software Systems is a global IT services company delivering true value to businesses for the past 29 years. We optimize and transform businesses into high-performance platforms, enabling digitization and business growth. With over 2600 employees worldwide, Miracle serves 42 of today’s Fortune 100 companies, with 1000+ satisfied customers and 1400+ successful projects. We provide services across Cloud, Application Development, Data and Analytics, among others, and are known for our Always-Available, Innovation-First approach, making us a trusted partner in digital journeys. We have alliances with leading IT firms such as SAP, IBM, AWS, RedHat, Microsoft, and UiPath. Role Description This is an on-site full-time role for a Sr. Azure Data Engineer located in Vishakhapatnam. The Sr. Azure Data Engineer with 6+ years will be responsible for highly skilled Data Engineering with extensive experience in Microsoft Azure, particularly with ADF and Fabric pipeline development and a strong understanding of the Medallion Architecture (Bronze, Silver, Gold layers). The ideal candidate will be responsible for designing and optimizing end-to-end data pipelines across Lake houses and Warehouses in Microsoft Fabric, and will work closely with business and engineering teams to define scalable, governed data models. Responsibilities: Develop and manage complex data pipelines using Azure Data Factory (ADF) and Microsoft Fabric. Implement and maintain Medallion Architecture layers (Bronze, Silver, Gold). Design governed, scalable data models tailored to business requirements. Develop and optimize PySpark-based data processing for large-scale data transformations. Integrate with reporting tools such as Power BI for seamless data visualization. Ensure robust data governance, security, and performance in large-scale Fabric deployments. Required Skills: Strong expertise in Azure Data Factory (ADF) and Microsoft Fabric Hands-on experience with OneLake, Lakehouse Explorer, and Power BI integration Solid understanding of data governance, security, and performance tuning SAP knowledge is required Proficiency in PySpark is mandatory Interested can share your updated resume to skoditala@miraclesoft.com // can able to reach me at 08919401201. Show more Show less
Posted 1 week ago
10.0 years
0 Lacs
Trivandrum, Kerala, India
On-site
Role Description Role Proficiency: Act creatively to develop applications by selecting appropriate technical options optimizing application development maintenance and performance by employing design patterns and reusing proven solutions. Account for others' developmental activities; assisting Project Manager in day to day project execution. Outcomes Interpret the application feature and component designs to develop the same in accordance with specifications. Code debug test document and communicate product component and feature development stages. Validate results with user representatives integrating and commissions the overall solution. Select and create appropriate technical options for development such as reusing improving or reconfiguration of existing components while creating own solutions for new contexts Optimises efficiency cost and quality. Influence and improve customer satisfaction Influence and improve employee engagement within the project teams Set FAST goals for self/team; provide feedback to FAST goals of team members Measures Of Outcomes Adherence to engineering process and standards (coding standards) Adherence to project schedule / timelines Number of technical issues uncovered during the execution of the project Number of defects in the code Number of defects post delivery Number of non compliance issues Percent of voluntary attrition On time completion of mandatory compliance trainings Code Outputs Expected: Code as per the design Define coding standards templates and checklists Review code – for team and peers Documentation Create/review templates checklists guidelines standards for design/process/development Create/review deliverable documents. Design documentation Requirements test cases and results Configure Define and govern configuration management plan Ensure compliance from the team Test Review/Create unit test cases scenarios and execution Review test plan created by testing team Provide clarifications to the testing team Domain Relevance Advise software developers on design and development of features and components with deeper understanding of the business problem being addressed for the client Learn more about the customer domain and identify opportunities to provide value addition to customers Complete relevant domain certifications Manage Project Support Project Manager with inputs for the projects Manage delivery of modules Manage complex user stories Manage Defects Perform defect RCA and mitigation Identify defect trends and take proactive measures to improve quality Estimate Create and provide input for effort and size estimation and plan resources for projects Manage Knowledge Consume and contribute to project related documents share point libraries and client universities Review the reusable documents created by the team Release Execute and monitor release process Design Contribute to creation of design (HLD LLD SAD)/architecture for applications features business components and data models Interface With Customer Clarify requirements and provide guidance to Development Team Present design options to customers Conduct product demos Work closely with customer architects for finalizing design Manage Team Set FAST goals and provide feedback Understand aspirations of the team members and provide guidance opportunities etc Ensure team members are upskilled Ensure team is engaged in project Proactively identify attrition risks and work with BSE on retention measures Certifications Obtain relevant domain and technology certifications Skill Examples Explain and communicate the design / development to the customer Perform and evaluate test results against product specifications Break down complex problems into logical components Develop user interfaces business software components Use data models Estimate time and effort resources required for developing / debugging features / components Perform and evaluate test in the customer or target environments Make quick decisions on technical/project related challenges Manage a team mentor and handle people related issues in team Have the ability to maintain high motivation levels and positive dynamics within the team. Interface with other teams designers and other parallel practices Set goals for self and team. Provide feedback for team members Create and articulate impactful technical presentations Follow high level of business etiquette in emails and other business communication Drive conference calls with customers and answer customer questions Proactively ask for and offer help Ability to work under pressure determine dependencies risks facilitate planning handling multiple tasks. Build confidence with customers by meeting the deliverables timely with a quality product. Estimate time and effort of resources required for developing / debugging features / components Knowledge Examples Appropriate software programs / modules Functional & technical designing Programming languages – proficient in multiple skill clusters DBMS Operating Systems and software platforms Software Development Life Cycle Agile – Scrum or Kanban Methods Integrated development environment (IDE) Rapid application development (RAD) Modelling technology and languages Interface definition languages (IDL) Broad knowledge of customer domain and deep knowledge of sub domain where problem is solved Additional Comments Key Responsibilities: Design, build, and manage end-to-end ETL/ELT workflows using Azure Data Factory (ADF) to support supply chain data movement and transformation. Integrate data from multiple sources such as ERP systems, logistics platforms, warehouses, APIs, and third-party providers into Azure Data Lake or Synapse Analytics. Ensure high-performance, scalable, and secure data pipelines aligned with business and compliance requirements. Collaborate with business analysts, data architects, and supply chain SMEs to understand data needs and implement effective solutions. Write and optimize complex SQL queries, stored procedures, and data transformation logic. Monitor, troubleshoot, and optimize ADF pipelines for latency, throughput, and reliability. Support data validation, quality assurance, and governance processes. Document data flows, transformation logic, and technical processes. Work in Agile/Scrum delivery model to support iterative development and rapid delivery. ________________________________________ Required Skills & Experience: 9–10 years of experience in Data Engineering and ETL development, with at least 3–5 years in Azure Data Factory. Strong knowledge of Azure Data Lake, Azure SQL DB, Azure Synapse, Blob Storage, and Data Flows in ADF. Proficiency in SQL, T-SQL, and performance tuning of queries. Experience working with structured, semi-structured (JSON, XML), and unstructured data. Exposure to Supply Chain data sources like ERP (e.g., SAP, Oracle), TMS, WMS, or inventory/order management systems. Experience with Git, Azure DevOps, or other version control and CI/CD tools. Basic understanding of DataBricks, Python, or Spark is a plus. Familiarity with data quality, metadata management, and lineage tools. Bachelor's Degree in Computer Science, Engineering, or a related field. ________________________________________ Preferred Qualifications: Experience in Supply Chain Analytics or Operations. Knowledge of forecasting, inventory planning, procurement, logistics, or demand planning data flows. Certification in Microsoft Azure Data Engineer Associate is preferred. ________________________________________ Soft Skills: Strong problem-solving and analytical skills. Ability to communicate effectively with business and technical stakeholders. Experience working in Agile / Scrum teams. Proactive, self-motivated, and detail-oriented. Skills Azure Data Factory,Azure Data Lake,Blob Storage Show more Show less
Posted 1 week ago
5.0 years
0 Lacs
Kolkata, West Bengal, India
On-site
Skill required: Tech for Operations - Microsoft Azure Cloud Services Designation: App Automation Eng Senior Analyst Qualifications: Any Graduation/12th/PUC/HSC Years of Experience: 5 to 8 years About Accenture Accenture is a global professional services company with leading capabilities in digital, cloud and security.Combining unmatched experience and specialized skills across more than 40 industries, we offer Strategy and Consulting, Technology and Operations services, and Accenture Song— all powered by the world’s largest network of Advanced Technology and Intelligent Operations centers. Our 699,000 people deliver on the promise of technology and human ingenuity every day, serving clients in more than 120 countries. We embrace the power of change to create value and shared success for our clients, people, shareholders, partners and communities.Visit us at www.accenture.com What would you do? Accenture is a global professional services company with leading capabilities in digital, cloud and security. Combining unmatched experience and specialized skills across more than 40 industries, we offer Strategy and Consulting, Technology and Operations services, and Accenture Song— all powered by the world s largest network of Advanced Technology and Intelligent Operations centers. Our 699,000 people deliver on the promise of technology and human ingenuity every day, serving clients in more than 120 countries. We embrace the power of change to create value and shared success for our clients, people, shareholders, partners and communities. Visit us at www.accenture.com. In our Service Supply Chain offering, we leverage a combination of proprietary technology and client systems to develop, execute, and deliver BPaaS (business process as a service) or Managed Service solutions across the service lifecycle: Plan, Deliver, and Recover. In this role, you will partner with business development and act as a Business Subject Matter Expert (SME) to help build resilient solutions that will enhance our clients supply chains and customer experience. The Senior Azure Data factory (ADF) Support Engineer Il will be a critical member of our Enterprise Applications Team, responsible for designing, supporting & maintaining robust data solutions. The ideal candidate is proficient in ADF, SQL and has extensive experience in troubleshooting Azure Data factory environments, conducting code reviews, and bug fixing. This role requires a strategic thinker who can collaborate with cross-functional teams to drive our data strategy and ensure the optimal performance of our data systems. What are we looking for? Bachelor s or Master s degree in Computer Science, Information Technology, or a related field. Proven experience (5+ years) as a Azure Data Factory Support Engineer Il Expertise in ADF with a deep understanding of its data-related libraries. Strong experience in Azure cloud services, including troubleshooting and optimizing cloud-based environments. Proficient in SQL and experience with SQL database design. Strong analytical skills with the ability to collect, organize, analyze, and disseminate significant amounts of information with attention to detail and accuracy. Experience with ADF pipelines. Excellent problem-solving and troubleshooting skills. Experience in code review and debugging in a collaborative project setting. Excellent verbal and written communication skills. Ability to work in a fast-paced, team-oriented environment. Strong understanding of the business and a passion for the mission of Service Supply Chain Hands on with Jira, Devops ticketing, ServiceNow is good to have Roles and Responsibilities: Innovate. Collaborate. Build. Create. Solve ADF & associated systems Ensure systems meet business requirements and industry practices. Integrate new data management technologies and software engineering tools into existing structures. Recommend ways to improve data reliability, efficiency, and quality. Use large data sets to address business issues. Use data to discover tasks that can be automated. Fix bugs to ensure robust and sustainable codebase. Collaborate closely with the relevant teams to diagnose and resolve issues in data processing systems, ensuring minimal downtime and optimal performance. Analyze and comprehend existing ADF data pipelines, systems, and processes to identify and troubleshoot issues effectively. Develop, test, and implement code changes to fix bugs and improve the efficiency and reliability of data pipelines. Review and validate change requests from stakeholders, ensuring they align with system capabilities and business objectives. Implement robust monitoring solutions to proactively detect and address issues in ADF data pipelines and related infrastructure. Coordinate with data architects and other team members to ensure that changes are in line with the overall architecture and data strategy. Document all changes, bug fixes, and updates meticulously, maintaining clear and comprehensive records for future reference and compliance. Provide technical guidance and support to other team members, promoting a culture of continuous learning and improvement. Stay updated with the latest technologies and practices in ADF to continuously improve the support and maintenance of data systems. Flexible Work Hours to include US Time Zones Flexible working hours however this position may require you to work a rotational On-Call schedule, evenings, weekends, and holiday shifts when need arises Participate in the Demand Management and Change Management processes. Work in partnership with internal business, external 3rd party technical teams and functional teams as a technology partner in communicating and coordinating delivery of technology services from Technology For Operations (TfO) Show more Show less
Posted 1 week ago
5.0 years
0 Lacs
Kolkata, West Bengal, India
On-site
Skill required: Tech for Operations - Microsoft Azure Cloud Services Designation: App Automation Eng Senior Analyst Qualifications: Any Graduation/12th/PUC/HSC Years of Experience: 5 to 8 years About Accenture Accenture is a global professional services company with leading capabilities in digital, cloud and security.Combining unmatched experience and specialized skills across more than 40 industries, we offer Strategy and Consulting, Technology and Operations services, and Accenture Song— all powered by the world’s largest network of Advanced Technology and Intelligent Operations centers. Our 699,000 people deliver on the promise of technology and human ingenuity every day, serving clients in more than 120 countries. We embrace the power of change to create value and shared success for our clients, people, shareholders, partners and communities.Visit us at www.accenture.com What would you do? Accenture is a global professional services company with leading capabilities in digital, cloud and security. Combining unmatched experience and specialized skills across more than 40 industries, we offer Strategy and Consulting, Technology and Operations services, and Accenture Song— all powered by the world s largest network of Advanced Technology and Intelligent Operations centers. Our 699,000 people deliver on the promise of technology and human ingenuity every day, serving clients in more than 120 countries. We embrace the power of change to create value and shared success for our clients, people, shareholders, partners and communities. Visit us at www.accenture.com. In our Service Supply Chain offering, we leverage a combination of proprietary technology and client systems to develop, execute, and deliver BPaaS (business process as a service) or Managed Service solutions across the service lifecycle: Plan, Deliver, and Recover. In this role, you will partner with business development and act as a Business Subject Matter Expert (SME) to help build resilient solutions that will enhance our clients supply chains and customer experience. The Senior Azure Data factory (ADF) Support Engineer Il will be a critical member of our Enterprise Applications Team, responsible for designing, supporting & maintaining robust data solutions. The ideal candidate is proficient in ADF, SQL and has extensive experience in troubleshooting Azure Data factory environments, conducting code reviews, and bug fixing. This role requires a strategic thinker who can collaborate with cross-functional teams to drive our data strategy and ensure the optimal performance of our data systems. What are we looking for? Bachelor s or Master s degree in Computer Science, Information Technology, or a related field. Proven experience (5+ years) as a Azure Data Factory Support Engineer Il Expertise in ADF with a deep understanding of its data-related libraries. Strong experience in Azure cloud services, including troubleshooting and optimizing cloud-based environments. Proficient in SQL and experience with SQL database design. Strong analytical skills with the ability to collect, organize, analyze, and disseminate significant amounts of information with attention to detail and accuracy. Experience with ADF pipelines. Excellent problem-solving and troubleshooting skills. Experience in code review and debugging in a collaborative project setting. Excellent verbal and written communication skills. Ability to work in a fast-paced, team-oriented environment. Strong understanding of the business and a passion for the mission of Service Supply Chain Hands on with Jira, Devops ticketing, ServiceNow is good to have Roles and Responsibilities: Innovate. Collaborate. Build. Create. Solve ADF & associated systems Ensure systems meet business requirements and industry practices. Integrate new data management technologies and software engineering tools into existing structures. Recommend ways to improve data reliability, efficiency, and quality. Use large data sets to address business issues. Use data to discover tasks that can be automated. Fix bugs to ensure robust and sustainable codebase. Collaborate closely with the relevant teams to diagnose and resolve issues in data processing systems, ensuring minimal downtime and optimal performance. Analyze and comprehend existing ADF data pipelines, systems, and processes to identify and troubleshoot issues effectively. Develop, test, and implement code changes to fix bugs and improve the efficiency and reliability of data pipelines. Review and validate change requests from stakeholders, ensuring they align with system capabilities and business objectives. Implement robust monitoring solutions to proactively detect and address issues in ADF data pipelines and related infrastructure. Coordinate with data architects and other team members to ensure that changes are in line with the overall architecture and data strategy. Document all changes, bug fixes, and updates meticulously, maintaining clear and comprehensive records for future reference and compliance. Provide technical guidance and support to other team members, promoting a culture of continuous learning and improvement. Stay updated with the latest technologies and practices in ADF to continuously improve the support and maintenance of data systems. Flexible Work Hours to include US Time Zones Flexible working hours however this position may require you to work a rotational On-Call schedule, evenings, weekends, and holiday shifts when need arises Participate in the Demand Management and Change Management processes. Work in partnership with internal business, external 3rd party technical teams and functional teams as a technology partner in communicating and coordinating delivery of technology services from Technology For Operations (TfO) Show more Show less
Posted 1 week ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Flex is the diversified manufacturing partner of choice that helps market-leading brands design, build and deliver innovative products that improve the world. We believe in the power of diversity and inclusion and cultivate a workplace culture of belonging that views uniqueness as a competitive edge and builds a community that enables our people to push the limits of innovation to make great products that create value and improve people's lives. A career at Flex offers the opportunity to make a difference and invest in your growth in a respectful, inclusive, and collaborative environment. If you are excited about a role but don't meet every bullet point, we encourage you to apply and join us to create the extraordinary. The “ Business Intelligence Developer ” will be R emote based (with very occasional travel to sites), Responsible for maintaining and improving the organisation’s business intelligence systems to ensure that they function reliably and in accordance with user needs. What a typical day looks like: Support the Senior Business Intelligence Developer to maintain and improve the Power BI suite of reports for the business Develop, test, review and help deploy automated reports and dashboards using Power BI and other reporting tools. Understand business requirements to set functional specifications for reporting applications. Exhibit an understanding of database concepts such relational database architecture and multidimensional database design Design data models that transform raw data into insightful knowledge by understanding business requirements in the context of BI. Develop technical specifications from business needs, and accurately scope the work to help set realistic deadlines for work completion. Make charts and data documentation that includes descriptions of the techniques, parameters, models, and relationships. Developing Power BI desktop to create dashboards, KPI scorecards, and visual reports. Examine, comprehend, and study business needs as they relate to business intelligence. Design and map data models to transform raw data into insightful information. Create dynamic and eye-catching dashboards and reports using Power BI. Make necessary tactical and technological adjustments to enhance current business intelligence systems Integrate data, alter data, and connect to data sources for business intelligence. The experience we’re looking to add to our team: Knowledge of SSRS and TSQL, Power Query, MDX, PowerBI, and DAX and systems on the MS SQL Server BI Stack. Good communication skills are necessary to effectively work with stakeholders, end users and all levels at the organisation who request reports. Ability to run a Power BI project End to end through all stages from Requirements gathering to Report Deployment. Exceptional analytical thinking skills for converting data into illuminating reports and reports. Some knowledge of data warehousing, data gateway, and data preparation projects Good knowledge of Power BI,and desirable a knowledge of SSAS, SSRS, and SSIS components of the Microsoft Business Intelligence Stack Articulating, representing, and analysing solutions with the team while documenting, creating, and modelling them Strong understanding of DAX, Intermediate Knowledge of SQL, and M-Query and a basic understanding of Python. A basic understanding of ADF or Fabric Pipelines. Comprehensive understanding of data modelling, administration, and visualisation Capacity to perform in an atmosphere where agility and continual development are prioritised Awareness of BI technologies (e.g., Microsoft Power BI, Oracle BI) Expertise of SQL queries, SSRS, and SQL Server Integration Services (SSIS) Troubleshooting and problem-solving skills. Demonstrates basic functional, technical and people and/or process management skills as well as customer (external and internal) relationship skills. Demonstrates skills in functional/ technical area. Use of the following tools may be required: Office Skills: telephones, data entry, and office software to include word processing, spreadsheets, and presentation package and database systems. What you’ll receive for the great work you provide: Health Insurance Paid Time Off #BB04 Site Flex is an Equal Opportunity Employer and employment selection decisions are based on merit, qualifications, and abilities. We celebrate diversity and do not discriminate based on: age, race, religion, color, sex, national origin, marital status, sexual orientation, gender identity, veteran status, disability, pregnancy status, or any other status protected by law. We're happy to provide reasonable accommodations to those with a disability for assistance in the application process. Please email accessibility@flex.com and we'll discuss your specific situation and next steps (NOTE: this email does not accept or consider resumes or applications. This is only for disability assistance. To be considered for a position at Flex, you must complete the application process first). Show more Show less
Posted 1 week ago
0 years
0 Lacs
India
Remote
Senior Data Functional Consultant - Fully Remote - 6 Month Contract Role: Senior Data Functional Consultant Client Location: Dubai Work Location: Fully Remote Duration: 6 Months extendable Monthly Rate: $2000 USD Our Dubai based client are seeking a proactive and business-oriented Data Functional Consultant with strong experience in Azure Data Factory and Azure Databricks. This role bridges the gap between business stakeholders and technical teams—translating business needs into scalable data solutions, ensuring effective data management, and enabling insights-driven decision-making. The ideal candidate is not a pure developer or data engineer but someone who understands business processes, data flows, and stakeholder priorities, and can help drive value from data platforms using cloud-native Azure services. Experience Required: • Proven exposure to Azure Data Factory (ADF) for orchestrating data workflows. • Practical experience with Azure Databricks for data processing (functional understanding, not necessarily coding). • Strong understanding of data warehousing, data modeling, and business KPIs. • Experience working in agile or hybrid project environments. • Excellent communication and stakeholder management skills. • Ability to translate complex technical details into business-friendly language. • Familiarity with tools like Power BI, Excel, or other reporting solutions is a plus. • Background in Banking, Finance industries is a bonus. Requirements: • Collaborate closely with business stakeholders to gather, understand, and document functional data requirements Translate business needs into high-level data design, data workflows, and process improvements. • Work with data engineering teams to define and validate ETL/ELT logic and data pipeline workflows using Azure Data Factory and Databricks. • Facilitate functional workshops and stakeholder meetings to align on data needs and business KPIs. • Act as a bridge between business teams and data engineers to ensure accurate implementation and delivery of data solutions. • Conduct data validation, UAT, and support users in adopting data platforms and self-service analytics. • Maintain functional documentation, data dictionaries, and mapping specifications. • Assist in defining data governance, data quality, and master data management practices from a business perspective. • Monitor data pipeline health and help triage issues from a functional/business impact standpoint. Show more Show less
Posted 1 week ago
5.0 years
0 Lacs
Gurugram, Haryana, India
On-site
This role is for one of the Weekday's clients Salary range: Rs 2200000 - Rs 2400000 (ie INR 22-24 LPA) Min Experience: 5 years Location: Pune, Bengaluru, Chennai, Kolkata, Gurgaon JobType: full-time We are looking for an experienced Snowflake Developer to join our Data Engineering team. The ideal candidate will possess a deep understanding of Data Warehousing , SQL , ETL tools like Informatica , and visualization platforms such as Power BI . This role involves building scalable data pipelines, optimizing data architectures, and collaborating with cross-functional teams to deliver impactful data solutions. Requirements Key Responsibilities Data Engineering & Warehousing: Leverage over 5 years of hands-on experience in Data Engineering with a focus on Data Warehousing and Business Intelligence. Pipeline Development: Design and maintain ELT pipelines using Snowflake, Fivetran, and DBT to ingest and transform data from multiple sources. SQL Development: Write and optimize complex SQL queries and stored procedures to support robust data transformations and analytics. Data Modeling & ELT: Implement advanced data modeling practices including SCD Type-2, and build high-performance ELT workflows using DBT. Requirement Analysis: Partner with business stakeholders to capture data needs and convert them into scalable technical solutions. Data Quality & Troubleshooting: Conduct root cause analysis on data issues, maintain high data integrity, and ensure reliability across systems. Collaboration & Documentation: Collaborate with engineering and business teams. Develop and maintain thorough documentation for pipelines, data models, and processes. Skills & Qualifications Expertise in Snowflake for large-scale data warehousing and ELT operations. Strong SQL skills with the ability to create and manage complex queries and procedures. Proven experience with Informatica PowerCenter for ETL development. Proficiency with Power BI for data visualization and reporting. Hands-on experience with Fivetran for automated data integration. Familiarity with DBT, Sigma Computing, Tableau, and Oracle. Solid understanding of data analysis, requirement gathering, and source-to-target mapping. Knowledge of cloud ecosystems such as Azure (including ADF, Databricks); experience with AWS or GCP is a plus. Experience with workflow orchestration tools like Airflow, Azkaban, or Luigi. Proficiency in Python for scripting and data processing (Java or Scala is a plus). Bachelor's or Graduate degree in Computer Science, Statistics, Informatics, Information Systems, or a related field. Key Tools & Technologies Snowflake, snowsql, Snowpark SQL, Informatica, Power BI, DBT Python, Fivetran, Sigma Computing, Tableau Airflow, Azkaban, Azure, Databricks, ADF Show more Show less
Posted 1 week ago
5.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
This role is for one of the Weekday's clients Salary range: Rs 2200000 - Rs 2400000 (ie INR 22-24 LPA) Min Experience: 5 years Location: Pune, Bengaluru, Chennai, Kolkata, Gurgaon JobType: full-time We are looking for an experienced Snowflake Developer to join our Data Engineering team. The ideal candidate will possess a deep understanding of Data Warehousing , SQL , ETL tools like Informatica , and visualization platforms such as Power BI . This role involves building scalable data pipelines, optimizing data architectures, and collaborating with cross-functional teams to deliver impactful data solutions. Requirements Key Responsibilities Data Engineering & Warehousing: Leverage over 5 years of hands-on experience in Data Engineering with a focus on Data Warehousing and Business Intelligence. Pipeline Development: Design and maintain ELT pipelines using Snowflake, Fivetran, and DBT to ingest and transform data from multiple sources. SQL Development: Write and optimize complex SQL queries and stored procedures to support robust data transformations and analytics. Data Modeling & ELT: Implement advanced data modeling practices including SCD Type-2, and build high-performance ELT workflows using DBT. Requirement Analysis: Partner with business stakeholders to capture data needs and convert them into scalable technical solutions. Data Quality & Troubleshooting: Conduct root cause analysis on data issues, maintain high data integrity, and ensure reliability across systems. Collaboration & Documentation: Collaborate with engineering and business teams. Develop and maintain thorough documentation for pipelines, data models, and processes. Skills & Qualifications Expertise in Snowflake for large-scale data warehousing and ELT operations. Strong SQL skills with the ability to create and manage complex queries and procedures. Proven experience with Informatica PowerCenter for ETL development. Proficiency with Power BI for data visualization and reporting. Hands-on experience with Fivetran for automated data integration. Familiarity with DBT, Sigma Computing, Tableau, and Oracle. Solid understanding of data analysis, requirement gathering, and source-to-target mapping. Knowledge of cloud ecosystems such as Azure (including ADF, Databricks); experience with AWS or GCP is a plus. Experience with workflow orchestration tools like Airflow, Azkaban, or Luigi. Proficiency in Python for scripting and data processing (Java or Scala is a plus). Bachelor's or Graduate degree in Computer Science, Statistics, Informatics, Information Systems, or a related field. Key Tools & Technologies Snowflake, snowsql, Snowpark SQL, Informatica, Power BI, DBT Python, Fivetran, Sigma Computing, Tableau Airflow, Azkaban, Azure, Databricks, ADF Show more Show less
Posted 1 week ago
5.0 years
0 Lacs
Greater Kolkata Area
On-site
This role is for one of the Weekday's clients Salary range: Rs 2200000 - Rs 2400000 (ie INR 22-24 LPA) Min Experience: 5 years Location: Pune, Bengaluru, Chennai, Kolkata, Gurgaon JobType: full-time We are looking for an experienced Snowflake Developer to join our Data Engineering team. The ideal candidate will possess a deep understanding of Data Warehousing , SQL , ETL tools like Informatica , and visualization platforms such as Power BI . This role involves building scalable data pipelines, optimizing data architectures, and collaborating with cross-functional teams to deliver impactful data solutions. Requirements Key Responsibilities Data Engineering & Warehousing: Leverage over 5 years of hands-on experience in Data Engineering with a focus on Data Warehousing and Business Intelligence. Pipeline Development: Design and maintain ELT pipelines using Snowflake, Fivetran, and DBT to ingest and transform data from multiple sources. SQL Development: Write and optimize complex SQL queries and stored procedures to support robust data transformations and analytics. Data Modeling & ELT: Implement advanced data modeling practices including SCD Type-2, and build high-performance ELT workflows using DBT. Requirement Analysis: Partner with business stakeholders to capture data needs and convert them into scalable technical solutions. Data Quality & Troubleshooting: Conduct root cause analysis on data issues, maintain high data integrity, and ensure reliability across systems. Collaboration & Documentation: Collaborate with engineering and business teams. Develop and maintain thorough documentation for pipelines, data models, and processes. Skills & Qualifications Expertise in Snowflake for large-scale data warehousing and ELT operations. Strong SQL skills with the ability to create and manage complex queries and procedures. Proven experience with Informatica PowerCenter for ETL development. Proficiency with Power BI for data visualization and reporting. Hands-on experience with Fivetran for automated data integration. Familiarity with DBT, Sigma Computing, Tableau, and Oracle. Solid understanding of data analysis, requirement gathering, and source-to-target mapping. Knowledge of cloud ecosystems such as Azure (including ADF, Databricks); experience with AWS or GCP is a plus. Experience with workflow orchestration tools like Airflow, Azkaban, or Luigi. Proficiency in Python for scripting and data processing (Java or Scala is a plus). Bachelor's or Graduate degree in Computer Science, Statistics, Informatics, Information Systems, or a related field. Key Tools & Technologies Snowflake, snowsql, Snowpark SQL, Informatica, Power BI, DBT Python, Fivetran, Sigma Computing, Tableau Airflow, Azkaban, Azure, Databricks, ADF Show more Show less
Posted 1 week ago
0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
Job Description Description of role and key responsibilities The candidate will be required to deliver to all stages of the data engineering process – data ingestion, transformation, data modelling and data warehousing, and build self-service data products. The role is a mix of Azure cloud delivery and on-prem (SQL) development. Ultimately all on-prem will be migrated to cloud and decommissioned – but we are only part way along that journey. There will be a dual reporting line between the main business technology area (Asset Lending) providing the day-to-day direction and management of work items, and the Head of Data for Corporate Banking Technology – who will provide guidance on overall Data strategy and alignment with wider bank. The role itself will work closely with our Architect, Engineering lead, Analytics team, DevOps, DBAs, and upstream Application teams in Asset Finance, Working Capital and ABL. Specifically, The Person Will Work closely with end-users and Data Analysts to understand the business and their data requirements Carry out ad hoc data analysis and ‘data wrangling’ using Synapse Analytics and Databricks Building dynamic meta-data driven data ingestion patterns using Azure Data Factory and Databricks Build and maintain the Enterprise Data Warehouse (using Data Vault 2.0 methodology) Build and maintain business focused data products and data marts Build and maintain Azure Analysis Services databases and cubes Share support and operational duties within the wider engineering and data teams Work with Architecture and Engineering teams to deliver on these projects. and ensure that supporting code and infrastructure follows best practices outlined by these teams. Help define test criteria to establish clear conditions for success and ensure alignment with business objectives. Manage their user stories and acceptance criteria through to production into day-to-day support Assist in the testing and validation of new requirements and processes to ensure they meet business needs Stay up-to-date with industry trends and best practices in data engineering Core skills and knowledge Excellent data analysis and exploration using T-SQL Strong SQL programming (stored procedures, functions) Extensive experience with SQL Server and SSIS Knowledge and experience of data warehouse modelling methodologies (Kimball, dimensional modelling, Data Vault 2.0) Experience in Azure – one or more of the following: Data Factory, Databricks, Synapse Analytics, ADLS Gen2 Experience in building robust and performant ETL processes Build and maintain Analysis Services databases and cubes (both multidimensional and tabular) Experience in using source control & ADO Understanding and experience of deployment pipelines Excellent analytical and problem-solving skills, with the ability to think critically and strategically. Strong communication and interpersonal skills, with the ability to engage and influence stakeholders at all levels. To always act with integrity and embrace the philosophy of treating our customers fairly Analytical, ability to arrive at solutions that fit current / future business processes Effective writing and verbal communication Organisational skills: Ability to effectively manage and co-ordinate themselves. Ownership and self-motivation Delivery focus Assertive, resilient and persistent Team oriented Deal well with pressure and highly effective at multi-tasking and juggling priorities Any other attributes that would be helpful, but not essential for the role. Deeper programming ability (C#, .Net Core) Build ‘infrastructure-as-code’ deployment pipelines Asset Finance knowledge Vehicle Finance knowledge ABL and Working Capital knowledge Any financial services and banking experience Show more Show less
Posted 1 week ago
6.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Overview: We are looking for a skilled and motivated Data Engineer with strong experience in Python programming and Google Cloud Platform (GCP) to join our data engineering team. The ideal candidate will be responsible for designing, developing, and maintaining robust and scalable ETL (Extract, Transform, Load) data pipelines. The role involves working with various GCP services, implementing data ingestion and transformation logic, and ensuring data quality and consistency across systems. Key Responsibilities: 1. Design, develop, test, and maintain scalable ETL data pipelines using Python. 2. Perform data ingestion from various sources and apply transformation & cleansing logic to ensure high-quality data delivery. 3. Implement and enforce data quality checks, validation rules, and monitoring. 4. Collaborate with data scientists, analysts, and other engineering teams to understand data needs and deliver efficient data solutions. 5. Manage version control using GitHub and participate in CI/CD pipeline deployments for data projects. 6 . Write complex SQL queries for data extraction and validation from relational databases such as SQL Server , Oracle , or PostgreSQL 7. Document pipeline designs, data flow diagrams, and operational support procedures. 8. Work extensively on Google Cloud Platform (GCP) services such as: Dataflow for real-time and batch data process. Cloud Functions for lightweight serverless compute. BigQuery for data warehousing and analytics. Cloud Composer for orchestration of data workflows ( Apache Airflow.) Google Cloud Storage (GCS) for managing data at scale. IAM for access control and secure. Cloud Run for containerized applications Required Skills: 4–6 years of hands-on experience in Python for backend or data engineering projects. Strong understanding and working experience with GCP cloud services (especially Dataflow, BigQuery, Cloud Functions, Cloud Composer, etc.). Solid understanding of data pipeline architecture , data integration , and transformation techniques . Experience in working with version control systems like GitHub and knowledge of CI/CD practices . Strong experience in SQL with at least one enterprise database (SQL Server, Oracle, PostgreSQL, etc.). Good to Have (Optional Skills): Experience working with Snowflake cloud data platform. Hands-on knowledge of Databricks for big data processing and analytics. Familiarity with Azure Data Factory (ADF) and other Azure data engineering tools. Show more Show less
Posted 1 week ago
5.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Job Title: Data Engineer – Databricks, Delta Live Tables, Data Pipelines Location: Bhopal / Hyderabad / Pune (On-site) Experience Required: 5+ Years Employment Type: Full-Time Job Summary: We are seeking a skilled and experienced Data Engineer with a strong background in designing and building data pipelines using Databricks and Delta Live Tables. The ideal candidate should have hands-on experience in managing large-scale data engineering workloads and building scalable, reliable data solutions in cloud environments. Key Responsibilities: Design, develop, and manage scalable and efficient data pipelines using Databricks and Delta Live Tables . Work with structured and unstructured data to enable analytics and reporting use cases. Implement data ingestion , transformation , and cleansing processes. Collaborate with Data Architects, Analysts, and Data Scientists to ensure data quality and integrity. Monitor data pipelines and troubleshoot issues to ensure high availability and performance. Optimize queries and data flows to reduce costs and increase efficiency. Ensure best practices in data security, governance, and compliance. Document architecture, processes, and standards. Required Skills: Minimum 5 years of hands-on experience in data engineering . Proficient in Apache Spark , Databricks , Delta Lake , and Delta Live Tables . Strong programming skills in Python or Scala . Experience with cloud platforms such as Azure , AWS , or GCP . Proficient in SQL for data manipulation and analysis. Experience with ETL/ELT pipelines , data wrangling , and workflow orchestration tools (e.g., Airflow, ADF). Understanding of data warehousing , big data ecosystems , and data modeling concepts. Familiarity with CI/CD processes in a data engineering context. Nice to Have: Experience with real-time data processing using tools like Kafka or Kinesis. Familiarity with machine learning model deployment in data pipelines. Experience working in an Agile environment. Show more Show less
Posted 1 week ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
We are seeking an experienced and strategic Data to design, build, and optimize scalable, secure, and high-performance data solutions. You will play a pivotal role in shaping our data infrastructure, working with technologies such as Databricks, Azure Data Factory, SQL and PySpark, Key Responsibilities: • Design and develop scalable data pipelines using Databricks and Medallion (Bronze, Silver, Gold layers). • Write efficient PySpark and SQL code for data transformation, cleansing, and enrichment. • Build and manage data workflows in Azure Data Factory (ADF) including triggers, linked services, and integration runtimes. • Optimize queries and data structures for performance and cost-efficiency . • Develop and maintain CI/CD pipelines using GitHub for automated deployment and version control. • Collaborate with cross-functional teams to define data strategies and drive data quality initiatives. • Implement best practices for DevOps, CI/CD , and infrastructure-as-code in data engineering. • Troubleshoot and resolve performance bottlenecks across Spark, ADF, and Databricks pipelines. Requirements: • Bachelor’s or master’s degree in computer science, Information Systems, or related field. • Proven experience as a Data Architect or Senior Data Engineer. • Strong knowledge of Databricks , Azure Data Factory , Spark (PySpark) , and SQL . • Hands-on experience with data governance , security frameworks , and catalog management . • Proficiency in cloud platforms (preferably Azure). • Experience with CI/CD tools and version control systems like GitHub. • Strong communication and collaboration skills. Show more Show less
Posted 1 week ago
8.0 years
0 Lacs
Greater Kolkata Area
On-site
Location: PAN India Duration: 6 Months Experience Required: 7–8 years Job Summary We are looking for an experienced SSAS Developer with strong expertise in developing both OLAP and Tabular models using SQL Server Analysis Services (SSAS), alongside advanced ETL development skills using tools like SSIS, Informatica, or Azure Data Factory . The ideal candidate will be well-versed in T-SQL , dimensional modeling, and building high-performance, scalable data solutions. Key Responsibilities Design, build, and maintain SSAS OLAP cubes and Tabular models Create complex DAX and MDX queries for analytical use cases Develop robust ETL workflows and pipelines using SSIS, Informatica, or ADF Collaborate with cross-functional teams to translate business requirements into BI solutions Optimize SSAS models for scalability and performance Implement best practices in data modeling, version control, and deployment automation Support dashboarding and reporting needs via Power BI, Excel, or Tableau Maintain and troubleshoot data quality, performance, and integration issues Must-Have Skills Hands-on experience with SSAS (Tabular & Multidimensional) Proficient in DAX, MDX, and T-SQL Advanced ETL skills using SSIS / Informatica / Azure Data Factory Knowledge of dimensional modeling (star & snowflake schema) Experience with Azure SQL / MS SQL Server Familiarity with Git and CI/CD pipelines Nice to Have Exposure to cloud data platforms (Azure Synapse, Snowflake, AWS Redshift) Working knowledge of Power BI or similar BI tools Understanding of Agile/Scrum methodology Bachelor's degree in Computer Science, Information Systems, or equivalent Show more Show less
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
The job market for ADF (Application Development Framework) professionals in India is witnessing significant growth, with numerous opportunities available for job seekers in this field. ADF is a popular framework used for building enterprise applications, and companies across various industries are actively looking for skilled professionals to join their teams.
Here are 5 major cities in India where there is a high demand for ADF professionals: - Bangalore - Hyderabad - Pune - Chennai - Mumbai
The estimated salary range for ADF professionals in India varies based on experience levels: - Entry-level: INR 4-6 lakhs per annum - Mid-level: INR 8-12 lakhs per annum - Experienced: INR 15-20 lakhs per annum
In the ADF job market in India, a typical career path may include roles such as Junior Developer, Senior Developer, Technical Lead, and Architect. As professionals gain more experience and expertise in ADF, they can progress to higher-level positions with greater responsibilities.
In addition to ADF expertise, professionals in this field are often expected to have knowledge of related technologies such as Java, Oracle Database, SQL, JavaScript, and web development frameworks like Angular or React.
Here are 25 interview questions for ADF roles, categorized by difficulty level: - Basic: - What is ADF and its key features? - What is the difference between ADF Faces and ADF Task Flows? - Medium: - Explain the lifecycle of an ADF application. - How do you handle exceptions in ADF applications? - Advanced: - Discuss the advantages of using ADF Business Components. - How would you optimize performance in an ADF application?
As you explore job opportunities in the ADF market in India, make sure to enhance your skills, prepare thoroughly for interviews, and showcase your expertise confidently. With the right preparation and mindset, you can excel in your ADF career and secure rewarding opportunities in the industry. Good luck!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2