Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
3.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Role: Azure Data Engineer Location: Gurugram We’re looking for a skilled and motivated Data Engineer to join our growing team and help us build scalable data pipelines, optimize data platforms, and enable real-time analytics. 🧠 What You'll Do 🔹 Design, develop, and maintain robust data pipelines using tools like Databricks, PySpark, SQL, Fabric, and Azure Data Factory 🔹 Collaborate with data scientists, analysts, and business teams to ensure data is accessible, clean, and actionable 🔹 Work on modern data lakehouse architectures and contribute to data governance and quality frameworks 🎯 Tech Stack ☁️ Azure | 🧱 Databricks | 🐍 PySpark | 📊 SQL 👤 What We’re Looking For ✅ 3+ years of experience in data engineering or analytics engineering ✅ Hands-on with cloud data platforms and large-scale data processing ✅ Strong problem-solving mindset and a passion for clean, efficient data design Job Description: Min 3 years of experience in modern data engineering/data warehousing/data lakes technologies on cloud platforms like Azure, AWS, GCP, Data Bricks etc. Azure experience is preferred over other cloud platforms. 5 years of proven experience with SQL, schema design and dimensional data modelling Solid knowledge of data warehouse best practices, development standards and methodologies Experience with ETL/ELT tools like ADF, Informatica, Talend etc., and data warehousing technologies like Azure Synapse, Microsoft Fabric, Azure SQL, Amazon redshift, Snowflake, Google Big Query etc. Strong experience with big data tools (Databricks, Spark etc..) and programming skills in PySpark and Spark SQL. Be an independent self-learner with “let’s get this done” approach and ability to work in Fast paced and Dynamic environment. Excellent communication and teamwork abilities. Nice-to-Have Skills: Event Hub, IOT Hub, Azure Stream Analytics, Azure Analysis Service, Cosmo DB knowledge. SAP ECC /S/4 and Hana knowledge. Intermediate knowledge on Power BI Azure DevOps and CI/CD deployments, Cloud migration methodologies and processes Best Regards, Santosh Cherukuri Email: scherukuri@bayonesolutions.com Show more Show less
Posted 1 week ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job description Job Name: Senior Data Engineer Azure Years of Experience: 5 Job Description: We are looking for a skilled and experienced Senior Azure Developer to join our team! As part of the team, you will be involved in the implementation of the ongoing and new initiatives for our company. If you love learning, thinking strategically, innovating, and helping others, this job is for you! Primary Skills: ADF, Databricks Secondary Skills: DBT, Python, Databricks, Airflow, Fivetran, Glue, Snowflake Role Description: Data engineering role requires creating and managing technological infrastructure of a data platform, be in-charge /involved in architecting, building, and managing data flows / pipelines and construct data storages (noSQL, SQL), tools to work with big data (Hadoop, Kafka), and integration tools to connect sources or other databases Role Responsibility: Translate functional specifications and change requests into technical specifications Translate business requirement document, functional specification, and technical specification to related coding Develop efficient code with unit testing and code documentation Ensuring accuracy and integrity of data and applications through analysis, coding, documenting, testing, and problem solving Setting up the development environment and configuration of the development tools Communicate with all the project stakeholders on the project status Manage, monitor, and ensure the security and privacy of data to satisfy business needs Contribute to the automation of modules, wherever required To be proficient in written, verbal and presentation communication (English) Co-ordinating with the UAT team Role Requirement: Proficient in basic and advanced SQL programming concepts (Procedures, Analytical functions etc.) Good Knowledge and Understanding of Data warehouse concepts (Dimensional Modeling, change data capture, slowly changing dimensions etc.) Knowledgeable in Shell / PowerShell scripting Knowledgeable in relational databases, nonrelational databases, data streams, and file stores Knowledgeable in performance tuning and optimization Experience in Data Profiling and Data validation Experience in requirements gathering and documentation processes and performing unit testing Understanding and Implementing QA and various testing process in the project Knowledge in any BI tools will be an added advantage Sound aptitude, outstanding logical reasoning, and analytical skills Willingness to learn and take initiatives Ability to adapt to fast-paced Agile environment Additional Requirement: Demonstrated expertise as a Data Engineer, specializing in Azure cloud services. Highly skilled in Azure Data Factory, Azure Data Lake, Azure Databricks, and Azure Synapse Analytics. Create and execute efficient, scalable, and dependable data pipelines utilizing Azure Data Factory. Utilize Azure Databricks for data transformation and processing. Effectively oversee and enhance data storage solutions, emphasizing Azure Data Lake and other Azure storage services. Construct and uphold workflows for data orchestration and scheduling using Azure Data Factory or equivalent tools. Proficient in programming languages like Python, SQL, and conversant with pertinent scripting languages Show more Show less
Posted 1 week ago
7.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
We have an exciting job opportunity - Lead Data Engineer : Snowflake Experience - 7 years to 10 years only Mandate Skills - min of 3 yrs+ experience with Snowflake - either migrating from another DB to SF or pulling data into SF. SQL, any ETL tool - SSIS, Talend, Informatica, Data Bricks, ADF (preferred), etc, Team Handling exp Location - Hyderabad / Chennai Mode - Hybrid Notice Period - Immediate Joiner to 15 Days Job Summary: · We are seeking an experienced Lead Snowflake Data Engineer to join our Data & Analytics team. This role involves designing, implementing, and optimizing Snowflake-based data solutions while providing strategic direction and leadership to a team of junior and mid-level data engineers. The ideal candidate will have deep expertise in Snowflake, cloud data platforms, ETL/ELT processes, and Medallion data architecture best practices. The lead data engineer role has a strong focus on performance optimization, security, scalability, and Snowflake credit control and management. This is a tactical role requiring independent in-depth data analysis and data discovery to understand our existing source systems, fact and dimension data models, and implement an enterprise data warehouse solution in Snowflake. Essential Functions and Tasks: · Lead the design, development, and maintenance of a scalable Snowflake data solution serving our enterprise data & analytics team. · Architect and implement data pipelines, ETL/ELT workflows, and data warehouse solutions using Snowflake and related technologies. · Optimize Snowflake database performance, storage, and security. · Provide guidance on Snowflake best practices · Collaborate with cross-functional teams of data analysts, business analysts, data scientists, and software engineers, to define and implement data solutions. · Ensure data quality, integrity, and governance across the organization. · Provide technical leadership and mentorship to junior and mid-level data engineers. ·Troubleshoot and resolve data-related issues, ensuring high availability and performance of the data platform. Education and Experience Requirements: · Bachelor’s or Master’s degree in Computer Science, Information Systems, or a related field. · 7+ years of experience in-depth data engineering, with at least 3+ minimum years of dedicated experience engineering solutions in a Snowflake environment. ·Tactical expertise in ANSI SQL, performance tuning, and data modeling techniques. ·Strong experience with cloud platforms (preference to Azure) and their data services. · Proficiency in ETL/ELT development using tools such as Azure Data Factory, dbt, Matillion, Talend, or Fivetran. · Hands-on experience with scripting languages like Python for data processing. · Strong understanding of data governance, security, and compliance best practices. · Snowflake SnowPro certification; preference to the engineering course path. · Experience with CI/CD pipelines, DevOps practices, and Infrastructure as Code (IaC). · Knowledge of streaming data processing frameworks such as Apache Kafka or Spark Streaming. · Familiarity with BI and visualization tools such as PowerBI Knowledge, Skills, and Abilities: · Familiarity working in an agile scrum team, including sprint planning, daily stand-ups, backlog grooming, and retrospectives. · Ability to self-manage large complex deliverables and document user stories and tasks through Azure DevOps. · Personal accountability to committed sprint user stories and tasks · Strong analytical and problem-solving skills with the ability to handle complex data challenges · Ability to read, understand, and apply state/federal laws, regulations, and policies. · Ability to communicate with diverse personalities in a tactful, mature, and professional manner. · Ability to remain flexible and work within a collaborative and fast paced environment. · Understand and comply with company policies and procedures. · Strong oral, written, and interpersonal communication skills. · Strong time management and organizational skills. About Our Client: Our client is a leading business solutions provider for facility-based physicians practicing anesthesia, emergency medicine, hospital medicine, and now radiology, through the recent combining of forces with Advocate RCM. Focused on Revenue Cycle Management and Advisory services.Having grown consistently every year they have now grown to over 5000 employees headquartered in Dallas, US. Kindly share your updated resume ankita.jaiswal@firstwave-tech.com Show more Show less
Posted 1 week ago
3.0 years
0 Lacs
India
Remote
Title: Azure Data Engineer Location: Remote Employment type: Full Time with BayOne We’re looking for a skilled and motivated Data Engineer to join our growing team and help us build scalable data pipelines, optimize data platforms, and enable real-time analytics. What You'll Do Design, develop, and maintain robust data pipelines using tools like Databricks, PySpark, SQL, Fabric, and Azure Data Factory Collaborate with data scientists, analysts, and business teams to ensure data is accessible, clean, and actionable Work on modern data lakehouse architectures and contribute to data governance and quality frameworks Tech Stack Azure | Databricks | PySpark | SQL What We’re Looking For 3+ years experience in data engineering or analytics engineering Hands-on with cloud data platforms and large-scale data processing Strong problem-solving mindset and a passion for clean, efficient data design Job Description: Min 3 years of experience in modern data engineering/data warehousing/data lakes technologies on cloud platforms like Azure, AWS, GCP, Data Bricks etc. Azure experience is preferred over other cloud platforms. 5 years of proven experience with SQL, schema design and dimensional data modelling Solid knowledge of data warehouse best practices, development standards and methodologies Experience with ETL/ELT tools like ADF, Informatica, Talend etc., and data warehousing technologies like Azure Synapse, Microsoft Fabric, Azure SQL, Amazon redshift, Snowflake, Google Big Query etc. Strong experience with big data tools (Databricks, Spark etc..) and programming skills in PySpark and Spark SQL. Be an independent self-learner with “let’s get this done” approach and ability to work in Fast paced and Dynamic environment. Excellent communication and teamwork abilities. Nice-to-Have Skills: Event Hub, IOT Hub, Azure Stream Analytics, Azure Analysis Service, Cosmo DB knowledge. SAP ECC /S/4 and Hana knowledge. Intermediate knowledge on Power BI Azure DevOps and CI/CD deployments, Cloud migration methodologies and processes BayOne is an Equal Opportunity Employer and does not discriminate against any employee or applicant for employment because of race, color, sex, age, religion, sexual orientation, gender identity, status as a veteran, and basis of disability or any federal, state, or local protected class. This job posting represents the general duties and requirements necessary to perform this position and is not an exhaustive statement of all responsibilities, duties, and skills required. Management reserves the right to revise or alter this job description. Show more Show less
Posted 1 week ago
12.0 - 15.0 years
0 Lacs
India
On-site
Data Analyst Experience : 12 to 15 Years Location : Bangalore, Pune, Hyderabad, Chennai, Noida, Kolkata, Mumbai Interview mode: 1st Virtual , L2 or HR round will be F2F Mandatory Skills - Data Analyst - Minimum 10 yr experience Required Data profiling - Minimum 5 yr experience Required SQL - Minimum 8 experience required Data Quality tools ( IDMC OR Alteryx ) - Minimum 5 yr experience Required JD Primary Skills: Strong proficiency in SQL Data Profiling and Data Quality tools (e.g., IDMC, Alteryx) Excellent communication and collaboration skills Secondary Skills: Experience with ADF, Snowflake, and Databricks Knowledge of Spark and any ETL tools (SSIS, Informatica, etc.) Power BI for data analysis and reporting Domain knowledge in Claims and Insurance Job Responsibilities: Act as the primary point of contact for platform-related inquiries Communicate platform updates and changes to relevant teams and stakeholders Collaborate effectively with multiple stakeholders across teams Facilitate coordination between development teams and other departments dependent on the platform Work within Agile practices to ensure timely and efficient project delivery Utilize data profiling and quality tools to ensure integrity and consistency of data Good to Have: Strong understanding of the Insurance domain and Claims processing Show more Show less
Posted 1 week ago
40.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Description Analyze, design develop, and debug software programs for commercial or end user applications. Writes code, completes programming and performs testing and debugging of applications. Career Level - IC3 Responsibilities Preferred Qualifications: Oracle Applications Lab (OAL) has a central role within Oracle. It's role is to work with Product Development and Oracle internal business to deliver Oracle products for Oracle to use internally. OAL has a role of implementing Oracle applications, databases and middleware, supporting Oracle applications for Oracle internally and configuring Oracle applications to meet the specific needs of Oracle. OAL also provides a showcase for Oracle's products The role will involve: Working as part of a global team to implement and support new business applications for HR and Payroll Debugging and solving sophisticated problems and working closely with Oracle Product Development and other groups to implement solutions Developing and implementing product extensions and customizations Testing new releases Providing critical production support Your skills should include: Experience in designing and supporting Oracle E-Business Suite and Fusion applications. Preferably Oracle HRMS/Fusion HCM Strong Oracle technical skills: SQL, PL/SQL, Java, XML, ADF, SOA etc Communicating confidently with peers and management within technical and business teams Detailed Description and Job Requirements: Work with Oracle's world class technology to develop, implement, and support Oracle's global infrastructure. As a member of the IT organization, assist with the analyze of existing complex programs and formulate logic for new complex internal systems. Prepare flowcharting, perform coding, and test/debug programs. Develop conversion and system implementation plans. Recommend changes to development, maintenance, and system standards. Job duties are varied and complex using independent judgment. May have project lead role. BS or equivalent experience in programming on enterprise or department servers or systems. About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law. Show more Show less
Posted 1 week ago
12.0 years
0 Lacs
India
On-site
Job Title: Lead / Architect - Azure Data Factory Location : Mumbai/Pune/Bangalore/Hyderabad/Chennai/Kolkata/Noida Exp: 12 to 15 years Interview Mode – 1 Virtual / 1 F2F Mandatory Skills: Azure Data Factory (8+yrs) , Data Migration & Integration (8+yrs) , Azure Devops (5+yrs) , Lead or Architect experience (4+yrs) Job Description We are looking for a seasoned data engineering professional with deep expertise in Azure Data Factory (ADF) and related Azure services. The ideal candidate will lead end-to-end data integration, orchestration, and migration initiatives across enterprise environments. This role demands strong experience in designing robust ADF pipelines, data migration across tenants, and implementing best practices in CI/CD and data governance. Primary Responsibilities Azure Data Factory (ADF) Expertise Design, develop, and manage complex ADF pipelines for large-scale data integration and transformation Orchestrate data flows across multiple sources and sinks using ADF Optimize data movement and transformation for performance and cost-efficiency Troubleshoot and monitor ADF pipelines to ensure data reliability and accuracy Data Migration & Integration Plan and execute data migration across multiple Azure tenants using ADF and tools like AzCopy Ensure data integrity, security, and minimal downtime during migrations Implement logging and error-handling strategies for large-volume data transfers CI/CD and Azure DevOps Design and implement CI/CD pipelines using Azure Pipelines to automate deployment of ADF and related components Collaborate with development and QA teams to integrate CI/CD best practices Manage version control and release strategies for ADF assets Data Architecture & Governance Design and maintain scalable data models, databases, and warehouses in Azure Align data architecture with business requirements and Azure best practices Enforce data governance, compliance, and security standards Show more Show less
Posted 1 week ago
3.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
What You’ll Do Handle data: pull, clean, and shape structured & unstructured data. Manage pipelines: Airflow / Step Functions / ADF… your call. Deploy models: build, tune, and push to production on SageMaker, Azure ML, or Vertex AI. Scale: Spark / Databricks for the heavy lifting. Automate processes: Docker, Kubernetes, CI/CD, MLFlow, Seldon, Kubeflow. Collaborate effectively: work with engineers, architects, and business professionals to solve real problems promptly. What You Bring 3+ years hands-on MLOps (4-5 yrs total software experience). Proven experience with one hyperscaler (AWS, Azure, or GCP). Confidence with Databricks / Spark, Python, SQL, TensorFlow / PyTorch / Scikit-learn. Extensive experience handling and troubleshooting Kubernetes and proficiency in Dockerfile management. Prototyping with open-source tools, selecting the appropriate solution, and ensuring scalability. Analytical thinker, team player, with a proactive attitude. Nice-to-Haves Sagemaker, Azure ML, or Vertex AI in production. Dedication to clean code, thorough documentation, and precise pull requests. Skills: mlflow,ml ops,scikit-learn,airflow,mlops,sql,pytorch,adf,step functions,kubernetes,gcp,kubeflow,python,databricks,tensorflow,aws,azure,docker,seldon,spark Show more Show less
Posted 1 week ago
3.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Company Overview Viraaj HR Solutions is a forward-thinking recruitment agency specializing in connecting talent with opportunities across various industries. Our mission is to empower individuals through meaningful employment, while fostering growth for businesses through innovative talent acquisition strategies. We value integrity, collaboration, and excellence in our operations. As part of our commitment to delivering exceptional HR solutions, we are currently seeking an Azure Data Engineer to join our client on-site in India. Role Responsibilities Design and implement data solutions on Microsoft Azure. Develop ETL processes to extract, transform, and load data efficiently. Perform data modeling and database design to support analytics and reporting. Collaborate with stakeholders to gather requirements and translate them into technical specifications. Optimize existing data pipelines for performance and reliability. Ensure data integrity and consistency through robust validation checks. Maintain and troubleshoot data integration processes. Implement data governance and security best practices. Work with big data technologies to manage large data sets. Document all technical processes and data architecture. Utilize Azure Data Factory and other Azure services for data management. Conduct performance tuning of SQL queries and data flows. Participate in design reviews and code reviews. Stay current with Azure updates and analyze their potential impact on existing solutions. Provide support and training to junior data engineers. Qualifications Bachelor's degree in Computer Science or a related field. 3+ years of experience in data engineering or a related role. Proficiency in Azure Data Factory and Azure SQL Database. Strong knowledge of SQL and relational databases. Experience with ETL tools and processes. Familiarity with data warehousing concepts. Hands-on experience with big data technologies like Hadoop or Spark. Knowledge of Python or other scripting languages. Understanding of data modeling concepts and techniques. Excellent problem-solving skills and attention to detail. Strong communication and collaboration abilities. Ability to work independently and as part of a team. Experience with data governance practices. Certifications in Azure or data engineering are a plus. Familiarity with Agile methodologies. Skills: big data technologies,azure data engineer,agile methodologies,relational databases,etl processes,sql server,python scripting,database design,azure databricks,sql,data modeling,microsoft azure,spark,azure data factory,data warehousing,data governance,python,hadoop,adf Show more Show less
Posted 1 week ago
5.0 - 10.0 years
0 Lacs
Greater Kolkata Area
On-site
Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate Job Description & Summary A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Creating business intelligence from data requires an understanding of the business, the data, and the technology used to store and analyse that data. Using our Rapid Business Intelligence Solutions, data visualisation and integrated reporting dashboards, we can deliver agile, highly interactive reporting and analytics that help our clients to more effectively run their business and understand what business questions can be answered and how to unlock the answers. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Responsibilities Data Lake and Lakehouse Implementation: Design, implement, and manage Data Lake and Lakehouse architectures. (Must have) Develop and maintain scalable data pipelines and workflows. (Must have) Utilize Azure Data Lake Services (ADLS) for data storage and management. (Must have) Knowledge on Medalion Architecture, Delta Format. (Must have) Data Processing and Transformation: Use PySpark for data processing and transformations. (Must have) Implement Delta Live Tables for real-time data processing and analytics. (Good to have) Ensure data quality and consistency across all stages of the data lifecycle. (Must have) Data Management and Governance: Employ Unity Catalog for data governance and metadata management. (Good to have) Ensure robust data security and compliance with industry standards. (Must have) Data Integration: Extract, transform, and load (ETL) data from multiple sources (Must have) including SAP (Good to have), Dynamics 365 (Good to have), and other systems. Utilize Azure Data Factory (ADF) and Synapse Analytics for data integration and orchestration. (Must have) Performance Optimization of the Jobs. (Must have) Data Storage and Access: Implement and manage Azure Data Lake Storage (ADLS) for large-scale data storage. (Must have) Optimize data storage and retrieval processes for performance and cost-efficiency. (Must have) Collaboration and Communication: Work closely with data scientists, analysts, and other stakeholders to understand data requirements. (Must have) Provide technical guidance and mentorship to junior team members. (Good to have) Continuous Improvement: Stay updated with the latest industry trends and technologies in data engineering and cloud computing. (Good to have) Continuously improve data processes and infrastructure for efficiency and scalability. (Must have) Required Skills And Qualifications Technical Skills: Proficient in PySpark and Python for data processing and analysis. Strong experience with Azure Data Lake Services (ADLS) and Data Lake architecture. Hands-on experience with Databricks for data engineering and analytics. Knowledge of Unity Catalog for data governance. Expertise in Delta Live Tables for real-time data processing. Familiarity with Azure Fabric for data integration and orchestration. Proficient in Azure Data Factory (ADF) and Synapse Analytics for ETL and data warehousing. Experience in pulling data from multiple sources like SAP, Dynamics 365, and others. Soft Skills: Excellent problem-solving and analytical skills. Strong communication and collaboration abilities. Ability to work independently and as part of a team. Attention to detail and commitment to data accuracy and quality. Certifications Required Certification in Azure Data Engineering or relevant Azure certifications. DP203 (Must have) Certification in Databricks. Databricks certified Data Engineer Associate (Must have) Databricks certified Data Engineer Professional (Good Have) Mandatory skill sets: Azure DE, Pyspark, Databricks Preferred skill sets: Azure DE, Pyspark, Databricks Years of experience required: 5-10 Years Educational Qualification: BE, B.Tech, MCA, M.Tech Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Master of Engineering, Bachelor of Engineering Degrees/Field Of Study Preferred Certifications (if blank, certifications not specified) Required Skills Databricks Platform, Microsoft Azure Optional Skills Desired Languages (If blank, desired languages not specified) Travel Requirements Available for Work Visa Sponsorship? Government Clearance Required? Job Posting End Date Show more Show less
Posted 1 week ago
3.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Company Overview Viraaj HR Solutions is a forward-thinking recruitment agency specializing in connecting talent with opportunities across various industries. Our mission is to empower individuals through meaningful employment, while fostering growth for businesses through innovative talent acquisition strategies. We value integrity, collaboration, and excellence in our operations. As part of our commitment to delivering exceptional HR solutions, we are currently seeking an Azure Data Engineer to join our client on-site in India. Role Responsibilities Design and implement data solutions on Microsoft Azure. Develop ETL processes to extract, transform, and load data efficiently. Perform data modeling and database design to support analytics and reporting. Collaborate with stakeholders to gather requirements and translate them into technical specifications. Optimize existing data pipelines for performance and reliability. Ensure data integrity and consistency through robust validation checks. Maintain and troubleshoot data integration processes. Implement data governance and security best practices. Work with big data technologies to manage large data sets. Document all technical processes and data architecture. Utilize Azure Data Factory and other Azure services for data management. Conduct performance tuning of SQL queries and data flows. Participate in design reviews and code reviews. Stay current with Azure updates and analyze their potential impact on existing solutions. Provide support and training to junior data engineers. Qualifications Bachelor's degree in Computer Science or a related field. 3+ years of experience in data engineering or a related role. Proficiency in Azure Data Factory and Azure SQL Database. Strong knowledge of SQL and relational databases. Experience with ETL tools and processes. Familiarity with data warehousing concepts. Hands-on experience with big data technologies like Hadoop or Spark. Knowledge of Python or other scripting languages. Understanding of data modeling concepts and techniques. Excellent problem-solving skills and attention to detail. Strong communication and collaboration abilities. Ability to work independently and as part of a team. Experience with data governance practices. Certifications in Azure or data engineering are a plus. Familiarity with Agile methodologies. Skills: big data technologies,azure data engineer,agile methodologies,relational databases,etl processes,sql server,python scripting,database design,azure databricks,sql,data modeling,microsoft azure,spark,azure data factory,data warehousing,data governance,python,hadoop,adf Show more Show less
Posted 1 week ago
2.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Line of Service Advisory Industry/Sector Not Applicable Specialism Oracle Management Level Associate Job Description & Summary At PwC, our people in business application consulting specialise in consulting services for a variety of business applications, helping clients optimise operational efficiency. These individuals analyse client needs, implement software solutions, and provide training and support for seamless integration and utilisation of business applications, enabling clients to achieve their strategic objectives. Those in Oracle technology at PwC will focus on utilising and managing Oracle suite of software and technologies for various purposes within an organisation. You will be responsible for tasks such as installation, configuration, administration, development, and support of Oracle products and solutions. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Job Description & Summary Managing business performance in today’s complex and rapidly changing business environment is crucial for any organization’s short-term and long-term success. However ensuring streamlined E2E Oracle fusion Technical to seamlessly adapt to the changing business environment is crucial from a process and compliance perspective. As part of the Technology Consulting -Business Applications - Oracle Practice team, we leverage opportunities around digital disruption, new age operating model and best in class practices to deliver technology enabled transformation to our clients Responsibilities: Extensive experience in Oracle ERP/Fusion SaaS/PaaS project implementations as a technical developer Completed at least 2 full Oracle Cloud (Fusion) Implementation Extensive Knowledge on database structure for ERP/Oracle Cloud (Fusion) Extensively worked on BI Publisher reports, FBDI/OTBI Cloud and Oracle Integration (OIC) Mandatory skill sets BI Publisher reports, FBDI/OTBI Cloud and Oracle Integration (OIC) Preferred skill sets database structure for ERP/Oracle Cloud (Fusion) Year of experience required Minimum 2+ Years of Oracle fusion experience Educational Qualification BE/BTech MBA Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Chartered Accountant Diploma Degrees/Field Of Study Preferred: Certifications (if blank, certifications not specified) Required Skills Oracle Fusion Applications Optional Skills Accepting Feedback, Active Listening, Business Transformation, Communication, Design Automation, Emotional Regulation, Empathy, Inclusion, Intellectual Curiosity, Optimism, Oracle Application Development Framework (ADF), Oracle Business Intelligence (BI) Publisher, Oracle Cloud Infrastructure, Oracle Data Integration, Process Improvement, Process Optimization, Strategic Technology Planning, Teamwork, Well Being Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date Show more Show less
Posted 1 week ago
2.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Line of Service Advisory Industry/Sector Not Applicable Specialism Oracle Management Level Associate Job Description & Summary At PwC, our people in business application consulting specialise in consulting services for a variety of business applications, helping clients optimise operational efficiency. These individuals analyse client needs, implement software solutions, and provide training and support for seamless integration and utilisation of business applications, enabling clients to achieve their strategic objectives. Those in Oracle technology at PwC will focus on utilising and managing Oracle suite of software and technologies for various purposes within an organisation. You will be responsible for tasks such as installation, configuration, administration, development, and support of Oracle products and solutions. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Job Description & Summary Managing business performance in today’s complex and rapidly changing business environment is crucial for any organization’s short-term and long-term success. However ensuring streamlined E2E Oracle fusion Technical to seamlessly adapt to the changing business environment is crucial from a process and compliance perspective. As part of the Technology Consulting -Business Applications - Oracle Practice team, we leverage opportunities around digital disruption, new age operating model and best in class practices to deliver technology enabled transformation to our clients Responsibilities: Extensive experience in Oracle ERP/Fusion SaaS/PaaS project implementations as a technical developer Completed at least 2 full Oracle Cloud (Fusion) Implementation Extensive Knowledge on database structure for ERP/Oracle Cloud (Fusion) Extensively worked on BI Publisher reports, FBDI/OTBI Cloud and Oracle Integration (OIC) Mandatory skill sets BI Publisher reports, FBDI/OTBI Cloud and Oracle Integration (OIC) Preferred skill sets database structure for ERP/Oracle Cloud (Fusion) Year of experience required Minimum 2+ Years of Oracle fusion experience Educational Qualification BE/BTech MBA Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Chartered Accountant Diploma Degrees/Field Of Study Preferred: Certifications (if blank, certifications not specified) Required Skills Oracle Fusion Applications Optional Skills Accepting Feedback, Active Listening, Business Transformation, Communication, Design Automation, Emotional Regulation, Empathy, Inclusion, Intellectual Curiosity, Optimism, Oracle Application Development Framework (ADF), Oracle Business Intelligence (BI) Publisher, Oracle Cloud Infrastructure, Oracle Data Integration, Process Improvement, Process Optimization, Strategic Technology Planning, Teamwork, Well Being Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date Show more Show less
Posted 1 week ago
10.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Job Description Alimentation Couche-Tard Inc., (ACT) is a global Fortune 200 company. A leader in the convenience store and fuel space with over 16,700 stores in 31 countries, serving more than 9 million customers each day It is an exciting time to be a part of the growing Data Engineering team at Circle K. We are driving a well-supported cloud-first strategy to unlock the power of data across the company and help teams to discover, value and act on insights from data across the globe. With our strong data pipeline, this position will play a key role partnering with our Technical Development stakeholders to enable analytics for long term success. About The Role We are seeking a talented and experienced Data Architect to join our team. The Data Architect will be responsible for designing enterprise data management frameworks, ensuring data security and compliance, implementing data management processes, build data models and strategies to support various business needs and initiatives. The ideal candidate will have a strong background in data modeling principles, database design, and data management best practices. Responsibilities Collaborate with solution architect, data engineers, business stakeholders, business analysts, and DQ testers to ensure data management and data governance framework is defined as critical components. Design and develop data models using industry-standard modeling techniques and tools. Perform data profiling, data lineage and analysis to understand data quality, structure, and relationships. Optimize data models for performance, scalability, and usability by creating optimal data storage layer. Define and enforce data modeling standards, best practices, and guidelines. Participate in data governance initiatives to ensure compliance with data management policies and standards. Work closely with database administrators and developers to implement data models in relational and non-relational database systems. Conduct data model reviews and provide recommendations for improvements. Stay updated on emerging trends and technologies in data modeling and data management Qualifications & Required Skills Full-Time bachelor’s or master’s degree in engineering/technology, computer science, information technology, or related fields. 10+ years of total experience in data modeling and database design and experience in Retail domain will be added advantage. 8+ years of experience in data engineering development and support. 3+ years of experience in leading technical team of data engineers and BI engineers Proficiency in data modeling tools such as Erwin, ER/Studio, or similar tools. Strong knowledge of Azure cloud infrastructure and development using SQL/Python/PySpark using ADF, Synapse and Databricks. Hands-on experience with Azure Data Factory, Azure Synapse Analytics, Azure Analysis Services, Azure Databricks, Blob Storage, Python/PySpark, Logic Apps, Key Vault, and Azure functions. Strong communication, interpersonal, collaboration skills along with leadership capabilities. Ability to work effectively in a fast-paced, dynamic environment as cloud SME. Act as single point of contact for all kinds of data management related queries to make data decisions. Design and manage centralized, end-to-end data architecture solutions, such as- Data model designs, Database development standards, Implementation and management of data warehouses, Data analytics systems. Conduct continuous audits of data management system performance and refine where necessary. Identify bottlenecks, optimize queries, and implement caching mechanisms to enhance data processing speed. Work to integrate disparate data sources, including internal databases and external application programming interfaces (APIs), enabling organizations to derive insights from a holistic view of the data. Ensure data privacy measures comply with regulatory standards. Preferred Azure Data Factory (ADF), Databricks certification is a plus. Data Architect or Azure cloud Solution Architect certification is a plus. Technologies we use : Azure Data Factory, Databricks, Azure Synapse, Azure Tabular, Azure Functions, Logic Apps, Key Vault, DevOps, Python, PySpark, Scripting (PowerShell, Bash), Git, Terraform, Power BI, Snowflake Show more Show less
Posted 1 week ago
2.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Associate Job Description & Summary At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decision-making and driving business growth. In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Responsibilities: Job title: Data Engineer About The Role: As a Junior/Senior Data Engineer, you'll be taking the lead in designing and maintaining complex data ecosystems. Your experience will be instrumental in optimizing data processes, ensuring data quality, and driving data-driven decision-making within the organization. Responsibilities: Architecting and designing complex data systems and pipelines. Leading and mentoring junior data engineers and team members. Collaborating with cross-functional teams to define data requirements. Implementing advanced data quality checks and ensuring data integrity. Optimizing data processes for efficiency and scalability. Overseeing data security and compliance measures. Evaluating and recommending new technologies to enhance data infrastructure. Providing technical expertise and guidance for critical data projects. Required Skills & Experience: Proficiency in designing and building complex data pipelines and data processing systems. Leadership and mentorship capabilities to guide junior data engineers and foster skill development. Strong expertise in data modeling and database design for optimal performance. Skill in optimizing data processes and infrastructure for efficiency, scalability, and cost-effectiveness. Knowledge of data governance principles, ensuring data quality, security, and compliance. Familiarity with big data technologies like Hadoop, Spark, or NoSQL. Expertise in implementing robust data security measures and access controls. Effective communication and collaboration skills for cross-functional teamwork and defining data requirements. Skills: Cloud: Azure/GCP/AWS DE Technologies: ADF, Big Query, AWS Glue etc., Data Lake: Snowflake, Data Bricks etc., Mandatory Skill Sets: Cloud: Azure/GCP/AWS DE Technologies: ADF, Big Query, AWS Glue etc., Data Lake: Snowflake, Data Bricks etc. Preferred Skill Sets: Cloud: Azure/GCP/AWS DE Technologies: ADF, Big Query, AWS Glue etc., Data Lake: Snowflake, Data Bricks etc. Years Of Experience Required: 2-4 years Education Qualification: BE/BTECH, ME/MTECH, MBA, MCA Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Master of Business Administration, Bachelor of Engineering, Master of Engineering Degrees/Field Of Study Preferred: Certifications (if blank, certifications not specified) Required Skills AWS Glue, Microsoft Azure Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Agile Scalability, Amazon Web Services (AWS), Apache Hadoop, Azure Data Factory, Communication, Data Anonymization, Database Administration, Database Management System (DBMS), Database Optimization, Database Security Best Practices, Data Engineering, Data Engineering Platforms, Data Infrastructure, Data Integration, Data Lake, Data Modeling, Data Pipeline, Data Quality, Data Transformation, Data Validation, Data Warehouse, Data Warehouse Indexing {+ 13 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Available for Work Visa Sponsorship? Government Clearance Required? Job Posting End Date Show more Show less
Posted 1 week ago
8.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Line of Service Advisory Industry/Sector Not Applicable Specialism Oracle Management Level Manager Job Description & Summary At PwC, our people in business application consulting specialise in consulting services for a variety of business applications, helping clients optimise operational efficiency. These individuals analyse client needs, implement software solutions, and provide training and support for seamless integration and utilisation of business applications, enabling clients to achieve their strategic objectives. Those in Oracle technology at PwC will focus on utilising and managing Oracle suite of software and technologies for various purposes within an organisation. You will be responsible for tasks such as installation, configuration, administration, development, and support of Oracle products and solutions. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Job Description & Summary Managing business performance in today’s complex and rapidly changing business environment is crucial for any organization’s short-term and long-term success. However ensuring streamlined E2E Oracle fusion Technical to seamlessly adapt to the changing business environment is crucial from a process and compliance perspective. As part of the Technology Consulting -Business Applications - Oracle Practice team, we leverage opportunities around digital disruption, new age operating model and best in class practices to deliver technology enabled transformation to our clients Responsibilities: Extensive experience in Oracle ERP/Fusion SaaS/PaaS project implementations as a technical developer Completed at least 2 full Oracle Cloud (Fusion) Implementation Extensive Knowledge on database structure for ERP/Oracle Cloud (Fusion) Extensively worked on BI Publisher reports, FBDI/OTBI Cloud and Oracle Integration (OIC) Mandatory skill sets BI Publisher reports, FBDI/OTBI Cloud and Oracle Integration (OIC) Preferred skill sets database structure for ERP/Oracle Cloud (Fusion) Year of experience required Minimum 8 Years of Oracle fusion experience Educational Qualification BE/BTech MBA Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Master of Business Administration, Bachelor of Engineering Degrees/Field Of Study Preferred: Certifications (if blank, certifications not specified) Required Skills Oracle Fusion Applications Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Analytical Thinking, Business Transformation, Coaching and Feedback, Communication, Creativity, Design Automation, Embracing Change, Emotional Regulation, Empathy, Inclusion, Intellectual Curiosity, Learning Agility, Optimism, Oracle Application Development Framework (ADF), Oracle Business Intelligence (BI) Publisher, Oracle Cloud Infrastructure, Oracle Data Integration, Process Improvement, Process Optimization, Professional Courage, Relationship Building, Self-Awareness {+ 4 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date Show more Show less
Posted 1 week ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Line of Service Advisory Industry/Sector Not Applicable Specialism SAP Management Level Senior Associate Job Description & Summary At PwC, our people in business application consulting specialise in consulting services for a variety of business applications, helping clients optimise operational efficiency. These individuals analyse client needs, implement software solutions, and provide training and support for seamless integration and utilisation of business applications, enabling clients to achieve their strategic objectives. Those in Oracle technology at PwC will focus on utilising and managing Oracle suite of software and technologies for various purposes within an organisation. You will be responsible for tasks such as installation, configuration, administration, development, and support of Oracle products and solutions. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Job Description & Summary A career within PwC's Oracle Services Practice, will provide you with the opportunity to help organizations use enterprise technology to achieve their digital technology goals and capitalize on business opportunities. We help our clients implement and effectively use Oracle offerings to solve their business problems and fuel success in the areas of finance operations, human capital management, supply chain management, reporting and analytics, and governance, risk and compliance. Responsibilities: Experience in Implementation, Configuration, Roll-out and Application Maintenance & Support Responsibilities & Role Good functional knowledge and understanding of standard business processes across Procure-to-Pay(P2P) & Order-to-Cash(O2C) modules of the track Exposure in Requirement Gathering, Analyze Gaps, run Design Workshop, produce proof-of-concept, provide functional solutions (work on fitment & arounds) and out-of-the-box solutions Gather localization requirements and conduct a feasibility analysis Create TO-BE process flow and analyze impacts of changes from AS-IS flows Ability to work with Client and onsite team to build and building a global solution for multi country roll outs Prepare Configuration Workbook for modules, Functional Specification for RICEF objects, Test Plans and Detailed test scripts. Configure Oracle Cloud in different environments. Perform Unit / String / End to End / Regression testing for standard and custom features along with RICEF objects. Perform Data Conversion for all major data objects through FBDI/ ADFDI / Web Service Build OTBI reports as per project requirements. Should be a very good team player and ability to work with Client and onsite team to build and building a global solution for multi country roll outs Excellent English Communication Skill in all forms Mandatory skill sets Modules – SSP, Purchase Order, Order Management, GOP Inventory, Sourcing, Procurement Contracts, Supplier Management and Supplier Qualification Management Knowledge on BPM Approval Configuration. Primary Skill: SSP, Purchase Order, Sourcing, Order Management, GOP, Procurement Contracts, Supplier Management and Supplier Qualification Management Knowledge on BPM Approval Configuration Preferred skill sets Secondary Skill set of Finance Modules- Expenses, Fixed Assets, Payables, Tax is an added advantage. Years of experience required 4-7 Yrs experience Education Qualification BE/BTech/MBA/MCA/CAs Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Master of Business Administration, Chartered Accountant Diploma, Bachelor of Engineering Degrees/Field Of Study Preferred: Certifications (if blank, certifications not specified) Required Skills Oracle Supply Chain Management (SCM) Optional Skills Accepting Feedback, Active Listening, Analytical Thinking, Business Transformation, Communication, Creativity, Design Automation, Embracing Change, Emotional Regulation, Empathy, Inclusion, Intellectual Curiosity, Learning Agility, Optimism, Oracle Application Development Framework (ADF), Oracle Business Intelligence (BI) Publisher, Oracle Cloud Infrastructure, Oracle Data Integration, Process Improvement, Process Optimization, Self-Awareness, Strategic Technology Planning, Teamwork, Well Being Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date Show more Show less
Posted 1 week ago
6.0 - 9.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Job Title: Data Engineer (Azure Data Factory & PostgreSQL) Location: Bangalore Employment Type: Full-time Experience Level: 6-9 Years About the Role: We are seeking a skilled and detail-oriented Data Engineer with hands-on experience in Azure Data Factory , PostgreSQL , and other Azure ecosystem tools. The ideal candidate will be responsible for designing, developing, and maintaining data pipelines and models, ensuring high performance, scalability, and data integrity across the system. Key Responsibilities: Design, develop, and maintain robust data pipelines using Azure Data Factory . Create and manage data models in PostgreSQL , ensuring optimal data storage and retrieval. Optimize query performance and database efficiency in PostgreSQL through indexing, tuning, and performance monitoring. Map and transform data from diverse sources into coherent and efficient data models. Develop and maintain logging and monitoring mechanisms in Azure Data Factory to proactively identify and troubleshoot issues. Handle various file operations within ADF, including reading, writing, and transforming data across multiple file formats. Ensure secure and compliant operations using Azure Key Vault , Azure Data Lake , and other Azure services. Write complex SQL queries , capable of handling diverse scenarios and optimized for performance. Collaborate effectively with business stakeholders, product owners, and data architects to gather requirements and deliver scalable solutions. Implement data validation and quality checks to ensure data integrity and accuracy. (Preferred) Build and configure semantic models and reports in Power BI . Required Skills: Strong experience with Azure Data Factory , PostgreSQL , SQL , and Azure services (Key Vault, Data Lake). Solid understanding of data modeling techniques and ETL/ELT processes. Excellent problem-solving skills and ability to manage complex data scenarios. Strong communication and collaboration skills, especially working with cross-functional teams. Preferred Qualifications: Experience with Power BI for report creation and data visualization. Familiarity with DevOps practices and CI/CD in a data engineering context. Education: Bachelor’s or Master’s degree. Show more Show less
Posted 1 week ago
3.0 years
0 Lacs
India
Remote
Title: Azure Data Engineer Location: Remote Employment type: Full Time with BayOne We’re looking for a skilled and motivated Data Engineer to join our growing team and help us build scalable data pipelines, optimize data platforms, and enable real-time analytics. What You'll Do Design, develop, and maintain robust data pipelines using tools like Databricks, PySpark, SQL, Fabric, and Azure Data Factory Collaborate with data scientists, analysts, and business teams to ensure data is accessible, clean, and actionable Work on modern data lakehouse architectures and contribute to data governance and quality frameworks Tech Stack Azure | Databricks | PySpark | SQL What We’re Looking For 3+ years experience in data engineering or analytics engineering Hands-on with cloud data platforms and large-scale data processing Strong problem-solving mindset and a passion for clean, efficient data design Job Description: Min 3 years of experience in modern data engineering/data warehousing/data lakes technologies on cloud platforms like Azure, AWS, GCP, Data Bricks etc. Azure experience is preferred over other cloud platforms. 5 years of proven experience with SQL, schema design and dimensional data modelling Solid knowledge of data warehouse best practices, development standards and methodologies Experience with ETL/ELT tools like ADF, Informatica, Talend etc., and data warehousing technologies like Azure Synapse, Microsoft Fabric, Azure SQL, Amazon redshift, Snowflake, Google Big Query etc. Strong experience with big data tools (Databricks, Spark etc..) and programming skills in PySpark and Spark SQL. Be an independent self-learner with “let’s get this done” approach and ability to work in Fast paced and Dynamic environment. Excellent communication and teamwork abilities. Nice-to-Have Skills: Event Hub, IOT Hub, Azure Stream Analytics, Azure Analysis Service, Cosmo DB knowledge. SAP ECC /S/4 and Hana knowledge. Intermediate knowledge on Power BI Azure DevOps and CI/CD deployments, Cloud migration methodologies and processes BayOne is an Equal Opportunity Employer and does not discriminate against any employee or applicant for employment because of race, color, sex, age, religion, sexual orientation, gender identity, status as a veteran, and basis of disability or any federal, state, or local protected class. This job posting represents the general duties and requirements necessary to perform this position and is not an exhaustive statement of all responsibilities, duties, and skills required. Management reserves the right to revise or alter this job description. Show more Show less
Posted 1 week ago
10.0 - 15.0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
About the company: With over 2.5 crore customers, over 5,000 distribution points and nearly 2,000 branches, IndusInd Bank is a universal bank with a widespread banking footprint across the country. IndusInd offers a wide array of products and services for individuals and corporates including microfinance, personal loans, personal and commercial vehicles loans, credit cards, SME loans. Over the years, IndusInd has grown ceaselessly and dynamically, driven by zeal to offer our customers banking services at par with the highest quality standards in the industry. IndusInd is a pioneer in digital first solutions to bring together the power of next-gen digital product stack, customer excellence and trust of an established bank. Job Purpose: To work on implementing data modeling solutions To design data flow and structure to reduce data redundancy and improving data movement among systems defining a data lineage To work in the Azure Data Warehouse To work with large data volume of data integration Experience With overall experience between 10 to 15 years, applicant must have minimum 8 to 11 years of hard core professional experience in data modeling for large Data Warehouse with multiple Sources. Technical Skills Expertise in core skill of data modeling principles/methods including conceptual, logical & physical Data Models Ability to utilize BI tools like Power BI, Tableau, etc to represent insights Experience in translating/mapping relational data models into XML and Schemas Expert knowledge of metadata management, relational & data modeling tools like ER Studio, Erwin or others. Hands-on experience in relational, dimensional and/or analytical experience (using RDBMS, dimensional, NoSQL, ETL and data ingestion protocols). Very strong in SQL queries Expertise in performance tuning of SQL queries. Ability to analyse source system and create Source to Target mapping. Ability to understand the business use case and create data models or joined data in Datawarehouse. Preferred experience in banking domain and experience in building data models/marts for various banking functions. Good to have knowledge of – -Azure powershell scripting or Python scripting for data transformation in ADF - SSIS, SSAS, BI tools like Power BI -Azure PaaS components like Azure Data Factory, Azure Data Bricks, Azure Data Lake, Azure Synapse (DWH), Polybase, ExpressRoute tunneling, etc. -API integration Responsibility Understanding the existing data model, existing data warehouse design, functional domain subject areas of data, documenting the same with as is architecture and proposed one. Understanding existing ETL process, various sources and analyzing, documenting the best approach to design logical data model where required Work with development team to implement the proposed data model into physical data model, build data flows Work with development team to optimize the database structure with best practices applying optimization methods. Analyze, document and implement to re-use of data model for new initiatives. Will interact with stakeholder, Users, other IT teams to understand the eco system and analyze for solutions Work on user requirements and create queries for creating consumption views for users from the existing DW data. Will train and lead a small team of data engineers. Qualifications Bachelors of Computer Science or Equivalent Should have certification done on Data Modeling and Data Analyst. Good to have a certification of Azure Fundamental and Azure Engineer courses (AZ900 or DP200/201) Behavioral Competencies Should have excellent problem-solving and time management skills Strong analytical thinking skills Applicant should have excellent communication skill and process oriented with flexible execution mindset. Strategic Thinking with Research and Development mindset. Clear and demonstrative communication Efficiently identify and solves issues Identify, track and escalate risks in a timely manner Selection Process: Interested Candidates are mandatorily required to apply through the listing on Jigya. Only applications received through Jigya will be evaluated further. Shortlisted candidates may need to appear in an Online Assessment and/or a Technical Screening interview administered by Jigya, on behalf on IndusInd Bank Candidates selected after the screening rounds will be processed further by IndusInd Bank Show more Show less
Posted 1 week ago
8.0 - 12.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Dear Candidate Greetings from TCS !!! TCS has been a great pioneer in feeding the fire of young Techies like you. We are a global leader in the technology arena and there's nothing that can stop us from growing together. Role: Cloud Devops Engineer (Azure) Location: Chennai Experience Range: 8 to 12 years Job Description: Good experience in Microsoft Fabric Strong understanding of DevOps processes & procedures & Tools Data Lake,Data Analysis ,Data Engineer Power BI Experience with Azure DevOps products (work item, Wiki, git, repos, pipelines, release manager) Experience with application and infrastructure operation monitoring (such as App Dynamic, Splunk, Azure Portal) and change management (such as ServiceNow) Azure Cloud experience deploy and using PaaS resources, such as ASE, SQL MI, Cosmos DB, Storage Account, AKS, ADF, etc. Hands on experience to create build & deployment automation with Application as a Code pipelines using YAML. Hands on experience to create Azure Data Factory pipelines using YAML. Knowledge in Azure infrastructure automation using PowerShell, Runbooks, and Terraform NuGet and NPM Packaging Containers/Docker, Repository manager Good communication skills (written & verbal) and Ability to present. Agile Scrum/Kanban experience Core experience in Azure services CI experience (Git, Jenkins, GitLab), Bash, PowerShell Build automation Container experience in Docker Azure DevOps CKA and CKAD Certifications Azure Developer who has worked extensively on CI image building with both Linux and Windows containers Should have the best standards knowledge on CI Image building process for both Linux and windows containers Significant experience with SaaS and web-based technologies Skilled with Continuous Integration and Continuous Deployments using Azure Devops Services. Skilled with PowerShell to automate Python, or Bash is an added advantage. Skilled with containerization platforms using Docker & Kubernetes Familiar with architecture/design patterns and re-usability concepts. Skilled with object-oriented analysis and design (OOA&D) methodology and micro-services. Skilled in SOLID design principles and TDD. Familiar with Application Security via OWASP Top 10 and common mitigation strategies. Very Familiar with source control systems (git) and Azure DevOps. Detailed knowledge of database design and object/relational database technology. Azure DevOps Implementation: Lead the design and implementation of CI/CD pipelines using Azure DevOps. Configure and manage build agents, release pipelines, and deployment environments in Azure DevOps. Establish and maintain robust CI processes to automate code builds, testing, and deployment. Integrate automated testing into CI pipelines for comprehensive code validation. Continuous Integration: Infrastructure as Code (IaC) Utilize Infrastructure as Code principles to manage and provision infrastructure components on Azure. Implement and maintain IaC templates (e.g., ARM templates) for infrastructure provisioning. Monitoring and Optimization: Implement monitoring and logging solutions to track the performance and reliability of CI/CD pipelines. Continuously optimize CI/CD processes for efficiency, speed, and resource utilization. Security and Compliance Implement security best practices within CI/CD pipelines. Ensure compliance with industry standards and regulatory requirements in CI/CD processes. Troubleshooting and Support Provide expert-level support for CI/CD-related issues. Working with Product teams to manage AZURE systems deployment, and lifecycle maintenance, including requests, determining action plans, Capacity planning, reporting, advising and parties involved. Responsible for triage and resolving service management system incidents and requests. Responsible for application monitoring, data manipulation for widgets and generating reports, problem identification and management. Responsible for system data manipulation- tuning agents and collectors to glean wanted information. Occasionally consult with individuals inside and outside of the team and provide general customer support Azure infrastructure mgmt. Create and manage check-in policies, and installation, configuration, troubleshooting and maintenance. Producing scripts for automation and report generation using Terraform, Tera grunt, Cloud formation templates, Ansible, GIT, PowerShell, Bash, Shell, python scripting, Linux and Windows operating system and scripting, azure Visual Studio Team Services. Maintain the applications within EKS, AKS, Dockers, Hub, and Docker Registry. Manage Networking protocols, network security in the cloud. Manage Cloudflare products as well as other equivalent tools. Monitoring Infrastructure. Virtual machines, Virtual Networks, autoscaling, storage, Key vault, Network Security Group, Load Balancer, Traffic Manager, Route Tables, storage accts, EFS, FSX, NetApp NAS, Recovery Services Vaults, Key Vault, Azure Backup, lambda, server less architecture components. Required Skills Azure Certified Solutions Architect or Sys Ops Administrator and Equivalent Azure certified. Show more Show less
Posted 1 week ago
0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
We are looking for a highly skilled API & Pixel Tracking Integration Engineer to lead the development and deployment of server-side tracking and attribution solutions across multiple platforms. The ideal candidate brings deep expertise in CAPI integrations (Meta, Google, and other platforms), secure data handling using cryptographic techniques, and experience working within privacy-first environments like Azure Clean Rooms . This role requires strong hands-on experience in C# development, Azure cloud services, OCI (Oracle Cloud Infrastructure) , and marketing technology stacks including Adobe Tag Management and Pixel Management . You will work closely with engineering, analytics, and marketing teams to deliver scalable, compliant, and secure data tracking solutions that drive business insights and performance. Key Responsibilities Design, implement, and maintain CAPI integrations across Meta, Google, and all major platforms, ensuring real-time and accurate server-side event tracking. Develop and manage custom tracking solutions leveraging Azure Clean Rooms, ensuring user NFAs are respected and privacy-compliant logic is implemented. Architect and develop secure REST APIs in C# to support advanced attribution models and marketing analytics pipelines. Implement cryptographic hashing (e.g., SHA-256) Use Azure Data Lake Gen1 & Gen2 (ADLS), Cosmos DB, and Azure Functions to build and host scalable backend systems. Integrate with Azure Key Vaults to securely manage secrets and sensitive credentials. Design and execute data pipelines in Azure Data Factory (ADF) for processing and transforming tracking data. Lead pixel and tag management initiatives using Adobe Tag Manager, including pixel governance and QA across properties. Collaborate with security teams to ensure all data-sharing and processing complies with Azure’s data security standards and enterprise privacy frameworks. Utilize Fabric and OCI environments as needed for data integration and marketing intelligence workflows. Monitor, troubleshoot, and optimize existing integrations using logs, diagnostics, and analytics tools. Required Skills Strong hands-on experience with C# and building scalable APIs. Experience in implementing Meta CAPI, Google Enhanced Conversions, and other platform-specific server-side tracking APIs. Knowledge of Azure Clean Rooms, with experience developing custom logic and code for clean data collaborations. Proficiency with Azure Cloud technologies, especially Cosmos DB, Azure Functions, ADF, Key Vault, ADLS, and Azure security best practices. Familiarity with OCI for hybrid-cloud integration scenarios. Understanding of cryptography and secure data handling (e.g., hashing email addresses with SHA-256). Experience with Adobe Tag Management, specifically in pixel governance and lifecycle. Proven ability to collaborate across functions, especially with marketing and analytics teams. Soft Skills Strong communication skills to explain technical concepts to non-technical stakeholders. Proven ability to collaborate across teams, especially with marketing, product, and data analytics. Adaptable and proactive in learning and applying evolving technologies and regulatory changes. Show more Show less
Posted 1 week ago
0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
We are looking for a highly skilled API & Pixel Tracking Integration Engineer to lead the development and deployment of server-side tracking and attribution solutions across multiple platforms. The ideal candidate brings deep expertise in CAPI integrations (Meta, Google, and other platforms), secure data handling using cryptographic techniques, and experience working within privacy-first environments like Azure Clean Rooms . This role requires strong hands-on experience in Azure cloud services, OCI (Oracle Cloud Infrastructure) , and marketing technology stacks including Adobe Tag Management and Pixel Management . You will work closely with engineering, analytics, and marketing teams to deliver scalable, compliant, and secure data tracking solutions that drive business insights and performance. Key Responsibilities Design, implement, and maintain CAPI integrations across Meta, Google, and all major platforms, ensuring real-time and accurate server-side event tracking. Utilize OCI environments as needed for data integration and marketing intelligence workflows. Develop and manage custom tracking solutions leveraging Azure Clean Rooms, ensuring user NFAs are respected, and privacy-compliant logic is implemented. Implement cryptographic hashing (e.g., SHA-256) Use Azure Data Lake Gen1 & Gen2 (ADLS), Cosmos DB, and Azure Functions to build and host scalable backend systems. Integrate with Azure Key Vaults to securely manage secrets and sensitive credentials. Design and execute data pipelines in Azure Data Factory (ADF) for processing and transforming tracking data. Lead pixel and tag management initiatives using Adobe Tag Manager, including pixel governance and QA across properties. Collaborate with security teams to ensure all data-sharing and processing complies with Azure’s data security standards and enterprise privacy frameworks. Monitor, troubleshoot, and optimize existing integrations using logs, diagnostics, and analytics tools. Required Skills Strong hands-on experience in Python and building scalable APIs. Experience in implementing Meta CAPI, Google Enhanced Conversions, and other platform-specific server-side tracking APIs. Proficiency with Azure Cloud technologies, Azure Functions, ADF, Key Vault, ADLS, and Azure security best practices. Knowledge of Azure Clean Rooms, with experience developing custom logic and code for clean data collaborations. Familiarity with OCI for hybrid-cloud integration scenarios. Understanding of cryptography and secure data handling (e.g., hashing email addresses with SHA-256). Experience with Adobe Tag Management, specifically in pixel governance and lifecycle. Proven ability to collaborate across functions, especially with marketing and analytics teams. Soft Skills Strong communication skills to explain technical concepts to non-technical stakeholders. Proven ability to collaborate across teams, especially with marketing, product, and data analytics. Adaptable and proactive in learning and applying evolving technologies and regulatory changes. Show more Show less
Posted 1 week ago
5.0 - 10.0 years
0 Lacs
Greater Kolkata Area
On-site
Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate Job Description & Summary A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Creating business intelligence from data requires an understanding of the business, the data, and the technology used to store and analyse that data. Using our Rapid Business Intelligence Solutions, data visualisation and integrated reporting dashboards, we can deliver agile, highly interactive reporting and analytics that help our clients to more effectively run their business and understand what business questions can be answered and how to unlock the answers. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Responsibilities Key Responsibilities: Data Lake and Lakehouse Implementation: Design, implement, and manage Data Lake and Lakehouse architectures. (Must have) Develop and maintain scalable data pipelines and workflows. (Must have) Utilize Azure Data Lake Services (ADLS) for data storage and management. (Must have) Knowledge on Medalion Architecture, Delta Format. (Must have) Data Processing and Transformation: Use PySpark for data processing and transformations. (Must have) Implement Delta Live Tables for real-time data processing and analytics. (Good to have) Ensure data quality and consistency across all stages of the data lifecycle. (Must have) Data Management and Governance: Employ Unity Catalog for data governance and metadata management. (Good to have) Ensure robust data security and compliance with industry standards. (Must have) Data Integration: Extract, transform, and load (ETL) data from multiple sources (Must have) including SAP (Good to have), Dynamics 365 (Good to have), and other systems. Utilize Azure Data Factory (ADF) and Synapse Analytics for data integration and orchestration. (Must have) Performance Optimization of the Jobs. (Must have) Data Storage and Access: Implement and manage Azure Data Lake Storage (ADLS) for large-scale data storage. (Must have) Optimize data storage and retrieval processes for performance and cost-efficiency. (Must have) Collaboration and Communication: Work closely with data scientists, analysts, and other stakeholders to understand data requirements. (Must have) Provide technical guidance and mentorship to junior team members. (Good to have) Continuous Improvement: Stay updated with the latest industry trends and technologies in data engineering and cloud computing. (Good to have) Continuously improve data processes and infrastructure for efficiency and scalability. (Must have) Required Skills And Qualifications Technical Skills: Proficient in PySpark and Python for data processing and analysis. Strong experience with Azure Data Lake Services (ADLS) and Data Lake architecture. Hands-on experience with Databricks for data engineering and analytics. Knowledge of Unity Catalog for data governance. Expertise in Delta Live Tables for real-time data processing. Familiarity with Azure Fabric for data integration and orchestration. Proficient in Azure Data Factory (ADF) and Synapse Analytics for ETL and data warehousing. Experience in pulling data from multiple sources like SAP, Dynamics 365, and others. Soft Skills: Excellent problem-solving and analytical skills. Strong communication and collaboration abilities. Ability to work independently and as part of a team. Attention to detail and commitment to data accuracy and quality. Certifications Required Certification in Azure Data Engineering or relevant Azure certifications. DP203 (Must have) Certification in Databricks. Databricks certified Data Engineer Associate (Must have) Databricks certified Data Engineer Professional (Good Have) Mandatory skill sets: Azure DE, Pyspark, Databricks Preferred skill sets: Azure DE, Pyspark, Databricks Years of experience required: 5-10 Years Educational Qualification BE, B.Tech, MCA, M.Tech Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor of Engineering, Master of Engineering Degrees/Field Of Study Preferred Certifications (if blank, certifications not specified) Required Skills Microsoft Azure Optional Skills Desired Languages (If blank, desired languages not specified) Travel Requirements Available for Work Visa Sponsorship? Government Clearance Required? Job Posting End Date Show more Show less
Posted 1 week ago
5.0 - 10.0 years
0 Lacs
Greater Kolkata Area
On-site
Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate Job Description & Summary A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Creating business intelligence from data requires an understanding of the business, the data, and the technology used to store and analyse that data. Using our Rapid Business Intelligence Solutions, data visualisation and integrated reporting dashboards, we can deliver agile, highly interactive reporting and analytics that help our clients to more effectively run their business and understand what business questions can be answered and how to unlock the answers. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Responsibilities Extensive experience with SQL Server, including performance tuning, optimization, and complex query writing. Proficient in Azure Data Factory (ADF) for data integration, ETL processes, and data flow management. Proficient in T-SQL for writing advanced queries, stored procedures, and functions. Knowledge of PL/SQL is a plus for handling Oracle database interactions. Good to have experience as a SQL DBA, managing database security, backups, and recovery strategies. Ability to troubleshoot and resolve database and data pipeline issues. Excellent communication skills, both verbal and written, with the ability to convey technical concepts to non-technical stakeholders. Self-motivated and proactive, with a commitment to delivering high-quality solutions. Mandatory Skill Sets SQL. PL/SQL, T-SQL, SQL DBA Preferred Skill Sets SQL. PL/SQL, T-SQL, SQL DBA Years Of Experience Required 5-10 Years Education Qualification BE, B.Tech, MCA, M.Tech Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor Degree, Master of Engineering, Bachelor of Engineering Degrees/Field Of Study Preferred Certifications (if blank, certifications not specified) Required Skills Structured Query Language (SQL) Optional Skills Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date Show more Show less
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
The job market for ADF (Application Development Framework) professionals in India is witnessing significant growth, with numerous opportunities available for job seekers in this field. ADF is a popular framework used for building enterprise applications, and companies across various industries are actively looking for skilled professionals to join their teams.
Here are 5 major cities in India where there is a high demand for ADF professionals: - Bangalore - Hyderabad - Pune - Chennai - Mumbai
The estimated salary range for ADF professionals in India varies based on experience levels: - Entry-level: INR 4-6 lakhs per annum - Mid-level: INR 8-12 lakhs per annum - Experienced: INR 15-20 lakhs per annum
In the ADF job market in India, a typical career path may include roles such as Junior Developer, Senior Developer, Technical Lead, and Architect. As professionals gain more experience and expertise in ADF, they can progress to higher-level positions with greater responsibilities.
In addition to ADF expertise, professionals in this field are often expected to have knowledge of related technologies such as Java, Oracle Database, SQL, JavaScript, and web development frameworks like Angular or React.
Here are 25 interview questions for ADF roles, categorized by difficulty level: - Basic: - What is ADF and its key features? - What is the difference between ADF Faces and ADF Task Flows? - Medium: - Explain the lifecycle of an ADF application. - How do you handle exceptions in ADF applications? - Advanced: - Discuss the advantages of using ADF Business Components. - How would you optimize performance in an ADF application?
As you explore job opportunities in the ADF market in India, make sure to enhance your skills, prepare thoroughly for interviews, and showcase your expertise confidently. With the right preparation and mindset, you can excel in your ADF career and secure rewarding opportunities in the industry. Good luck!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2