Home
Jobs

1032 Adf Jobs - Page 8

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 years

0 Lacs

Greater Kolkata Area

On-site

Linkedin logo

Skill required: Tech for Operations - Microsoft Azure Cloud Services Designation: App Automation Eng Senior Analyst Qualifications: Any Graduation/12th/PUC/HSC Years of Experience: 5 to 8 years About Accenture Accenture is a global professional services company with leading capabilities in digital, cloud and security.Combining unmatched experience and specialized skills across more than 40 industries, we offer Strategy and Consulting, Technology and Operations services, and Accenture Song— all powered by the world’s largest network of Advanced Technology and Intelligent Operations centers. Our 699,000 people deliver on the promise of technology and human ingenuity every day, serving clients in more than 120 countries. We embrace the power of change to create value and shared success for our clients, people, shareholders, partners and communities.Visit us at www.accenture.com What would you do? Accenture is a global professional services company with leading capabilities in digital, cloud and security. Combining unmatched experience and specialized skills across more than 40 industries, we offer Strategy and Consulting, Technology and Operations services, and Accenture Song— all powered by the world s largest network of Advanced Technology and Intelligent Operations centers. Our 699,000 people deliver on the promise of technology and human ingenuity every day, serving clients in more than 120 countries. We embrace the power of change to create value and shared success for our clients, people, shareholders, partners and communities. Visit us at www.accenture.com. In our Service Supply Chain offering, we leverage a combination of proprietary technology and client systems to develop, execute, and deliver BPaaS (business process as a service) or Managed Service solutions across the service lifecycle: Plan, Deliver, and Recover. In this role, you will partner with business development and act as a Business Subject Matter Expert (SME) to help build resilient solutions that will enhance our clients supply chains and customer experience. The Senior Azure Data factory (ADF) Support Engineer Il will be a critical member of our Enterprise Applications Team, responsible for designing, supporting & maintaining robust data solutions. The ideal candidate is proficient in ADF, SQL and has extensive experience in troubleshooting Azure Data factory environments, conducting code reviews, and bug fixing. This role requires a strategic thinker who can collaborate with cross-functional teams to drive our data strategy and ensure the optimal performance of our data systems. What are we looking for? Bachelor s or Master s degree in Computer Science, Information Technology, or a related field. Proven experience (5+ years) as a Azure Data Factory Support Engineer Il Expertise in ADF with a deep understanding of its data-related libraries. Strong experience in Azure cloud services, including troubleshooting and optimizing cloud-based environments. Proficient in SQL and experience with SQL database design. Strong analytical skills with the ability to collect, organize, analyze, and disseminate significant amounts of information with attention to detail and accuracy. Experience with ADF pipelines. Excellent problem-solving and troubleshooting skills. Experience in code review and debugging in a collaborative project setting. Excellent verbal and written communication skills. Ability to work in a fast-paced, team-oriented environment. Strong understanding of the business and a passion for the mission of Service Supply Chain Hands on with Jira, Devops ticketing, ServiceNow is good to have Roles and Responsibilities: Innovate. Collaborate. Build. Create. Solve ADF & associated systems Ensure systems meet business requirements and industry practices. Integrate new data management technologies and software engineering tools into existing structures. Recommend ways to improve data reliability, efficiency, and quality. Use large data sets to address business issues. Use data to discover tasks that can be automated. Fix bugs to ensure robust and sustainable codebase. Collaborate closely with the relevant teams to diagnose and resolve issues in data processing systems, ensuring minimal downtime and optimal performance. Analyze and comprehend existing ADF data pipelines, systems, and processes to identify and troubleshoot issues effectively. Develop, test, and implement code changes to fix bugs and improve the efficiency and reliability of data pipelines. Review and validate change requests from stakeholders, ensuring they align with system capabilities and business objectives. Implement robust monitoring solutions to proactively detect and address issues in ADF data pipelines and related infrastructure. Coordinate with data architects and other team members to ensure that changes are in line with the overall architecture and data strategy. Document all changes, bug fixes, and updates meticulously, maintaining clear and comprehensive records for future reference and compliance. Provide technical guidance and support to other team members, promoting a culture of continuous learning and improvement. Stay updated with the latest technologies and practices in ADF to continuously improve the support and maintenance of data systems. Flexible Work Hours to include US Time Zones Flexible working hours however this position may require you to work a rotational On-Call schedule, evenings, weekends, and holiday shifts when need arises Participate in the Demand Management and Change Management processes. Work in partnership with internal business, external 3rd party technical teams and functional teams as a technology partner in communicating and coordinating delivery of technology services from Technology For Operations (TfO) Any Graduation,12th/PUC/HSC Show more Show less

Posted 6 days ago

Apply

5.0 years

0 Lacs

Greater Kolkata Area

On-site

Linkedin logo

Skill required: Tech for Operations - Microsoft Azure Cloud Services Designation: App Automation Eng Senior Analyst Qualifications: Any Graduation/12th/PUC/HSC Years of Experience: 5 to 8 years About Accenture Accenture is a global professional services company with leading capabilities in digital, cloud and security.Combining unmatched experience and specialized skills across more than 40 industries, we offer Strategy and Consulting, Technology and Operations services, and Accenture Song— all powered by the world’s largest network of Advanced Technology and Intelligent Operations centers. Our 699,000 people deliver on the promise of technology and human ingenuity every day, serving clients in more than 120 countries. We embrace the power of change to create value and shared success for our clients, people, shareholders, partners and communities.Visit us at www.accenture.com What would you do? Accenture is a global professional services company with leading capabilities in digital, cloud and security. Combining unmatched experience and specialized skills across more than 40 industries, we offer Strategy and Consulting, Technology and Operations services, and Accenture Song— all powered by the world s largest network of Advanced Technology and Intelligent Operations centers. Our 699,000 people deliver on the promise of technology and human ingenuity every day, serving clients in more than 120 countries. We embrace the power of change to create value and shared success for our clients, people, shareholders, partners and communities. Visit us at www.accenture.com. In our Service Supply Chain offering, we leverage a combination of proprietary technology and client systems to develop, execute, and deliver BPaaS (business process as a service) or Managed Service solutions across the service lifecycle: Plan, Deliver, and Recover. In this role, you will partner with business development and act as a Business Subject Matter Expert (SME) to help build resilient solutions that will enhance our clients supply chains and customer experience. The Senior Azure Data factory (ADF) Support Engineer Il will be a critical member of our Enterprise Applications Team, responsible for designing, supporting & maintaining robust data solutions. The ideal candidate is proficient in ADF, SQL and has extensive experience in troubleshooting Azure Data factory environments, conducting code reviews, and bug fixing. This role requires a strategic thinker who can collaborate with cross-functional teams to drive our data strategy and ensure the optimal performance of our data systems. What are we looking for? Bachelor s or Master s degree in Computer Science, Information Technology, or a related field. Proven experience (5+ years) as a Azure Data Factory Support Engineer Il Expertise in ADF with a deep understanding of its data-related libraries. Strong experience in Azure cloud services, including troubleshooting and optimizing cloud-based environments. Proficient in SQL and experience with SQL database design. Strong analytical skills with the ability to collect, organize, analyze, and disseminate significant amounts of information with attention to detail and accuracy. Experience with ADF pipelines. Excellent problem-solving and troubleshooting skills. Experience in code review and debugging in a collaborative project setting. Excellent verbal and written communication skills. Ability to work in a fast-paced, team-oriented environment. Strong understanding of the business and a passion for the mission of Service Supply Chain Hands on with Jira, Devops ticketing, ServiceNow is good to have Roles and Responsibilities: Innovate. Collaborate. Build. Create. Solve ADF & associated systems Ensure systems meet business requirements and industry practices. Integrate new data management technologies and software engineering tools into existing structures. Recommend ways to improve data reliability, efficiency, and quality. Use large data sets to address business issues. Use data to discover tasks that can be automated. Fix bugs to ensure robust and sustainable codebase. Collaborate closely with the relevant teams to diagnose and resolve issues in data processing systems, ensuring minimal downtime and optimal performance. Analyze and comprehend existing ADF data pipelines, systems, and processes to identify and troubleshoot issues effectively. Develop, test, and implement code changes to fix bugs and improve the efficiency and reliability of data pipelines. Review and validate change requests from stakeholders, ensuring they align with system capabilities and business objectives. Implement robust monitoring solutions to proactively detect and address issues in ADF data pipelines and related infrastructure. Coordinate with data architects and other team members to ensure that changes are in line with the overall architecture and data strategy. Document all changes, bug fixes, and updates meticulously, maintaining clear and comprehensive records for future reference and compliance. Provide technical guidance and support to other team members, promoting a culture of continuous learning and improvement. Stay updated with the latest technologies and practices in ADF to continuously improve the support and maintenance of data systems. Flexible Work Hours to include US Time Zones Flexible working hours however this position may require you to work a rotational On-Call schedule, evenings, weekends, and holiday shifts when need arises Participate in the Demand Management and Change Management processes. Work in partnership with internal business, external 3rd party technical teams and functional teams as a technology partner in communicating and coordinating delivery of technology services from Technology For Operations (TfO) Any Graduation,12th/PUC/HSC Show more Show less

Posted 6 days ago

Apply

10.0 years

0 Lacs

India

Remote

Linkedin logo

Role: Senior Azure / Data Engineer with (ETL/ Data warehouse background) Location: Remote, India Duration: Long Term Contract Need with 10+ years of experience Must have Skills : • Min 5 years of experience in modern data engineering/data warehousing/data lakes technologies on cloud platforms like Azure, AWS, GCP, Data Bricks, etc. Azure experience is preferred over other cloud platforms. • 10 + years of proven experience with SQL, schema design, and dimensional data modeling • Solid knowledge of data warehouse best practices, development standards, and methodologies • Experience with ETL/ELT tools like ADF, Informatica, Talend, etc., and data warehousing technologies like Azure Synapse, Azure SQL, Amazon Redshift, Snowflake, Google Big Query, etc.. • Strong experience with big data tools(Databricks, Spark, etc..) and programming skills in PySpark and Spark SQL. • Be an independent self-learner with a “let’s get this done” approach and the ability to work in Fast paced and Dynamic environment. • Excellent communication and teamwork abilities. Nice-to-Have Skills: • Event Hub, IOT Hub, Azure Stream Analytics, Azure Analysis Service, Cosmo DB knowledge. • SAP ECC /S/4 and Hana knowledge. • Intermediate knowledge on Power BI • Azure DevOps and CI/CD deployments, Cloud migration methodologies and processes Show more Show less

Posted 6 days ago

Apply

10.0 - 20.0 years

20 - 30 Lacs

Kochi, Kozhikode, Thiruvananthapuram

Work from Office

Naukri logo

Expertise in Azure services including App Services, Functions, DevOps pipelines and ADF.Expert-level knowledge of MuleSoft Anypoint Platform, API lifecycle management, and enterprise integration. unit testing frameworks/integration testing in Java Required Candidate profile Proven skills in Java-based web applications with RDBMS or NoSQL backends. Proven skills in Python and JavaScript for backend and full-stack solutions. object-oriented programming and design patterns

Posted 6 days ago

Apply

2.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Job Description: Technologies: SQL, Azure Synapse, Azure Data Factory (ADF), ETL, Excel We are looking for a person who will be responsible for fulfilling the needs of key business functions through the development of SQL code, Azure data pipelines, ETL, and data models. You will be involved in the development of MS-SQL queries and procedures, creating custom reports, and rolling up the data to the desired level for the client to consume. You will also be responsible for designing databases and extracting data from various sources, integrating it, and ensuring its stability, reliability, and performance. What your day would look like: 2-3 years of experience as a SQL Developer or similar role Excellent understanding of SQL Server and SQL programming, with 2+ years of hands-on experience in SQL programming. Experience with SQL Server Integration Services (SSIS) It is beneficial to have experience in Data Factory pipelines for on-cloud ETL processing. In-depth skills with Azure Data Factory, Azure Synapse, ADLS, with the ability to configure and administrate all aspects of SQL Server at a Consultant level A sense of ownership and pride in your performance and its impact on the company's success Excellent interpersonal and communication skills (both oral and written), with the ability to communicate at various levels with clarity and precision. Critical thinker and problem-solving skills Team player Good time-management skills Experience working in analytics projects in the pharma domain deriving actionable insights and implementing them. Experience in longitudinal data, retail/CPG, Customer level data sets, pharma data, patient data, forecasting, and/or performance reporting Should have an intermediate to strong MS Excel and PowerPoint knowledge. Experience in SQL Server and SSIS In-depth skills with Azure Data Factory, Azure Synapse, ADLS with the ability to configure and administrate all aspects of SQL Server. Ability to manage, efficiently manipulates huge data sets (multi-million record complex relational databases) Right candidate must be self-directed and comfortable supporting the data needs of multiple teams, systems, and products. Articulate communicator handling challenging situations with structured thinking and solution minded focus. Leading communication with internal and external stakeholders with minimal supervision Proactive in identifying potential risks and implementing mitigation strategies to avoid potential issues downstream. Exposure to project management principles – breaking up approach into smaller tasks and planning for the same across resources (Consultant) Ability to learn quickly in a dynamic environment. Successful experience working in a global environment an advantage. Previous experience in healthcare analytics is a plus. IQVIA is a leading global provider of clinical research services, commercial insights and healthcare intelligence to the life sciences and healthcare industries. We create intelligent connections to accelerate the development and commercialization of innovative medical treatments to help improve patient outcomes and population health worldwide. Learn more at https://jobs.iqvia.com Show more Show less

Posted 6 days ago

Apply

4.0 years

0 Lacs

India

On-site

Linkedin logo

Love turning raw data into powerful insights? Join us! We're partnering with global brands to unlock the full potential of their data. As a Data Engineer, you'll be at the heart of these transformations—building scalable data pipelines, optimizing data flows, and empowering analytics teams to make real-time, data-driven decisions. We are seeking a highly skilled Data Engineer with hands-on experience in Databricks to support data integration, pipeline development, and large-scale data processing for our retail or healthcare client. The ideal candidate will work closely with cross-functional teams to design robust data solutions that drive business intelligence and operational efficiency. Key Responsibilities Develop and maintain scalable data pipelines using Databricks and Spark. Build ETL/ELT workflows to support data ingestion, transformation, and validation. Collaborate with data scientists, analysts, and business stakeholders to gather data requirements. Optimize data processing workflows for performance and reliability. Manage structured and unstructured data across cloud-based data lakes and warehouses (e.g., Delta Lake, Snowflake, Azure Synapse). Ensure data quality and compliance with data governance standards. Required Qualifications 4+ years of experience as a Data Engineer. Strong expertise in Databricks, Apache Spark, and Delta Lake. Proficiency in Python, SQL, and data pipeline orchestration tools (e.g., Airflow, ADF). Experience with cloud platforms such as Azure, AWS, or GCP. Familiarity with data modeling, version control, and CI/CD practices. Experience in the retail or healthcare domain is a plus. Benefits Health Insurance, Accident Insurance. The salary will be determined based on several factors including, but not limited to, location, relevant education, qualifications, experience, technical skills, and business needs. Additional Responsibilities Participate in OP monthly team meetings, and participate in team-building efforts. Contribute to OP technical discussions, peer reviews, etc. Contribute content and collaborate via the OP-Wiki/Knowledge Base. Provide status reports to OP Account Management as requested. About Us OP is a technology consulting and solutions company, offering advisory and managed services, innovative platforms, and staffing solutions across a wide range of fields — including AI, cyber security, enterprise architecture, and beyond. Our most valuable asset is our people: dynamic, creative thinkers, who are passionate about doing quality work. As a member of the OP team, you will have access to industry-leading consulting practices, strategies & and technologies, innovative training & education. An ideal OP team member is a technology leader with a proven track record of technical excellence and a strong focus on process and methodology. Show more Show less

Posted 6 days ago

Apply

2.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Overview Data Science Team works in developing Machine Learning (ML) and Artificial Intelligence (AI) projects. Specific scope of this role is to develop ML solution in support of ML/AI projects using big analytics toolsets in a CI/CD environment. Analytics toolsets may include DS tools/Spark/Databricks, and other technologies offered by Microsoft Azure or open-source toolsets. This role will also help automate the end-to-end cycle with Azure Pipelines. You will be part of a collaborative interdisciplinary team around data, where you will be responsible of our continuous delivery of statistical/ML models. You will work closely with process owners, product owners and final business users. This will provide you the correct visibility and understanding of criticality of your developments. Responsibilities Delivery of key Advanced Analytics/Data Science projects within time and budget, particularly around DevOps/MLOps and Machine Learning models in scope Active contributor to code & development in projects and services Partner with data engineers to ensure data access for discovery and proper data is prepared for model consumption. Partner with ML engineers working on industrialization. Communicate with business stakeholders in the process of service design, training and knowledge transfer. Support large-scale experimentation and build data-driven models. Refine requirements into modelling problems. Influence product teams through data-based recommendations. Research in state-of-the-art methodologies. Create documentation for learnings and knowledge transfer. Create reusable packages or libraries. Ensure on time and on budget delivery which satisfies project requirements, while adhering to enterprise architecture standards Leverage big data technologies to help process data and build scaled data pipelines (batch to real time) Implement end-to-end ML lifecycle with Azure Databricks and Azure Pipelines Automate ML models deployments Qualifications BE/B.Tech in Computer Science, Maths, technical fields. Overall 2-4 years of experience working as a Data Scientist. 2+ years’ experience building solutions in the commercial or in the supply chain space. 2+ years working in a team to deliver production level analytic solutions. Fluent in git (version control). Understanding of Jenkins, Docker are a plus. Fluent in SQL syntaxis. 2+ years’ experience in Statistical/ML techniques to solve supervised (regression, classification) and unsupervised problems. 2+ years’ experience in developing business problem related statistical/ML modeling with industry tools with primary focus on Python or Pyspark development. Data Science - Hands on experience and strong knowledge of building machine learning models - supervised and unsupervised models. Knowledge of Time series/Demand Forecast models is a plus Programming Skills - Hands-on experience in statistical programming languages like Python, Pyspark and database query languages like SQL Statistics - Good applied statistical skills, including knowledge of statistical tests, distributions, regression, maximum likelihood estimators Cloud (Azure) - Experience in Databricks and ADF is desirable Familiarity with Spark, Hive, Pig is an added advantage Business storytelling and communicating data insights in business consumable format. Fluent in one Visualization tool. Strong communications and organizational skills with the ability to deal with ambiguity while juggling multiple priorities Experience with Agile methodology for team work and analytics ‘product’ creation. Experience in Reinforcement Learning is a plus. Experience in Simulation and Optimization problems in any space is a plus. Experience with Bayesian methods is a plus. Experience with Causal inference is a plus. Experience with NLP is a plus. Experience with Responsible AI is a plus. Experience with distributed machine learning is a plus Experience in DevOps, hands-on experience with one or more cloud service providers AWS, GCP, Azure(preferred) Model deployment experience is a plus Experience with version control systems like GitHub and CI/CD tools Experience in Exploratory data Analysis Knowledge of ML Ops / DevOps and deploying ML models is preferred Experience using MLFlow, Kubeflow etc. will be preferred Experience executing and contributing to ML OPS automation infrastructure is good to have Exceptional analytical and problem-solving skills Stakeholder engagement-BU, Vendors. Experience building statistical models in the Retail or Supply chain space is a plus Show more Show less

Posted 6 days ago

Apply

3.0 years

0 Lacs

Greater Kolkata Area

On-site

Linkedin logo

Line of Service Advisory Industry/Sector Not Applicable Specialism Oracle Management Level Associate Job Description & Summary At PwC, our people in business application consulting specialise in consulting services for a variety of business applications, helping clients optimise operational efficiency. These individuals analyse client needs, implement software solutions, and provide training and support for seamless integration and utilisation of business applications, enabling clients to achieve their strategic objectives. Those in Oracle technology at PwC will focus on utilising and managing Oracle suite of software and technologies for various purposes within an organisation. You will be responsible for tasks such as installation, configuration, administration, development, and support of Oracle products and solutions. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Job Description & Summary Managing business performance in today’s complex and rapidly changing business environment is crucial for any organization’s short-term and long-term success. However ensuring streamlined E2E Oracle fusion Technical to seamlessly adapt to the changing business environment is crucial from a process and compliance perspective. As part of the Technology Consulting -Business Applications - Oracle Practice team, we leverage opportunities around digital disruption, new age operating model and best in class practices to deliver technology enabled transformation to our clients Responsibilities: Extensive experience in Oracle ERP/Fusion SaaS/PaaS project implementations as a technical developer . Completed at least 2 full Oracle Cloud (Fusion) Implementation Extensive Knowledge on database structure for ERP/Oracle Cloud (Fusion) Extensively worked on BI Publisher reports, FBDI/OTBI Cloud and Oracle Integration (OIC) Mandatory skill sets BI Publisher reports, FBDI/OTBI Cloud and Oracle Integration (OIC) Preferred skill sets database structure for ERP/Oracle Cloud (Fusion) Year of experience required Minimum 3 Years of Oracle fusion experience Educational Qualification BE/BTech MBA Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Master of Business Administration, Bachelor of Engineering Degrees/Field Of Study Preferred: Certifications (if blank, certifications not specified) Required Skills Oracle Fusion Middleware (OFM) Optional Skills Accepting Feedback, Active Listening, Business Transformation, Communication, Design Automation, Emotional Regulation, Empathy, Inclusion, Intellectual Curiosity, Optimism, Oracle Application Development Framework (ADF), Oracle Business Intelligence (BI) Publisher, Oracle Cloud Infrastructure, Oracle Data Integration, Process Improvement, Process Optimization, Strategic Technology Planning, Teamwork, Well Being Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date Show more Show less

Posted 6 days ago

Apply

6.0 years

0 Lacs

Kolkata, West Bengal, India

Remote

Linkedin logo

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. The opportunity We are seeking a highly skilled and motivated Senior DataOps Engineer with strong expertise in the Azure data ecosystem. You will play a crucial role in managing and optimizing data workflows across Azure platforms such as Azure Data Factory, Data Lake, Databricks, and Synapse. Your primary focus will be on building, maintaining, and monitoring data pipelines, ensuring high data quality, and supporting critical data operations. You'll also support visualization, automation, and CI/CD processes to streamline data delivery and reporting. Your Key Responsibilities Data Pipeline Management: Build, monitor, and optimize data pipelines using Azure Data Factory (ADF), Databricks, and Azure Synapse for efficient data ingestion, transformation, and storage. ETL Operations: Design and maintain robust ETL processes for batch and real-time data processing across cloud and on-premise sources. Data Lake Management: Organize and manage structured and unstructured data in Azure Data Lake, ensuring performance and security best practices. Data Quality & Validation: Perform data profiling, validation, and transformation using SQL, PySpark, and Python to ensure data integrity. Monitoring & Troubleshooting: Use logging and monitoring tools to troubleshoot failures in pipelines and address data latency or quality issues. Reporting & Visualization: Work with Power BI or Tableau teams to support dashboard development, ensuring the availability of clean and reliable data. DevOps & CI/CD: Support data deployment pipelines using Azure DevOps, Git, and CI/CD practices for version control and automation. Tool Integration: Collaborate with cross-functional teams to integrate Informatica CDI or similar ETL tools with Azure components for seamless data flow. Collaboration & Documentation: Partner with data analysts, engineers, and business stakeholders, while maintaining SOPs and technical documentation for operational efficiency. Skills and attributes for success Strong hands-on experience in Azure Data Factory, Azure Data Lake, Azure Synapse, and Databricks Solid understanding of ETL/ELT design and implementation principles Strong SQL and PySpark skills for data transformation and validation Exposure to Python for automation and scripting Familiarity with DevOps concepts, CI/CD workflows, and source control systems (Azure DevOps preferred) Experience in working with Power BI or Tableau for data visualization and reporting support Strong problem-solving skills, attention to detail, and commitment to data quality Excellent communication and documentation skills to interface with technical and business teamsStrong knowledge of asset management business operations, especially in data domains like securities, holdings, benchmarks, and pricing. To qualify for the role, you must have 4–6 years of experience in DataOps or Data Engineering roles Proven expertise in managing and troubleshooting data workflows within the Azure ecosystem Experience working with Informatica CDI or similar data integration tools Scripting and automation experience in Python/PySpark Ability to support data pipelines in a rotational on-call or production support environment Comfortable working in a remote/hybrid and cross-functional team setup Technologies and Tools Must haves Azure Databricks: Experience in data transformation and processing using notebooks and Spark. Azure Data Lake: Experience working with hierarchical data storage in Data Lake. Azure Synapse: Familiarity with distributed data querying and data warehousing. Azure Data factory: Hands-on experience in orchestrating and monitoring data pipelines. ETL Process Understanding: Knowledge of data extraction, transformation, and loading workflows, including data cleansing, mapping, and integration techniques. Good to have Power BI or Tableau for reporting support Monitoring/logging using Azure Monitor or Log Analytics Azure DevOps and Git for CI/CD and version control Python and/or PySpark for scripting and data handling Informatica Cloud Data Integration (CDI) or similar ETL tools Shell scripting or command-line data SQL (across distributed and relational databases) What We Look For Enthusiastic learners with a passion for data op’s and practices. Problem solvers with a proactive approach to troubleshooting and optimization. Team players who can collaborate effectively in a remote or hybrid work environment. Detail-oriented professionals with strong documentation skills. What we offer EY Global Delivery Services (GDS) is a dynamic and truly global delivery network. We work across six locations – Argentina, China, India, the Philippines, Poland and the UK – and with teams from all EY service lines, geographies and sectors, playing a vital role in the delivery of the EY growth strategy. From accountants to coders to advisory consultants, we offer a wide variety of fulfilling career opportunities that span all business disciplines. In GDS, you will collaborate with EY teams on exciting projects and work with well-known brands from across the globe. We’ll introduce you to an ever-expanding ecosystem of people, learning, skills and insights that will stay with you throughout your career. Continuous learning: You’ll develop the mindset and skills to navigate whatever comes next. Success as defined by you: We’ll provide the tools and flexibility, so you can make a meaningful impact, your way. Transformative leadership: We’ll give you the insights, coaching and confidence to be the leader the world needs. Diverse and inclusive culture: You’ll be embraced for who you are and empowered to use your voice to help others find theirs. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less

Posted 6 days ago

Apply

0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Data Engineer- ETL Bangalore, India AXA XL recognizes data and information as critical business assets, both in terms of managing risk and enabling new business opportunities. This data should not only be high quality, but also actionable - enabling AXA XL’s executive leadership team to maximize benefits and facilitate sustained industrious advantage. Our Chief Data Office also known as our Innovation, Data Intelligence & Analytics team (IDA) is focused on driving innovation through optimizing how we leverage data to drive strategy and create a new business model - disrupting the insurance market. As we develop an enterprise-wide data and digital strategy that moves us toward greater focus on the use of data and data-driven insights, we are seeking a Data Engineer. The role will support the team’s efforts towards creating, enhancing, and stabilizing the Enterprise data lake through the development of the data pipelines. This role requires a person who is a team player and can work well with team members from other disciplines to deliver data in an efficient and strategic manner. What You’ll Be Doing What will your essential responsibilities include? Act as a data engineering expert and partner to Global Technology and data consumers in controlling complexity and cost of the data platform, whilst enabling performance, governance, and maintainability of the estate. Understand current and future data consumption patterns, architecture (granular level), partner with Architects to make sure optimal design of data layers. Apply best practices in Data architecture. For example, balance between materialization and virtualization, optimal level of de-normalization, caching and partitioning strategies, choice of storage and querying technology, performance tuning. Leading and hands-on execution of research into new technologies. Formulating frameworks for assessment of new technology vs business benefit, implications for data consumers. Act as a best practice expert, blueprint creator of ways of working such as testing, logging, CI/CD, observability, release, enabling rapid growth in data inventory and utilization of Data Science Platform. Design prototypes and work in a fast-paced iterative solution delivery model. Design, Develop and maintain ETL pipelines using Py spark in Azure Databricks using delta tables. Use Harness for deployment pipeline. Monitor Performance of ETL Jobs, resolve any issue that arose and improve the performance metrics as needed. Diagnose system performance issue related to data processing and implement solution to address them. Collaborate with other teams to make sure successful integration of data pipelines into larger system architecture requirement. Maintain integrity and quality across all pipelines and environments. Understand and follow secure coding practice to make sure code is not vulnerable. You will report to the Application Manager. What You Will BRING We’re looking for someone who has these abilities and skills: Required Skills And Abilities Effective Communication skills. Bachelor’s degree in computer science, Mathematics, Statistics, Finance, related technical field, or equivalent work experience. Relevant years of extensive work experience in various data engineering & modeling techniques (relational, data warehouse, semi-structured, etc.), application development, advanced data querying skills. Relevant years of programming experience using Databricks. Relevant years of experience using Microsoft Azure suite of products (ADF, synapse and ADLS). Solid knowledge on network and firewall concepts. Solid experience writing, optimizing and analyzing SQL. Relevant years of experience with Python. Ability to break complex data requirements and architect solutions into achievable targets. Robust familiarity with Software Development Life Cycle (SDLC) processes and workflow, especially Agile. Experience using Harness. Technical lead responsible for both individual and team deliveries. Desired Skills And Abilities Worked in big data migration projects. Worked on performance tuning both at database and big data platforms. Ability to interpret complex data requirements and architect solutions. Distinctive problem-solving and analytical skills combined with robust business acumen. Excellent basics on parquet files and delta files. Effective Knowledge of Azure cloud computing platform. Familiarity with Reporting software - Power BI is a plus. Familiarity with DBT is a plus. Passion for data and experience working within a data-driven organization. You care about what you do, and what we do. Who WE Are AXA XL, the P&C and specialty risk division of AXA, is known for solving complex risks. For mid-sized companies, multinationals and even some inspirational individuals we don’t just provide re/insurance, we reinvent it. How? By combining a comprehensive and efficient capital platform, data-driven insights, leading technology, and the best talent in an agile and inclusive workspace, empowered to deliver top client service across all our lines of business − property, casualty, professional, financial lines and specialty. With an innovative and flexible approach to risk solutions, we partner with those who move the world forward. Learn more at axaxl.com What we OFFER Inclusion AXA XL is committed to equal employment opportunity and will consider applicants regardless of gender, sexual orientation, age, ethnicity and origins, marital status, religion, disability, or any other protected characteristic. At AXA XL, we know that an inclusive culture and a diverse workforce enable business growth and are critical to our success. That’s why we have made a strategic commitment to attract, develop, advance and retain the most diverse workforce possible, and create an inclusive culture where everyone can bring their full selves to work and can reach their highest potential. It’s about helping one another — and our business — to move forward and succeed. Five Business Resource Groups focused on gender, LGBTQ+, ethnicity and origins, disability and inclusion with 20 Chapters around the globe Robust support for Flexible Working Arrangements Enhanced family friendly leave benefits Named to the Diversity Best Practices Index Signatory to the UK Women in Finance Charter Learn more at axaxl.com/about-us/inclusion-and-diversity. AXA XL is an Equal Opportunity Employer. Total Rewards AXA XL’s Reward program is designed to take care of what matters most to you, covering the full picture of your health, wellbeing, lifestyle and financial security. It provides dynamic compensation and personalized, inclusive benefits that evolve as you do. We’re committed to rewarding your contribution for the long term, so you can be your best self today and look forward to the future with confidence. Sustainability At AXA XL, Sustainability is integral to our business strategy. In an ever-changing world, AXA XL protects what matters most for our clients and communities. We know that sustainability is at the root of a more resilient future. Our 2023-26 Sustainability strategy, called “Roots of resilience”, focuses on protecting natural ecosystems, addressing climate change, and embedding sustainable practices across our operations. Our Pillars Valuing nature: How we impact nature affects how nature impacts us. Resilient ecosystems - the foundation of a sustainable planet and society - are essential to our future. We’re committed to protecting and restoring nature - from mangrove forests to the bees in our backyard - by increasing biodiversity awareness and inspiring clients and colleagues to put nature at the heart of their plans. Addressing climate change: The effects of a changing climate are far reaching and significant. Unpredictable weather, increasing temperatures, and rising sea levels cause both social inequalities and environmental disruption. We're building a net zero strategy, developing insurance products and services, and mobilizing to advance thought leadership and investment in societal-led solutions. Integrating ESG: All companies have a role to play in building a more resilient future. Incorporating ESG considerations into our internal processes and practices builds resilience from the roots of our business. We’re training our colleagues, engaging our external partners, and evolving our sustainability governance and reporting. AXA Hearts in Action: We have established volunteering and charitable giving programs to help colleagues support causes that matter most to them, known as AXA XL’s “Hearts in Action” programs. These include our Matching Gifts program, Volunteering Leave, and our annual volunteering day - the Global Day of Giving. For more information, please see axaxl.com/sustainability. Show more Show less

Posted 6 days ago

Apply

0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Data Engineer- ETL Bangalore, India AXA XL recognizes data and information as critical business assets, both in terms of managing risk and enabling new business opportunities. This data should not only be high quality, but also actionable - enabling AXA XL’s executive leadership team to maximize benefits and facilitate sustained industrious advantage. Our Chief Data Office also known as our Innovation, Data Intelligence & Analytics team (IDA) is focused on driving innovation through optimizing how we leverage data to drive strategy and create a new business model - disrupting the insurance market. As we develop an enterprise-wide data and digital strategy that moves us toward greater focus on the use of data and data-driven insights, we are seeking a Data Engineer. The role will support the team’s efforts towards creating, enhancing, and stabilizing the Enterprise data lake through the development of the data pipelines. This role requires a person who is a team player and can work well with team members from other disciplines to deliver data in an efficient and strategic manner. What You’ll Be Doing What will your essential responsibilities include? Act as a data engineering expert and partner to Global Technology and data consumers in controlling complexity and cost of the data platform, whilst enabling performance, governance, and maintainability of the estate. Understand current and future data consumption patterns, architecture (granular level), partner with Architects to make sure optimal design of data layers. Apply best practices in Data architecture. For example, balance between materialization and virtualization, optimal level of de-normalization, caching and partitioning strategies, choice of storage and querying technology, performance tuning. Leading and hands-on execution of research into new technologies. Formulating frameworks for assessment of new technology vs business benefit, implications for data consumers. Act as a best practice expert, blueprint creator of ways of working such as testing, logging, CI/CD, observability, release, enabling rapid growth in data inventory and utilization of Data Science Platform. Design prototypes and work in a fast-paced iterative solution delivery model. Design, Develop and maintain ETL pipelines using Py spark in Azure Databricks using delta tables. Use Harness for deployment pipeline. Monitor Performance of ETL Jobs, resolve any issue that arose and improve the performance metrics as needed. Diagnose system performance issue related to data processing and implement solution to address them. Collaborate with other teams to make sure successful integration of data pipelines into larger system architecture requirement. Maintain integrity and quality across all pipelines and environments. Understand and follow secure coding practice to make sure code is not vulnerable. You will report to the Application Manager. What You Will BRING We’re looking for someone who has these abilities and skills: Required Skills And Abilities Effective Communication skills. Bachelor’s degree in computer science, Mathematics, Statistics, Finance, related technical field, or equivalent work experience. Relevant years of extensive work experience in various data engineering & modeling techniques (relational, data warehouse, semi-structured, etc.), application development, advanced data querying skills. Relevant years of programming experience using Databricks. Relevant years of experience using Microsoft Azure suite of products (ADF, synapse and ADLS). Solid knowledge on network and firewall concepts. Solid experience writing, optimizing and analyzing SQL. Relevant years of experience with Python. Ability to break complex data requirements and architect solutions into achievable targets. Robust familiarity with Software Development Life Cycle (SDLC) processes and workflow, especially Agile. Experience using Harness. Technical lead responsible for both individual and team deliveries. Desired Skills And Abilities Worked in big data migration projects. Worked on performance tuning both at database and big data platforms. Ability to interpret complex data requirements and architect solutions. Distinctive problem-solving and analytical skills combined with robust business acumen. Excellent basics on parquet files and delta files. Effective Knowledge of Azure cloud computing platform. Familiarity with Reporting software - Power BI is a plus. Familiarity with DBT is a plus. Passion for data and experience working within a data-driven organization. You care about what you do, and what we do. Who WE Are AXA XL, the P&C and specialty risk division of AXA, is known for solving complex risks. For mid-sized companies, multinationals and even some inspirational individuals we don’t just provide re/insurance, we reinvent it. How? By combining a comprehensive and efficient capital platform, data-driven insights, leading technology, and the best talent in an agile and inclusive workspace, empowered to deliver top client service across all our lines of business − property, casualty, professional, financial lines and specialty. With an innovative and flexible approach to risk solutions, we partner with those who move the world forward. Learn more at axaxl.com What we OFFER Inclusion AXA XL is committed to equal employment opportunity and will consider applicants regardless of gender, sexual orientation, age, ethnicity and origins, marital status, religion, disability, or any other protected characteristic. At AXA XL, we know that an inclusive culture and a diverse workforce enable business growth and are critical to our success. That’s why we have made a strategic commitment to attract, develop, advance and retain the most diverse workforce possible, and create an inclusive culture where everyone can bring their full selves to work and can reach their highest potential. It’s about helping one another — and our business — to move forward and succeed. Five Business Resource Groups focused on gender, LGBTQ+, ethnicity and origins, disability and inclusion with 20 Chapters around the globe Robust support for Flexible Working Arrangements Enhanced family friendly leave benefits Named to the Diversity Best Practices Index Signatory to the UK Women in Finance Charter Learn more at axaxl.com/about-us/inclusion-and-diversity. AXA XL is an Equal Opportunity Employer. Total Rewards AXA XL’s Reward program is designed to take care of what matters most to you, covering the full picture of your health, wellbeing, lifestyle and financial security. It provides dynamic compensation and personalized, inclusive benefits that evolve as you do. We’re committed to rewarding your contribution for the long term, so you can be your best self today and look forward to the future with confidence. Sustainability At AXA XL, Sustainability is integral to our business strategy. In an ever-changing world, AXA XL protects what matters most for our clients and communities. We know that sustainability is at the root of a more resilient future. Our 2023-26 Sustainability strategy, called “Roots of resilience”, focuses on protecting natural ecosystems, addressing climate change, and embedding sustainable practices across our operations. Our Pillars Valuing nature: How we impact nature affects how nature impacts us. Resilient ecosystems - the foundation of a sustainable planet and society - are essential to our future. We’re committed to protecting and restoring nature - from mangrove forests to the bees in our backyard - by increasing biodiversity awareness and inspiring clients and colleagues to put nature at the heart of their plans. Addressing climate change: The effects of a changing climate are far reaching and significant. Unpredictable weather, increasing temperatures, and rising sea levels cause both social inequalities and environmental disruption. We're building a net zero strategy, developing insurance products and services, and mobilizing to advance thought leadership and investment in societal-led solutions. Integrating ESG: All companies have a role to play in building a more resilient future. Incorporating ESG considerations into our internal processes and practices builds resilience from the roots of our business. We’re training our colleagues, engaging our external partners, and evolving our sustainability governance and reporting. AXA Hearts in Action: We have established volunteering and charitable giving programs to help colleagues support causes that matter most to them, known as AXA XL’s “Hearts in Action” programs. These include our Matching Gifts program, Volunteering Leave, and our annual volunteering day - the Global Day of Giving. For more information, please see axaxl.com/sustainability. Show more Show less

Posted 6 days ago

Apply

8.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Line of Service Advisory Industry/Sector Not Applicable Specialism Oracle Management Level Manager Job Description & Summary At PwC, our people in business application consulting specialise in consulting services for a variety of business applications, helping clients optimise operational efficiency. These individuals analyse client needs, implement software solutions, and provide training and support for seamless integration and utilisation of business applications, enabling clients to achieve their strategic objectives. Those in Oracle technology at PwC will focus on utilising and managing Oracle suite of software and technologies for various purposes within an organisation. You will be responsible for tasks such as installation, configuration, administration, development, and support of Oracle products and solutions. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Job Description & Summary Managing business performance in today’s complex and rapidly changing business environment is crucial for any organization’s short-term and long-term success. However ensuring streamlined E2E Oracle fusion Finance to seamlessly adapt to the changing business environment is crucial from a process and compliance perspective. As part of the Technology Consulting -Business Applications - Oracle Practice team, we leverage opportunities around digital disruption, new age operating model and best in class practices to deliver technology enabled transformation to our clients Responsibilities: Experienced Oracle Fusion Finance Consultant to join our finance team. Oracle Fusion Finance Functional : Minimum 2 implementation in Oracle Fusion ERP package - Finance modules as listed. Implementation, configuration, and maintenance of Oracle Fusion Financials modules, such as General Ledger, Accounts Payable, Accounts Receivable, Fixed Assets, lease Accounting, Tax and Cash Management. Ensure that financial systems and processes are designed and maintained in accordance with industry best practices and company policies. - Coordinate with cross-functional teams to ensure that financial systems are integrated with other enterprise systems. - Coordinate with other functional tracks on the accounting/ financial impact of transactions, SLA rules, etc. *Mandatory skill sets Modules: AP, AR, GL, FA & Lease accounting, CM, Tax modules of Fusion *Preferred skill sets - Provide hypercare/ AMS support post Go Live. - Has go the ability to work independently with minimal oversight - Carries a can-do attitude and a mindset of diversity and equality - Proficient in MS – Excel *Year of experience required Minimum 8Years of Oracle fusion experience *Educational Qualification BE/BTech MBA CA Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor of Technology, Master of Business Administration, Bachelor of Engineering Degrees/Field Of Study Preferred Certifications (if blank, certifications not specified) Required Skills Oracle Integration Cloud (OIC) Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Analytical Thinking, Business Transformation, Coaching and Feedback, Communication, Creativity, Design Automation, Embracing Change, Emotional Regulation, Empathy, Inclusion, Intellectual Curiosity, Learning Agility, Optimism, Oracle Application Development Framework (ADF), Oracle Business Intelligence (BI) Publisher, Oracle Cloud Infrastructure, Oracle Data Integration, Process Improvement, Process Optimization, Professional Courage, Relationship Building, Self-Awareness {+ 4 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date Show more Show less

Posted 6 days ago

Apply

6.0 - 9.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

About Deloitte Position Summary Deloitte refers to one or more of Deloitte Touche Tohmatsu Limited, a UK private company limited by guarantee (“DTTL”), its network of member firms, and their related entities. DTTL and each of its member firms are legally separate and independent entities. DTTL (also referred to as “Deloitte Global”) does not provide services to clients. In the United States, Deloitte refers to one or more of the US member firms of DTTL, their related entities that operate using the “Deloitte” name in the United States and their respective affiliates. Certain services may not be available to attest clients under the rules and regulations of public accounting. Please see www.deloitte.com/about to learn more about our global network of member firms. Copyright © 2017 Deloitte Development LLC. All rights reserved. Assistant Manager, Data-Marts & Analytics - Reporting and Analytics – Digital Data Analytics Innovation - Deloitte Support Services India Private Limited Are you a quick learner with a willingness to work with new technologies? Data-Marts and Reporting team offers you a particular opportunity to be an integral part of the Datamarts & Reporting – CoRe Digital | Data | Analytics | Innovation Group. The principle focus of this group is the research, development, maintain and documentation of customized solutions that e-enable the delivery of cutting- edge technology to firm's business centers. Work you will do As an Assistant Manager, you will be involved in Leading people, Project Management, customer interactions, research and develop solutions built on varied technologies like MS SQL Server, MS Azure SQL, MSBI Suite, Azure Data Factory, Azure Cloud Services, Python, Tableau, .Net. You will support a team which provides high-quality solutions to the customers by following a streamlined system development methodology. In the process of acquainting yourself with various development tools, testing tools, methodologies and processes, you will be aligned to the following role: Role: Datamart Solution – Assistant Manager As a Datamart Solution Assistant Manager, you will be responsible for Leading People, Project Management, customer interactions, research and develop solutions built on varied technologies like MS SQL Server, MS Azure SQL, MSBI Suite, Azure Data Factory, Azure Cloud Services, Python, Tableau, .Net. Your key responsibilities include: Interact with end users to gather, document, and interpret requirements. Leads multiple projects from a Project Management and Team Management perspective. Coordinates with USI/US Managers on project prioritization, team leverage and utilization, and performance measurement Works closely with project teams to ensure smooth and timely delivery including communication and escalations as required. Should possess an innovative mindset in exploring new tools, technologies and capabilities. Able to do hands-on coding/trouble shooting for complex projects. Responsible for people management of his/her team, including mentoring and counselling and presenting counselees in Performance Reviews Proficient in project management, work planning, and allocation of resources. Involved in reviewing design for project delivery and facilitating client interaction and effort estimation. Maintains knowledge base of the functional capabilities and is a Subject Matter Expert for the projects assigned. Nurture, develop and retain talent by coaching, counseling and mentoring team members. Develop SQL objects and scripts based on design. Analyze, debug, and optimize existing stored procedures and views. Leverage indexes, performance tuning techniques, and error handling to improve performance of SQL scripts. Create, schedule, and monitor SQL jobs. Should be proficient in building pipelines in SSIS and ADF. Should possess relevant experience in working with different sources and destinations. Should have knowledge on sourcing data through API calls. Should possess good knowledge in building applications and automate processes using Python or .Net. Should be capable of building solution’s by leveraging different Azure Services in a cost effective and efficient way. Proactively prioritize activities, handle tasks, and deliver quality solutions on time. Communicate clearly and regularly with team leadership and project teams. Assist Manager in developing/improving the capabilities of team. Provide appropriate feedback to Manager/leaders on team members whenever required. Involve in solution design sessions with clients/stakeholder, provide them best solution and deliver task. Should be able to effectively track time, bandwidth of team members in creating a proper pipeline for them. About Deloitte Deloitte refers to one or more of Deloitte Touche Tohmatsu Limited, a UK private company limited by guarantee (“DTTL”), its network of member firms, and their related entities. DTTL and each of its member firms are legally separate and independent entities. DTTL (also referred to as “Deloitte Global”) does not provide services to clients. In the United States, Deloitte refers to one or more of the US member firms of DTTL, their related entities that operate using the “Deloitte” name in the United States and their respective affiliates. Certain services may not be available to attest clients under the rules and regulations of public accounting. Please see www.deloitte.com/about to learn more about our global network of member firms. Copyright © 2017 Deloitte Development LLC. All rights reserved. Take active part in firm wide initiatives and encourage team members to do so. Manage ongoing deliverable timelines and own relationships with end clients to understand if deliverables continue to meet client’s need. Work collaboratively with other team members and end clients throughout development life cycle. Research, learn, implement, and share skills on new technologies. Understand the customer requirement well and provide status update to project lead (US/USI) on calls and emails efficiently. Proactively prioritize activities, handle tasks and deliver quality solutions on time. Continuously improves skills in this space by completing certification and recommended training. Good understanding of MVC .Net, SharePoint, Azure, MSBI. The team CoRe - Digital Data Analytics Innovation (DDAI) sits inside Deloitte’s global shared services organization and serves Deloitte’s 10 largest member firms worldwide. The Reporting and Analytics team creates user experiences that delivers relevant content and connections, enables the discovery of impactful knowledge and delights users by delivering innovative applications that use emerging technologies. Qualifications and experience Required: Educational Qualification: B.E/B.Tech or MTech (60% or 6.5 GPA and above) Should be proficient in understanding of one or more of the following Technologies: Extensive knowledge in DBMS concepts, exposure to querying on any relational database preferably MS SQL Server, MS Azure SQL, SSIS and Tableau. Should have good experience in automating and generating insights using Python. Should possess good knowledge in Azure Services. Knowledge on any of the coding language like C#. NET or VB .Net would be added advantage. Should have a good experience in leading people. Good experience in Project Management. Understands development methodology and lifecycle. Excellent analytical skills and communication skills (written, verbal, and presentation) Ability to chart ones’ own career and build networks within the organization. Ability to work both independently and as part of a team with professionals at all levels. Ability to prioritize tasks, work on multiple assignments, and raise concerns/questions where appropriate. Seek information / ideas / establish relationship with customer to assess any future opportunities. Total Experience: 6 to 9 years Skill set: Required: SQL Server, MS Azure SQL, SSIS, Azure Data Factory, Python, Data warehouse and BI Preferred: Tableau, Azure Services. Good to have: MVC.Net (Full stack suite) How you will grow Location: Hyderabad Work hours: 2 p.m. – 11 p.m. About Deloitte Deloitte refers to one or more of Deloitte Touche Tohmatsu Limited, a UK private company limited by guarantee (“DTTL”), its network of member firms, and their related entities. DTTL and each of its member firms are legally separate and independent entities. DTTL (also referred to as “Deloitte Global”) does not provide services to clients. In the United States, Deloitte refers to one or more of the US member firms of DTTL, their related entities that operate using the “Deloitte” name in the United States and their respective affiliates. Certain services may not be available to attest clients under the rules and regulations of public accounting. Please see www.deloitte.com/about to learn more about our global network of member firms. Copyright © 2017 Deloitte Development LLC. All rights reserved. At Deloitte, we have invested a great deal to create a rich environment in which our professionals can grow. We want all our people to develop in their own way, playing to their own strengths as they hone their leadership skills. And, as a part of our efforts, we provide our professionals with a variety of learning and networking opportunities—including exposure to leaders, sponsors, coaches, and challenging assignments—to help accelerate their careers along the way. No two people learn in exactly the same way. So, we provide a range of resources, including live classrooms, team- based learning, and eLearning. Deloitte University (DU): The Leadership Center in India, our state-of-the-art, world- class learning center in the Hyderabad office, is an extension of the DU in Westlake, Texas, and represents a tangible symbol of our commitment to our people’s growth and development. Explore DU: The Leadership Center in India. Benefits At Deloitte, we know that great people make a great organization. We value our people and offer employees a broad range of benefits. Learn more about what working at Deloitte can mean for you. Deloitte’s culture Our positive and supportive culture encourages our people to do their best work every day. We celebrate individuals by recognizing their uniqueness and offering them the flexibility to make daily choices that can help them to be healthy, centered, confident, and aware. We offer well-being programs and are continuously looking for new ways to maintain a culture that is inclusive, invites authenticity, leverages our diversity, and where our people excel and lead healthy, happy lives. Learn more about Life at Deloitte. Corporate citizenship Deloitte is led by a purpose: to make an impact that matters. This purpose defines who we are and extends to relationships with our clients, our people, and our communities. We believe that business has the power to inspire and transform. We focus on education, giving, skill-based volunteerism, and leadership to help drive positive social impact in our communities. Learn more about Deloitte’s impact on the world. Disclaimer: Please note that this description is subject to change basis business/project requirements and at the discretion of the management. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Benefits At Deloitte, we know that great people make a great organization. We value our people and offer employees a broad range of benefits. Learn more about what working at Deloitte can mean for you. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Professional development From entry-level employees to senior leaders, we believe there’s always room to learn. We offer opportunities to build new skills, take on leadership opportunities and connect and grow through mentorship. From on-the-job learning experiences to formal development programs, our professionals have a variety of opportunities to continue to grow throughout their career. Requisition code: 304415 Show more Show less

Posted 6 days ago

Apply

0 years

0 Lacs

Ahmedabad

On-site

Key Responsibilities: Optimize scalable data pipelines for ingestion, transformation, and loading of large datasets from diverse sources. Utilize Azure Data Factory for orchestration and automation of ETL/ELT processes, ensuring efficient data flow. Work extensively with Azure services such as Azure Blob Storage and Azure Data Lake Storage (ADLS Gen2) for efficient data storage and management. Integrate and manage data flows using Azure Logic Apps for event-driven automation and workflow orchestration. Develop and maintain robust SQL queries, stored procedures, and data models to support reporting and analytical requirements. Implement and enforce data governance, quality, and security best practices. Troubleshoot, debug, and optimize existing data pipelines and data infrastructure. Key Skills : Proven experience as a Senior Data Engineer, with a strong emphasis on designing and implementing solutions, production support, Maintenance projects & Data Engineering applications on Azure and Snowflake cloud platforms. Mandatory: Expert-level proficiency with Snowflake for data warehousing and data platform development. Mandatory: In-depth knowledge and practical experience with the Azure data ecosystem, including: Azure Data Factory (ADF) for pipeline orchestration. Azure Blob Storage and Azure Data Lake Storage (ADLS Gen2) for data storage. Azure Logic Apps for workflow automation. Exceptional command of SQL for complex querying, data manipulation, and performance tuning. Familiarity with version control systems (e.g., Git). Excellent problem-solving, analytical, and communication skills. Ability to work independently and as part of a collaborative team in a fast-paced environment. Nice to Have (Added Advantage): Experience with dbt (data build tool) for data transformation and modeling. & Knowledge of other programming languages such as Python. Familiarity with DevOps practices for data engineering (CI/CD).

Posted 6 days ago

Apply

6.0 years

0 Lacs

Kanayannur, Kerala, India

Remote

Linkedin logo

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. The opportunity We are seeking a highly skilled and motivated Senior DataOps Engineer with strong expertise in the Azure data ecosystem. You will play a crucial role in managing and optimizing data workflows across Azure platforms such as Azure Data Factory, Data Lake, Databricks, and Synapse. Your primary focus will be on building, maintaining, and monitoring data pipelines, ensuring high data quality, and supporting critical data operations. You'll also support visualization, automation, and CI/CD processes to streamline data delivery and reporting. Your Key Responsibilities Data Pipeline Management: Build, monitor, and optimize data pipelines using Azure Data Factory (ADF), Databricks, and Azure Synapse for efficient data ingestion, transformation, and storage. ETL Operations: Design and maintain robust ETL processes for batch and real-time data processing across cloud and on-premise sources. Data Lake Management: Organize and manage structured and unstructured data in Azure Data Lake, ensuring performance and security best practices. Data Quality & Validation: Perform data profiling, validation, and transformation using SQL, PySpark, and Python to ensure data integrity. Monitoring & Troubleshooting: Use logging and monitoring tools to troubleshoot failures in pipelines and address data latency or quality issues. Reporting & Visualization: Work with Power BI or Tableau teams to support dashboard development, ensuring the availability of clean and reliable data. DevOps & CI/CD: Support data deployment pipelines using Azure DevOps, Git, and CI/CD practices for version control and automation. Tool Integration: Collaborate with cross-functional teams to integrate Informatica CDI or similar ETL tools with Azure components for seamless data flow. Collaboration & Documentation: Partner with data analysts, engineers, and business stakeholders, while maintaining SOPs and technical documentation for operational efficiency. Skills and attributes for success Strong hands-on experience in Azure Data Factory, Azure Data Lake, Azure Synapse, and Databricks Solid understanding of ETL/ELT design and implementation principles Strong SQL and PySpark skills for data transformation and validation Exposure to Python for automation and scripting Familiarity with DevOps concepts, CI/CD workflows, and source control systems (Azure DevOps preferred) Experience in working with Power BI or Tableau for data visualization and reporting support Strong problem-solving skills, attention to detail, and commitment to data quality Excellent communication and documentation skills to interface with technical and business teamsStrong knowledge of asset management business operations, especially in data domains like securities, holdings, benchmarks, and pricing. To qualify for the role, you must have 4–6 years of experience in DataOps or Data Engineering roles Proven expertise in managing and troubleshooting data workflows within the Azure ecosystem Experience working with Informatica CDI or similar data integration tools Scripting and automation experience in Python/PySpark Ability to support data pipelines in a rotational on-call or production support environment Comfortable working in a remote/hybrid and cross-functional team setup Technologies and Tools Must haves Azure Databricks: Experience in data transformation and processing using notebooks and Spark. Azure Data Lake: Experience working with hierarchical data storage in Data Lake. Azure Synapse: Familiarity with distributed data querying and data warehousing. Azure Data factory: Hands-on experience in orchestrating and monitoring data pipelines. ETL Process Understanding: Knowledge of data extraction, transformation, and loading workflows, including data cleansing, mapping, and integration techniques. Good to have Power BI or Tableau for reporting support Monitoring/logging using Azure Monitor or Log Analytics Azure DevOps and Git for CI/CD and version control Python and/or PySpark for scripting and data handling Informatica Cloud Data Integration (CDI) or similar ETL tools Shell scripting or command-line data SQL (across distributed and relational databases) What We Look For Enthusiastic learners with a passion for data op’s and practices. Problem solvers with a proactive approach to troubleshooting and optimization. Team players who can collaborate effectively in a remote or hybrid work environment. Detail-oriented professionals with strong documentation skills. What we offer EY Global Delivery Services (GDS) is a dynamic and truly global delivery network. We work across six locations – Argentina, China, India, the Philippines, Poland and the UK – and with teams from all EY service lines, geographies and sectors, playing a vital role in the delivery of the EY growth strategy. From accountants to coders to advisory consultants, we offer a wide variety of fulfilling career opportunities that span all business disciplines. In GDS, you will collaborate with EY teams on exciting projects and work with well-known brands from across the globe. We’ll introduce you to an ever-expanding ecosystem of people, learning, skills and insights that will stay with you throughout your career. Continuous learning: You’ll develop the mindset and skills to navigate whatever comes next. Success as defined by you: We’ll provide the tools and flexibility, so you can make a meaningful impact, your way. Transformative leadership: We’ll give you the insights, coaching and confidence to be the leader the world needs. Diverse and inclusive culture: You’ll be embraced for who you are and empowered to use your voice to help others find theirs. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less

Posted 6 days ago

Apply

6.0 years

0 Lacs

Trivandrum, Kerala, India

Remote

Linkedin logo

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. The opportunity We are seeking a highly skilled and motivated Senior DataOps Engineer with strong expertise in the Azure data ecosystem. You will play a crucial role in managing and optimizing data workflows across Azure platforms such as Azure Data Factory, Data Lake, Databricks, and Synapse. Your primary focus will be on building, maintaining, and monitoring data pipelines, ensuring high data quality, and supporting critical data operations. You'll also support visualization, automation, and CI/CD processes to streamline data delivery and reporting. Your Key Responsibilities Data Pipeline Management: Build, monitor, and optimize data pipelines using Azure Data Factory (ADF), Databricks, and Azure Synapse for efficient data ingestion, transformation, and storage. ETL Operations: Design and maintain robust ETL processes for batch and real-time data processing across cloud and on-premise sources. Data Lake Management: Organize and manage structured and unstructured data in Azure Data Lake, ensuring performance and security best practices. Data Quality & Validation: Perform data profiling, validation, and transformation using SQL, PySpark, and Python to ensure data integrity. Monitoring & Troubleshooting: Use logging and monitoring tools to troubleshoot failures in pipelines and address data latency or quality issues. Reporting & Visualization: Work with Power BI or Tableau teams to support dashboard development, ensuring the availability of clean and reliable data. DevOps & CI/CD: Support data deployment pipelines using Azure DevOps, Git, and CI/CD practices for version control and automation. Tool Integration: Collaborate with cross-functional teams to integrate Informatica CDI or similar ETL tools with Azure components for seamless data flow. Collaboration & Documentation: Partner with data analysts, engineers, and business stakeholders, while maintaining SOPs and technical documentation for operational efficiency. Skills and attributes for success Strong hands-on experience in Azure Data Factory, Azure Data Lake, Azure Synapse, and Databricks Solid understanding of ETL/ELT design and implementation principles Strong SQL and PySpark skills for data transformation and validation Exposure to Python for automation and scripting Familiarity with DevOps concepts, CI/CD workflows, and source control systems (Azure DevOps preferred) Experience in working with Power BI or Tableau for data visualization and reporting support Strong problem-solving skills, attention to detail, and commitment to data quality Excellent communication and documentation skills to interface with technical and business teamsStrong knowledge of asset management business operations, especially in data domains like securities, holdings, benchmarks, and pricing. To qualify for the role, you must have 4–6 years of experience in DataOps or Data Engineering roles Proven expertise in managing and troubleshooting data workflows within the Azure ecosystem Experience working with Informatica CDI or similar data integration tools Scripting and automation experience in Python/PySpark Ability to support data pipelines in a rotational on-call or production support environment Comfortable working in a remote/hybrid and cross-functional team setup Technologies and Tools Must haves Azure Databricks: Experience in data transformation and processing using notebooks and Spark. Azure Data Lake: Experience working with hierarchical data storage in Data Lake. Azure Synapse: Familiarity with distributed data querying and data warehousing. Azure Data factory: Hands-on experience in orchestrating and monitoring data pipelines. ETL Process Understanding: Knowledge of data extraction, transformation, and loading workflows, including data cleansing, mapping, and integration techniques. Good to have Power BI or Tableau for reporting support Monitoring/logging using Azure Monitor or Log Analytics Azure DevOps and Git for CI/CD and version control Python and/or PySpark for scripting and data handling Informatica Cloud Data Integration (CDI) or similar ETL tools Shell scripting or command-line data SQL (across distributed and relational databases) What We Look For Enthusiastic learners with a passion for data op’s and practices. Problem solvers with a proactive approach to troubleshooting and optimization. Team players who can collaborate effectively in a remote or hybrid work environment. Detail-oriented professionals with strong documentation skills. What we offer EY Global Delivery Services (GDS) is a dynamic and truly global delivery network. We work across six locations – Argentina, China, India, the Philippines, Poland and the UK – and with teams from all EY service lines, geographies and sectors, playing a vital role in the delivery of the EY growth strategy. From accountants to coders to advisory consultants, we offer a wide variety of fulfilling career opportunities that span all business disciplines. In GDS, you will collaborate with EY teams on exciting projects and work with well-known brands from across the globe. We’ll introduce you to an ever-expanding ecosystem of people, learning, skills and insights that will stay with you throughout your career. Continuous learning: You’ll develop the mindset and skills to navigate whatever comes next. Success as defined by you: We’ll provide the tools and flexibility, so you can make a meaningful impact, your way. Transformative leadership: We’ll give you the insights, coaching and confidence to be the leader the world needs. Diverse and inclusive culture: You’ll be embraced for who you are and empowered to use your voice to help others find theirs. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less

Posted 6 days ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Job Description Title: Business Analyst and Businesss Intelligence Developer(Digital Solution Team)- Husky( India)Chennai Id: 20036 Type: FullTime Location: Chennai, India Job Description Job Purpose The DST Business Analyst and Business Intelligence Developer for Husky will be responsible for building the business intelligence system for the company, based on the internal and external data structures. Responsible for leading the design and support of enterprise-wide business intelligence applications and architecture. Works with enterprise-wide business and IT senior management to understand and prioritize data and information requirements. Solves complex technical problems. Optimizes the performance of enterprise business intelligence tools by defining data elements which contribute to data insights which add value to the user. Creates testing methodology and criteria. Designs and coordinates a curriculum for coaching and training customers in the use of business intelligence tools to enhance business decision-making capability. Develops standards, policies, and procedures for the form, structure, and attributes of the business intelligence tools and systems. Develops data/information quality metrics. Researches new technology and develops business cases to support enterprise-wide business intelligence solutions. Key Responsibilities & Key Success Metrics Leading BI software development, deployment and maintenance Perform Data Profiling and Data Analysis activities to understand data sources Report curation, template definition and analytical data modeling Work with cross-functional teams to gather and document reporting requirements Translate business requirements into specifications that will be used to implement the required reports and dashboards, created from potentially multiple data sources Identifies and resolves data reporting issues in a timely fashion, while looking for continuous improvement opportunities. Build solutions that create value and resolve business problems Provide technical guidance to designers and other stakeholders Work effectively with members of Digital Solutions Team Troubleshoots analytics tool problems and tunes for performance Develops semantic layer and analytics query objects for end users Translation of business questions and requirements into reports, views, and analytics query objects Ensuring that quality standards are met Supporting Master Data Management Strategy Qualifications Understanding of ERP and Operational systems databases, knowledge of database programming Highly skilled at writing SQL queries with large scale, complex datasets Experience in data visualization and data storytelling Experience designing, debugging and deploying software in ADO (Azure Dev/Ops) development environment Experience with Microsoft BI stack - Power BI and SQL Server Analysis Services Experience working in an international business environment Experience with Azure Data Platform resources (ADLS, ADF, Azure Synapse, Power BI Services) Basic manufacturing and sales business process knowledge Strong communication & presentation skills Ability to moderate meetings and constructive design sessions for effective decision making English language skills are a requirement, German & French are considered an asset Show more Show less

Posted 6 days ago

Apply

6.0 years

0 Lacs

Noida, Uttar Pradesh, India

Remote

Linkedin logo

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. The opportunity We are seeking a highly skilled and motivated Senior DataOps Engineer with strong expertise in the Azure data ecosystem. You will play a crucial role in managing and optimizing data workflows across Azure platforms such as Azure Data Factory, Data Lake, Databricks, and Synapse. Your primary focus will be on building, maintaining, and monitoring data pipelines, ensuring high data quality, and supporting critical data operations. You'll also support visualization, automation, and CI/CD processes to streamline data delivery and reporting. Your Key Responsibilities Data Pipeline Management: Build, monitor, and optimize data pipelines using Azure Data Factory (ADF), Databricks, and Azure Synapse for efficient data ingestion, transformation, and storage. ETL Operations: Design and maintain robust ETL processes for batch and real-time data processing across cloud and on-premise sources. Data Lake Management: Organize and manage structured and unstructured data in Azure Data Lake, ensuring performance and security best practices. Data Quality & Validation: Perform data profiling, validation, and transformation using SQL, PySpark, and Python to ensure data integrity. Monitoring & Troubleshooting: Use logging and monitoring tools to troubleshoot failures in pipelines and address data latency or quality issues. Reporting & Visualization: Work with Power BI or Tableau teams to support dashboard development, ensuring the availability of clean and reliable data. DevOps & CI/CD: Support data deployment pipelines using Azure DevOps, Git, and CI/CD practices for version control and automation. Tool Integration: Collaborate with cross-functional teams to integrate Informatica CDI or similar ETL tools with Azure components for seamless data flow. Collaboration & Documentation: Partner with data analysts, engineers, and business stakeholders, while maintaining SOPs and technical documentation for operational efficiency. Skills and attributes for success Strong hands-on experience in Azure Data Factory, Azure Data Lake, Azure Synapse, and Databricks Solid understanding of ETL/ELT design and implementation principles Strong SQL and PySpark skills for data transformation and validation Exposure to Python for automation and scripting Familiarity with DevOps concepts, CI/CD workflows, and source control systems (Azure DevOps preferred) Experience in working with Power BI or Tableau for data visualization and reporting support Strong problem-solving skills, attention to detail, and commitment to data quality Excellent communication and documentation skills to interface with technical and business teamsStrong knowledge of asset management business operations, especially in data domains like securities, holdings, benchmarks, and pricing. To qualify for the role, you must have 4–6 years of experience in DataOps or Data Engineering roles Proven expertise in managing and troubleshooting data workflows within the Azure ecosystem Experience working with Informatica CDI or similar data integration tools Scripting and automation experience in Python/PySpark Ability to support data pipelines in a rotational on-call or production support environment Comfortable working in a remote/hybrid and cross-functional team setup Technologies and Tools Must haves Azure Databricks: Experience in data transformation and processing using notebooks and Spark. Azure Data Lake: Experience working with hierarchical data storage in Data Lake. Azure Synapse: Familiarity with distributed data querying and data warehousing. Azure Data factory: Hands-on experience in orchestrating and monitoring data pipelines. ETL Process Understanding: Knowledge of data extraction, transformation, and loading workflows, including data cleansing, mapping, and integration techniques. Good to have Power BI or Tableau for reporting support Monitoring/logging using Azure Monitor or Log Analytics Azure DevOps and Git for CI/CD and version control Python and/or PySpark for scripting and data handling Informatica Cloud Data Integration (CDI) or similar ETL tools Shell scripting or command-line data SQL (across distributed and relational databases) What We Look For Enthusiastic learners with a passion for data op’s and practices. Problem solvers with a proactive approach to troubleshooting and optimization. Team players who can collaborate effectively in a remote or hybrid work environment. Detail-oriented professionals with strong documentation skills. What we offer EY Global Delivery Services (GDS) is a dynamic and truly global delivery network. We work across six locations – Argentina, China, India, the Philippines, Poland and the UK – and with teams from all EY service lines, geographies and sectors, playing a vital role in the delivery of the EY growth strategy. From accountants to coders to advisory consultants, we offer a wide variety of fulfilling career opportunities that span all business disciplines. In GDS, you will collaborate with EY teams on exciting projects and work with well-known brands from across the globe. We’ll introduce you to an ever-expanding ecosystem of people, learning, skills and insights that will stay with you throughout your career. Continuous learning: You’ll develop the mindset and skills to navigate whatever comes next. Success as defined by you: We’ll provide the tools and flexibility, so you can make a meaningful impact, your way. Transformative leadership: We’ll give you the insights, coaching and confidence to be the leader the world needs. Diverse and inclusive culture: You’ll be embraced for who you are and empowered to use your voice to help others find theirs. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less

Posted 6 days ago

Apply

0.0 - 1.0 years

0 Lacs

Kochi, Kerala

On-site

Indeed logo

Work Location: Kochi/Trivandrum 10-15+ Yrs Experience ▪ Expertise in Azure services including App Services, Functions, DevOps pipelines and ADF ▪ Expert-level knowledge of MuleSoft Anypoint Platform, API lifecycle management, and enterprise integration ▪ Proven skills in Java-based web applications with RDBMS or NoSQL backends ▪ Proven skills in Python and JavaScript for backend and full-stack solutions ▪ Deep understanding of object-oriented programming and design patterns ▪ Solid hands-on experience with RESTful API development and microservices architecture ▪ Familiarity with unit testing frameworks (e.g., TestNG) and integration testing in Java ▪ Experience with code review processes, test coverage validation, and CI/CD pipelines ▪ Proficiency in Git, SVN, and other version control systems ▪ Comfortable working with static code analysis tools and agile methodologies ▪ Good knowledge of JIRA, Confluence, and project collaboration tools ▪ Strong communication skills and ability to mentor team members ▪ Ability to prepare detailed technical documentation and design specifications ▪ Passion for clean code, best practices, and scalable architecture ▪ Nice to have: experience with identity providers like Auth0, Keycloak, IdentityServer. ▪ Take ownership of tasks and user stories; provide accurate estimations ▪ Provide technically sound advice/decisions on how to develop functionality to implement client’s business logic in software systems ▪ Lead sprint activities, including task management and code reviews ▪ Design and implement secure, scalable, and maintainable solutions ▪ Conduct technical discovery and performance analysis of critical systems ▪ Write low-level design and as-built documents ▪ Translate business requirements into technical specifications and working code ▪ Develop and maintain unit, integration, and regression test cases ▪ Ensure adherence to TDD and promote high code coverage ▪ Integrate multiple data sources and systems with dissimilar structures ▪ Contribute to overall system design with focus on security and resilience ▪ Use static code analysis tools and set up CI/CD pipelines ▪ Participate in Agile ceremonies: grooming, planning, stand-ups, and retrospectives ▪ Collaborate with technical architects on solution design ▪ Mentor junior team members on coding standards, sprint tasks, and technology ▪ Troubleshoot, test, and optimize core software and databases ▪ Stay updated with industry trends and promote best practices within the team. ▪ Identify challenges, initiate PoCs, and perform feasibility studies. • Participate in the full product development cycle, including brainstorming, release planning and estimation, implementing and iterating on code, coordinating with internal and external clients, internal code and design reviews, MVP and production releases, quality assurance, and product support. • Highly effective and thrive in a dynamic environment. • Comfortable with proactive outward communication and technical leadership and positive about accepting challenges. • Adhere to ISMS Policies and procedures. Job Type: Full-time Pay: From ₹2,500,000.00 per year Benefits: Health insurance Provident Fund Schedule: Day shift Ability to commute/relocate: Kochi, Kerala: Reliably commute or planning to relocate before starting work (Preferred) Experience: Mulesoft: 1 year (Preferred) Java: 1 year (Preferred) Python: 1 year (Preferred) JavaScript: 1 year (Preferred) Azure: 1 year (Preferred) Work Location: In person

Posted 6 days ago

Apply

4.0 - 6.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Position Summary Senior Analyst, Data-Marts & Reporting - Reporting and Analytics – Digital Data Analytics Innovation - Deloitte Support Services India Private Limited Are you a quick learner with a willingness to work with new technologies? Data-Marts and Reporting team offers you a particular opportunity to be an integral part of the Datamarts & Reporting – CoRe Digital | Data | Analytics | Innovation Group. The principle focus of this group is the research, development, maintain and documentation of customized solutions that e-enable the delivery of cutting- edge technology to firm's business centers. Work you will do As a Senior Analyst, you will research and develop solutions built on varied technologies like Microsoft SQL Server, MSBI Suite, MS Azure SQL, Tableau, .Net. You will support a team which provides high-quality solutions to the customers by following a streamlined system development methodology. In the process of acquainting yourself with various development tools, testing tools, methodologies and processes, you will be aligned to the following role: Role: Datamart Solution Senior Analyst As a Datamart Solution Analyst, you will be responsible for delivering technical solutions on building high performing datamart and reporting tools using tools/technologies like Microsoft SQL Server, MSBI Suite, MS Azure SQL, Tableau, .Net. Your key responsibilities include: Interact with end users to gather, document, and interpret requirements. Leverage requirements to design technical solution. Develop SQL objects and scripts based on design. Analyze, debug, and optimize existing stored procedures and views. Leverage indexes, performance tuning techniques, and error handling to improve performance of SQL scripts. Create and modify SSIS packages, ADF Pipelines for transferring data between various systems cloud and On-premise environments. Should be able to seamlessly work with different Azure services. Improve performance and find opportunities to improvise process to bring in efficiency in SQL, SSIS and ADF. Create, schedule and monitor SQL jobs. Build interactive visualizations in Tableau for leadership reporting. Proactively prioritize activities, handle tasks and deliver quality solutions on time. Communicate clearly and regularly with team leadership and project teams. Manage ongoing deliverable timelines and own relationships with end clients to understand if deliverables continue to meet client’s need. Work collaboratively with other team members and end clients throughout development life cycle. Research, learn, implement, and share skills on new technologies. Understand the customer requirement well and provide status update to project lead (US/USI) on calls and emails efficiently. Proactively prioritize activities, handle tasks and deliver quality solutions on time. Guide junior members in team to get them up to speed in domain, tool and technologies we work on. Continuously improves skills in this space by completing certification and recommended training. Obtain and maintain a thorough understanding of the MDM data model, Global Systems & Client attributes. Good understanding of MVC .Net, Sharepoint front end solutions. Good to have knowledge on Full stack development. The team CoRe - Digital Data Analytics Innovation (DDAI) sits inside Deloitte’s global shared services organization and serves Qualifications and experience Required: Educational Qualification: B.E/B.Tech or MTech (60% or 6.5 GPA and above) Should be proficient in understanding of one or more of the following Technologies: Knowledge in DBMS concepts, exposure to querying on any relational database preferably MS SQL Server, MS Azure SQL, SSIS, Tableau. Knowledge on any of the coding language like C#. NET or VB .Net would be added advantage. Understands development methodology and lifecycle. Excellent analytical skills and communication skills (written, verbal, and presentation) Ability to chart ones’ own career and build networks within the organization Ability to work both independently and as part of a team with professionals at all levels Ability to prioritize tasks, work on multiple assignments, and raise concerns/questions where appropriate Seek information / ideas / establish relationship with customer to assess any future opportunities Total Experience: 4-6 years of overall experience At least 3 years of experience in data base development, ETL and Reporting Skill set Required: SQL Server, MS Azure SQL, Azure Data factory, SSIS, Azure Synapse, Data warehousing & BI Preferred: Tableau, .Net Good to have: MVC .Net, Sharepoint front end solutions. Location: Hyderabad Work hours: 2 p.m. – 11 p.m. How you will grow At Deloitte, we have invested a great deal to create a rich environment in which our professionals can grow. We want all our people to develop in their own way, playing to their own strengths as they hone their leadership skills. And, as a part of our efforts, we provide our professionals with a variety of learning and networking opportunities—including exposure to leaders, sponsors, coaches, and challenging assignments—to help accelerate their careers along the way. No two people learn in exactly the same way. So, we provide a range of resources, including live classrooms, team- based learning, and eLearning. Deloitte University (DU): The Leadership Center in India, our state-of-the-art, world- class learning center in the Hyderabad office, is an extension of the DU in Westlake, Texas, and represents a tangible symbol of our commitment to our people’s growth and development. Explore DU: The Leadership Center in India. Benefits At Deloitte, we know that great people make a great organization. We value our people and offer employees a broad range of benefits. Learn more about what working at Deloitte can mean for you. Deloitte’s culture Our positive and supportive culture encourages our people to do their best work every day. We celebrate individuals by recognizing their uniqueness and offering them the flexibility to make daily choices that can help them to be healthy, centered, confident, and aware. We offer well-being programs and are continuously looking for new ways to maintain a culture that is inclusive, invites authenticity, leverages our diversity, and where our people excel and lead healthy, happy lives. Learn more about Life at Deloitte. Corporate citizenship Deloitte is led by a purpose: to make an impact that matters. This purpose defines who we are and extends to relationships with our clients, our people, and our communities. We believe that business has the power to inspire and transform. We focus on education, giving, skill-based volunteerism, and leadership to help drive positive social impact in our communities. Learn more about Deloitte’s impact on the world. Disclaimer: Please note that this description is subject to change basis business/project requirements and at the discretion of the management. #EAG-Core Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Benefits At Deloitte, we know that great people make a great organization. We value our people and offer employees a broad range of benefits. Learn more about what working at Deloitte can mean for you. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Professional development From entry-level employees to senior leaders, we believe there’s always room to learn. We offer opportunities to build new skills, take on leadership opportunities and connect and grow through mentorship. From on-the-job learning experiences to formal development programs, our professionals have a variety of opportunities to continue to grow throughout their career. Requisition code: 300780 Show more Show less

Posted 6 days ago

Apply

10.0 - 15.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Linkedin logo

About the company: With over 2.5 crore customers, over 5,000 distribution points and nearly 2,000 branches, IndusInd Bank is a universal bank with a widespread banking footprint across the country. IndusInd offers a wide array of products and services for individuals and corporates including microfinance, personal loans, personal and commercial vehicles loans, credit cards, SME loans. Over the years, IndusInd has grown ceaselessly and dynamically, driven by zeal to offer our customers banking services at par with the highest quality standards in the industry. IndusInd is a pioneer in digital first solutions to bring together the power of next-gen digital product stack, customer excellence and trust of an established bank. Job Purpose: To work on implementing data modeling solutions To design data flow and structure to reduce data redundancy and improving data movement among systems defining a data lineage To work in the Azure Data Warehouse To work with large data volume of data integration Experience With overall experience between 10 to 15 years, applicant must have minimum 8 to 11 years of hard core professional experience in data modeling for large Data Warehouse with multiple Sources. Technical Skills Expertise in core skill of data modeling principles/methods including conceptual, logical & physical Data Models Ability to utilize BI tools like Power BI, Tableau, etc to represent insights Experience in translating/mapping relational data models into XML and Schemas Expert knowledge of metadata management, relational & data modeling tools like ER Studio, Erwin or others. Hands-on experience in relational, dimensional and/or analytical experience (using RDBMS, dimensional, NoSQL, ETL and data ingestion protocols). Very strong in SQL queries Expertise in performance tuning of SQL queries. Ability to analyse source system and create Source to Target mapping. Ability to understand the business use case and create data models or joined data in Datawarehouse. Preferred experience in banking domain and experience in building data models/marts for various banking functions. Good to have knowledge of – -Azure powershell scripting or Python scripting for data transformation in ADF - SSIS, SSAS, BI tools like Power BI -Azure PaaS components like Azure Data Factory, Azure Data Bricks, Azure Data Lake, Azure Synapse (DWH), Polybase, ExpressRoute tunneling, etc. -API integration Responsibility Understanding the existing data model, existing data warehouse design, functional domain subject areas of data, documenting the same with as is architecture and proposed one. Understanding existing ETL process, various sources and analyzing, documenting the best approach to design logical data model where required Work with development team to implement the proposed data model into physical data model, build data flows Work with development team to optimize the database structure with best practices applying optimization methods. Analyze, document and implement to re-use of data model for new initiatives. Will interact with stakeholder, Users, other IT teams to understand the eco system and analyze for solutions Work on user requirements and create queries for creating consumption views for users from the existing DW data. Will train and lead a small team of data engineers. Qualifications Bachelors of Computer Science or Equivalent Should have certification done on Data Modeling and Data Analyst. Good to have a certification of Azure Fundamental and Azure Engineer courses (AZ900 or DP200/201) Behavioral Competencies Should have excellent problem-solving and time management skills Strong analytical thinking skills Applicant should have excellent communication skill and process oriented with flexible execution mindset. Strategic Thinking with Research and Development mindset. Clear and demonstrative communication Efficiently identify and solves issues Identify, track and escalate risks in a timely manner Selection Process: Interested Candidates are mandatorily required to apply through the listing on Jigya. Only applications received through Jigya will be evaluated further. Shortlisted candidates may need to appear in an Online Assessment and/or a Technical Screening interview administered by Jigya, on behalf on IndusInd Bank Candidates selected after the screening rounds will be processed further by IndusInd Bank Show more Show less

Posted 6 days ago

Apply

15.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Data Project/Program/ Delivery Manager Technical skills: Primary Skill: SQL, Datawarehouse, Azure tech stack – Azure Data Factory (ADF), Databricks, Synapse Secondary Skills: Knowledge of any BI reporting tool like Power BI or Tableau Good to have skills: Candidates having these extra skills on their resumes are an added advantage while assessing the profile. Some of them are as follows: Expertise in Insurance domain EXPERIENCE/SKILLS: Min 15 years work experience in successful delivery of complex data related projects end to end. Experience in Agile or DataOps delivery, quality practices, techniques, and tools at all layers of data engineering Tech-savvy and good understanding of recent technologies incl. Azure cloud API, inclusion of unstructured data, business intelligence tools. Familiarity with JIRA and other prioritization tools Knowledge and experience with project management methodologies (Agile/Waterfall) to work with intricate, multifaceted projects. Excellent communication and coordination skills Comfortable with changing and flexible requirements from business owner Customer oriented attitude High degree of self-motivation Experience managing third party relationships in the successful achievement of customer deliveries. Demonstrated track record of delivering high quality projects & programs up to medium to large sized accounts. Demonstrated experience in successful delivery of complex data related projects end to end. Ability to communicate clearly to all levels and present to senior leadership. Ability to lead, motivate & direct med-large sized engineering delivery teams. Ability to help define delivery management core processes and improvement opportunities. Demonstrated attentiveness to quality and productivity as outcomes. Advanced analytical, problem solving, negotiation and organizational skills. Ability to manage significant delivery budgets and minimize program variances. Strong ability to lead teams across multiple shores. Strong ETL skills and working experience with SSIS and related functions. Knowledge of data warehouse and data lake frameworks Show more Show less

Posted 6 days ago

Apply

10.0 years

0 Lacs

India

Remote

Linkedin logo

🚀 We’re Hiring: Senior Data Engineer (Remote – India | Full-time or Contract) We are helping our client hire a Senior Data Engineer with over 10 years of experience in modern data platforms. This is a remote role open across India , and available on both full-time and contract basis. 💼 Position: Senior Data Engineer 🌍 Location: Remote (India) 📅 Type: Full-Time / Contract 📊 Experience: 10+ Years 🔧 Must-Have Skills: Data Engineering, Data Warehousing, ETL Azure Databricks & Azure Data Factory (ADF) PySpark, SparkSQL Python, SQL 👀 What We’re Looking For: A strong background in building and managing data pipelines Hands-on experience in cloud platforms, especially Azure Ability to work independently and collaborate in distributed teams 📩 How to Apply: Please send your resume to [your email] with the subject line: "Senior Data Engineer – Remote India" ⚠️ Along with your resume, kindly include the following details: Full Name Mobile Number Total Experience Relevant Experience Current CTC Expected CTC Notice Period Current Location Are you fine with Contract or Full time or both? Willing to work IST/US overlapping hours: Yes/No Do you have a PF account? (Yes/No) 🔔 Follow our company page to stay updated on future job openings! #DataEngineer #AzureDatabricks #ADF #PySpark #SQL #RemoteJobsIndia #HiringNow #Strive4X #ContractJobs #FullTimeJobs #IndiaJobs Show more Show less

Posted 6 days ago

Apply

0 years

0 Lacs

Karnataka, India

On-site

Linkedin logo

Please update your details here: https://forms.cloud.microsoft/r/92GNAVdBWW Power BI Developer (5-8 Yrs) : PAN INDIA Skills required: Primary Skills : Power BI + DAX + SQL experience • Knowledge of Azure Data Factory, ADLS .• Individuals with strong technical background in SQL, SQL Server, data engineering, BI, Kusto, Data Analysis and strong knowledge of Data Warehousing Concepts. • Customer facing, code development and review of dev • T-SQL queries, Performance tuning • Proven experience in strong Data Analysis dealing with large volume of datasets • Proven experience in Design & Implementation of end-end data pipelines. • Should have strong knowledge of Data Warehousing concepts. • Should have very good experience in BI tools ,Reporting tools like Power BI • Microsoft Big Data Technologies • Knowledge on Azure VM, Storage Role and Responsibilities: • Should be able write complex and efficient -SQL queries. • Should be able to design & Implement end-end data pipelines using cosmos and ADF. • Problem analysis and troubleshooting complex & “large volume of dataset’s” data issues. • Strong communication skills: should be able to communicate complex information to technical and non-technical stakeholders. Show more Show less

Posted 6 days ago

Apply

7.0 years

0 Lacs

India

Remote

Linkedin logo

Job Title: Senior Data Engineer – Azure & API Development Location: Remote Experience Required: 7+ Years Job Summary: We are looking for an experienced Data Engineer with strong expertise in Azure cloud architecture , API development , and modern data engineering tools . The ideal candidate will have in-depth experience in building and maintaining scalable data pipelines and API integrations using Azure services like Azure Data Factory (ADF) , Databricks , Azure Functions , and Service Bus , along with infrastructure provisioning using Terraform . Key Responsibilities: Design and implement scalable, secure, and high-performance data solutions on Azure . Develop, deploy, and manage RESTful APIs to support data access and integration. Build and maintain ETL/ELT data pipelines using Azure Data Factory , Databricks , and Azure Functions . Integrate data workflows with Azure Service Bus and other messaging services. Define and implement cloud infrastructure using Terraform and Infrastructure-as-Code (IaC) best practices. Collaborate with stakeholders to understand data requirements and develop technical solutions. Ensure best practices for data governance , security , monitoring , and performance optimization . Work closely with DevOps and Data Architects to implement CI/CD pipelines and production-grade deployments. Must-Have Skills: 7+ years of professional experience in Data Engineering or related roles. Strong hands-on experience with Azure services , particularly: Azure Data Factory (ADF) Databricks (Spark-based processing) Azure Functions Azure Service Bus Proficient in API development (RESTful APIs using Python, .NET, or Node.js). Good command over SQL , Spark SQL , and data transformation techniques. Experience with Terraform for IaC and provisioning Azure resources. Excellent understanding of data architecture , cloud security , and governance models . Strong problem-solving skills and experience working in Agile environments. Preferred Skills: Familiarity with CI/CD tools like Azure DevOps, GitHub Actions, or Jenkins. Exposure to event-driven architecture and real-time data streaming. Knowledge of containerization (Docker/Kubernetes) is a plus. Experience in performance tuning and cost optimization in Azure environments. Show more Show less

Posted 6 days ago

Apply

Exploring ADF Jobs in India

The job market for ADF (Application Development Framework) professionals in India is witnessing significant growth, with numerous opportunities available for job seekers in this field. ADF is a popular framework used for building enterprise applications, and companies across various industries are actively looking for skilled professionals to join their teams.

Top Hiring Locations in India

Here are 5 major cities in India where there is a high demand for ADF professionals: - Bangalore - Hyderabad - Pune - Chennai - Mumbai

Average Salary Range

The estimated salary range for ADF professionals in India varies based on experience levels: - Entry-level: INR 4-6 lakhs per annum - Mid-level: INR 8-12 lakhs per annum - Experienced: INR 15-20 lakhs per annum

Career Path

In the ADF job market in India, a typical career path may include roles such as Junior Developer, Senior Developer, Technical Lead, and Architect. As professionals gain more experience and expertise in ADF, they can progress to higher-level positions with greater responsibilities.

Related Skills

In addition to ADF expertise, professionals in this field are often expected to have knowledge of related technologies such as Java, Oracle Database, SQL, JavaScript, and web development frameworks like Angular or React.

Interview Questions

Here are 25 interview questions for ADF roles, categorized by difficulty level: - Basic: - What is ADF and its key features? - What is the difference between ADF Faces and ADF Task Flows? - Medium: - Explain the lifecycle of an ADF application. - How do you handle exceptions in ADF applications? - Advanced: - Discuss the advantages of using ADF Business Components. - How would you optimize performance in an ADF application?

Closing Remark

As you explore job opportunities in the ADF market in India, make sure to enhance your skills, prepare thoroughly for interviews, and showcase your expertise confidently. With the right preparation and mindset, you can excel in your ADF career and secure rewarding opportunities in the industry. Good luck!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies