Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
30.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Who Is Sa.global sa.global addresses industry challenges through vertical-focussed solutions. Leveraging modern technologies like AI and Copilot, we empower organizations to make intelligent decisions and act faster. Our solutions and services are 100% based on Microsoft business applications and the Microsoft Business cloud, and benefit advertising and marketing, accounting, architecture and engineering, consulting, homebuilding, legal, and IT services companies. Through our industry-first approach, we want to put solutions in the hands of people closest to the problem to enable organizations to act faster and make intelligent decisions. Over 800,000 users in 80 countries around the world rely on sa.global's industry-focused expertise to gain value faster, adapt quickly to changes, and build for the future. We have 30+ years of real-world experience, we are an 11-time winner of the Microsoft Dynamics Partner of the Year Award, and we’ve been a part of Microsoft’s elite Inner Circle for11 years. Our global organization has a 1000-member team across 25 countries. For more information, visit www.saglobal.com. Why Choose sa.global Open, flexible, vibrant, collaborative, and diverse – these are just some of the terms that our employees use to describe the culture at sa.global. We believe and encourage innovative and dynamic thinking. Our culture and values give us the extra edge to help us scale greater heights. Led by our Core Values: Agile, Capable, and Committed , which form an integral part of who we are, we constantly strive to provide an inclusive work environment. Our employees come from varied cultural and social backgrounds, and we strive each day to work towards making sa.global a great place to work. Values of sa.global Contribute towards a working environment that represents “one sa.global” where everyone is seen as an equal, and equality and diversity is championed Interact with a wide variety of colleagues, customers, and stakeholders at all levels with respect, courtesy, and professionalism Come as you are, make work fun & others successful, and foster an always learning mentality About The Role We offer a career with growth opportunities in a dynamic, collaborative, and supportive organization. We also have a strong and ethical working culture. If you'd to work with a team that is passionate about their work while also having a good sense of fun, you might have just found what you are looking for! sa.global is looking for a motivated, driven, and skilled Power BI Contractor to join our dynamic consulting team in India! Skills And Experience 7+ years of experience with Microsoft BI tools, preferably Power BI MS SQL Server, ADF (Azure Data Factory), Microsoft Fabric Data Pipelines, Azure Synapse, Power BI Strong knowledge of T-SQL joins, Views, and Stored Procedures Strong knowledge on data transformation, data modelling and performance tuning Good to have ERP experience Candidates with Microsoft certifications are preferred Excellent business communication skills with effective presentation and demonstration skills Ability to guide junior team members and co-ordinate with them to deliver above and beyond commitments Strong learning orientation with regards to new technologies and implementing them Self-starter and initiative taker Strong analytical skills and detail-oriented Bachelor in Engineering/Bachelor in Technology, BCA graduate or Diploma in Engineering Contact Us! If this is a promising opportunity for you and you possess the desired skills and experience, please apply for the role. We will be in touch! If you're not looking for a job change but know someone that is, please share the details of this open position with them! Show more Show less
Posted 3 weeks ago
6.0 - 10.0 years
10 - 18 Lacs
Gurugram
Work from Office
Job Title : Azure Data Engineer (SQL & Azure Data Services) Location: Gurgaon ODC Job Description: The Azure Data Engineer will be responsible for developing, maintaining, and optimizing data pipelines and SQL databases using Azure Data Factory (ADF), Microsoft Fabrics and other Azure services. The role requires expertise in SQL Server, ETL/ELT processes, and data modeling to support business intelligence and operational applications. The ideal candidate will collaborate with cross-functional teams to deliver reliable, scalable, and high-performing data solutions. Key Responsibilities: Design, develop, and manage SQL databases , tables , stored procedures , and T-SQL queries . Develop and maintain Azure Data Factory (ADF) pipelines to automate data ingestion, transformation, and integration. Build and optimize ETL/ELT processes to transfer data between Azure Data Lake , SQL Server , and other systems. Design and implement Microsoft Fabric Lake houses for structured and unstructured data storage. Build scalable ETL/ELT pipelines to move and transform data across Azure Data Lake , SQL Server , and external data sources . Develop and implement data modeling strategies using star schema , snowflake schema , and dimensional models to support analytics use cases. Integrate Azure Data Lake Storage (ADLS) with Microsoft Fabric for scalable, secure, and cost-effective data storage. Monitor, troubleshoot, and optimize data pipelines using Azure Monitor , Log Analytics , and Fabric Monitoring capabilities . Ensure data integrity, consistency, and security following data governance frameworks such as Azure Purview . Collaborate with DevOps teams to implement CI/CD pipelines for automated data pipeline deployment. Utilize Azure Monitor , Log Analytics , and Application Insights for pipeline monitoring and performance optimization. Stay updated on Azure Data Services and Microsoft Fabric innovations , recommending enhancements for performance and scalability. Requirements: 4+ years of experience in data engineering with strong expertise in SQL development. Proficiency in SQL Server , T-SQL , and query optimization techniques . Hands-on experience with Azure Data Factory (ADF) , Azure Synapse Analytics , and Azure SQL Database . Solid understanding of ETL/ELT processes , data integration patterns , and data transformation . Practical experience with Microsoft Fabric components : Fabric Dataflows for self-service data preparation. Fabric Lake houses for unified data storage. Fabric Synapse Real-Time Analytics for streaming data insights. Fabric Direct Lake mode with Power BI for optimized performance. Strong understanding of Azure Data Lake Storage (ADLS) for efficient data management. Proficiency in Python or Scala for data transformation tasks. Experience with Azure DevOps , Git , and CI/CD pipeline automation . Knowledge of data governance practices , including data lineage , sensitivity labels , and RBAC . Experience with Infrastructure-as-Code (IaC) using Terraform or ARM templates . Understanding of data security protocols like data encryption and network security groups (NSGs) . Familiarity with streaming services like Azure Event Hub or Kafka is a plus. Excellent problem-solving, communication , and team collaboration skills. Azure Data Engineer Associate (DP-203) and Microsoft Fabric Analytics certifications are desirable. What We Offer: Opportunity to work with modern data architectures and Microsoft Fabric innovations. Competitive salary and benefits package, tailored to experience and qualifications. Opportunities for professional growth and development in a supportive and collaborative environment. A culture that values diversity, creativity, and a commitment to excellence.
Posted 3 weeks ago
10.0 - 13.0 years
0 Lacs
Andhra Pradesh, India
On-site
At PwC, our people in software and product innovation focus on developing cutting-edge software solutions and driving product innovation to meet the evolving needs of clients. These individuals combine technical experience with creative thinking to deliver innovative software products and solutions. Those in software engineering at PwC will focus on developing innovative software solutions to drive digital transformation and enhance business performance. In this field, you will use your knowledge to design, code, and test cutting-edge applications that revolutionise industries and deliver exceptional user experiences. Position Title : Full Stack Lead Developer Experience : 10-13 Years Job Overview We are seeking a highly skilled and versatile polyglot Full Stack Developer with expertise in modern front-end and back-end technologies, cloud-based solutions, AI/ML and Gen AI. The ideal candidate will have a strong foundation in full-stack development, cloud platforms (preferably Azure), and hands-on experience in Gen AI, AI and machine learning technologies. Key Responsibilities Develop and maintain web applications using Angular/React.js, .NET, and Python. Design, deploy, and optimize Azure native PaaS and SaaS services, including but not limited to Function Apps, Service Bus, Storage Accounts, SQL Databases, Key vaults, ADF, Data Bricks and REST APIs with Open API specifications. Implement security best practices for data in transit and rest. Authentication best practices – SSO, OAuth 2.0 and Auth0. Utilize Python for developing data processing and advanced AI/ML models using libraries like pandas, NumPy, scikit-learn and Langchain, Llamaindex, Azure OpenAI SDK Leverage Agentic frameworks like Crew AI, Autogen etc. Well versed with RAG and Agentic Architecture. Strong in Design patterns – Architectural, Data, Object oriented Leverage azure serverless components to build highly scalable and efficient solutions. Create, integrate, and manage workflows using Power Platform, including Power Automate, Power Pages, and SharePoint. Apply expertise in machine learning, deep learning, and Generative AI to solve complex problems. Primary Skills Proficiency in React.js, .NET, and Python. Strong knowledge of Azure Cloud Services, including serverless architectures and data security. Experience with Python Data Analytics libraries: pandas NumPy scikit-learn Matplotlib Seaborn Experience with Python Generative AI Frameworks: Langchain LlamaIndex Crew AI AutoGen Familiarity with REST API design, Swagger documentation, and authentication best practices. Secondary Skills Experience with Power Platform tools such as Power Automate, Power Pages, and SharePoint integration. Knowledge of Power BI for data visualization (preferred). Preferred Knowledge Areas – Nice To Have In-depth understanding of Machine Learning, deep learning, supervised, un-supervised algorithms. Qualifications Bachelor's or master's degree in computer science, Engineering, or a related field. 6~12 years of hands-on experience in full-stack development and cloud-based solutions. Strong problem-solving skills and ability to design scalable, maintainable solutions. Excellent communication and collaboration skills. Show more Show less
Posted 3 weeks ago
4.0 years
0 Lacs
Thane, Maharashtra, India
On-site
Description Experience: 4+ years of experience Key Responsibilities Help define/improvise actionable and decision driving management information Ensure streamlining, consistency and standardization of MI within the handled domain Build and operate flexible processes/reports that meet changing business needs Would be required to prepare the detailed documentation of schemas Any other duties commensurate with position or level of responsibility Desired Profile Prior experience in insurance companies/insurance sector would be an added advantage Experience of Azure Technologies (include SSAS, SQL Server, Azure Data Lake, Synapse) Hands on experience on SQL and PowerBI. Excellent understanding in developing stored procedures, functions, views and T-SQL programs. Developed and maintained ETL (Data Extraction, Transformation and Loading) mappings using ADF to extract the data from multiple source systems. Analyze existing SQL queries for performance improvements. Excellent written and verbal communication skills Ability to create and design MI for the team Expected to handle multiple projects, with stringent timelines Good interpersonal skills Actively influences strategy by creative and unique ideas Exposure to documentation activities Key Competencies Technical Learning - Can learn new skills and knowledge, as per business requirement Action Orientated - Enjoys working; is action oriented and full of energy for the things he/she sees as challenging; not fearful of acting with a minimum of planning; seizes more opportunities than others. Decision Quality - Makes good decisions based upon a mixture of analysis, wisdom, experience, and judgment. Detail Orientation : High attention to detail, especially in data quality, documentation, and reporting Communication & Collaboration : Strong interpersonal skills, effective in team discussions and stakeholder interactions Qualifications B.Com/ BE/BTech/MCA from reputed College/Institute Show more Show less
Posted 3 weeks ago
12.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
About The Company iLink Digital is a Global Software Solution Provider and Systems Integrator, delivers next-generation technology solutions to help clients solve complex business challenges, improve organizational effectiveness, increase business productivity, realize sustainable enterprise value and transform your business inside-out. iLink integrates software systems and develops custom applications, components, and frameworks on the latest platforms for IT departments, commercial accounts, application services providers (ASP) and independent software vendors (ISV). iLink solutions are used in a broad range of industries and functions, including healthcare, telecom, government, oil and gas, education, and life sciences. iLink’s expertise includes Cloud Computing & Application Modernization, Data Management & Analytics, Enterprise Mobility, Portal, collaboration & Social Employee Engagement, Embedded Systems and User Experience design etc. What makes iLink's offerings unique is the fact that we use pre-created frameworks, designed to accelerate software development and implementation of business processes for our clients. iLink has over 60 frameworks (solution accelerators), both industry-specific and horizontal, that can be easily customized and enhanced to meet your current business challenges. Requirements At least 12+ years of overall experience mainly in the field of Data Warehousing or Data Analytics Solid experience with Data modeling using various modeling methodologies/Techniques like dimensional modeling, data vault, 3rd Normalization ..etc Experience in designing common/reusable domain specific data sets for Data Lakes Experience in PowerBI Tabular models for self-service Experience in working with Azure data platform tools like ADF, Synapse, data bricks, Zen2 storage Solid understanding of various Data structures /Data Formats and the best usage for those in Big Data/Hadoop/Data Lake environment (Relational vs Parquet vs ORC ..etc) Good experience with the data in Supply Chain and Manufacturing domains. Working experience with data from SAP ERP systems is preferable Should be an expert with SQL queries and data exploration methods. Other technical skills include SparkSQL, Python, PySpark, Any Data modeling tools like ERWin. Benefits Competitive salaries Medical Insurance Employee Referral Bonuses Performance Based Bonuses Flexible Work Options & Fun Culture Robust Learning & Development Programs In-House Technology Training Show more Show less
Posted 3 weeks ago
0 years
0 Lacs
Greater Kolkata Area
On-site
Line of Service Advisory Industry/Sector Not Applicable Specialism Oracle Management Level Senior Associate Job Description & Summary At PwC, our people in business application consulting specialise in consulting services for a variety of business applications, helping clients optimise operational efficiency. These individuals analyse client needs, implement software solutions, and provide training and support for seamless integration and utilisation of business applications, enabling clients to achieve their strategic objectives. Those in Oracle technology at PwC will focus on utilising and managing Oracle suite of software and technologies for various purposes within an organisation. You will be responsible for tasks such as installation, configuration, administration, development, and support of Oracle products and solutions. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. Job Description & Summary Managing business performance in today’s complex and rapidly changing business environment is crucial for any organization’s short-term and long-term success. However ensuring streamlined E2E Oracle PPM to seamlessly adapt to the changing business environment is crucial from a process and compliance perspective. As part of the Technology Consulting -Business Applications - Oracle Practice team, we leverage opportunities around digital disruption, new age operating model and best in class practices to deliver technology enabled transformation to our clients Responsibilities Worked in more than 2 end to end implementations as Oracle PPM Solution Architect / Expert Good understanding of latest industry trends in PPM related business processes and practices Deep product understanding of all Fusion PPM modules (Project Financial Management, Project Execution Management) Good understanding of accounting flows in Oracle PPM Strong conceptual knowledge in P2P; O2C; R2R and A2R cycles Must be ready to travel to client sites regularly for project deliveries Mandatory skill sets Experienced in working domestic customers Oracle PPM certification Rapid prototyping experience Preferred skill sets Oracle Fusion PPM (Projects Module - Project Management, Costing & Billing) Years of experience required Minimum 4 to 10yrs Years of Oracle fusion experience Education Qualification Graduate /Post Graduate Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Master of Business Administration, Bachelor of Engineering Degrees/Field Of Study Preferred: Certifications (if blank, certifications not specified) Required Skills Oracle Cloud Integration Optional Skills Accepting Feedback, Active Listening, Analytical Thinking, Business Transformation, Communication, Creativity, Design Automation, Embracing Change, Emotional Regulation, Empathy, Inclusion, Intellectual Curiosity, Learning Agility, Optimism, Oracle Application Development Framework (ADF), Oracle Business Intelligence (BI) Publisher, Oracle Cloud Infrastructure, Oracle Data Integration, Process Improvement, Process Optimization, Self-Awareness, Strategic Technology Planning, Teamwork, Well Being Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date Show more Show less
Posted 3 weeks ago
0 years
0 Lacs
Bengaluru East, Karnataka, India
On-site
Primary skills:Technology->Artificial Intelligence->Artificial Intelligence - ALL,Technology->Machine Learning->Python,Technology->OpenSystem->Python - OpenSystem Should have Handson experience in Azure Data Engineering Stack Should be proficient in writing SQL Queries/Stored Procedure/Trigger etc preferably in SQL Server 2012 onwards or Azure Cloud Should be Proficient in Azure Synapse or ADF Pipelines Good to have Azure Databricks exposure using Pyspark Must be Knowing Python Scripts/ PySpark and fully hands-on Predictive modelling techniques like Regression, Random Forest/Any model is preferred Self-motivated and at least executed two complete projects as an individual contributor. Exposure to Agile -Scrum is preferred Good in business communication is expected. Show more Show less
Posted 3 weeks ago
0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
Line of Service Advisory Industry/Sector Not Applicable Specialism Salesforce Management Level Manager Job Description & Summary We are seeking a developer to design, develop, and maintain data ingestion processes to a data platform built using Microsoft Technologies, ensuring data quality and integrity. The role involves collaborating with data architects and business analysts to implement solutions using tools like ADF, Azure Databricks, and requires strong SQL skills Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Responsibilities The Business Systems Analyst plays a key role as subject matter expert and technical consultant throughout the lifecycle of Salesforce.com applications used by Business Units and Administrative functions across the company. Business Systems Analysts are responsible for supporting the end to end implementation of projects and solutions from analysing and documenting business and system requirements, preparing solution proposals, translating requirements into functional specifications, supporting build deliverables, planning and performing test activities, providing end-user training and creating end-user documentation, through to deployment, stabilization and steady state support. The Business Systems Analyst will possess a unique blend of technical, business, application and people skills, have the ability to effectively prioritize work assignments, work in a team environment and will possess excellent problem-solving skills. Desired Skills Possesses strong knowledge/experience in Salesforce Sales or Service business processes and use cases. Good understanding of AppExchange products/solutions and appropriately suggest for desired use cases Salesforce Marketing Cloud/Pardot and Integration skills are added advantage Excellent Knowledge of Salesforce.com application (Both Sales and Service Cloud preferred) Formal Salesforce.com certifications strongly preferred Understanding of all Salesforce Lightning & lightning components and basic understanding of Salesforce Object and relationships, Apex, Visualforce Pages, Triggers, Workflows rules, Page layouts, Record Types, process builder etc. Detail oriented ability to translate business and user requirements into detailed system requirements that are actionable by development team Ability to influence others to achieve results through strong interpersonal skills. Excellent organization and problem-solving skills Clear spoken, business writing, and communication skills across global client base and teams Strong Sense of urgency Key Responsibilities Document and design current and future Salesforce.com enabled solutions and drive all relevant business analysis to ensure the most effective recommendations are made for successful solution and project plan completion. Exceptional interpersonal skills and written communication skills to frequently interact with all levels of the organization as well as global cultures; Ability to interpret technical documentation to meet business needs. Correctly identifies system / functional interdependencies. Demonstrates ability to effectively work both independently and within cross functional project teams.Seen as a cross functional process and subject matter expert. Accurately translating Business Requirements into detailed Salesforce.com functionality requirements. High aptitude for interpreting technical documentation as well as authoring or updating documents as needed (Functional Designs, Business Process Designs). Ability to self-manage multiple deliverables within tight timeframes and dynamic priorities. Based on experience, can accurately estimate the cost and time to implement complex enterprise level solutions. Extensive experience interpreting user needs and writing or editing functional specifications for new systems, systems changes and / or system enhancements; Has the ability to present ideas in a focused and concise manner. Ability to create compelling business justification for recommended direction and design. Demonstrates leadership skills by mentoring more junior level staff and leading meetings. Creates and executes complex test scenarios and scripts, provides training to end users and responds to critical production support requests. Assist project managers in developing project plans, executing project tasks and communicating issues and status in a timely manner Mandatory Skill Sets Salesforce Sales/Business Analyst Preferred Skill Sets Experience with Banking sector in lending domain Years Of Experience Required 3+ Education Qualification B.Tech/B.E./MCA Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor of Engineering Degrees/Field Of Study Preferred Certifications (if blank, certifications not specified) Required Skills Business Analytics, Salesforce Optional Skills Banking, Banking Industry, Lending Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date Show more Show less
Posted 3 weeks ago
0 years
0 Lacs
Delhi, India
On-site
This Job is based in Australia Develop and enhance Natural Language Processing systems to support AI-driven products and services. Work with Large Language Models (LLMs), including fine-tuning and Retrieval-Augmented Generation (RAG) approaches, to optimise performance Design and implement robust evaluation frameworks Bottom of Form Level A or B appointment - $88,290 - $145,730 (min. 110,059 with PhD) + 17% superannuation Full-time, fixed term contracts to May 2027 Why Your Role Matters The Defence Trailblazer for Concept to Sovereign Capability is a $250 million enterprise powered by UNSW and UoA, with funding from the Australian Government through the Trailblazer Universities Program, as well as university and industry partners. Our mission is to accelerate the commercialisation of research for the Australian Defence Force (ADF), develop education training pathways, fast-track entrepreneurs' ideas to commercialisation, and enhance collaboration between industry, government, and academia. The School of Computer Science and Engineering is one of the largest and most prestigious schools of computing in Australia. It offers undergraduate programmes in Software Engineering, Computer Engineering, Computer Science, and Bioinformatics, as well as a number of combined degrees with other disciplines. Our research and teaching staff are world leading and world building as they advance knowledge and learning. For more information on our school go to the following link - https://www.unsw.edu.au/engineering/our-schools/computer-science-and-engineering Reporting to Dr. Aditya Joshi, School of Computer Science and Engineering, the Postdoctoral Fellows (Level A/Level B) are expected to carry out independent and/or team research within the field in which they are appointed and to carry out activities to develop their research expertise relevant to their particular field of research. This position is funded by the ASIC Defence Trailblazer Grant. About UNSW UNSW isn’t like other places you’ve worked. Yes, we’re a large organisation with a diverse and talented community; a community doing extraordinary things. But what makes us different isn’t only what we do, it’s how we do it. Together, we are driven to be thoughtful, practical, and purposeful in all we do. Taking this combined approach is what makes our work matter. It’s the reason we’re Australia’s number one university for impact and among the nation’s Top 20 employers. And it’s why we come to campus each day. Benefits And Culture UNSW offers a competitive salary and access to UNSW benefits including: Hybrid/Flexible working arrangements An additional 3 days of leave over the Christmas Period Access to lifelong learning and career development Progressive HR practices Discounts and entitlements Affordable on campus parking Who You Are Level A You have a PhD in natural language processing, or a related discipline, and can demonstrate the following Skills and Experience: Proven commitment to proactively keeping up to date with discipline knowledge and developments. Research with outcomes of high-quality outputs (e.g. ACL, NeurIPS, ICML, ICLR, AAAI) and high impact, with clear evidence of the desire and ability to achieve research excellence and capacity for leadership. Strong experience in fine-tuning LLMs and developing LLM-based tools and techniques, such as RAGs and agentic workflows. A track record of significant involvement with the profession and/or industry. High-level communication skills and ability to network effectively and interact with a diverse range of students and staff. Ability to work in a team, collaborate across disciplines and build effective relationships. Highly developed interpersonal and organisational skills. An understanding of and commitment to UNSW’s aims, objectives and values, together with relevant policies and guidelines. Knowledge of health and safety responsibilities and commitment to attending relevant health and safety training. Retrieval-augmented generation for conversation understanding or question-answering is desirable. Web-based deployment using commercial cloud platforms is desirable. Level B – Additional Skills And Experience Required Two years of post-PhD experience. Build effective networks with colleagues. Generate alternative funding and/or industry projects through liaison with external local and international researchers, industry, and government. Pre-employment Checks Required For These Positions Verification of qualifications The successful candidates may be required to hold or attain an Australian government security clearance. Apply : If developing your NLP research expertise is of interest to you, please submit your CV, Cover Letter and responses to Skills and Experience required in the position description. Please note: Visa sponsorship is not available for this position. Candidates must hold Australian working rights to be considered for this position Applications close: 11.55pm, Sunday 25th May 2025 Contact : Shiree Thomas – Talent Acquisition Consultant e: shiree.thomas@unsw.edu.au Please apply online - applications will not be accepted if sent to the contact listed. UNSW is committed to evolving a culture that embraces equity and supports a diverse and inclusive community where everyone can participate fairly, in a safe and respectful environment. We welcome candidates from all backgrounds and encourage applications from people of diverse gender, sexual orientation, cultural and linguistic backgrounds, Aboriginal and Torres Strait Islander background, people with disability and those with caring and family responsibilities. UNSW provides workplace adjustments for people with disability, and access to flexible work options for eligible staff. The University reserves the right not to proceed with any appointment. Show more Show less
Posted 3 weeks ago
7.0 - 10.0 years
20 - 27 Lacs
Pune, Chennai, Coimbatore
Hybrid
Proficient Working Knowledge in: ADB, ADF, PySpark, and SQL 79 Years Old Reputed MNC Company
Posted 3 weeks ago
4.0 years
0 Lacs
Thane, Maharashtra, India
On-site
Description Experience: 4+ years of experience Key Responsibilities Help define/improvise actionable and decision driving management information Ensure streamlining, consistency and standardization of MI within the handled domain Build and operate flexible processes/reports that meet changing business needs Would be required to prepare the detailed documentation of schemas Any other duties commensurate with position or level of responsibility Desired Profile Prior experience in insurance companies/insurance sector would be an added advantage Experience of Azure Technologies (include SSAS, SQL Server, Azure Data Lake, Synapse) Hands on experience on SQL and PowerBI. Excellent understanding in developing stored procedures, functions, views and T-SQL programs. Developed and maintained ETL (Data Extraction, Transformation and Loading) mappings using ADF to extract the data from multiple source systems. Analyze existing SQL queries for performance improvements. Excellent written and verbal communication skills Ability to create and design MI for the team Expected to handle multiple projects, with stringent timelines Good interpersonal skills Actively influences strategy by creative and unique ideas Exposure to documentation activities Key Competencies Technical Learning - Can learn new skills and knowledge, as per business requirement Action Orientated - Enjoys working; is action oriented and full of energy for the things he/she sees as challenging; not fearful of acting with a minimum of planning; seizes more opportunities than others. Decision Quality - Makes good decisions based upon a mixture of analysis, wisdom, experience, and judgment. Detail Orientation : High attention to detail, especially in data quality, documentation, and reporting Communication & Collaboration : Strong interpersonal skills, effective in team discussions and stakeholder interactions Qualifications B.Com/ BE/BTech/MCA from reputed College/Institute Show more Show less
Posted 3 weeks ago
3.0 years
0 Lacs
Mumbai Metropolitan Region
On-site
This role is for one of Weekday's clients Salary range: Rs 600000 - Rs 1200000 (ie INR 6-12 LPA) Min Experience: 3 years Location: Mumbai JobType: full-time Requirements About the Role: We are seeking a talented and detail-oriented Junior Visualizer to join our data and analytics team. In this role, you will be responsible for designing and delivering insightful visualizations and dashboards that empower stakeholders to make data-driven decisions. You will work closely with data engineers, business analysts, and product teams to transform raw data into actionable business intelligence using industry-standard tools such as Power BI, Tableau, and Azure Databricks. As a Junior Visualizer, your expertise in ETL processes, data transformation, and data mining—combined with your strong visualization and storytelling abilities—will be instrumental in identifying trends, patterns, and key performance indicators across various business functions. Key Responsibilities: Data Visualization & Reporting: Design, develop, and maintain interactive dashboards and reports using Power BI and Tableau. Translate business requirements into visualizations that clearly communicate complex data in a user-friendly format. Build intuitive interfaces that showcase KPIs and metrics aligned with business goals. ETL & Data Preparation: Support the extraction, transformation, and loading (ETL) of data using tools like ADF (Azure Data Factory). Collaborate with data engineers to prepare and cleanse data for accurate visualization and analysis. Big Data & Cloud Integration: Leverage Azure Databricks, PySpark, and Python to process large datasets and perform data mining operations. Work with Microsoft Azure cloud services to integrate and manage data pipelines. Performance Monitoring & Optimization: Monitor data quality and visualization performance to ensure accuracy and responsiveness. Optimize queries and dashboards for faster rendering and usability. Collaboration & Stakeholder Management: Engage with business users to gather requirements and refine visualization goals. Assist in the interpretation of visual analytics to support strategic decision-making. Required Skills & Qualifications: Bachelor's degree in Computer Science, Data Science, Information Systems, or related field. Minimum of 3 years of hands-on experience in data visualization and reporting roles. Proficiency in Power BI and Tableau for building reports and dashboards. Experience with ETL tools, particularly ADF (Azure Data Factory). Strong programming skills in Python and PySpark for data manipulation. Hands-on experience with Azure Databricks and Microsoft Azure ecosystem. Deep understanding of data mining, KPI development, and data storytelling. Strong analytical skills and attention to detail. Preferred Attributes: Ability to communicate technical insights to non-technical stakeholders. A proactive, curious mindset with a passion for learning and growing in the data space. Strong team player with a collaborative attitude and good problem-solving skills. Show more Show less
Posted 3 weeks ago
2.0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
Notice Period - Immediate Joiner to 30 Days Location - Mumbai / Pune/Bangalore/Delhi/Hyderabad/Chennai Responsibilities: • 2+ yrs of experience with Microsoft Azure and Microsoft BI stack • Experience developing Azure Data Factory, SQL, Databricks (PySpark, Scala, SQL), Stream Analytics, Components experience in working on data lake & DW solutions on Azure • Experience in configuring and managing Azure DevOps pipelines • Lead and implement data pipelines for software solutions and ensure that technology deliverables are met based on business requirements • ETL Architecture design and implementation at enterprise level • Work with subject matter experts and Data/ Business Analysts to understand business data and related processes • Using Agile and DevOps methods, build platform architecture using a variety of sources (on-premises such as SQL Server, cloud IaaS/ SaaS such as Azure). • Implement rules & automate data cleansing, mapping, transformation, logging, and exception handling. • Experience in designing and implementing BI solution on Azure, (ADLS, Azure Databricks, ADF, Azure Synapse, Azure SQL, etc.) • Design, build, and deploy databases and data stores • Participate in cross-functional teams to promote technology strategies, analyze, and test products, perform proofs-of-concept, and pilot new technologies and/or methods. • Establish and document standards, guidelines, and best practices for teams utilizing the solutions. • Implement / operationalize Analytics by developing representations and visualizations in BI. • Work with clients (internal or external) to determine business requirements, priorities, define key performance indicators (KPI) • Data management experience, strong excel skills (pivot tables, VLOOKUP’s, etc.), stronghold in Power BI • Design, and document dashboards, alerts, and reports either on a regular recurring basis or as needed. Primary Skillsets: Azure Data Factory, Azure Data Lake Storage,Azure Databricks,Azure Synapse ,SQL Server,Azure Event Hub,Azure Function,Azure App Services Secondary Skillset: Power Apps,Microsoft SharePoint,Power Automate,Dynamic 365,Understanding of DevOps practices,Demonstrates teamwork and collaboration,Passionate about new technologies, Certification in DP-200, DP-201, DP-203, DA-100, PL-300, AZ-900 Desired qualifications • Education: bachelor’s degree in B Tech, MCA and Software Engineering, or a related field. Advanced degrees are a plus. Experience: 2+ years of experience in software development. Show more Show less
Posted 3 weeks ago
5.0 years
0 Lacs
India
On-site
Years of experience: 5+ Years JD: D365: understanding of Data model & core functionalities & customization capabilities D365: experience in extending functionality through custom code ADF: understanding of writing SQL Scripts, stored procedures and ADF ETL pipelines SQL: knowledge of database performance and tuning, troubleshooting and query optimization Generic: ability to understand complex business processes, strong team skills Show more Show less
Posted 3 weeks ago
7.0 years
0 Lacs
India
On-site
Company Description At Beyond Key, we specialize in driving digital transformation and enterprise modernization. By leveraging our deep technical expertise and AI capabilities, our services – D365, BI, M365, Cloud, AI, and Custom Development – enable businesses to excel in a fast-paced tech world. We bring industrial expertise to sectors such as Insurance, Financial Services, Healthcare, and Manufacturing, focusing on customized growth and efficiency. Our commitment to delivering the right solutions has earned us prestigious awards and recognitions, including Great Place to Work certifications. We are dedicated to fostering a collaborative, innovative, and inclusive culture. Key Responsibilities: Design, develop, and maintain interactive Power BI dashboards & reports with advanced DAX, Power Query, and custom visuals. Build and optimize end-to-end data solutions using Microsoft Fabric (OneLake, Lakehouse, Data Warehouse) . Develop and automate ETL/ELT pipelines using Azure Data Factory (ADF) and Fabric Data Pipelines . Architect and manage modern data warehousing solutions (Star/Snowflake Schema) using Fabric Warehouse, Azure Synapse, or SQL Server. Implement data modeling, performance tuning, and optimization for large-scale datasets. Collaborate with business teams to translate requirements into scalable Fabric-based analytics solutions . Ensure data governance, security, and compliance across BI platforms. Mentor junior team members on Fabric, Power BI, and cloud data best practices . Required Skills & Qualifications: 7+ years of hands-on experience in Power BI, SQL, Data Warehousing, and ETL/ELT . Strong expertise in Microsoft Fabric (Lakehouse, Warehouse, ETL workflows, Delta Lake). Proficient in Azure Data Factory (ADF) for orchestration and data integration. Advanced SQL (query optimization, stored procedures, partitioning). Experience with data warehousing (dimensional modeling, SCD, fact/dimension tables). Knowledge of Power BI Premium/Fabric capacity, deployment pipelines, and DAX patterns . Familiarity with Databricks, PySpark, or Python (for advanced analytics) is a plus. Strong problem-solving and stakeholder management skills. Preferred Qualifications: Microsoft Certifications ( PL-300: Power BI, DP-600: Fabric Analytics Engineer ). Experience with Azure DevOps (CI/CD for Fabric/Power BI deployments) . Domain knowledge in BFSI, Retail, or Manufacturing . Show more Show less
Posted 3 weeks ago
0 years
0 Lacs
Trivandrum, Kerala, India
On-site
Senior Data Engineer will be responsible for designing, implementing, and maintaining data solutions on the Microsoft Azure Data platform and SQL Server (SSIS, SSAS, UC4 Atomic) Collaborate with various stakeholders, and ensuring the efficient processing, storage, and retrieval of large volumes of data Technical Expertise and Responsibilities Design, build, and maintain scalable and reliable data pipelines. Should be able to design and build solutions in Azure data factory and Databricks to extract, transform and load data into different source and target systems. Should be able to design and build solutions in SSIS Should be able to analyze and understand the existing data landscape and provide recommendations/innovative ideas for rearchitecting / optimizing / streamlining to bring efficiency and scalability. Must be able to collaborate and effectively communicate with onshore counterparts to address technical gaps, requirement challenges, and other complex scenarios. Monitor and troubleshoot data systems to ensure high performance and reliability. Should be highly analytical and detail-oriented with extensive familiarity with database management principles. Optimize data processes for speed and efficiency. Ensure the data architecture supports business requirements and data governance policies. Define and execute the data engineering strategy in alignment with the company’s goals. Integrate data from various sources, ensuring data quality and consistency. Stay updated with emerging technologies and industry trends. Understand the big picture business process utilizing deep knowledge in banking industry and translate them to data requirements. Enabling and running data migrations across different databases and different servers Perform thorough testing and validation to support the accuracy of data transformations and data verification used in machine learning models. Analyze data and different systems to define data requirements. Should be well versed with Data Structures & algorithms. Define data mapping working along with business and digital team and data team. Data pipeline maintenance/testing/performance validation Assemble large, complex data sets that meet functional / non-functional business requirements. Analyze and identify gaps on data needs and work with business and IT to bring in alignment on data needs. Troubleshoot and resolve technical issues as they arise. Optimize data flow and collection for cross-functional teams. Work closely with Data counterparts at onshore, product owners, and business stakeholders to understand data needs and strategies. Collaborate with IT and DevOps teams to ensure data infrastructure aligns with overall IT architecture. Implement best practices for data security and privacy. Drive continuous improvement initiatives within the data engineering function Optimize data flow and collection for cross-functional teams. Understand impact of data conversions as they pertain to servicing operations. Manage higher volume and more complex cases with accuracy and efficiency. Role Expectations Design and develop warehouse solutions using Azure Synapse Analytics, ADLS, ADF, Databricks, Power BI, Azure Analysis Services Should be proficient in SSIS, SQL and Query optimization. Should have worked in onshore offshore model managing challenging scenarios. Expertise in working with large amounts of data (structured and unstructured), building data pipelines for ETL workloads and generate insights utilizing Data Science, Analytics. Expertise in Azure, AWS cloud services, and DevOps/CI/CD frameworks. Ability to work with ambiguity and vague requirements and transform them into deliverables. Good combination of technical and interpersonal skills with strong written and verbal communication; detail-oriented with the ability to work independently. Drive automation efforts across the data analytics team utilizing Infrastructure as Code (IaC) using Terraform, Configuration Management, and Continuous Integration (CI) / Continuous Delivery (CD) tools such as Jenkins. Help build define architecture frameworks, best practices & processes. Collaborate on Data warehouse architecture and technical design discussions. Expertise in Azure Data factory and should be familiar with building pipelines for ETL projects. Expertise in SQL knowledge and experience working with relational databases. Expertise in Python and ETL projects Experience in data bricks will be of added advantage. Should have expertise in data life cycle, data ingestion, transformation, data loading, validation, and performance tuning. Skillsets Required SQL, PL/SQL SSIS SSAS TFS Azure Data Factory Azure Databricks Azure Synapse ADLS Lakehouse Architecture Python SCD Concepts and Implementation UC4 Power BI DevOps CI/CD Banking Domain Show more Show less
Posted 3 weeks ago
4.0 years
0 Lacs
Indore, Madhya Pradesh, India
On-site
About Beyond Key We are a Microsoft Gold Partner and a Great Place to Work-certified company. "Happy Team Members, Happy Clients" is a principle we hold dear. We are an international IT consulting and software services firm committed to providing. Cutting-edge services and products that satisfy our clients' global needs. Our company was established in 2005, and since then we've expanded our team by including more than 350+ Talented skilled software professionals. Our clients come from the United States, Canada, Europe, Australia, the Middle East, and India, and we create and design IT solutions for them. If you need any more details, you can get them at https://www.beyondkey.com/about. Job Summary We are looking for a Data Engineer with strong expertise in Databricks, PySpark, SQL, and Azure Data Factory (ADF) to design and optimize scalable data pipelines. Experience with Snowflake and DBT is a plus. The ideal candidate will have a proven track record in building efficient ETL/ELT processes, data warehousing, and cloud-based data solutions. Key Responsibilities Design, develop, and maintain ETL/ELT pipelines using Azure Data Factory (ADF) and Databricks. Process and transform large datasets using PySpark (DataFrames, Spark SQL, optimizations) in Databricks. Write and optimize complex SQL queries for data extraction, transformation, and loading. Implement data lakehouse architectures using Delta Lake in Databricks. (Optional) Manage and optimize Snowflake data warehouses (table structures, performance tuning). (Optional) Use DBT (Data Build Tool) for modular and scalable data transformations. Ensure data quality, governance, and monitoring across pipelines. Collaborate with data analysts and business teams to deliver actionable insights. Qualifications 4+ years of hands-on experience in Databricks, PySpark, SQL, and ADF. Strong expertise in Databricks (Spark, Delta Lake, Notebooks, Job Scheduling). Proficiency in Azure Data Factory (ADF) for pipeline orchestration. Advanced SQL skills (query optimization, stored procedures, partitioning). Experience with data warehousing concepts (ETL vs. ELT, dimensional modeling). (Good to Have) Familiarity with Snowflake (warehouse management, Snowpipe). (Good to Have) Knowledge of DBT (Data Build Tool) for transformations. (Bonus) Python scripting for automation and data processing. Preferred Qualifications Certifications (Databricks Certified Data Engineer, Azure Data Engineer). Experience with CI/CD pipelines (Azure DevOps, GitHub Actions). Knowledge of streaming data (Kafka, Event Hubs) is a plus. Share with someone awesome View all job openings Show more Show less
Posted 3 weeks ago
5.0 - 10.0 years
7 - 12 Lacs
Indore, Hyderabad, Ahmedabad
Work from Office
Qualification: B.Tech / MCA / BCA / B.Sc (Computer Science or related field) Notice Period: Immediate Joiners Preferred We Have an Immediate Requirement For Azure Data Engineers at Senior and Lead levels to work on cutting-edge data projects in Azure ecosystem. Primary Skills: Azure Data Factory (ADF) Azure Databricks SQL Development & Optimization Power BI Reports, Dashboards, DAX Data Integration & ETL Pipelines Data Modeling and Transformation Secondary Skills: Azure Synapse Analytics Python / PySpark in Databricks DevOps exposure for data pipelines Version control using Git Exposure to Agile & CI/CD Environments Strong communication and client-handling skills Key Responsibilities: Design and implement data pipelines using ADF and Databricks Transform and process large-scale datasets efficiently Develop interactive reports and dashboards in Power BI Write and optimize SQL queries and stored procedures Work closely with architects and business teams to deliver high-quality data solutions Lead and mentor junior data engineers in best practices
Posted 3 weeks ago
5.0 - 8.0 years
9 - 18 Lacs
Hyderabad
Hybrid
We are hiring for ETL QA Developer with one of Big 4 clients. Interested candidates, kindly share your resume to k.arpitha@dynpro.in/ reach out to me on 7975510903 (whats app only) Experience: 5-9 Years IMMEDIATE JOINERS ONLY Tools/Technology Skills: Demonstrates an understanding and working experience on SQL databases and ETL testing. Should be able to write queries to validate the table mappings and structures Should be able to perform schema validations Good understanding of SCD types Strong knowledge of database methodology In-depth understanding of Data Warehousing/Business intelligence concepts Working experience on cloud(Azure) based services Working experience in testing BI reports Should be able to write queries to validate the data quality during migration projects Demonstrates an understanding of any of the peripheral technologies utilized in SDC, including Peoplesoft, SAP and Aderant. Demonstrates a working understanding of tools like UFT and TFS Experience with Microsoft tools is highly desired Understands enterprise wide networks and software implementations. Must have previous experience in creating complex SQL queries for data validation. Must have testing experience in Enterprise Data Warehouse (EDW)
Posted 3 weeks ago
3.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Company Description Blend is a premier AI services provider, committed to co-creating meaningful impact for its clients through the power of data science, AI, technology, and people. With a mission to fuel bold visions, Blend tackles significant challenges by seamlessly aligning human expertise with artificial intelligence. The company is dedicated to unlocking value and fostering innovation for its clients by harnessing world-class people and data-driven strategy. We believe that the power of people and AI can have a meaningful impact on your world, creating more fulfilling work and projects for our people and clients. For more information, visit www.blend360.com Job Description As a Data Engineer, your role is to spearhead the data engineering teams and elevate the team to the next level! You will be responsible for laying out the architecture of the new project as well as selecting the tech stack associated with it. You will plan out the development cycles deploying AGILE if possible and create the foundations for good data stewardship with our new data products! You will also set up a solid code framework that needs to be built to purpose yet have enough flexibility to adapt to new business use cases tough but rewarding challenge! Responsibilities Collaborate with several stakeholders to deeply understand the needs of data practitioners to deliver at scale Lead Data Engineers to define, build and maintain Data Platform Work on building Data Lake in Azure Fabric processing data from multiple sources Migrating existing data store from Azure Synapse to Azure Fabric Implement data governance and access control Drive development effort End-to-End for on-time delivery of high-quality solutions that conform to requirements, conform to the architectural vision, and comply with all applicable standards. Present technical solutions, capabilities, considerations, and features in business terms. Effectively communicate status, issues, and risks in a precise and timely manner. Further develop critical initiatives, such as Data Discovery, Data Lineage and Data Quality Leading team and Mentor junior resources Help your team members grow in their role and achieve their career aspirations Build data systems, pipelines, analytical tools and programs Conduct complex data analysis and report on results Qualifications 3+ Years of Experience as a data engineer or similar role in Azure Synapses, ADF or relevant exp in Azure Fabric Degree in Computer Science, Data Science, Mathematics, IT, or similar field Must have experience executing projects end to end. At least one data engineering project should have worked in Azure Synapse, ADF or Azure Fabric Should be experienced in handling multiple data sources Technical expertise with data models, data mining, and segmentation techniques Deep understanding, both conceptually and in practice of at least one object orientated library (Python, pySpark) Strong SQL skills and a good understanding of existing SQL warehouses and relational databases. Strong Spark, PySpark, Spark SQL skills and good understanding of distributed processing frameworks. Build large-scale batches and real-time data pipelines. Ability to work independently and mentor junior resources. Desire to lead and develop a team of Data Engineers across multiple levels Experience or knowledge in Should have experience or knowledge in Data Governance Azure Cloud experience with Data modeling, CI\CD, Agile Methodologies, Docker\Kubernetes Additional Information What do you get in return? Competitive Salary: Your skills and contributions are highly valued here, and we make sure your salary reflects that, rewarding you fairly for the knowledge and experience you bring to the table. Dynamic Career Growth: Our vibrant environment offers you the opportunity to grow rapidly, providing the right tools, mentorship, and experiences to fast-track your career. Idea Tanks: Innovation lives here. Our "Idea Tanks" are your playground to pitch, experiment, and collaborate on ideas that can shape the future Growth Chats: Dive into our casual "Growth Chats" where you can learn from the best whether it's over lunch or during a laid-back session with peers, it's the perfect space to grow your skills. Snack Zone: Stay fueled and inspired! In our Snack Zone, you'll find a variety of snacks to keep your energy high and ideas flowing Recognition & Rewards: We believe great work deserves to be recognized. Expect regular Hive-Fives, shoutouts and the chance to see your ideas come to life as part of our reward program. Fuel Your Growth Journey with Certifications: We’re all about your growth groove! Level up your skills with our support as we cover the cost of your certifications. Show more Show less
Posted 3 weeks ago
16.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. Primary Responsibility Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications B.E / MCA / B.Tech / MTECH /MS Graduation (Minimum 16 years of formal education, Correspondence courses are not relevant) 2+ years of experience on Azure database offering like SQL DB, Postgres DB, constructing data pipelines using Azure data factory, design and development of analytics using Azure data bricks and snowpark 3+ years of experience in constructing large and complex SQL queries on terabytes of warehouse database system 2+ years of experience on Cloud based DW: Snowflake, Azure SQL DW 2+ years of experience in data engineering and working on large data warehouse including design and development of ETL / ELT Good Knowledge on Agile practices - Scrum, Kanban Knowledge on Kubernetes, Jenkins, CI / CD Pipelines, SonarQube, Artifactory, GIT, Unit Testing Main tech experience : Dockers, Kubernetes and Kafka Database: Azure SQL databases Knowledge on Apache Kafka and Data Streaming Main tech experience : Terraform and Azure.. Ability to identify system changes and verify that technical system specifications meet the business requirements Solid problem solving, analytical kills, Good communication and presentation skills, Good attitude and self-motivated Solid problem solving, analytical kills Proven good communication and presentation skills Proven good attitude and self-motivated Preferred Qualifications 2+ years of experience on working with cloud native monitoring and logging tool like Log analytics 2+ years of experience in scheduling tools on cloud either using Apache Airflow or logic apps or any native/third party scheduling tool on cloud Exposure on ATDD, Fortify, SonarQube Unix scripting, DW concepts, ETL Frameworks: Scala / Spark, DataStage At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone - of every race, gender, sexuality, age, location and income - deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission. Show more Show less
Posted 3 weeks ago
3.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Overview We are PepsiCo PepsiCo is one of the world's leading food and beverage companies with more than $79 Billion in Net Revenue and a global portfolio of diverse and beloved brands. We have a complementary food and beverage portfolio that includes 22 brands that each generate more than $1 Billion in annual retail sales. PepsiCo's products are sold in more than 200 countries and territories around the world. PepsiCo's strength is its people. We are over 250,000 game changers, mountain movers and history makers, located around the world, and united by a shared set of values and goals. We believe that acting ethically and responsibly is not only the right thing to do, but also the right thing to do for our business. At PepsiCo, we aim to deliver top-tier financial performance over the long term by integrating sustainability into our business strategy, leaving a positive imprint on society and the environment. We call this Winning with Pep+ Positive . For more information on PepsiCo and the opportunities it holds, visit www.pepsico.com. PepsiCo Data Analytics & AI Overview: With data deeply embedded in our DNA, PepsiCo Data, Analytics and AI (DA&AI) transforms data into consumer delight. We build and organize business-ready data that allows PepsiCo’s leaders to solve their problems with the highest degree of confidence. Our platform of data products and services ensures data is activated at scale. This enables new revenue streams, deeper partner relationships, new consumer experiences, and innovation across the enterprise. The Data Science Pillar in DA&AI will be the organization where Data Scientist and ML Engineers report to in the broader D+A Organization. Also DS will lead, facilitate and collaborate on the larger DS community in PepsiCo. DS will provide the talent for the development and support of DS component and its life cycle within DA&AI Products. And will support “pre-engagement” activities as requested and validated by the prioritization framework of DA&AI. Data Scientist-Gurugram and Hyderabad The role will work in developing Machine Learning (ML) and Artificial Intelligence (AI) projects. Specific scope of this role is to develop ML solution in support of ML/AI projects using big analytics toolsets in a CI/CD environment. Analytics toolsets may include DS tools/Spark/Databricks, and other technologies offered by Microsoft Azure or open-source toolsets. This role will also help automate the end-to-end cycle with Machine Learning Services and Pipelines. Responsibilities Delivery of key Advanced Analytics/Data Science projects within time and budget, particularly around DevOps/MLOps and Machine Learning models in scope Collaborate with data engineers and ML engineers to understand data and models and leverage various advanced analytics capabilities Ensure on time and on budget delivery which satisfies project requirements, while adhering to enterprise architecture standards Use big data technologies to help process data and build scaled data pipelines (batch to real time) Automate the end-to-end ML lifecycle with Azure Machine Learning and Azure/AWS/GCP Pipelines. Setup cloud alerts, monitors, dashboards, and logging and troubleshoot machine learning infrastructure Automate ML models deployments Qualifications Minimum 3years of hands-on work experience in data science / Machine learning Minimum 3year of SQL experience Experience in DevOps and Machine Learning (ML) with hands-on experience with one or more cloud service providers. BE/BS in Computer Science, Math, Physics, or other technical fields. Data Science - Hands on experience and strong knowledge of building machine learning models - supervised and unsupervised models Programming Skills - Hands-on experience in statistical programming languages like Python and database query languages like SQL Statistics - Good applied statistical skills, including knowledge of statistical tests, distributions, regression, maximum likelihood estimators Any Cloud - Experience in Databricks and ADF is desirable Familiarity with Spark, Hive, Pig is an added advantage Model deployment experience will be a plus Experience with version control systems like GitHub and CI/CD tools Experience in Exploratory data Analysis Knowledge of ML Ops / DevOps and deploying ML models is required Experience using MLFlow, Kubeflow etc. will be preferred Experience executing and contributing to ML OPS automation infrastructure is good to have Exceptional analytical and problem-solving skills Show more Show less
Posted 3 weeks ago
3.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Overview We are PepsiCo PepsiCo is one of the world's leading food and beverage companies with more than $79 Billion in Net Revenue and a global portfolio of diverse and beloved brands. We have a complementary food and beverage portfolio that includes 22 brands that each generate more than $1 Billion in annual retail sales. PepsiCo's products are sold in more than 200 countries and territories around the world. PepsiCo's strength is its people. We are over 250,000 game changers, mountain movers and history makers, located around the world, and united by a shared set of values and goals. We believe that acting ethically and responsibly is not only the right thing to do, but also the right thing to do for our business. At PepsiCo, we aim to deliver top-tier financial performance over the long term by integrating sustainability into our business strategy, leaving a positive imprint on society and the environment. We call this Winning with Pep+ Positive . For more information on PepsiCo and the opportunities it holds, visit www.pepsico.com. PepsiCo Data Analytics & AI Overview: With data deeply embedded in our DNA, PepsiCo Data, Analytics and AI (DA&AI) transforms data into consumer delight. We build and organize business-ready data that allows PepsiCo’s leaders to solve their problems with the highest degree of confidence. Our platform of data products and services ensures data is activated at scale. This enables new revenue streams, deeper partner relationships, new consumer experiences, and innovation across the enterprise. The Data Science Pillar in DA&AI will be the organization where Data Scientist and ML Engineers report to in the broader D+A Organization. Also DS will lead, facilitate and collaborate on the larger DS community in PepsiCo. DS will provide the talent for the development and support of DS component and its life cycle within DA&AI Products. And will support “pre-engagement” activities as requested and validated by the prioritization framework of DA&AI. Data Scientist-Gurugram and Hyderabad The role will work in developing Machine Learning (ML) and Artificial Intelligence (AI) projects. Specific scope of this role is to develop ML solution in support of ML/AI projects using big analytics toolsets in a CI/CD environment. Analytics toolsets may include DS tools/Spark/Databricks, and other technologies offered by Microsoft Azure or open-source toolsets. This role will also help automate the end-to-end cycle with Machine Learning Services and Pipelines. Responsibilities Delivery of key Advanced Analytics/Data Science projects within time and budget, particularly around DevOps/MLOps and Machine Learning models in scope Collaborate with data engineers and ML engineers to understand data and models and leverage various advanced analytics capabilities Ensure on time and on budget delivery which satisfies project requirements, while adhering to enterprise architecture standards Use big data technologies to help process data and build scaled data pipelines (batch to real time) Automate the end-to-end ML lifecycle with Azure Machine Learning and Azure/AWS/GCP Pipelines. Setup cloud alerts, monitors, dashboards, and logging and troubleshoot machine learning infrastructure Automate ML models deployments Qualifications Minimum 3years of hands-on work experience in data science / Machine learning Minimum 3year of SQL experience Experience in DevOps and Machine Learning (ML) with hands-on experience with one or more cloud service providers BE/BS in Computer Science, Math, Physics, or other technical fields. Data Science - Hands on experience and strong knowledge of building machine learning models - supervised and unsupervised models Programming Skills - Hands-on experience in statistical programming languages like Python and database query languages like SQL Statistics - Good applied statistical skills, including knowledge of statistical tests, distributions, regression, maximum likelihood estimators Any Cloud - Experience in Databricks and ADF is desirable Familiarity with Spark, Hive, Pig is an added advantage Model deployment experience will be a plus Experience with version control systems like GitHub and CI/CD tools Experience in Exploratory data Analysis Knowledge of ML Ops / DevOps and deploying ML models is required Experience using MLFlow, Kubeflow etc. will be preferred Experience executing and contributing to ML OPS automation infrastructure is good to have Exceptional analytical and problem-solving skills Show more Show less
Posted 3 weeks ago
3.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Overview We are PepsiCo PepsiCo is one of the world's leading food and beverage companies with more than $79 Billion in Net Revenue and a global portfolio of diverse and beloved brands. We have a complementary food and beverage portfolio that includes 22 brands that each generate more than $1 Billion in annual retail sales. PepsiCo's products are sold in more than 200 countries and territories around the world. PepsiCo's strength is its people. We are over 250,000 game changers, mountain movers and history makers, located around the world, and united by a shared set of values and goals. We believe that acting ethically and responsibly is not only the right thing to do, but also the right thing to do for our business. At PepsiCo, we aim to deliver top-tier financial performance over the long term by integrating sustainability into our business strategy, leaving a positive imprint on society and the environment. We call this Winning with Pep+ Positive . For more information on PepsiCo and the opportunities it holds, visit www.pepsico.com. PepsiCo Data Analytics & AI Overview: With data deeply embedded in our DNA, PepsiCo Data, Analytics and AI (DA&AI) transforms data into consumer delight. We build and organize business-ready data that allows PepsiCo’s leaders to solve their problems with the highest degree of confidence. Our platform of data products and services ensures data is activated at scale. This enables new revenue streams, deeper partner relationships, new consumer experiences, and innovation across the enterprise. The Data Science Pillar in DA&AI will be the organization where Data Scientist and ML Engineers report to in the broader D+A Organization. Also DS will lead, facilitate and collaborate on the larger DS community in PepsiCo. DS will provide the talent for the development and support of DS component and its life cycle within DA&AI Products. And will support “pre-engagement” activities as requested and validated by the prioritization framework of DA&AI. Data Scientist-Gurugram and Hyderabad The role will work in developing Machine Learning (ML) and Artificial Intelligence (AI) projects. Specific scope of this role is to develop ML solution in support of ML/AI projects using big analytics toolsets in a CI/CD environment. Analytics toolsets may include DS tools/Spark/Databricks, and other technologies offered by Microsoft Azure or open-source toolsets. This role will also help automate the end-to-end cycle with Machine Learning Services and Pipelines. Responsibilities Delivery of key Advanced Analytics/Data Science projects within time and budget, particularly around DevOps/MLOps and Machine Learning models in scope Collaborate with data engineers and ML engineers to understand data and models and leverage various advanced analytics capabilities Ensure on time and on budget delivery which satisfies project requirements, while adhering to enterprise architecture standards Use big data technologies to help process data and build scaled data pipelines (batch to real time) Automate the end-to-end ML lifecycle with Azure Machine Learning and Azure/AWS/GCP Pipelines. Setup cloud alerts, monitors, dashboards, and logging and troubleshoot machine learning infrastructure Automate ML models deployments Qualifications Minimum 3years of hands-on work experience in data science / Machine learning Minimum 3year of SQL experience Experience in DevOps and Machine Learning (ML) with hands-on experience with one or more cloud service providers. BE/BS in Computer Science, Math, Physics, or other technical fields. Data Science - Hands on experience and strong knowledge of building machine learning models - supervised and unsupervised models Programming Skills - Hands-on experience in statistical programming languages like Python and database query languages like SQL Statistics - Good applied statistical skills, including knowledge of statistical tests, distributions, regression, maximum likelihood estimators Any Cloud - Experience in Databricks and ADF is desirable Familiarity with Spark, Hive, Pig is an added advantage Model deployment experience will be a plus Experience with version control systems like GitHub and CI/CD tools Experience in Exploratory data Analysis Knowledge of ML Ops / DevOps and deploying ML models is required Experience using MLFlow, Kubeflow etc. will be preferred Experience executing and contributing to ML OPS automation infrastructure is good to have Exceptional analytical and problem-solving skills Show more Show less
Posted 3 weeks ago
0 years
0 Lacs
Bhubaneswar, Odisha, India
On-site
Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Associate Job Description & Summary At PwC, our people in business application consulting specialise in consulting services for a variety of business applications, helping clients optimise operational efficiency. These individuals analyse client needs, implement software solutions, and provide training and support for seamless integration and utilisation of business applications, enabling clients to achieve their strategic objectives. In Oracle enterprise performance management at PwC, you will specialise in providing consulting services for enterprise performance management solutions using Oracle technologies. You will collaborate with clients to assess their performance management needs, design and implement Oracle-based solutions for budgeting, forecasting, financial consolidation, and reporting. Working in this area, you will also provide training and support for seamless integration and utilisation of Oracle enterprise performance management tools, helping clients improve their financial planning and analysis processes and achieve their performance objectives. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Job Description Experience with different databases like Synapse, SQL DB, Snowflake etc. Design and implement data pipelines using Azure Data Factory, Databricks, Synapse Create and manage Azure SQL Data Warehouses and Azure Cosmos DB databases Extract, transform, and load (ETL) data from various sources into Azure Data Lake Storage Implement data security and governance measures Monitor and optimize data pipelines for performance and efficiency Troubleshoot and resolve data engineering issues Provide optimized solution for any problem related to data engineering Ability to work with verity of sources like Relational DB, API, File System, Realtime streams, CDC etc. Strong knowledge on Databricks, Delta tables Technology: SQL, ADF, ADLS, Synapse, Pyspark, Databricks, Mandatory Skill Sets: Azure DE Preferred skill Sets: Azure DE Years Of Experience Required 4-8 Education Qualification Btech/MBA/MCA Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Master of Business Administration, Bachelor of Engineering Degrees/Field Of Study Preferred Certifications (if blank, certifications not specified) Required Skills Microsoft Azure DevOps Optional Skills Accepting Feedback, Accepting Feedback, Account Reconciliation, Active Listening, Business Process Analysis, Business Rules Development, Communication, Cost Management, Creating Budgets, Emotional Regulation, Empathy, Enterprise Integration, Finance Industry, Financial Accounting, Financial Advising, Financial Forecasting, Financial Planning, Financial Review, Growth Management, Inclusion, Intellectual Curiosity, Key Performance Indicators (KPIs), Operational Performance Management (OPM), Optimism, Optimization Models {+ 15 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date Show more Show less
Posted 3 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
The job market for ADF (Application Development Framework) professionals in India is witnessing significant growth, with numerous opportunities available for job seekers in this field. ADF is a popular framework used for building enterprise applications, and companies across various industries are actively looking for skilled professionals to join their teams.
Here are 5 major cities in India where there is a high demand for ADF professionals: - Bangalore - Hyderabad - Pune - Chennai - Mumbai
The estimated salary range for ADF professionals in India varies based on experience levels: - Entry-level: INR 4-6 lakhs per annum - Mid-level: INR 8-12 lakhs per annum - Experienced: INR 15-20 lakhs per annum
In the ADF job market in India, a typical career path may include roles such as Junior Developer, Senior Developer, Technical Lead, and Architect. As professionals gain more experience and expertise in ADF, they can progress to higher-level positions with greater responsibilities.
In addition to ADF expertise, professionals in this field are often expected to have knowledge of related technologies such as Java, Oracle Database, SQL, JavaScript, and web development frameworks like Angular or React.
Here are 25 interview questions for ADF roles, categorized by difficulty level: - Basic: - What is ADF and its key features? - What is the difference between ADF Faces and ADF Task Flows? - Medium: - Explain the lifecycle of an ADF application. - How do you handle exceptions in ADF applications? - Advanced: - Discuss the advantages of using ADF Business Components. - How would you optimize performance in an ADF application?
As you explore job opportunities in the ADF market in India, make sure to enhance your skills, prepare thoroughly for interviews, and showcase your expertise confidently. With the right preparation and mindset, you can excel in your ADF career and secure rewarding opportunities in the industry. Good luck!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.