Home
Jobs
Companies
Resume

996 Adf Jobs

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

8.0 years

1 - 3 Lacs

Cochin

On-site

Introduction Candidates with 8+ years of experience in IT industry and with strong .Net/.Net Core/Azure Cloud Service/ Azure DevOps. This is a client facing role and hence should have strong communication skills. This is for a US client and the resource should be hands-on - experience in coding and Azure Cloud. Working hours - 8 hours , with a 4 hours of overlap during EST Time zone. ( 12 PM - 9 PM) This overlap hours is mandatory as meetings happen during this overlap hours Responsibilities include: Design, develop, enhance, document, and maintain robust applications using .NET Core 6/8+, C#, REST APIs, T-SQL, and modern JavaScript/jQuery Integrate and support third-party APIs and external services Collaborate across cross-functional teams to deliver scalable solutions across the full technology stack Identify, prioritize, and execute tasks throughout the Software Development Life Cycle (SDLC) Participate in Agile/Scrum ceremonies and manage tasks using Jira Understand technical priorities, architectural dependencies, risks, and implementation challenges Troubleshoot, debug, and optimize existing solutions with a strong focus on performance and reliability • Certifications : Microsoft Certified: Azure Fundamentals Microsoft Certified: Azure Developer Associate Other relevant certifications in Azure, .NET, or Cloud technologies Primary Skills : 8+ years of hands-on development experience with: C#, .NET Core 6/8+, Entity Framework / EF Core JavaScript, jQuery, REST APIs Expertise in MS SQL Server, including: Complex SQL queries, Stored Procedures, Views, Functions, Packages, Cursors, Tables, and Object Types Skilled in unit testing with XUnit, MSTest Strong in software design patterns, system architecture, and scalable solution design Ability to lead and inspire teams through clear communication, technical mentorship, and ownership Strong problem-solving and debugging capabilities Ability to write reusable, testable, and efficient code Develop and maintain frameworks and shared libraries to support large-scale applications Excellent technical documentation, communication, and leadership skills Microservices and Service-Oriented Architecture (SOA) Experience in API Integrations 2+ years of hands with Azure Cloud Services, including: Azure Functions Azure Durable Functions Azure Service Bus, Event Grid, Storage Queues Blob Storage, Azure Key Vault, SQL Azure Application Insights, Azure Monitoring Secondary Skills: Familiarity with AngularJS, ReactJS, and other front-end frameworks Experience with Azure API Management (APIM) Knowledge of Azure Containerization and Orchestration (e.g., AKS/Kubernetes) Experience with Azure Data Factory (ADF) and Logic Apps Exposure to Application Support and operational monitoring Azure DevOps - CI/CD pipelines (Classic / YAML)

Posted 16 hours ago

Apply

7.0 years

0 Lacs

Hyderābād

On-site

Digital Solutions Consultant I - HYD015Q Company : Worley Primary Location : IND-AP-Hyderabad Job : Digital Solutions Schedule : Full-time Employment Type : Agency Contractor Job Level : Experienced Job Posting : Jun 16, 2025 Unposting Date : Jul 16, 2025 Reporting Manager Title : Senior General Manager : We deliver the world’s most complex projects. Work as part of a collaborative and inclusive team. Enjoy a varied & challenging role. Building on our past. Ready for the future Worley is a global professional services company of energy, chemicals and resources experts headquartered in Australia. Right now, we’re bridging two worlds as we accelerate to more sustainable energy sources, while helping our customers provide the energy, chemicals, and resources that society needs now. We partner with our customers to deliver projects and create value over the life of their portfolio of assets. We solve complex problems by finding integrated data-centric solutions from the first stages of consulting and engineering to installation and commissioning, to the last stages of decommissioning and remediation. Join us and help drive innovation and sustainability in our projects. The Role As a Digital Solutions Consultant with Worley, you will work closely with our existing team to deliver projects for our clients while continuing to develop your skills and experience etc. We are looking for a skilled Data Engineer to join our Digital Customer Solutions team. The ideal candidate should have experience in cloud computing and big data technologies. As a Data Engineer, you will be responsible for designing, building, and maintaining scalable data solutions that can handle large volumes of data. You will work closely with stakeholders to ensure that the data is accurate, reliable, and easily accessible. Responsibilities: Design, build, and maintain scalable data pipelines that can handle large volumes of data. Document design of proposed solution including structuring data (data modelling applying different techniques including 3-NF and Dimensional modelling) and optimising data for further consumption (working closely with Data Visualization Engineers, Front-end Developers, Data Scientists and ML-Engineers). Develop and maintain ETL processes to extract data from various sources (including sensor, semi-structured and unstructured, as well as structured data stored in traditional databases, file stores or from SOAP and REST data interfaces). Develop data integration patterns for batch and streaming processes, including implementation of incremental loads. Build quick porotypes and prove-of-concepts to validate assumption and prove value of proposed solutions or new cloud-based services. Define Data engineering standards and develop data ingestion/integration frameworks. Participate in code reviews and ensure all solutions are lined to architectural and requirement specifications. Develop and maintain cloud-based infrastructure to support data processing using Azure Data Services (ADF, ADLS, Synapse, Azure SQL DB, Cosmos DB). Develop and maintain automated data quality pipelines. Collaborate with cross-functional teams to identify opportunities for process improvement. Manage a team of Data Engineers. About You To be considered for this role it is envisaged you will possess the following attributes: Bachelor’s degree in Computer Science or related field. 7+ years of experience in big data technologies such as Hadoop, Spark, Hive & Delta Lake. 7+ years of experience in cloud computing platforms such as Azure, AWS or GCP. Experience in working in cloud Data Platforms, including deep understanding of scaled data solutions. Experience in working with different data integration patterns (batch and streaming), implementing incremental data loads. Proficient in scripting in Java, Windows and PowerShell. Proficient in at least one programming language like Python, Scala. Expert in SQL. Proficient in working with data services like ADLS, Azure SQL DB, Azure Synapse, Snowflake, No-SQL (e.g. Cosmos DB, Mongo DB), Azure Data Factory, Databricks or similar on AWS/GCP. Experience in using ETL tools (like Informatica IICS Data integration) is an advantage. Strong understanding of Data Quality principles and experience in implementing those. Moving forward together We want our people to be energized and empowered to drive sustainable impact. So, our focus is on a values-inspired culture that unlocks brilliance through belonging, connection and innovation. We’re building a diverse, inclusive and respectful workplace. Creating a space where everyone feels they belong, can be themselves, and are heard. And we're not just talking about it; we're doing it. We're reskilling our people, leveraging transferable skills, and supporting the transition of our workforce to become experts in today's low carbon energy infrastructure and technology. Whatever your ambition, there’s a path for you here. And there’s no barrier to your potential career success. Join us to broaden your horizons, explore diverse opportunities, and be part of delivering sustainable change. Worley takes personal data protection seriously and respects EU and local data protection laws. You can read our full Recruitment Privacy Notice Here. Please note: If you are being represented by a recruitment agency you will not be considered, to be considered you will need to apply directly to Worley.

Posted 17 hours ago

Apply

7.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

We deliver the world’s most complex projects. Work as part of a collaborative and inclusive team. Enjoy a varied & challenging role. Building on our past. Ready for the future Worley is a global professional services company of energy, chemicals and resources experts headquartered in Australia. Right now, we’re bridging two worlds as we accelerate to more sustainable energy sources, while helping our customers provide the energy, chemicals, and resources that society needs now. We partner with our customers to deliver projects and create value over the life of their portfolio of assets. We solve complex problems by finding integrated data-centric solutions from the first stages of consulting and engineering to installation and commissioning, to the last stages of decommissioning and remediation. Join us and help drive innovation and sustainability in our projects. The Role As a Digital Solutions Consultant with Worley, you will work closely with our existing team to deliver projects for our clients while continuing to develop your skills and experience etc. We are looking for a skilled Data Engineer to join our Digital Customer Solutions team. The ideal candidate should have experience in cloud computing and big data technologies. As a Data Engineer, you will be responsible for designing, building, and maintaining scalable data solutions that can handle large volumes of data. You will work closely with stakeholders to ensure that the data is accurate, reliable, and easily accessible. Responsibilities Design, build, and maintain scalable data pipelines that can handle large volumes of data. Document design of proposed solution including structuring data (data modelling applying different techniques including 3-NF and Dimensional modelling) and optimising data for further consumption (working closely with Data Visualization Engineers, Front-end Developers, Data Scientists and ML-Engineers). Develop and maintain ETL processes to extract data from various sources (including sensor, semi-structured and unstructured, as well as structured data stored in traditional databases, file stores or from SOAP and REST data interfaces). Develop data integration patterns for batch and streaming processes, including implementation of incremental loads. Build quick porotypes and prove-of-concepts to validate assumption and prove value of proposed solutions or new cloud-based services. Define Data engineering standards and develop data ingestion/integration frameworks. Participate in code reviews and ensure all solutions are lined to architectural and requirement specifications. Develop and maintain cloud-based infrastructure to support data processing using Azure Data Services (ADF, ADLS, Synapse, Azure SQL DB, Cosmos DB). Develop and maintain automated data quality pipelines. Collaborate with cross-functional teams to identify opportunities for process improvement. Manage a team of Data Engineers. About You To be considered for this role it is envisaged you will possess the following attributes: Bachelor’s degree in Computer Science or related field. 7+ years of experience in big data technologies such as Hadoop, Spark, Hive & Delta Lake. 7+ years of experience in cloud computing platforms such as Azure, AWS or GCP. Experience in working in cloud Data Platforms, including deep understanding of scaled data solutions. Experience in working with different data integration patterns (batch and streaming), implementing incremental data loads. Proficient in scripting in Java, Windows and PowerShell. Proficient in at least one programming language like Python, Scala. Expert in SQL. Proficient in working with data services like ADLS, Azure SQL DB, Azure Synapse, Snowflake, No-SQL (e.g. Cosmos DB, Mongo DB), Azure Data Factory, Databricks or similar on AWS/GCP. Experience in using ETL tools (like Informatica IICS Data integration) is an advantage. Strong understanding of Data Quality principles and experience in implementing those. Moving forward together We want our people to be energized and empowered to drive sustainable impact. So, our focus is on a values-inspired culture that unlocks brilliance through belonging, connection and innovation. We’re building a diverse, inclusive and respectful workplace. Creating a space where everyone feels they belong, can be themselves, and are heard. And we're not just talking about it; we're doing it. We're reskilling our people, leveraging transferable skills, and supporting the transition of our workforce to become experts in today's low carbon energy infrastructure and technology. Whatever your ambition, there’s a path for you here. And there’s no barrier to your potential career success. Join us to broaden your horizons, explore diverse opportunities, and be part of delivering sustainable change. Worley takes personal data protection seriously and respects EU and local data protection laws. You can read our full Recruitment Privacy Notice Here. Please note: If you are being represented by a recruitment agency you will not be considered, to be considered you will need to apply directly to Worley. Company Worley Primary Location IND-AP-Hyderabad Job Digital Solutions Schedule Full-time Employment Type Agency Contractor Job Level Experienced Job Posting Jun 16, 2025 Unposting Date Jul 16, 2025 Reporting Manager Title Senior General Manager Show more Show less

Posted 17 hours ago

Apply

3.0 years

0 Lacs

Calcutta

On-site

Line of Service Advisory Industry/Sector Not Applicable Specialism Microsoft Management Level Senior Associate Job Description & Summary A career in our Microsoft Dynamics team will provide the opportunity to help our clients transform their technology landscape across Front, Back and Mid-Office functions leveraging Microsoft Dynamics. We focus on contributing to PwC’s value proposition of “strategy led and technology enabled”, by aligning our Consulting Solutions’ industry focus with the Microsoft technologies such as Dynamics 365, Azure, Power Platform and Power BI. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Responsibilities: · Experience as a Data Analyst with high proficiency; · Expertise in writing and optimizing SQL Queries in SQL Server and/or Oracle; · Experience in Data Extraction, Transformation and Loading (ETL) using SSIS and ADF; · Experience in PowerBI and/or Tableau for visualizing and analyzing data; · Having knowledge in Database Normalization for optimum performance; · Excellency in MS Excel with proficiency in Vlookups, Pivot Tables and VBA Macros; · Knowledge about data warehousing concepts · Performance optimization and troubleshooting capabilities · Good Project Management Skills- Client Meetings, Stakeholder Engagement · Familiarity with Agile Methodology · Strong knowledge in Azure DevOps Boards, Sprint, Queries, Pipelines (CI/ CD) etc. Mandatory skill sets: ADF, Power BI Preferred skill sets: Devops/CI/CD Years of experience required: 3-7 years Education qualification: B.Tech/B.E Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor of Engineering Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills Power BI Optional Skills Acceptance Test Driven Development (ATDD), Acceptance Test Driven Development (ATDD), Accepting Feedback, Active Listening, Analytical Thinking, Android, API Management, Appian (Platform), Application Development, Application Frameworks, Application Lifecycle Management, Application Software, Business Process Improvement, Business Process Management (BPM), Business Requirements Analysis, C#.NET, C++ Programming Language, Client Management, Code Review, Coding Standards, Communication, Computer Engineering, Computer Science, Continuous Integration/Continuous Delivery (CI/CD), Creativity {+ 46 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date

Posted 17 hours ago

Apply

5.0 - 12.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Greetings from Kellton Tech!! Job Title : Java & ADF Developer / Java with Springboot Location : Hyderabad (Onsite – Client Location) Experience : 5-12 years Employment Type : Full-time / Contract (as applicable) Joining : Immediate to 30 days preferred About Kellton: We are a global IT services and digital product design and development company with subsidiaries that serve startup, mid-market, and enterprise clients across diverse industries, including Finance, Healthcare, Manufacturing, Retail, Government, and Nonprofits. At Kellton, we believe that our people are our greatest asset. We are committed to fostering a culture of collaboration, innovation, and continuous learning. Our core values include integrity, customer focus, teamwork, and excellence. To learn more about our organization, please visit us at www.kellton.com Are you craving a dynamic and autonomous work environment? If so, this opportunity may be just what you're looking for. At our company, we value your critical thinking skills and encourage your input and creative ideas to supply the best talent available. To boost your productivity, we provide a comprehensive suite of IT tools and practices backed by an experienced team to work with. Req 1: Java with Springboot Technical Skills: Java (should also be able to work on older versions – Versions 7 & 8) Spring Boot, Spring JPA, Spring Security MySQL IDEs: Primarily NetBeans, also Eclipse Jasper Reports Application Servers: Tomcat, JBoss (WildFly) Basic knowledge of Linux Day-to-Day Responsibilities: Handling API-related issues and bug fixes Developing new APIs and features as per business requirements Coordinating and deploying builds in UAT environments Collaborating with the QA and product teams to ensure smooth releases Addl Skillset Info: Java , Spring Boot , Hibernate , Junit , JWT, OAuth, Redis, Docker , Kafka (Optional) , Open api standards , Jenkins/Git Pipeline, etc Req:2 Java & Oracle ADF Developer About the Role: We are looking for a skilled Java and Oracle ADF Developer to join our team for an on-site deployment at our client’s location in Hyderabad. The ideal candidate should have a solid background in Java development, Oracle ADF, and associated tools and technologies, strong problem-solving abilities, and experience working in a Linux-based environment. Key Responsibilities Develop and maintain enterprise-grade applications using Oracle ADF and Java 7/8 . Design and implement reports using Jasper Reports and iReport . Manage deployments and configurations on the JBoss application server. Work with development tools such as NetBeans , Eclipse , or JDeveloper . Perform data management tasks using MySQL . Write and maintain Shell scripts and configure cron jobs for scheduled tasks. Administer and monitor systems in a Linux environment. Utilize Apache Superset for data visualization and dashboard reporting. Collaborate with cross-functional teams to deliver high-quality solutions on time. Troubleshoot issues and provide timely resolutions. Required Skills Proficiency in Java 7/8 and object-oriented programming Strong hands-on experience with Oracle ADF Expertise in Jasper Reports , iReport , and report generation Experience with JBoss server setup and application deployment Familiarity with NetBeans , Eclipse , or JDeveloper IDEs Good understanding of MySQL database design and queries Experience with Linux OS and shell scripting Ability to set up and manage cron jobs Knowledge of Apache Superset or similar BI tools Strong problem-solving and debugging skills Good to Have Exposure to Agile development practices Familiarity with REST APIs and web services Knowledge of version control tools (e.g., Git) Education Bachelor’s or Master’s degree in Computer Science, Information Technology, or related field. What we offer you: · Existing clients in multiple domains to work. · Strong and efficient team committed to quality output. · Enhance your knowledge and gain industry domain expertise by working in varied roles. · A team of experienced, fun, and collaborative colleagues · Hybrid work arrangement for flexibility and work-life balance (If the client/project allows) · Competitive base salary and job satisfaction. Join our team and become part of an exciting company where your expertise and ideas are valued, and where you can make a significant impact in the IT industry. Apply today! Interested applicants, please submit your detailed resume stating your current and expected compensation and notice period to srahaman@kellton.com Show more Show less

Posted 17 hours ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Title - Azure Data Engineer Experience - 5-8 Years Location - Pune and Gurgaon (Hybrid) Key Responsibilities: Design and develop ETL/ELT pipelines using Azure Data Factory , Snowflake , and DBT . Build and maintain data integration workflows from various data sources to Snowflake. Write efficient and optimized SQL queries for data extraction and transformation. Work with stakeholders to understand business requirements and translate them into technical solutions. Monitor, troubleshoot, and optimize data pipelines for performance and reliability. Maintain and enforce data quality, governance, and documentation standards. Collaborate with data analysts, architects, and DevOps teams in a cloud-native environment. Must-Have Skills: Strong experience with Azure Cloud Platform services. Proven expertise in Azure Data Factory (ADF) for orchestrating and automating data pipelines. Proficiency in SQL for data analysis and transformation. Hands-on experience with Snowflake and SnowSQL for data warehousing. Practical knowledge of DBT (Data Build Tool) for transforming data in the warehouse. Experience working in cloud-based data environments with large-scale datasets. Good-to-Have Skills: Experience with Azure Data Lake , Azure Synapse , or Azure Functions . Familiarity with Python or PySpark for custom data transformations. Understanding of CI/CD pipelines and DevOps for data workflows. Exposure to data governance , metadata management , or data catalog tools. Knowledge of business intelligence tools (e.g., Power BI, Tableau) is a plus. Qualifications: Bachelor’s or Master’s degree in Computer Science, Data Engineering, Information Systems, or a related field. 5+ years of experience in data engineering roles using Azure and Snowflake. Strong problem-solving, communication, and collaboration skills. Show more Show less

Posted 17 hours ago

Apply

7.0 years

0 Lacs

Mohali district, India

On-site

Linkedin logo

Job Summary: We are seeking a skilled Database & ETL Developer with strong expertise in SQL, data modeling, cloud integration, and reporting tools. The ideal candidate will be responsible for designing scalable database architectures, optimizing complex queries, working with modern ETL tools, and delivering insightful dashboards and reports. This role requires collaboration across multiple teams and the ability to manage tasks in a dynamic, agile environment. Experience: 7+ Years Location: Mohali (Work from Office Only) Key Responsibilities: Design and implement normalized and scalable database schemas using ER modeling techniques. Develop, maintain, and optimize stored procedures, triggers, views, and advanced SQL queries (e.g., joins, subqueries, indexing). Execute database backup, restore, and recovery operations ensuring data integrity and high availability. Optimize SQL performance through indexing strategies, execution plans, and query refactoring. Lead and support cloud migration and integration projects involving platforms such as AWS and Azure. Implement and manage data lake architectures such as AWS HealthLake and AWS Glue pipelines. Create and manage interactive dashboards and business reports using Power BI, Amazon QuickSight, or Tableau. Collaborate with cross-functional teams and use tools like JIRA, Azure Boards, ClickUp, or Trello for task and project tracking. Required Skills: Strong experience with SQL Server or PostgreSQL , including advanced T-SQL programming. In-depth knowledge of ER modeling and relational database design principles. Proficiency in query optimization , indexing, joins, and subqueries. Hands-on experience with modern ETL and data integration platforms such as Airbyte, Apache Airflow, Azure Data Factory (ADF), and AWS Glue . Understanding of Data Lake / Health Lake architectures and their role in cloud data ecosystems. Proficiency with reporting tools like Power BI, Amazon QuickSight, or Tableau . Experience with database backup, restore , and high availability strategies . Familiarity with project/task tracking tools such as JIRA, Azure Boards, ClickUp, or Trello . Soft Skills: Strong verbal and written communication skills. Excellent problem-solving and troubleshooting abilities. Self-motivated with the ability to manage priorities and work independently across multiple projects. Nice to Have: Certification in cloud platforms (AWS, Azure). Exposure to healthcare data standards and compliance (e.g., HIPAA, FHIR). Company overview: smartData is a leader in global software business space when it comes to business consulting and technology integrations making business easier, accessible, secure and meaningful for its target segment of startups to small & medium enterprises. As your technology partner, we provide both domain and technology consulting and our inhouse products and our unique productized service approach helps us to act as business integrators saving substantial time to market for our esteemed customers. With 8000+ projects, vast experience of 20+ years, backed by offices in the US, Australia, and India, providing next door assistance and round-the-clock connectivity, we ensure continual business growth for all our customers. Our business consulting and integrator services via software solutions focus on important industries of healthcare, B2B, B2C, & B2B2C platforms, online delivery services, video platform services, and IT services. Strong expertise in Microsoft, LAMP stack, MEAN/MERN stack with mobility first approach via native (iOS, Android, Tizen) or hybrid (React Native, Flutter, Ionic, Cordova, PhoneGap) mobility stack mixed with AI & ML help us to deliver on the ongoing needs of customers continuously. For more information, visit http://www.smartdatainc.com Show more Show less

Posted 18 hours ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Position: Data Engineer - Azure Synapse Location: Mumbai/Pune/Nagpur Type of Employment: TPC Key Result Areas and Activities: • Design, develop and deploy ETL/ELT solutions on premise or in the cloud • Transformation of data with stored procedures • Report Development (MicroStrategy/Power BI) • Create and maintain comprehensive documentation for data pipelines, configurations, and processes • Ensure data quality and integrity through effective data management practices • Monitor and optimize data pipeline performance • Troubleshoot and resolve data-related issues Technical Experience: Must Have • Good experience in Azure Synapse • Good experience in ADF • Good experience in Snowflake & Stored Procedures • Experience with ETL/ELT processes, data warehousing, and data modelling • Experience with data quality frameworks, monitoring tools, and job scheduling • Knowledge of data formats like JSON, XML, CSV, and Parquet • English Fluent (Strong written, verbal, and presentation skills) • Agile methodology & tools like JIRA • Good communication and formal skills Good To Have • Good experience in MicroStrategy and PowerBI • Experience in scripting languages such as Python, Java, or Shell scripting • Familiarity with Azure cloud platforms and cloud data services Show more Show less

Posted 19 hours ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Job Title: Data Governance & MDM Specialist – Azure Purview and Profisee (POC Implementation) Job Overview: We are seeking a skilled Data Governance and Master Data Management (MDM) Specialist to lead the setup and validation of a POC environment using Azure Purview and Profisee . The goal is to establish an integrated framework that ensures high standards in data quality, lineage tracking, security, and master data management across source systems. Key Responsibilities: Set up and configure Azure Purview based on the defined architecture Deploy and configure Profisee MDM platform Provision necessary Azure resources , including storage, compute, and access controls Configure: Data glossary Data classifications Domains and metadata schemas Security and access policies Set up data lineage tracking across integrated systems Define and implement match/merge rules , workflow processes , and data quality logic in Profisee Integrate Azure Purview and Profisee with multiple source systems for the POC Build data ingestion and transformation pipelines using tools such as Azure Data Factory (ADF) Ensure accurate lineage visualization , data quality validation , and matching logic verification Provide support for orchestration , testing, and ongoing validation during the POC Required Skills & Experience: Hands-on experience with Azure Purview configuration and integration Strong expertise in Profisee or similar MDM platforms Experience with data cataloging , metadata management , and data lineage Familiarity with data governance frameworks and data stewardship Proficient in Azure Data Factory (ADF) or other data pipeline tools Good understanding of Azure cloud architecture , RBAC, and resource provisioning Strong SQL skills for data profiling and validation Experience with match/merge logic , workflow configuration, and business rules in MDM tools Ability to troubleshoot and resolve integration issues across systems Nice to Have: Familiarity with Databricks , Azure Functions , or Logic Apps Knowledge of Infrastructure as Code (IaC) – ARM, Bicep, or Terraform Prior experience with API-based integrations for real-time data sync Show more Show less

Posted 20 hours ago

Apply

0 years

0 Lacs

Ganganagar, Rajasthan, India

On-site

Linkedin logo

33471BR Pune Job Description Mandatory Skills 1 Core Java 8 2 Spring Core, Spring Boot, Spring MVC, Spring Rest 3 JPA, Spring JDBC template OR Hibernate 4 REST Webservices 5 Security (Basic, JWT, OAuth, API Key) 6 Angular 7 Oracle Database, SQL queries 8 Jenkins 9 Agile Good to have skills 1 Spring Batch 2 JSP 3 Oracle ADF 4 PostgreSQL 5 AWS Developer Basics 6 SQL Tuning, PLSQL Additional Requirements 1 AWS Cloud Developer skills knowhow 2 Troubleshooting Production and Performance issues Qualifications Engineering Range of Year Experience-Min Year 9 Range of Year Experience-Max Year 14 Show more Show less

Posted 21 hours ago

Apply

0 years

0 Lacs

Ganganagar, Rajasthan, India

On-site

Linkedin logo

33472BR Pune Job Description Mandatory Skills 1 Core Java 8 2 Spring Core, Spring Boot, Spring MVC, Spring Rest 3 JPA, Spring JDBC template OR Hibernate 4 REST Webservices 5 Security (Basic, JWT, OAuth, API Key) 6 Angular 7 Oracle Database, SQL queries 8 Jenkins 9 Agile Good to have skills 1 Spring Batch 2 JSP 3 Oracle ADF 4 PostgreSQL 5 AWS Developer Basics 6 SQL Tuning, PLSQL Additional Requirements 1 AWS Cloud Developer skills knowhow 2 Troubleshooting Production and Performance Qualifications Engineering Range of Year Experience-Min Year 9 Range of Year Experience-Max Year 14 Show more Show less

Posted 21 hours ago

Apply

6.0 - 7.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Designation – Sr.Consultant Experience- 6 to 7 years Location- Bengaluru Skills Req- Python, SQL, Databrciks , ADF ,within-Databrcisk - DLT, PySpark, Structural streaming , performance and cost optimization. Roles and Responsibilities: Capture business problems, value drivers, and functional/non-functional requirements and translate into functionality. Assess the risks, feasibility, opportunities, and business impact. Assess and model processes, data flows, and technology to understand the current value and issues, and identify opportunities for improvement. Create / update clear documentation of requirements to align with the solution over the project lifecycle. Ensure traceability of requirements from business needs through testing and scope changes, to final solution. Interact with software suppliers, designers and developers to understand software limitations, deliver elements of system and database design, and ensure that business requirements and use cases are handled. Configure and document software and processes, using agreed standards and tools. Create acceptance criteria and validate that solutions meet business needs, through defining and coordinating testing. Create and present compelling business cases to justify solution value and establish approval, funding and prioritization. Initiate, plan, execute, monitor, and control Business Analysis activities on projects within agreed parameters of cost, time and quality. Lead stakeholder management activities and large design sessions. Lead teams to complete business analysis on projects. Configure and document software and processes. Define and coordinate testing. Mandatory skills: Agile project experience. Understand Agile frameworks and tools. Worked in Agile. Educated stakeholders including Product Owners and Business partners in Agile ways of working. Understand systems engineering concepts, data/process analysis and modeling, products & solutions. Degree. 4 - 7 yrs IT. Optional skills: Agile certifications/trainings preferred. CBAP (Certified Business Analysis Professional) or PMI-PBA certification preferred. Lean Practitioner training and experience are an asset. Show more Show less

Posted 21 hours ago

Apply

7.0 years

0 Lacs

India

Remote

Linkedin logo

Job Title: Senior Data Engineer – Azure & API Development Location: Remote Experience Required: 7+ Years Job Summary: We are looking for an experienced Data Engineer with strong expertise in Azure cloud architecture , API development , and modern data engineering tools . The ideal candidate will have in-depth experience in building and maintaining scalable data pipelines and API integrations using Azure services like Azure Data Factory (ADF) , Databricks , Azure Functions , and Service Bus , along with infrastructure provisioning using Terraform . Key Responsibilities: Design and implement scalable, secure, and high-performance data solutions on Azure . Develop, deploy, and manage RESTful APIs to support data access and integration. Build and maintain ETL/ELT data pipelines using Azure Data Factory , Databricks , and Azure Functions . Integrate data workflows with Azure Service Bus and other messaging services. Define and implement cloud infrastructure using Terraform and Infrastructure-as-Code (IaC) best practices. Collaborate with stakeholders to understand data requirements and develop technical solutions. Ensure best practices for data governance , security , monitoring , and performance optimization . Work closely with DevOps and Data Architects to implement CI/CD pipelines and production-grade deployments. Must-Have Skills: 7+ years of professional experience in Data Engineering or related roles. Strong hands-on experience with Azure services , particularly: Azure Data Factory (ADF) Databricks (Spark-based processing) Azure Functions Azure Service Bus Proficient in API development (RESTful APIs using Python, .NET, or Node.js). Good command over SQL , Spark SQL , and data transformation techniques. Experience with Terraform for IaC and provisioning Azure resources. Excellent understanding of data architecture , cloud security , and governance models . Strong problem-solving skills and experience working in Agile environments. Preferred Skills: Familiarity with CI/CD tools like Azure DevOps, GitHub Actions, or Jenkins. Exposure to event-driven architecture and real-time data streaming. Knowledge of containerization (Docker/Kubernetes) is a plus. Experience in performance tuning and cost optimization in Azure environments. Show more Show less

Posted 22 hours ago

Apply

0 years

0 Lacs

India

Remote

Linkedin logo

Required Skills: YOE-8+ Mode Of work: Remote Design, develop, modify, and test software applications for the healthcare industry in agile environment. Duties include: Develop. support/maintain and deploy software to support a variety of business needs Provide technical leadership in the design, development, testing, deployment and maintenance of software solutions Design and implement platform and application security for applications Perform advanced query analysis and performance troubleshooting Coordinate with senior-level stakeholders to ensure the development of innovative software solutions to complex technical and creative issues Re-design software applications to improve maintenance cost, testing functionality, platform independence and performance Manage user stories and project commitments in an agile framework to rapidly deliver value to customers deploy and operate software solutions using DevOps model. Required skills: Azure Deltalake, ADF, Databricks, PySpark, Oozie, Airflow, Big Data technologies( HBASE, HIVE), CI/CD (GitHub/Jenkins) Show more Show less

Posted 22 hours ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

The role involves building and managing data pipelines, troubleshooting issues, and ensuring data accuracy across various platforms such as Azure Synapse Analytics, Azure Data Lake Gen2, and SQL environments. This position requires extensive SQL experience and a strong background in PySpark development. Responsibilities Data Engineering: Work with Azure Synapse Pipelines and PySpark for data transformation and pipeline management. Perform data integration and schema updates in Delta Lake environments, ensuring smooth data flow and accurate reporting. Work with our Azure DevOps team on CI/CD processes for deployment of Infrastructure as Code (IaC) and Workspace artifacts. Develop custom solutions for our customers defined by our Data Architect and assist in improving our data solution patterns over time. Documentation : Document ticket resolutions, testing protocols, and data validation processes. Collaborate with other stakeholders to provide specifications and quotations for enhancements requested by customers. Ticket Management: Monitor the Jira ticket queue and respond to tickets as they are raised. Understand ticket issues, utilizing extensive SQL, Synapse Analytics, and other tools to troubleshoot them. Communicate effectively with customer users who raised the tickets and collaborate with other teams (e.g., FinOps, Databricks) as needed to resolve issues. Troubleshooting and Support: Handle issues related to ETL pipeline failures, Delta Lake processing, or data inconsistencies in Synapse Analytics. Provide prompt resolution to data pipeline and validation issues, ensuring data integrity and performance. Desired Skills & Requirements Seeking a candidate with 5+ years of Dynamics 365 ecosystem experience with a strong PySpark development background. While various profiles may apply, we highly value a strong person-organization fit. Our ideal candidate possesses the following attributes and qualifications: Extensive experience with SQL, including query writing and troubleshooting in Azure SQL, Synapse Analytics, and Delta Lake environments. Strong understanding and experience in implementing and supporting ETL processes, Data Lakes, and data engineering solutions. Proficiency in using Azure Synapse Analytics, including workspace management, pipeline creation, and data flow management. Hands-on experience with PySpark for data processing and automation. Ability to use VPNs, MFA, RDP, jump boxes/jump hosts, etc., to operate within the customers secure environments. Some experience with Azure DevOps CI/CD IaC and release pipelines. Ability to communicate effectively both verbally and in writing, with strong problem-solving and analytical skills. Understanding of the operation and underlying data structure of D365 Finance and Operations, Business Central, and Customer Engagement. Experience with Data Engineering in Microsoft Fabric Experience with Delta Lake and Azure data engineering concepts (e.g., ADLS, ADF, Synapse, AAD, Databricks). Certifications in Azure Data Engineering. Why Join Us? Opportunity to work with innovative technologies in a dynamic environment where progressive work culture with a global perspective where your ideas truly matter, and growth opportunities are endless. Work with the latest Microsoft Technologies alongside Dynamics professionals committed to driving customer success. Enjoy the flexibility to work from anywhere Work-life balance that suits your lifestyle. Competitive salary and comprehensive benefits package. Career growth and professional development opportunities. A collaborative and inclusive work culture. Show more Show less

Posted 23 hours ago

Apply

4.0 years

0 Lacs

India

On-site

Linkedin logo

Job Title: Azure Databricks Engineer Experience: 4+ Years Required Skills: 4+ years of experience in Data Engineering . Strong hands-on experience with Azure Databricks and PySpark . Good understanding of Azure Data Factory (ADF) , Azure Data Lake (ADLS) , and Azure Synapse . Strong SQL skills and experience with large-scale data processing. Experience with version control systems (Git), CI/CD pipelines, and Agile methodology. Knowledge of Delta Lake, Lakehouse architecture, and distributed computing concepts. Preferred Skills: Experience with Airflow , Power BI , or machine learning pipelines . Familiarity with DevOps tools for automation and deployment in Azure. Azure certifications (e.g., DP-203) are a plus. Show more Show less

Posted 23 hours ago

Apply

3.0 years

0 Lacs

India

Remote

Linkedin logo

Title: Data Engineer Location: Remote Employment type: Full Time with BayOne We’re looking for a skilled and motivated Data Engineer to join our growing team and help us build scalable data pipelines, optimize data platforms, and enable real-time analytics. What You'll Do Design, develop, and maintain robust data pipelines using tools like Databricks, PySpark, SQL, Fabric, and Azure Data Factory Collaborate with data scientists, analysts, and business teams to ensure data is accessible, clean, and actionable Work on modern data lakehouse architectures and contribute to data governance and quality frameworks Tech Stack Azure | Databricks | PySpark | SQL What We’re Looking For 3+ years experience in data engineering or analytics engineering Hands-on with cloud data platforms and large-scale data processing Strong problem-solving mindset and a passion for clean, efficient data design Job Description: Min 3 years of experience in modern data engineering/data warehousing/data lakes technologies on cloud platforms like Azure, AWS, GCP, Data Bricks etc. Azure experience is preferred over other cloud platforms. 5 years of proven experience with SQL, schema design and dimensional data modelling Solid knowledge of data warehouse best practices, development standards and methodologies Experience with ETL/ELT tools like ADF, Informatica, Talend etc., and data warehousing technologies like Azure Synapse, Microsoft Fabric, Azure SQL, Amazon redshift, Snowflake, Google Big Query etc. Strong experience with big data tools (Databricks, Spark etc..) and programming skills in PySpark and Spark SQL. Be an independent self-learner with “let’s get this done” approach and ability to work in Fast paced and Dynamic environment. Excellent communication and teamwork abilities. Nice-to-Have Skills: Event Hub, IOT Hub, Azure Stream Analytics, Azure Analysis Service, Cosmo DB knowledge. SAP ECC /S/4 and Hana knowledge. Intermediate knowledge on Power BI Azure DevOps and CI/CD deployments, Cloud migration methodologies and processes BayOne is an Equal Opportunity Employer and does not discriminate against any employee or applicant for employment because of race, color, sex, age, religion, sexual orientation, gender identity, status as a veteran, and basis of disability or any federal, state, or local protected class. This job posting represents the general duties and requirements necessary to perform this position and is not an exhaustive statement of all responsibilities, duties, and skills required. Management reserves the right to revise or alter this job description. Show more Show less

Posted 23 hours ago

Apply

8.0 years

0 Lacs

Kochi, Kerala, India

On-site

Linkedin logo

Introduction Candidates with 8+ years of experience in IT industry and with strong .Net/.Net Core/Azure Cloud Service/ Azure DevOps. This is a client facing role and hence should have strong communication skills. This is for a US client and the resource should be hands-on - experience in coding and Azure Cloud. Working hours - 8 hours , with a 4 hours of overlap during EST Time zone. ( 12 PM - 9 PM) This overlap hours is mandatory as meetings happen during this overlap hours Responsibilities include: Design, develop, enhance, document, and maintain robust applications using .NET Core 6/8+, C#, REST APIs, T-SQL, and modern JavaScript/jQuery Integrate and support third-party APIs and external services Collaborate across cross-functional teams to deliver scalable solutions across the full technology stack Identify, prioritize, and execute tasks throughout the Software Development Life Cycle (SDLC) Participate in Agile/Scrum ceremonies and manage tasks using Jira Understand technical priorities, architectural dependencies, risks, and implementation challenges Troubleshoot, debug, and optimize existing solutions with a strong focus on performance and reliability Certifications : Microsoft Certified: Azure Fundamentals Microsoft Certified: Azure Developer Associate Other relevant certifications in Azure, .NET, or Cloud technologies Primary Skills : 8+ years of hands-on development experience with: C#, .NET Core 6/8+, Entity Framework / EF Core JavaScript, jQuery, REST APIs Expertise in MS SQL Server, including: Complex SQL queries, Stored Procedures, Views, Functions, Packages, Cursors, Tables, and Object Types Skilled in unit testing with XUnit, MSTest Strong in software design patterns, system architecture, and scalable solution design Ability to lead and inspire teams through clear communication, technical mentorship, and ownership Strong problem-solving and debugging capabilities Ability to write reusable, testable, and efficient code Develop and maintain frameworks and shared libraries to support large-scale applications Excellent technical documentation, communication, and leadership skills Microservices and Service-Oriented Architecture (SOA) Experience in API Integrations 2+ years of hands with Azure Cloud Services, including: Azure Functions Azure Durable Functions Azure Service Bus, Event Grid, Storage Queues Blob Storage, Azure Key Vault, SQL Azure Application Insights, Azure Monitoring Secondary Skills: Familiarity with AngularJS, ReactJS, and other front-end frameworks Experience with Azure API Management (APIM) Knowledge of Azure Containerization and Orchestration (e.g., AKS/Kubernetes) Experience with Azure Data Factory (ADF) and Logic Apps Exposure to Application Support and operational monitoring Azure DevOps - CI/CD pipelines (Classic / YAML) Show more Show less

Posted 23 hours ago

Apply

8.0 years

0 Lacs

Thiruvananthapuram, Kerala, India

On-site

Linkedin logo

Introduction Candidates with 8+ years of experience in IT industry and with strong .Net/.Net Core/Azure Cloud Service/ Azure DevOps. This is a client facing role and hence should have strong communication skills. This is for a US client and the resource should be hands-on - experience in coding and Azure Cloud. Working hours - 8 hours , with a 4 hours of overlap during EST Time zone. ( 12 PM - 9 PM) This overlap hours is mandatory as meetings happen during this overlap hours Responsibilities include: Design, develop, enhance, document, and maintain robust applications using .NET Core 6/8+, C#, REST APIs, T-SQL, and modern JavaScript/jQuery Integrate and support third-party APIs and external services Collaborate across cross-functional teams to deliver scalable solutions across the full technology stack Identify, prioritize, and execute tasks throughout the Software Development Life Cycle (SDLC) Participate in Agile/Scrum ceremonies and manage tasks using Jira Understand technical priorities, architectural dependencies, risks, and implementation challenges Troubleshoot, debug, and optimize existing solutions with a strong focus on performance and reliability Certifications : Microsoft Certified: Azure Fundamentals Microsoft Certified: Azure Developer Associate Other relevant certifications in Azure, .NET, or Cloud technologies Primary Skills : 8+ years of hands-on development experience with: C#, .NET Core 6/8+, Entity Framework / EF Core JavaScript, jQuery, REST APIs Expertise in MS SQL Server, including: Complex SQL queries, Stored Procedures, Views, Functions, Packages, Cursors, Tables, and Object Types Skilled in unit testing with XUnit, MSTest Strong in software design patterns, system architecture, and scalable solution design Ability to lead and inspire teams through clear communication, technical mentorship, and ownership Strong problem-solving and debugging capabilities Ability to write reusable, testable, and efficient code Develop and maintain frameworks and shared libraries to support large-scale applications Excellent technical documentation, communication, and leadership skills Microservices and Service-Oriented Architecture (SOA) Experience in API Integrations 2+ years of hands with Azure Cloud Services, including: Azure Functions Azure Durable Functions Azure Service Bus, Event Grid, Storage Queues Blob Storage, Azure Key Vault, SQL Azure Application Insights, Azure Monitoring Secondary Skills: Familiarity with AngularJS, ReactJS, and other front-end frameworks Experience with Azure API Management (APIM) Knowledge of Azure Containerization and Orchestration (e.g., AKS/Kubernetes) Experience with Azure Data Factory (ADF) and Logic Apps Exposure to Application Support and operational monitoring Azure DevOps - CI/CD pipelines (Classic / YAML) Show more Show less

Posted 23 hours ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Eviden, part of the Atos Group, with an annual revenue of circa € 5 billion is a global leader in data-driven, trusted and sustainable digital transformation. As a next generation digital business with worldwide leading positions in digital, cloud, data, advanced computing and security, it brings deep expertise for all industries in more than 47 countries. By uniting unique high-end technologies across the full digital continuum with 47,000 world-class talents, Eviden expands the possibilities of data and technology, now and for generations to come. Roles & Responsibilities Design end-to-end data code development using pyspark, python, SQL and Kafka leveraging Microsoft Fabric's capabilities. Requirements Hands-on experience with Microsoft Fabric, including Lakehouse, Data Factory, and Synapse. Strong expertise in PySpark and Python for large-scale data processing and transformation. Deep knowledge of Azure data services (ADLS Gen2, Azure Databricks, Synapse, ADF, Azure SQL, etc.). Experience in designing, implementing, and optimizing end-to-end data pipelines on Azure. Understanding of Azure infrastructure setup (networking, security, and access management) is good to have. Healthcare domain knowledge is a plus but not mandatory. Our Offering Global cutting-edge IT projects that shape the future of digital and have a positive impact on environment. Wellbeing programs & work-life balance - integration and passion sharing events. Attractive Salary and Company Initiative Benefits Courses and conferences. Attractive Salary. Hybrid work culture. Let’s grow together. Show more Show less

Posted 1 day ago

Apply

3.0 years

0 Lacs

Greater Kolkata Area

On-site

Linkedin logo

Line of Service Advisory Industry/Sector Not Applicable Specialism Microsoft Management Level Senior Associate Job Description & Summary A career in our Microsoft Dynamics team will provide the opportunity to help our clients transform their technology landscape across Front, Back and Mid-Office functions leveraging Microsoft Dynamics. We focus on contributing to PwC’s value proposition of “strategy led and technology enabled”, by aligning our Consulting Solutions’ industry focus with the Microsoft technologies such as Dynamics 365, Azure, Power Platform and Power BI. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Responsibilities: · Experience as a Data Analyst with high proficiency; · Expertise in writing and optimizing SQL Queries in SQL Server and/or Oracle; · Experience in Data Extraction, Transformation and Loading (ETL) using SSIS and ADF; · Experience in PowerBI and/or Tableau for visualizing and analyzing data; · Having knowledge in Database Normalization for optimum performance; · Excellency in MS Excel with proficiency in Vlookups, Pivot Tables and VBA Macros; · Knowledge about data warehousing concepts · Performance optimization and troubleshooting capabilities · Good Project Management Skills- Client Meetings, Stakeholder Engagement · Familiarity with Agile Methodology · Strong knowledge in Azure DevOps Boards, Sprint, Queries, Pipelines (CI/ CD) etc. Mandatory skill sets: ADF, Power BI Preferred skill sets: Devops/CI/CD Years of experience required: 3-7 years Education qualification: B.Tech/B.E Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor of Engineering Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills Power BI Optional Skills Acceptance Test Driven Development (ATDD), Acceptance Test Driven Development (ATDD), Accepting Feedback, Active Listening, Analytical Thinking, Android, API Management, Appian (Platform), Application Development, Application Frameworks, Application Lifecycle Management, Application Software, Business Process Improvement, Business Process Management (BPM), Business Requirements Analysis, C#.NET, C++ Programming Language, Client Management, Code Review, Coding Standards, Communication, Computer Engineering, Computer Science, Continuous Integration/Continuous Delivery (CI/CD), Creativity {+ 46 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date Show more Show less

Posted 1 day ago

Apply

6.0 years

0 Lacs

Goregaon, Maharashtra, India

On-site

Linkedin logo

Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Manager Job Description & Summary At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decision-making and driving business growth. In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions. * Why PWC At PwC , you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . At PwC , we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Job Description & Summary: A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Responsibilities: • Designs, implements and maintains reliable and scalable data infrastructure • Writes, deploys and maintains software to build, integrate, manage, maintain , and quality-assure data •Develops, and delivers large-scale data ingestion, data processing, and data transformation projects on the Azure cloud • Mentors and shares knowledge with the team to provide design reviews, discussions and prototypes • Works with customers to deploy, manage, and audit standard processes for cloud products • Adheres to and advocates for software & data engineering standard processes ( e.g. Data Engineering pipelines, unit testing, monitoring, alerting, source control, code review & documentation) • Deploys secure and well-tested software that meets privacy and compliance requirements; develops, maintains and improves CI / CD pipeline • Service reliability and following site-reliability engineering standard processes: on-call rotations for services they maintain , responsible for defining and maintaining SLAs. Designs, builds, deploys and maintains infrastructure as code. Containerizes server deployments. • Part of a cross-disciplinary team working closely with other data engineers, Architects, software engineers, data scientists, data managers and business partners in a Scrum/Agile setup Mandatory skill sets: ‘Must have’ knowledge, skills and experiences Synapse, ADF, spark, SQL, pyspark , spark-SQL, Preferred skill sets: ‘Good to have’ knowledge, skills and experiences C osmos DB, Data modeling, Databricks, PowerBI , experience of having built analytics solution with SAP as data source for ingestion pipelines. Depth: Candidate should have in-depth hands-on experience w.r.t end to end solution designing in Azure data lake, ADF pipeline development and debugging, various file formats, Synapse and Databricks with excellent coding skills in PySpark and SQL with logic building capabilities. He/she should have sound knowledge of optimizing workloads. Years of experience required : 6 to 9 years relevant experience Education qualification: BE, B.Tech , ME, M,Tech , MBA, MCA (60% above ) Expected Joining: 3 weeks Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Master of Engineering, Bachelor of Engineering, Bachelor of Technology, Master of Business Administration Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills Structured Query Language (SQL) Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Agile Scalability, Amazon Web Services (AWS), Analytical Thinking, Apache Airflow, Apache Hadoop, Azure Data Factory, Coaching and Feedback, Communication, Creativity, Data Anonymization, Data Architecture, Database Administration, Database Management System (DBMS), Database Optimization, Database Security Best Practices, Databricks Unified Data Analytics Platform, Data Engineering, Data Engineering Platforms, Data Infrastructure, Data Integration, Data Lake, Data Modeling {+ 32 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date Show more Show less

Posted 1 day ago

Apply

6.0 years

0 Lacs

Goregaon, Maharashtra, India

On-site

Linkedin logo

Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Manager Job Description & Summary At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decision-making and driving business growth. In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions. * Why PWC At PwC , you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . At PwC , we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Job Description & Summary: A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Responsibilities: • Designs, implements and maintains reliable and scalable data infrastructure • Writes, deploys and maintains software to build, integrate, manage, maintain , and quality-assure data •Develops, and delivers large-scale data ingestion, data processing, and data transformation projects on the Azure cloud • Mentors and shares knowledge with the team to provide design reviews, discussions and prototypes • Works with customers to deploy, manage, and audit standard processes for cloud products • Adheres to and advocates for software & data engineering standard processes ( e.g. Data Engineering pipelines, unit testing, monitoring, alerting, source control, code review & documentation) • Deploys secure and well-tested software that meets privacy and compliance requirements; develops, maintains and improves CI / CD pipeline • Service reliability and following site-reliability engineering standard processes: on-call rotations for services they maintain , responsible for defining and maintaining SLAs. Designs, builds, deploys and maintains infrastructure as code. Containerizes server deployments. • Part of a cross-disciplinary team working closely with other data engineers, Architects, software engineers, data scientists, data managers and business partners in a Scrum/Agile setup Mandatory skill sets: ‘Must have’ knowledge, skills and experiences Synapse, ADF, spark, SQL, pyspark , spark-SQL, Preferred skill sets: ‘Good to have’ knowledge, skills and experiences C osmos DB, Data modeling, Databricks, PowerBI , experience of having built analytics solution with SAP as data source for ingestion pipelines. Depth: Candidate should have in-depth hands-on experience w.r.t end to end solution designing in Azure data lake, ADF pipeline development and debugging, various file formats, Synapse and Databricks with excellent coding skills in PySpark and SQL with logic building capabilities. He/she should have sound knowledge of optimizing workloads. Years of experience required : 6 to 9 years relevant experience Education qualification: BE, B.Tech , ME, M,Tech , MBA, MCA (60% above ) Expected Joining: 3 weeks Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Master of Engineering, Bachelor of Engineering, Bachelor of Technology, Master of Business Administration Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills Structured Query Language (SQL) Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Agile Scalability, Amazon Web Services (AWS), Analytical Thinking, Apache Airflow, Apache Hadoop, Azure Data Factory, Coaching and Feedback, Communication, Creativity, Data Anonymization, Data Architecture, Database Administration, Database Management System (DBMS), Database Optimization, Database Security Best Practices, Databricks Unified Data Analytics Platform, Data Engineering, Data Engineering Platforms, Data Infrastructure, Data Integration, Data Lake, Data Modeling {+ 32 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date Show more Show less

Posted 1 day ago

Apply

6.0 years

0 Lacs

Goregaon, Maharashtra, India

On-site

Linkedin logo

Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Manager Job Description & Summary At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decision-making and driving business growth. In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions. * Why PWC At PwC , you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . At PwC , we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Job Description & Summary: A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Responsibilities: • Designs, implements and maintains reliable and scalable data infrastructure • Writes, deploys and maintains software to build, integrate, manage, maintain , and quality-assure data •Develops, and delivers large-scale data ingestion, data processing, and data transformation projects on the Azure cloud • Mentors and shares knowledge with the team to provide design reviews, discussions and prototypes • Works with customers to deploy, manage, and audit standard processes for cloud products • Adheres to and advocates for software & data engineering standard processes ( e.g. Data Engineering pipelines, unit testing, monitoring, alerting, source control, code review & documentation) • Deploys secure and well-tested software that meets privacy and compliance requirements; develops, maintains and improves CI / CD pipeline • Service reliability and following site-reliability engineering standard processes: on-call rotations for services they maintain , responsible for defining and maintaining SLAs. Designs, builds, deploys and maintains infrastructure as code. Containerizes server deployments. • Part of a cross-disciplinary team working closely with other data engineers, Architects, software engineers, data scientists, data managers and business partners in a Scrum/Agile setup Mandatory skill sets: ‘Must have’ knowledge, skills and experiences Synapse, ADF, spark, SQL, pyspark , spark-SQL, Preferred skill sets: ‘Good to have’ knowledge, skills and experiences C osmos DB, Data modeling, Databricks, PowerBI , experience of having built analytics solution with SAP as data source for ingestion pipelines. Depth: Candidate should have in-depth hands-on experience w.r.t end to end solution designing in Azure data lake, ADF pipeline development and debugging, various file formats, Synapse and Databricks with excellent coding skills in PySpark and SQL with logic building capabilities. He/she should have sound knowledge of optimizing workloads. Years of experience required : 6 to 9 years relevant experience Education qualification: BE, B.Tech , ME, M,Tech , MBA, MCA (60% above ) Expected Joining: 3 weeks Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Master of Engineering, Bachelor of Engineering, Bachelor of Technology, Master of Business Administration Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills Structured Query Language (SQL) Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Agile Scalability, Amazon Web Services (AWS), Analytical Thinking, Apache Airflow, Apache Hadoop, Azure Data Factory, Coaching and Feedback, Communication, Creativity, Data Anonymization, Data Architecture, Database Administration, Database Management System (DBMS), Database Optimization, Database Security Best Practices, Databricks Unified Data Analytics Platform, Data Engineering, Data Engineering Platforms, Data Infrastructure, Data Integration, Data Lake, Data Modeling {+ 32 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date Show more Show less

Posted 1 day ago

Apply

8.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

Linkedin logo

Job Title: Senior .NET Developer – Azure Cloud (Client-Facing Role) Location: Trivandrum / Kochi Experience: 8+ Years Notice Period: Immediate to 15 days Working Hours: 12 PM – 9 PM IST (4 hours overlap with EST time zone mandatory) About the Role: We are looking for a highly skilled and hands-on Senior .NET Developer with strong expertise in Azure Cloud Services to join our growing team. This is a client-facing role with a U.S.-based client, requiring excellent communication skills and a proactive approach. The ideal candidate should have deep technical knowledge and the ability to lead by example in coding and architectural decisions. Key Responsibilities: Work directly with U.S. clients in a client-facing capacity Design and develop scalable .NET Core applications with clean architecture Build and maintain Azure Cloud-based solutions Participate in system design, code reviews, and technical mentoring Ensure high code quality through test-driven development and CI/CD practices Primary Skills: 8+ years of hands-on development experience with: C#, .NET Core 6/8+, Entity Framework / EF Core JavaScript, jQuery , and REST APIs MS SQL Server – Complex queries, Stored Procedures, Views, Functions Unit testing using XUnit / MSTest Strong understanding of software design patterns and system architecture Azure Cloud Experience (2+ years): Azure Functions & Durable Functions Azure Service Bus, Event Grid, Storage Queues Blob Storage, Azure Key Vault, SQL Azure Application Insights, Azure Monitoring Azure DevOps – CI/CD pipelines (Classic/YAML) Nice to Have: Familiarity with AngularJS / ReactJS Experience with Azure API Management (APIM) Knowledge of AKS / Kubernetes containerization Exposure to Azure Data Factory (ADF) and Logic Apps Understanding of application support and monitoring best practices What We’re Looking For: Excellent verbal and written communication skills Ability to work independently and as a technical mentor Strong problem-solving and decision-making skills Willingness to align with client time zones Interview Process: 2 Technical Rounds 1 HR Discussion If you’re passionate about building enterprise-grade applications using .NET and Azure, and are ready to work with a dynamic global team— apply now or DM us to learn more! Show more Show less

Posted 1 day ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies