Jobs
Interviews

1647 Adf Jobs - Page 14

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 years

20 Lacs

Hyderābād

On-site

Job Description : We are seeking a skilled and dynamic Azure Data Engineer to join our growing data engineering team. The ideal candidate will have a strong background in building and maintaining data pipelines and working with large datasets on the Azure cloud platform. The Azure Data Engineer will be responsible for developing and implementing efficient ETL processes, working with data warehouses, and leveraging cloud technologies such as Azure Data Factory (ADF), Azure Databricks, PySpark, and SQL to process and transform data for analytical purposes. Key Responsibilities : - Data Pipeline Development : Design, develop, and implement scalable, reliable, and high-performance data pipelines using Azure Data Factory (ADF), Azure Databricks, and PySpark. - Data Processing : Develop complex data transformations, aggregations, and cleansing processes using PySpark and Databricks for big data workloads. - Data Integration : Integrate and process data from various sources such as databases, APIs, cloud storage (e.g., Blob Storage, Data Lake), and third-party services into Azure Data Services. - Optimization : Optimize data workflows and ETL processes to ensure efficient data loading, transformation, and retrieval while ensuring data integrity and high performance. - SQL Development : Write complex SQL queries for data extraction, aggregation, and transformation. Maintain and optimize relational databases and data warehouses. - Collaboration : Work closely with data scientists, analysts, and other engineering teams to understand data requirements and design solutions that meet business and analytical needs. - Automation & Monitoring : Implement automation for data pipeline deployment and ensure monitoring, logging, and alerting mechanisms are in place for pipeline health. - Cloud Infrastructure Management : Work with cloud technologies (e.g., Azure Data Lake, Blob Storage) to store, manage, and process large datasets. - Documentation & Best Practices : Maintain thorough documentation of data pipelines, workflows, and best practices for data engineering solutions. Job Type: Full-time Pay: Up to ₹2,000,000.00 per year Experience: Azure: 4 years (Required) Python: 4 years (Required) SQL: 4 years (Required) Work Location: In person

Posted 2 weeks ago

Apply

3.0 years

3 - 6 Lacs

Gurgaon

On-site

#freepost Designation: Middleware Administrator Experience: 3+ Years Qualification: B.E. / B. Tech/BCA Location: Gurugram, Haryana Roles and Responsibility · Monitor application response times from the end-user perspective in real time and alert organizations when performance is unacceptable. By alerting the user to problems and intelligently segmenting response times, it should quickly expose problem sources and minimizes the time necessary for resolution. · It should allow specific application transactions to be captured and monitored separately. This allows administrators to select the most important operations within business-critical applications to be measured and tracked individually. · It should use baseline-oriented thresholds to raise alerts when application response times deviate from acceptable levels. This allows IT administrators to respond quickly to problems and minimize the impact on service delivery. · Shutdown and start-up of applications, generation of MIS reports, monitoring of application load user account management scripts execution, analysing system events, monitoring of error logs etc · Monitoring of applications, including Oracle Forms 10g ,Oracle SSO 10g ,OID 10g, Oracle Portal 10g ,Oracle Reports 10g ,Internet Application Server (OAS) 10.1.2.2.0, Oracle Web Server (OWS) 10.1.2.2.0, Oracle WebCenter Portal 12.2.1.3 ,Oracle Access Manager 12.2.1.3,Oracle Internet Directory 12.2.1.3,Oracle WebLogic Server 12.2.1.3,Oracle HTTP Server 12.2.1.3, Oracle ADF 12.2.1.3 (Fusion middleware) ,Oracle Forms 12.2.1.3,Oracle Reports12.2.1.3,mobile apps, Windows IIS, portal, web cache, BizTalk application and DNS applications, tomcat etc. Job Type: Full-time Benefits: Health insurance Provident Fund Work Location: In person

Posted 2 weeks ago

Apply

3.0 - 5.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

As a Data Engineer , you are required to: Design, build, and maintain data pipelines that efficiently process and transport data from various sources to storage systems or processing environments while ensuring data integrity, consistency, and accuracy across the entire data pipeline. Integrate data from different systems, often involving data cleaning, transformation (ETL), and validation. Design the structure of databases and data storage systems, including the design of schemas, tables, and relationships between datasets to enable efficient querying. Work closely with data scientists, analysts, and other stakeholders to understand their data needs and ensure that the data is structured in a way that makes it accessible and usable. Stay up-to-date with the latest trends and technologies in the data engineering space, such as new data storage solutions, processing frameworks, and cloud technologies. Evaluate and implement new tools to improve data engineering processes. Qualification : Bachelor's or Master's in Computer Science & Engineering, or equivalent. Professional Degree in Data Science, Engineering is desirable. Experience level : At least 3 - 5 years hands-on experience in Data Engineering Desired Knowledge & Experience : Spark: Spark 3.x, RDD/DataFrames/SQL, Batch/Structured Streaming Knowing Spark internals: Catalyst/Tungsten/Photon Databricks: Workflows, SQL Warehouses/Endpoints, DLT, Pipelines, Unity, Autoloader IDE: IntelliJ/Pycharm, Git, Azure Devops, Github Copilot Test: pytest, Great Expectations CI/CD Yaml Azure Pipelines, Continuous Delivery, Acceptance Testing Big Data Design: Lakehouse/Medallion Architecture, Parquet/Delta, Partitioning, Distribution, Data Skew, Compaction Languages: Python/Functional Programming (FP) SQL: TSQL/Spark SQL/HiveQL Storage: Data Lake and Big Data Storage Design additionally it is helpful to know basics of: Data Pipelines: ADF/Synapse Pipelines/Oozie/Airflow Languages: Scala, Java NoSQL: Cosmos, Mongo, Cassandra Cubes: SSAS (ROLAP, HOLAP, MOLAP), AAS, Tabular Model SQL Server: TSQL, Stored Procedures Hadoop: HDInsight/MapReduce/HDFS/YARN/Oozie/Hive/HBase/Ambari/Ranger/Atlas/Kafka Data Catalog: Azure Purview, Apache Atlas, Informatica Required Soft skills & Other Capabilities : Great attention to detail and good analytical abilities. Good planning and organizational skills Collaborative approach to sharing ideas and finding solutions Ability to work independently and also in a global team environment.

Posted 2 weeks ago

Apply

3.0 - 6.0 years

0 Lacs

Andhra Pradesh

On-site

We are seeking a Data Engineer with strong expertise in SQL and ETL processes to support banking data quality data pipelines, regulatory reporting, and data quality initiatives. The role involves building and optimizing data structures, implementing validation rules, and collaborating with governance and compliance teams. Experience in the banking domain and tools like Informatica and Azure Data Factory is essential. Strong proficiency in SQL for writing complex queries, joins, data transformations, and aggregations Proven experience in building tables, views, and data structures within enterprise Data Warehouses and Data Lakes Strong understanding of data warehousing concepts, such as Slowly Changing Dimensions (SCDs), data normalization, and star/snowflake schemas Practical experience in Azure Data Factory (ADF) for orchestrating data pipelines and managing ingestion workflows Exposure to data cataloging, metadata management, and lineage tracking using Informatica EDC or Axon Experience implementing Data Quality rules for banking use cases such as completeness, consistency, uniqueness, and validity Familiarity with banking systems and data domains such as Flexcube, HRMS, CRM, Risk, Compliance, and IBG reporting Understanding of regulatory and audit readiness needs for Central Bank and internal governance forums Write optimized SQL scripts to extract, transform, and load (ETL) data from multiple banking source systems Design and implement staging and reporting layer structures, aligned to business requirements and regulatory frameworks Apply data validation logic based on predefined business rules and data governance requirements Collaborate with Data Governance, Risk, and Compliance teams to embed lineage, ownership, and metadata into datasets Monitor scheduled jobs and resolve ETL failures to ensure SLA adherence for reporting and operational dashboards Support production deployment, UAT sign off, and issue resolution for data products across business units 3 to 6 years in banking-focused data engineering roles with hands on SQL, ETL, and DQ rule implementation Bachelors or Master's Degree in Computer Science, Information Systems, Data Engineering, or related fields Banking domain experience is mandatory, especially in areas related to regulatory reporting, compliance, and enterprise data governance About Virtusa Teamwork, quality of life, professional and personal development: values that Virtusa is proud to embody. When you join us, you join a team of 27,000 people globally that cares about your growth — one that seeks to provide you with exciting projects, opportunities and work with state of the art technologies throughout your career with us. Great minds, great potential: it all comes together at Virtusa. We value collaboration and the team environment of our company, and seek to provide great minds with a dynamic place to nurture new ideas and foster excellence. Virtusa was founded on principles of equal opportunity for all, and so does not discriminate on the basis of race, religion, color, sex, gender identity, sexual orientation, age, non-disqualifying physical or mental disability, national origin, veteran status or any other basis covered by appropriate law. All employment is decided on the basis of qualifications, merit, and business need.

Posted 2 weeks ago

Apply

3.0 - 6.0 years

0 Lacs

Andhra Pradesh

On-site

We are seeking a Data Engineer with strong expertise in SQL and ETL processes to support banking data quality data pipelines, regulatory reporting, and data quality initiatives. The role involves building and optimizing data structures, implementing validation rules, and collaborating with governance and compliance teams. Experience in the banking domain and tools like Informatica and Azure Data Factory is essential. Strong proficiency in SQL for writing complex queries, joins, data transformations, and aggregations Proven experience in building tables, views, and data structures within enterprise Data Warehouses and Data Lakes Strong understanding of data warehousing concepts, such as Slowly Changing Dimensions (SCDs), data normalization, and star/snowflake schemas Practical experience in Azure Data Factory (ADF) for orchestrating data pipelines and managing ingestion workflows Exposure to data cataloging, metadata management, and lineage tracking using Informatica EDC or Axon Experience implementing Data Quality rules for banking use cases such as completeness, consistency, uniqueness, and validity Familiarity with banking systems and data domains such as Flexcube, HRMS, CRM, Risk, Compliance, and IBG reporting Understanding of regulatory and audit readiness needs for Central Bank and internal governance forums Write optimized SQL scripts to extract, transform, and load (ETL) data from multiple banking source systems Design and implement staging and reporting layer structures, aligned to business requirements and regulatory frameworks Apply data validation logic based on predefined business rules and data governance requirements Collaborate with Data Governance, Risk, and Compliance teams to embed lineage, ownership, and metadata into datasets Monitor scheduled jobs and resolve ETL failures to ensure SLA adherence for reporting and operational dashboards Support production deployment, UAT sign off, and issue resolution for data products across business units 3 to 6 years in banking-focused data engineering roles with hands on SQL, ETL, and DQ rule implementation Bachelors or Master's Degree in Computer Science, Information Systems, Data Engineering, or related fields Banking domain experience is mandatory, especially in areas related to regulatory reporting, compliance, and enterprise data governance About Virtusa Teamwork, quality of life, professional and personal development: values that Virtusa is proud to embody. When you join us, you join a team of 27,000 people globally that cares about your growth — one that seeks to provide you with exciting projects, opportunities and work with state of the art technologies throughout your career with us. Great minds, great potential: it all comes together at Virtusa. We value collaboration and the team environment of our company, and seek to provide great minds with a dynamic place to nurture new ideas and foster excellence. Virtusa was founded on principles of equal opportunity for all, and so does not discriminate on the basis of race, religion, color, sex, gender identity, sexual orientation, age, non-disqualifying physical or mental disability, national origin, veteran status or any other basis covered by appropriate law. All employment is decided on the basis of qualifications, merit, and business need.

Posted 2 weeks ago

Apply

8.0 years

0 Lacs

Trivandrum, Kerala, India

Remote

Job Title : Senior DotNet Developer Experience: 8+ Years Job Type: Contract Contract Duration: 6 months Location : Remote Budget: 1L per month Working Hours:- 12:00 PM to 09:00 PM. DETAILS JOB DESCRIPTION Candidates with 8+ years of experience in IT industry and with strong .Net/.Net Core/Azure Cloud Service/ Azure DevOps. This is a client facing role and hence should have strong communication skills. This is for a US client and the resource should be hands-on - experience in coding and Azure Cloud. Working hours - 8 hours , with a 4 hours of overlap during EST Time zone. ( 12 PM - 9 PM) This overlap hours is mandatory as meetings happen during this overlap hours RESPONSIBILITIES •Design, develop, enhance, document, and maintain robust applications using .NET Core 6/8+, C#, REST APIs, T-SQL, and modern JavaScript/jQuery. •Integrate and support third-party APIs and external services. •Collaborate across cross-functional teams to deliver scalable solutions across the full technology stack. •Identify, prioritize, and execute tasks throughout the Software Development Life Cycle (SDLC) •Participate in Agile/Scrum ceremonies and manage tasks using Jira. •Understand technical priorities, architectural dependencies, risks, and implementation challenges. •Troubleshoot, debug, and optimize existing solutions with a strong focus on performance and reliability. PRIMARY SKILLS 8+ years of hands-on development experience with: ☑ C#, .NET Core 6/8+, Entity Framework / EF Core ☑ JavaScript, jQuery, REST APIs ☑ Expertise in MS SQL Server, including: ☑ Complex SQL queries, Stored Procedures, Views, Functions, Packages, Cursors, Tables, and Object Types ☑ Skilled in unit testing with XUnit, MSTest ☑ Strong in software design patterns, system architecture, and scalable solution design ☑ Ability to lead and inspire teams through clear communication, technical mentorship, and ownership ☑ Strong problem-solving and debugging capabilities ☑ Ability to write reusable, testable, and efficient code ☑ Develop and maintain frameworks and shared libraries to support large-scale applications ☑ Excellent technical documentation, communication, and leadership skills ☑ Microservices and Service-Oriented Architecture (SOA) ☑ Experience in API Integrations 2+ years of hands with Azure Cloud Services, including: ☑Azure Functions ☑Azure Durable Functions ☑Azure Service Bus, Event Grid, Storage Queues ☑Blob Storage, Azure Key Vault, SQL Azure ☑Application Insights, Azure Monitoring SECONDARY SKILLS ( GOOD TO HAVE) ☑Familiarity with AngularJS, ReactJS, and other front-end frameworks ☑Experience with Azure API Management (APIM) ☑Knowledge of Azure Containerization and Orchestration (e.g., AKS/Kubernetes) ☑Experience with Azure Data Factory (ADF) and Logic Apps ☑Exposure to Application Support and operational monitoring ☑Azure DevOps - CI/CD pipelines (Classic / YAML) CERTIFICATIONS REQUIRED (IF ANY) ☑Microsoft Certified: Azure Fundamentals ☑Microsoft Certified: Azure Developer Associate ☑Other relevant certifications in Azure, .NET, or Cloud technologies

Posted 2 weeks ago

Apply

8.0 years

0 Lacs

Kochi, Kerala, India

On-site

Job Role Senior Dot Net Developer Experience: 8+ years Max Notice period: Immediate Location: Trivandrum / Kochi Introduction: Candidates with 8+ years of experience in IT industry and with strong .Net/.Net Core/Azure Cloud Service/ Azure DevOps. This is a client facing role and hence should have strong communication skills. This is for a US client and the resource should be hands-on - experience in coding and Azure Cloud. Working hours - 8 hours , with a 4 hours of overlap during EST Time zone. ( 12 PM - 9 PM) This overlap hours is mandatory as meetings happen during this overlap hours Responsibilities include: • Design, develop, enhance, document, and maintain robust applications using .NET Core 6/8+, C#, REST APIs, T-SQL, and modern JavaScript/jQuery • Integrate and support third-party APIs and external services • Collaborate across cross-functional teams to deliver scalable solutions across the full technology stack • Identify, prioritize, and execute tasks throughout the Software Development Life Cycle (SDLC) • Participate in Agile/Scrum ceremonies and manage tasks using Jira • Understand technical priorities, architectural dependencies, risks, and implementation challenges • Troubleshoot, debug, and optimize existing solutions with a strong focus on performance and reliability • Certifications : • Microsoft Certified: Azure Fundamentals • Microsoft Certified: Azure Developer Associate • Other relevant certifications in Azure, .NET, or Cloud technologies Primary Skills : 8+ years of hands-on development experience with: • C#, .NET Core 6/8+, Entity Framework / EF Core • JavaScript, jQuery, REST APIs • Expertise in MS SQL Server, including: • Complex SQL queries, Stored Procedures, Views, Functions, Packages, Cursors, Tables, and Object Types • Skilled in unit testing with XUnit, MSTest • Strong in software design patterns, system architecture, and scalable solution design • Ability to lead and inspire teams through clear communication, technical mentorship, and ownership • Strong problem-solving and debugging capabilities • Ability to write reusable, testable, and efficient code • Develop and maintain frameworks and shared libraries to support large-scale applications • Excellent technical documentation, communication, and leadership skills • Microservices and Service-Oriented Architecture (SOA) • Experience in API Integrations 2+ years of hands with Azure Cloud Services, including: • Azure Functions • Azure Durable Functions • Azure Service Bus, Event Grid, Storage Queues • Blob Storage, Azure Key Vault, SQL Azure • Application Insights, Azure Monitoring Secondary Skills: • Familiarity with AngularJS, ReactJS, and other front-end frameworks • Experience with Azure API Management (APIM) • Knowledge of Azure Containerization and Orchestration (e.g., AKS/Kubernetes) • Experience with Azure Data Factory (ADF) and Logic Apps • Exposure to Application Support and operational monitoring • Azure DevOps - CI/CD pipelines (Classic / YAML)

Posted 2 weeks ago

Apply

0 years

0 Lacs

India

Remote

Ready to embark on a journey where your growth is intertwined with our commitment to making a positive impact? Join the Delphi family - where Growth Meets Values. At Delphi Consulting Pvt. Ltd. , we foster a thriving environment with a hybrid work model that lets you prioritize what matters most. Interviews and onboarding are conducted virtually, reflecting our digital-first mindset . We specialize in Data, Advanced Analytics, AI, Infrastructure, Cloud Security , and Application Modernization , delivering impactful solutions that drive smarter, efficient futures for our clients. About the Role: We are looking for a Lead Consultant – Data Functional (Healthcare) to join our growing Data & AI team. The ideal candidate will bring deep functional expertise in Cerner and Dynamics 365, with a proven track record in healthcare data management, integration, analytics, and stakeholder engagement. This is a client-facing role where you will lead the design and implementation of data solutions, ensuring compliance with healthcare regulations, and translating complex business needs into scalable, efficient data architectures. You will work closely with both internal teams and external healthcare clients across the data lifecycle— from requirement gathering and ETL design to insights generation and reporting. What you'll do: Collaborate with stakeholders to gather business requirements and convert them into functional data design documents. Lead data integration efforts across Cerner, Dynamics 365, and other healthcare platforms like Epic. Perform and lead functional data activities, including: Source-to-target mapping Master data management Data validation Documentation Design and validate scalable data architectures in collaboration with engineering teams using: Azure Data Factory Databricks Synapse Analytics SQL Server Conduct data analysis and generate insights for healthcare-specific use cases such as: Clinical operations Patient engagement Safety Develop functional assets such as: Data dictionaries Mapping sheets Validation checklists Test cases Ensure adherence to healthcare data compliance standards like HIPAA. Build dashboards and reports using Cerner, Dynamics 365, and other sources. Exposure to Power BI or Microsoft Fabric is an added advantage. Work closely with clients to identify gaps, propose solutions, and ensure successful implementation of data initiatives. Create artifacts and collaterals for healthcare proposals, and play a key role in establishing a Healthcare Center of Excellence (CoE) from the ground up. Contribute to pre-sales activities, including: RFIs/RFPs Solution design Client presentations What you'll do: • Domain expertise across Clinical Operations, Patient Safety, and Patient Experience. • Extensive functional knowledge and hands-on experience with Cerner, Dynamics 365, and exposure to Epic. • Proven experience in ETL design, data mapping, validation, and master data management. • Proficiency with Azure Data Services: ADF, Synapse, Databricks, SQL Server. • Familiarity with healthcare standards, workflows, and regulatory compliance. • Demonstrated experience in building functional documentation and reusable artifacts. • Strong client engagement skills; ability to translate requirements into data design and ensure end-toend delivery. • Experience contributing to business proposals and setting up data/healthcare-focused CoEs. • Good to have: Working knowledge of Power BI, Microsoft Fabric, or other Microsoft data tools. What We Offer: At Delphi, we are dedicated to creating an environment where you can thrive, both professionally and personally. Our competitive compensation package, performance-based incentives, and health benefits are designed to ensure you're well-supported. We believe in your continuous growth and offer company sponsored certifications, training programs, and skill-building opportunities to help you succeed. We foster a culture of inclusivity and support, with remote work options and a fully supported work-from home setup to ensure your comfort and productivity. Our positive and inclusive culture includes team activities, wellness and mental health programs to ensure you feel supported

Posted 2 weeks ago

Apply

8.0 years

0 Lacs

Thiruvananthapuram Taluk, India

On-site

DETAILS JOB DESCRIPTION Candidates with 8+ years of experience in IT industry and with strong .Net/.Net Core/Azure Cloud Service/ Azure DevOps. This is a client facing role and hence should have strong communication skills. This is for a US client and the resource should be hands-on - experience in coding and Azure Cloud. Working hours - 8 hours , with a 4 hours of overlap during EST Time zone. ( 12 PM - 9 PM) This overlap hours is mandatory as meetings happen during this overlap hours RESPONSIBILITIES •Design, develop, enhance, document, and maintain robust applications using .NET Core 6/8+, C#, REST APIs, T-SQL, and modern JavaScript/jQuery. •Integrate and support third-party APIs and external services. •Collaborate across cross-functional teams to deliver scalable solutions across the full technology stack. •Identify, prioritize, and execute tasks throughout the Software Development Life Cycle (SDLC) •Participate in Agile/Scrum ceremonies and manage tasks using Jira. •Understand technical priorities, architectural dependencies, risks, and implementation challenges. •Troubleshoot, debug, and optimize existing solutions with a strong focus on performance and reliability. PRIMARY SKILLS 8+ years of hands-on development experience with: ☑ C#, .NET Core 6/8+, Entity Framework / EF Core ☑ JavaScript, jQuery, REST APIs ☑ Expertise in MS SQL Server, including: ☑ Complex SQL queries, Stored Procedures, Views, Functions, Packages, Cursors, Tables, and Object Types ☑ Skilled in unit testing with XUnit, MSTest ☑ Strong in software design patterns, system architecture, and scalable solution design ☑ Ability to lead and inspire teams through clear communication, technical mentorship, and ownership ☑ Strong problem-solving and debugging capabilities ☑ Ability to write reusable, testable, and efficient code ☑ Develop and maintain frameworks and shared libraries to support large-scale applications ☑ Excellent technical documentation, communication, and leadership skills ☑ Microservices and Service-Oriented Architecture (SOA) ☑ Experience in API Integrations 2+ years of hands with Azure Cloud Services, including: ☑Azure Functions ☑Azure Durable Functions ☑Azure Service Bus, Event Grid, Storage Queues ☑Blob Storage, Azure Key Vault, SQL Azure ☑Application Insights, Azure Monitoring SECONDARY SKILLS ( GOOD TO HAVE) ☑Familiarity with AngularJS, ReactJS, and other front-end frameworks ☑Experience with Azure API Management (APIM) ☑Knowledge of Azure Containerization and Orchestration (e.g., AKS/Kubernetes) ☑Experience with Azure Data Factory (ADF) and Logic Apps ☑Exposure to Application Support and operational monitoring ☑Azure DevOps - CI/CD pipelines (Classic / YAML)

Posted 2 weeks ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

****Read the JD Carefully and fill the below foam and send resume following mail id **** Role: Azure DevOps Engineer with Databricks Primary Skills: Azure DevOps , CI/CD, Databricks Note: Must have Integrated Applications through CI/CD using any Data related . Experience: 5-10years Location: Hyderabad, Pune, Bangalore . Mode of Hire: Contractor/ Permanent (Full Time) Please find the JD Mentioned . 5+ years in DevOps with strong data pipeline experience The requirement is for Azure DevOps with Databricks , specifically using CI/CD pipelines for data implementation into data tools . Build and maintain CI/CD pipelines for Azure Data Factory and Databricks notebooks The role demands deep expertise in Databricks, including the automation of unit, integration, and QA testing workflows. Additionally, strong data architecture skills are essential, as the position involves implementing CI/CD pipelines for schema updates. Strong experience with Azure DevOps Pipelines, YAML builds, and release workflows. Proficiency in scripting languages like Python, PowerShell, Terraform Working knowledge of Azure services: ADF, Databricks, DABs, ADLS Gen2, Key Vault, ADO. Maintain infrastructure-as-code practices Collaborate with Data Engineers and Platform teams to maintain development, staging, and production environments. Monitor and troubleshoot pipeline failures and deployment inconsistencies. Fill this foam Also - https://docs.google.com/forms/d/e/1FAIpQLSfU0Jf2DmetEIfXLjrVm5pbZRtBNazIiSDJbXwB6BlN0uWHhw/viewform?usp=header Share Your Resume to this mail - aman.tyagi@firstwave-tech.com

Posted 2 weeks ago

Apply

2.0 years

0 Lacs

Pune, Maharashtra, India

Remote

Company Description UNIFY Dots is a global technology and software solutions company specializing in Microsoft Dynamics 365-based solutions. We are seeking a Business Intelligence Technical Consultant who has experience in designing and implementing end to end business intelligence solutions using Microsoft Power BI & Azure Synapse Analytics. The job will be work from home. This is a full-time position. Responsibilities Job Description Your job duties include the following tasks and responsibilities: Understanding and documenting reporting, business intelligence and dashboard requirements. Performing Data Mapping between Source systems like Microsoft Dynamics 365 Supply Chain and Finance ERP (“D365”), CRM, Dataverse, Fabric, Azure Synapse, ADLS and Power BI. Developing ADF/Synapse/Fabric Pipelines for ETL/data warehousing Performing data ingestion Configuring and Develop Power BI Reports to generate standardized and on-demand executive dashboards, reports, metrics & KPIs. Enhancing and modifying existing Power BI Reports, Dashboards and workspaces including embedded Power BI reports in D365. Analyzing data discrepancies and performing root cause analysis for reports showing different data than expected in the Production Environment. Writing Business Intelligence and analytics technical blog articles per year on the Unify Dots blog. Qualifications Bachelor’s Degree 2-4 years of experience using Power BI and at least one year of experience with Azure Synapse/Fabric Analytics Microsoft Certification - Power BI Data Analyst Associate (PL-300) Conversational skills in English language Experience with Power BI Desktop, Power BI Report Builder, Power BI Service, DAX, SQL, ETL, Azure DevOps Exposure to Azure Data Factory Additional Information Benefits Market competitive compensation. Medical Insurance for Team member + Spouse + Children + Parents. Flexibility to Work from Home for the majority of the time Laptop for Work from Home while working at Unify Dots People before Profit Culture that values team members over financial numbers.

Posted 2 weeks ago

Apply

6.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Good day, We have immediate opportunity for Azure Data Engineer. Job Role: Azure Data Engineer Job Location: Kharadi , Pune Experience- 6 Years - 12 Years Notice Period: Immediate to 30 days. About Company: At Synechron, we believe in the power of digital to transform businesses for the better. Our global consulting firm combines creativity and innovative technology to deliver industry-leading digital solutions. Synechron’s progressive technologies and optimization strategies span end-to-end Artificial Intelligence, Consulting, Digital, Cloud & DevOps, Data, and Software Engineering, servicing an array of noteworthy financial services and technology firms. Through research and development initiatives in our FinLabs we develop solutions for modernization, from Artificial Intelligence and Blockchain to Data Science models, Digital Underwriting, mobile-first applications and more. Over the last 20+ years, our company has been honoured with multiple employer awards, recognizing our commitment to our talented teams. With top clients to boast about, Synechron has a global workforce of 14,700+ and has 55 offices in 20 countries within key global markets. For more information on the company, please visit our website or LinkedIn community. Diversity, Equity, and Inclusion: Diversity & Inclusion are fundamental to our culture, and Synechron is proud to be an equal opportunity workplace and an affirmative-action employer. Our Diversity, Equity, and Inclusion (DEI) initiative ‘Same Difference’ is committed to fostering an inclusive culture – promoting equality, diversity and an environment that is respectful to all. We strongly believe that a diverse workforce helps build stronger, successful businesses as a global company. We encourage applicants from across diverse backgrounds, race, ethnicities, religion, age, marital status, gender, sexual orientations, or disabilities to apply. We empower our global workforce by offering flexible workplace arrangements, mentoring, internal mobility, learning and development programs, and more. All employment decisions at Synechron are based on business needs, job requirements and individual qualifications, without regard to the applicant’s gender, gender identity, sexual orientation, race, ethnicity, disabled or veteran status, or any other characteristic protected by law. Job Description: As an Azure Data Engineer, you will be responsible for designing, implementing, and maintaining data management systems on the Azure cloud platform. You will be responsible for creating and handling scalable data pipelines, assuring data quality, and maximizing data processing performance You will also work closely with other teams, such as data analysts, data scientists, and software developers, to provide them with the data they need to perform their job functions. We are looking for candidates with 8+ years of overall experience and a minimum of 4+ years’ experience in Azure. Technical Skills (Must Have): Azure Databricks, Spark, ADF (Azure Data Factory) Optional Skills (Good to have) : Spark, structure Streaming, SQL, GITLAB Responsibilities : Designing and implementing data storage solutions on Azure Building and maintaining data pipelines for data integration and processing Ensuring data quality and accuracy through testing and validation Developing and maintaining data models and schemas Collaborating with other teams to provide data for analytics and reporting. ensuring data security and privacy prerequisites are followed. Primary Skills : We will require professional with a career reflecting technical abilities coupled with “hands-on” experience in a diverse range of software projects: Strong exposure on DataBricks, Azure Data Factory, ADLS Strong exposure to Spark and structure streaming. Exposure on Cloud Integration and container services (Azure) Oracle and MS-SQL experience, terraform will be an asset. Expertise in managing Repository (GITLAB). Clean coding and refactoring skills and Test-Driven-Development (TDD). Performance optimization and scalability. Know-how of Agile Development practices (Scrum, XP, Kanban, etc.) Adaptable, able to work across teams, functions, applications. Enthusiastic, self-motivated and client focused. Prior Financial/ Banking experience is desirable. Secondary skills : Familiarity with data processing frameworks such as Apache Spark and Hadoop Understanding of data modeling and schema design principles Ability to work with large datasets and perform data analysis. Strong problem-solving and troubleshooting skills. If you find this opportunity interesting kindly share your below details (Mandatory) Total Experience Experience in Azure - Experience in Power BI – Experience in Databricks- Current CTC- Expected CTC- Notice period- Current Location- If you had gone through any interviews in Synechron before? If Yes when Regards, Recruitment Team, Pune

Posted 2 weeks ago

Apply

5.0 - 10.0 years

0 Lacs

Telangana, India

On-site

About Chubb JOB DESCRIPTION Chubb is a world leader in insurance. With operations in 54 countries and territories, Chubb provides commercial and personal property and casualty insurance, personal accident and supplemental health insurance, reinsurance and life insurance to a diverse group of clients. The company is defined by its extensive product and service offerings, broad distribution capabilities, exceptional financial strength and local operations globally. Parent company Chubb Limited is listed on the New York Stock Exchange (NYSE: CB) and is a component of the S&P 500 index. Chubb employs approximately 40,000 people worldwide. Additional information can be found at: www.chubb.com. About Chubb India At Chubb India, we are on an exciting journey of digital transformation driven by a commitment to engineering excellence and analytics. We are proud to share that we have been officially certified as a Great Place to Work® for the third consecutive year, a reflection of the culture at Chubb where we believe in fostering an environment where everyone can thrive, innovate, and grow With a team of over 2500 talented professionals, we encourage a start-up mindset that promotes collaboration, diverse perspectives, and a solution-driven attitude. We are dedicated to building expertise in engineering, analytics, and automation, empowering our teams to excel in a dynamic digital landscape. We offer an environment where you will be part of an organization that is dedicated to solving real-world challenges in the insurance industry. Together, we will work to shape the future through innovation and continuous learning. Position Details Role : MLops Engineer Engineer Experience : 5-10 Years Mandatory Skill: Python/MLOps/Docker and Kubernetes/FastAPI or Flask/CICD/Jenkins/Spark/SQL/RDB/Cosmos/Kafka/ADLS/API/Databricks Location: Bangalore Notice Period: less than 60 Days Job Description Other Skills: Azure/LLMOps/ADF/ETL We are seeking a talented and passionate Machine Learning Engineer to join our team and play a pivotal role in developing and deploying cutting-edge machine learning solutions. You will work closely with other engineers and data scientists to bring machine learning models from proof-of-concept to production, ensuring they deliver real-world impact and solve critical business challenges. Collaborate with data scientists, model developers, software engineers, and other stakeholders to translate business needs into technical solutions. Experience of having deployed ML models to production Create high performance real-time inferencing APIs and batch inferencing pipelines to serve ML models to stakeholders. Integrate machine learning models seamlessly into existing production systems. Continuously monitor and evaluate model performance and retrain the models automatically or periodically Streamline existing ML pipelines to increase throughput. Identify and address security vulnerabilities in existing applications proactively. Design, develop, and implement machine learning models for preferably insurance related applications. Well versed with Azure ecosystem Knowledge of NLP and Generative AI techniques. Relevant experience will be a plus. Knowledge of machine learning algorithms and libraries (e.g., TensorFlow, PyTorch) will be a plus. Stay up-to-date on the latest advancements in machine learning and contribute to ongoing innovation within the team. Why Chubb? Join Chubb to be part of a leading global insurance company! Our constant focus on employee experience along with a start-up-like culture empowers you to achieve impactful results. Industry leader: Chubb is a world leader in the insurance industry, powered by underwriting and engineering excellence A Great Place to work: Chubb India has been recognized as a Great Place to Work® for the years 2023-2024, 2024-2025 and 2025-2026 Laser focus on excellence: At Chubb we pride ourselves on our culture of greatness where excellence is a mindset and a way of being. We constantly seek new and innovative ways to excel at work and deliver outstanding results Start-Up Culture: Embracing the spirit of a start-up, our focus on speed and agility enables us to respond swiftly to market requirements, while a culture of ownership empowers employees to drive results that matter Growth and success: As we continue to grow, we are steadfast in our commitment to provide our employees with the best work experience, enabling them to advance their careers in a conducive environment Employee Benefits Our company offers a comprehensive benefits package designed to support our employees’ health, well-being, and professional growth. Employees enjoy flexible work options, generous paid time off, and robust health coverage, including treatment for dental and vision related requirements. We invest in the future of our employees through continuous learning opportunities and career advancement programs, while fostering a supportive and inclusive work environment. Our benefits include: Savings and Investment plans: We provide specialized benefits like Corporate NPS (National Pension Scheme), Employee Stock Purchase Plan (ESPP), Long-Term Incentive Plan (LTIP), Retiral Benefits and Car Lease that help employees optimally plan their finances Upskilling and career growth opportunities: With a focus on continuous learning, we offer customized programs that support upskilling like Education Reimbursement Programs, Certification programs and access to global learning programs. Health and Welfare Benefits: We care about our employees’ well-being in and out of work and have benefits like Employee Assistance Program (EAP), Yearly Free Health campaigns and comprehensive Insurance benefits. Application Process Our recruitment process is designed to be transparent, and inclusive. Step 1: Submit your application via the Chubb Careers Portal. Step 2: Engage with our recruitment team for an initial discussion. Step 3: Participate in HackerRank assessments/technical/functional interviews and assessments (if applicable). Step 4: Final interaction with Chubb leadership. Join Us With you Chubb is better. Whether you are solving challenges on a global stage or creating innovative solutions for local markets, your contributions will help shape the future. If you value integrity, innovation, and inclusion, and are ready to make a difference, we invite you to be part of Chubb India’s journey. Apply Now: Chubb External Careers Qualifications tbd

Posted 2 weeks ago

Apply

5.0 - 10.0 years

0 Lacs

Telangana, India

On-site

About Chubb JOB DESCRIPTION Chubb is a world leader in insurance. With operations in 54 countries and territories, Chubb provides commercial and personal property and casualty insurance, personal accident and supplemental health insurance, reinsurance and life insurance to a diverse group of clients. The company is defined by its extensive product and service offerings, broad distribution capabilities, exceptional financial strength and local operations globally. Parent company Chubb Limited is listed on the New York Stock Exchange (NYSE: CB) and is a component of the S&P 500 index. Chubb employs approximately 40,000 people worldwide. Additional information can be found at: www.chubb.com. About Chubb India At Chubb India, we are on an exciting journey of digital transformation driven by a commitment to engineering excellence and analytics. We are proud to share that we have been officially certified as a Great Place to Work® for the third consecutive year, a reflection of the culture at Chubb where we believe in fostering an environment where everyone can thrive, innovate, and grow With a team of over 2500 talented professionals, we encourage a start-up mindset that promotes collaboration, diverse perspectives, and a solution-driven attitude. We are dedicated to building expertise in engineering, analytics, and automation, empowering our teams to excel in a dynamic digital landscape. We offer an environment where you will be part of an organization that is dedicated to solving real-world challenges in the insurance industry. Together, we will work to shape the future through innovation and continuous learning. Position Details Role : MLops Engineer Engineer Experience : 5-10 Years Mandatory Skill: Python/MLOps/Docker and Kubernetes/FastAPI or Flask/CICD/Jenkins/Spark/SQL/RDB/Cosmos/Kafka/ADLS/API/Databricks Location: Bangalore Notice Period: less than 60 Days Job Description Other Skills: Azure/LLMOps/ADF/ETL We are seeking a talented and passionate Machine Learning Engineer to join our team and play a pivotal role in developing and deploying cutting-edge machine learning solutions. You will work closely with other engineers and data scientists to bring machine learning models from proof-of-concept to production, ensuring they deliver real-world impact and solve critical business challenges. Collaborate with data scientists, model developers, software engineers, and other stakeholders to translate business needs into technical solutions. Experience of having deployed ML models to production Create high performance real-time inferencing APIs and batch inferencing pipelines to serve ML models to stakeholders. Integrate machine learning models seamlessly into existing production systems. Continuously monitor and evaluate model performance and retrain the models automatically or periodically Streamline existing ML pipelines to increase throughput. Identify and address security vulnerabilities in existing applications proactively. Design, develop, and implement machine learning models for preferably insurance related applications. Well versed with Azure ecosystem Knowledge of NLP and Generative AI techniques. Relevant experience will be a plus. Knowledge of machine learning algorithms and libraries (e.g., TensorFlow, PyTorch) will be a plus. Stay up-to-date on the latest advancements in machine learning and contribute to ongoing innovation within the team. Why Chubb? Join Chubb to be part of a leading global insurance company! Our constant focus on employee experience along with a start-up-like culture empowers you to achieve impactful results. Industry leader: Chubb is a world leader in the insurance industry, powered by underwriting and engineering excellence A Great Place to work: Chubb India has been recognized as a Great Place to Work® for the years 2023-2024, 2024-2025 and 2025-2026 Laser focus on excellence: At Chubb we pride ourselves on our culture of greatness where excellence is a mindset and a way of being. We constantly seek new and innovative ways to excel at work and deliver outstanding results Start-Up Culture: Embracing the spirit of a start-up, our focus on speed and agility enables us to respond swiftly to market requirements, while a culture of ownership empowers employees to drive results that matter Growth and success: As we continue to grow, we are steadfast in our commitment to provide our employees with the best work experience, enabling them to advance their careers in a conducive environment Employee Benefits Our company offers a comprehensive benefits package designed to support our employees’ health, well-being, and professional growth. Employees enjoy flexible work options, generous paid time off, and robust health coverage, including treatment for dental and vision related requirements. We invest in the future of our employees through continuous learning opportunities and career advancement programs, while fostering a supportive and inclusive work environment. Our benefits include: Savings and Investment plans: We provide specialized benefits like Corporate NPS (National Pension Scheme), Employee Stock Purchase Plan (ESPP), Long-Term Incentive Plan (LTIP), Retiral Benefits and Car Lease that help employees optimally plan their finances Upskilling and career growth opportunities: With a focus on continuous learning, we offer customized programs that support upskilling like Education Reimbursement Programs, Certification programs and access to global learning programs. Health and Welfare Benefits: We care about our employees’ well-being in and out of work and have benefits like Employee Assistance Program (EAP), Yearly Free Health campaigns and comprehensive Insurance benefits. Application Process Our recruitment process is designed to be transparent, and inclusive. Step 1: Submit your application via the Chubb Careers Portal. Step 2: Engage with our recruitment team for an initial discussion. Step 3: Participate in HackerRank assessments/technical/functional interviews and assessments (if applicable). Step 4: Final interaction with Chubb leadership. Join Us With you Chubb is better. Whether you are solving challenges on a global stage or creating innovative solutions for local markets, your contributions will help shape the future. If you value integrity, innovation, and inclusion, and are ready to make a difference, we invite you to be part of Chubb India’s journey. Apply Now: Chubb External Careers

Posted 2 weeks ago

Apply

4.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Line of Service Advisory Industry/Sector Not Applicable Specialism Oracle Management Level Senior Associate Job Description & Summary At PwC, our people in business application consulting specialise in consulting services for a variety of business applications, helping clients optimise operational efficiency. These individuals analyse client needs, implement software solutions, and provide training and support for seamless integration and utilisation of business applications, enabling clients to achieve their strategic objectives. Those in Oracle technology at PwC will focus on utilising and managing Oracle suite of software and technologies for various purposes within an organisation. You will be responsible for tasks such as installation, configuration, administration, development, and support of Oracle products and solutions. *Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " *Responsibilities: Extensive experience in Oracle ERP/Fusion SaaS/PaaS project implementations as a technical developer . Completed at least 2 full Oracle Cloud (Fusion) Implementation Extensive Knowledge on database structure for ERP/Oracle Cloud (Fusion) Extensively worked on BI Publisher reports, FBDI/OTBI Cloud and Oracle Integration (OIC) *Mandatory skill sets BI Publisher reports, FBDI/OTBI Cloud and Oracle Integration (OIC) *Preferred skill sets database structure for ERP/Oracle Cloud (Fusion) *Year of experience required • Minimum 4Years of Oracle fusion experience • *Education Qualification : Any Graduate or Post Graduate Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor of Technology, Master Degree, Bachelor Degree Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills Oracle Integration Cloud (OIC) Optional Skills Accepting Feedback, Active Listening, Analytical Thinking, Business Transformation, Communication, Creativity, Design Automation, Embracing Change, Emotional Regulation, Empathy, Inclusion, Intellectual Curiosity, Learning Agility, Optimism, Oracle Application Development Framework (ADF), Oracle Business Intelligence (BI) Publisher, Oracle Cloud Infrastructure, Oracle Data Integration, Process Improvement, Process Optimization, Self-Awareness, Strategic Technology Planning, Teamwork, Well Being Desired Languages (If blank, desired languages not specified) Travel Requirements Available for Work Visa Sponsorship? Government Clearance Required? Job Posting End Date

Posted 2 weeks ago

Apply

2.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Line of Service Advisory Industry/Sector Not Applicable Specialism Oracle Management Level Associate Job Description & Summary At PwC, our people in business application consulting specialise in consulting services for a variety of business applications, helping clients optimise operational efficiency. These individuals analyse client needs, implement software solutions, and provide training and support for seamless integration and utilisation of business applications, enabling clients to achieve their strategic objectives. Those in Oracle technology at PwC will focus on utilising and managing Oracle suite of software and technologies for various purposes within an organisation. You will be responsible for tasks such as installation, configuration, administration, development, and support of Oracle products and solutions. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. Job Description & Summary Managing business performance in today’s complex and rapidly changing business environment is crucial for any organization’s short-term and long-term success. However ensuring streamlined E2E Oracle fusion Technical to seamlessly adapt to the changing business environment is crucial from a process and compliance perspective. As part of the Technology Consulting -Business Applications - Oracle Practice team, we leverage opportunities around digital disruption, new age operating model and best in class practices to deliver technology enabled transformation to our clients *Responsibilities: Extensive experience in Oracle ERP/Fusion SaaS/PaaS project implementations as a technical developer . Completed at least 2 full Oracle Cloud (Fusion) Implementation Extensive Knowledge on database structure for ERP/Oracle Cloud (Fusion) Extensively worked on Integration (OIC) and PLSQL *Mandatory skill sets BI Publisher reports, FBDI/OTBI Cloud and Oracle Integration (OIC)+ Plsql *Preferred skill sets database structure for ERP/Oracle Cloud (Fusion) *Years of experience required • Minimum 2+ Years of Oracle fusion experience *Education Qualification • BE/BTech • MBA Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Master of Business Administration, Bachelor Degree Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills Oracle Optional Skills Accepting Feedback, Active Listening, Business Transformation, Communication, Design Automation, Emotional Regulation, Empathy, Inclusion, Intellectual Curiosity, Optimism, Oracle Application Development Framework (ADF), Oracle Business Intelligence (BI) Publisher, Oracle Cloud Infrastructure, Oracle Data Integration, Process Improvement, Process Optimization, Strategic Technology Planning, Teamwork, Well Being Desired Languages (If blank, desired languages not specified) Travel Requirements Available for Work Visa Sponsorship? Government Clearance Required? Job Posting End Date

Posted 2 weeks ago

Apply

8.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Job Location HYDERABAD OFFICE INDIA Job Description Are you looking to take your career to the next level? We’re looking for a DevOps Engineer to join our Data & Analytics Core Data Lake Platform engineering team. We are searching for self-motivated candidates, who will leverage modern Agile and DevOps practices to design, develop, test and deploy IT systems and applications, delivering global projects in multinational teams. P&G Core Data Lake Platform is a central component of P&G data and analytics ecosystem. CDL Platform is used to deliver a broad scope of digital products and frameworks used by data engineers and business analysts. In this role you will have an opportunity to leverage data engineering skillset to deliver solutions enriching data cataloging and data discoverability for our users. With our approach to building solutions that would fit the scale P&G business is operating, we combine data engineering best practices (Databricks) with modern software engineering standards (Azure, DevOps, SRE) to deliver value for P&G. RESPONSIBILITIES: Writing and testing code for Data & Analytics platform applications and building E2E cloud native (Azure) solutions. Engineering applications throughout its entire lifecycle from development, deployment, upgrade, and replacement/termination Ensuring that development and architecture enforce to established standards, including modern software engineering practices (CICD, Agile, DevOps) Collaborate with internal technical specialists and vendors to develop final product to improve overall performance, efficiency and/or to enable adaptation of new business process. Qualifications Job Qualifications Bachelor’s degree in computer science or related technical field. 8+ years of experience working as Software/Data Engineer (with focus on developing in Python, PySpark, Databricks, ADF) Experience leveraging modern software engineering practices (code standards, Gitflow, automated testing, CICD, DevOps) Experience working with Cloud infrastructure (Azure preferred) Strong verbal, written, and interpersonal communication skills. A strong desire to produce high quality software through cross functional collaboration, testing, code reviews, and other best practices. YOU ALSO SHOULD HAVE: Strong written and verbal English communication skills to influence others Demonstrated use of data and tools Ability to handle multiple priorities Ability to work collaboratively across different functions and geographies Job Schedule Full time Job Number R000134774 Job Segmentation Experienced Professionals (Job Segmentation)

Posted 2 weeks ago

Apply

2.0 - 5.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Summary Position Summary Data Engineer, DT US PxE The Data Engineer is an integral part of the technical application development team and primarily responsible for analyze, plan, design, develop, and implement the Azure Data engineering solutions to meet strategic, usability, performance, reliability, control, and security requirements of Data science processes. Requires demonstrable knowledge in areas of Data engineering, AI/ML, Data warehouse and reporting applications. Must be innovative. Work you will do A unique opportunity to be a part of growing team that works on a premier unified data science analytics platform within Deloitte. You will be responsible for implementing/delivering/supporting Data engineering and AI/ML solutions to support the Deloitte US Member Firm. Outcome-Driven Accountability Collaborate with business and IT leaders to develop and refine ideas for integrating predictive and prescriptive analytics within business processes, ensuring measurable customer and business outcomes. Decompose complex business problems into manageable components, facilitating the use of multiple analytic modeling methods for holistic and valuable solutions. Develop and refine prototypes and proofs of concepts, presenting results to business and IT leaders, and demonstrating the impact on customer needs and business outcomes. Technical Leadership and Advocacy Engage in data analysis, generating and testing hypotheses, preparing and analyzing historical data, identifying patterns, and applying statistical methods to formulate solutions that deliver high-quality outcomes. Develop project plans, including resource needs and task dependencies, to meet project deliverables with a focus on incremental and iterative delivery. Engineering Craftsmanship Participate in defining project scope, objectives, and quality controls for new projects, ensuring alignment with customer-centric engineering principles. Present and communicate project deliverable results, emphasizing the value delivered to customers and the business. Customer-Centric Engineering Assist in recruiting and mentoring team members, fostering a culture of engineering craftsmanship and continuous learning. Incremental and Iterative Delivery Stay abreast of changes in technology, leading new technology evaluations for predictive and statistical analytics, and advocating for innovative, lean, and feasible solutions. Education: Bachelor’s degree in Computer Science or Business Information Systems or MCA or equivalent degree. Qualifications: 2 to 5 years Advanced Level of experience in Azure Data engineering Expertise in Development, deployment and monitoring ADF pipelines (using visual studio and browsers) Expertise in Azure databricks internal programming using (PySpark, SparkR and SparkSQL) or Amazon EMR (Elastic MapReduce). Expertise in managing azure storage (Azure Datalake Gen2, Azure Blob Storage, Azure SQL database) or Azure Blob Storage, Azure Data Lake Storage, Azure Synapse Analytics, Azure Data Factory Advanced programming skills in Python, R and SQL (SQL for HANA, MS SQL) Hands on experience in Visualization tools (Tableau / PowerBI) Hands on experience in Data science studios like (Dataiku, Azure ML studio, Amazon SageMaker) The Team Information Technology Services (ITS) helps power Deloitte’s success. ITS drives Deloitte, which serves many of the world’s largest, most respected organizations. We develop and deploy cutting-edge internal and go-to-market solutions that help Deloitte operate effectively and lead in the market. Our reputation is built on a tradition of delivering with excellence. The ~3,000 professionals in ITS deliver services including: Security, risk & compliance Technology support Infrastructure Applications Relationship management Strategy Deployment PMO Financials Communications Product Engineering (PxE) Product Engineering (PxE) is the internal software and applications development team responsible for delivering leading-edge technologies to Deloitte professionals. Their broad portfolio includes web and mobile productivity tools that empower our people to log expenses, enter timesheets, book travel and more, anywhere, anytime. PxE enables our client service professionals through a comprehensive suite of applications across the business lines. In addition to application delivery, PxE offers full-scale design services, a robust mobile portfolio, cutting-edge analytics, and innovative custom development. How you will grow At Deloitte, we have invested a great deal to create a rich environment in which our professionals can grow. We want allourpeopletodevelopintheirownway,playingtotheirownstrengthsastheyhonetheirleadershipskills.And,asa part of our efforts, we provide our professionals with a variety of learning and networking opportunities—including exposure to leaders, sponsors, coaches, and challenging assignments—to help accelerate their careers along the way. Notwopeoplelearninexactlythesameway.So,weprovidearangeofresources,includingliveclassrooms,team-based learning, and eLearning. Deloitte University (DU): The Leadership Center in India, our state-of-the-art, world-class learningcenterintheHyderabadoffice,isanextensionoftheDUinWestlake,Texas,andrepresentsatangiblesymbol of our commitment to our people’s growth and development. Explore DU: The Leadership Center inIndia . Benefits At Deloitte, we know that great people make a great organization. We value our people and offer employees a broad range of benefits. Learn more about what working at Deloitte can mean for you. Deloitte’s culture Ourpositiveandsupportivecultureencouragesourpeopletodotheirbestworkeveryday.Wecelebrateindividualsby recognizing their uniqueness and offering them the flexibility to make daily choices that can help them to be healthy, centered,confident,andaware.Weofferwell-beingprogramsandarecontinuouslylookingfornewwaystomaintaina culture that is inclusive, invites authenticity, leverages our diversity, and where our people excel and lead healthy, happy lives. Learn more about Life atDeloitte. Corporate citizenship Deloitte is led by a purpose: to make an impact that matters. This purpose defines who we are and extends to relationships with our clients, our people, and our communities. We believe that business has the power to inspire and transform. We focus on education, giving, skill-based volunteerism, and leadership to help drive positive social impact in our communities. Learn more about Deloitte’s impact on the world. Disclaimer: Please note that this description is subject to change basis business/engagement requirements and at the discretion of the management. Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Professional development At Deloitte, professionals have the opportunity to work with some of the best and discover what works best for them. Here, we prioritize professional growth, offering diverse learning and networking opportunities to help accelerate careers and enhance leadership skills. Our state-of-the-art DU: The Leadership Center in India, located in Hyderabad, represents a tangible symbol of our commitment to the holistic growth and development of our people. Explore DU: The Leadership Center in India . Benefits To Help You Thrive At Deloitte, we know that great people make a great organization. Our comprehensive rewards program helps us deliver a distinctly Deloitte experience that helps that empowers our professionals to thrive mentally, physically, and financially—and live their purpose. To support our professionals and their loved ones, we offer a broad range of benefits. Eligibility requirements may be based on role, tenure, type of employment and/ or other criteria. Learn more about what working at Deloitte can mean for you. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Requisition code: 302720

Posted 2 weeks ago

Apply

7.0 - 12.0 years

15 - 25 Lacs

Pune, Chennai, Bengaluru

Work from Office

Dear Candidate, Greetings from Sigma!! Job Title: Azure Migration Consultant Experience Required: 7+ Years Location: PAN India Employment Type: Full-Time Must-Have Skills: Azure Migrate, Azure Infrastructure, Kubernetes / Azure Kubernetes Service (AKS), Terraform, Azure, CI/CD -Tekton and ArgoCD, Git, Shell Scripting / Python / Powershell/ Ansible.A strong DevOps specialist with Azure Kubernetes experience is required who has experience on Application side of DevOps, creating and maintaining pipelines. Exp. in creating and managing templates, Bicep templates, Yaml pipelines are must. Person should have experience in Function app, ADF deployment & Power Shell scripts . Linux Consultant (preferably with Azure Migrate + Linux experience) Must-Have Skills: Linux Consultant (preferably with Azure Migrate + Linux experience) Secondary skills - AWS experience, EKS, Bitbucket, Azure Landing Zone, Helm, Azure Functions / Lambda Functions Good-to-Have Skills: SharePoint On-prem SME, Windows Failover Cluster SME If you are interested, please send me a copy of your resume along with the following details in this mail id- sumati@sigmacareers.in 1. Notice Period (LWD)- 2. Current CTC- 3. Expected CTC - 4. Current company- 5. Total year of experience- 6. Azure Infrastructure experience- 7. Do you have any offer - 8. current location- 9. Preferred location-

Posted 2 weeks ago

Apply

1.0 - 3.0 years

0 Lacs

Gurgaon, Haryana, India

Remote

About Milliman MedInsight Leading with our core values of Quality, Integrity, and Opportunity, MedInsight is one of the healthcare industry’s most trusted solutions for healthcare intelligence. Our company aims to empower easy, data-driven decision-making on important healthcare questions. Through our products, education, and services, MedInsight is impacting healthcare by helping to drive better patient outcomes while reducing waste. Over 300 leading healthcare organizations have come to rely on MedInsight analytic solutions for healthcare cost and care management. MedInsight is a subsidiary of Milliman, a global, employee-owned consultancy that provides actuarial consulting, retirement funding, healthcare financing, enterprise risk management and regulatory compliance, data analytics, business transformation, and a range of other consulting and technology solutions. Role Overview As an Azure Cloud Support Specialist, this professional will provide first-line triage and service desk support to internal users and clients, focusing on Azure cloud services. The right candidate will be highly motivated, a self-starter, and capable of working independently. Strong written and verbal communication, attention to detail, and troubleshooting skills are necessary to be successful in this position. Key responsibilities will include the following: Primary Responsibilities Respond to and resolve help desk requests on ServiceDesk Plus ticketing tool related to Azure services. Assist in monitoring Azure systems daily and respond immediately to security or usability concerns as they relate to cloud operations. Manage Azure resources, including virtual machines, Azure Entra ID, Azure Data Factory (ADF), Azure DevOps, and Azure Databricks. Identify, research, and resolve Azure-related technical problems in a timely manner. Apply updates, patches, and configuration changes to VMs as required. Manage and maintain in-house IT operations applications. Follow and enforce security compliance standards and company policies, with a focus on Azure security best practices. Ensure compliance with SOC2 and HITRUST audits through Azure governance and monitoring tools. The timing for this position is 9 PM-5AM IST but should be flexible for changes in the shift timings. This is a work from home opportunity with frequent office visits as required. Skills And Requirements Candidates must be team players with strong interpersonal skills and proven experience in a cloud support or help desk role. This position requires the following minimum requirements: 1-3 years’ experience in a help desk or cloud support role. Experience managing users, groups, and roles in Azure Entra ID. Knowledge and hands-on experience with Azure Data Factory (ADF), Azure DevOps, and Azure Databricks. Microsoft Azure Fundamentals (AZ-900) certification. Excellent troubleshooting skills for Azure services and resources. Experience managing Azure virtual machines and related services. Experience working with Azure monitoring tools (e.g., Azure Monitor, Log Analytics). Ability to communicate clearly, verbally and in writing. Reliable high-speed internet connection for remote work. Preferred Skills Microsoft Azure Administrator Associate (AZ-104) certification. Experience working in a regulated environment (e.g., HIPAA). Knowledge of PowerShell scripting for Azure automation. Familiarity with Azure governance and cost management tools.

Posted 2 weeks ago

Apply

5.0 - 7.0 years

0 Lacs

Mumbai Metropolitan Region

On-site

Job Title: SQL, Dev-Ops & Cloud Data Engineer Location: Mumbai Job Type: Full-time Experience Level: 5-7 Years Job Summary We are seeking a skilled SQL & Cloud Data Engineer with expertise in SQL development and cloud data integration. The ideal candidate will have hands-on experience with SQL Server, MySQL, PostgreSQL , as well as cloud platforms such as Azure and AWS . This role involves working with Azure Data Factory (ADF), Azure Blob Storage, AWS S3 , and various integration tools to streamline data processing and management. Key Responsibilities Develop, optimize, and maintain SQL queries and stored procedures across different database platforms (SQL Server, MySQL, PostgreSQL). Manage and maintain databases, ensuring high availability, performance tuning, and security best practices. Work with Azure Data Factory (ADF), Azure Blob Storage, and Azure integration tools to support data ingestion, transformation, and orchestration. Handle AWS S3 and AWS integration tools for data processing, storage, and migration. Implement ETL processes and data pipelines to support analytics and business intelligence needs. Troubleshoot and optimize database performance issues across cloud and on-prem environments. Collaborate with cross-functional teams to design and implement scalable data solutions. Required Skills & Qualifications Strong expertise in SQL queries, stored procedures, and performance optimization. Hands-on experience with SQL Server, MySQL, and PostgreSQL. Experience working with Azure Data Factory (ADF), Azure Blob Storage, and Azure integration tools. Knowledge of AWS services including S3 and AWS integration tools. Familiarity with ETL processes and data pipeline development. Strong problem-solving and analytical skills. Ability to work independently and in a team environment.

Posted 2 weeks ago

Apply

4.0 - 6.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Responsible for developing, optimize, and maintaining business intelligence and data warehouse systems, ensuring secure, efficient data storage and retrieval, enabling self-service data exploration, and supporting stakeholders with insightful reporting and analysis. Grade - T5 Please note that the Job will close at 12am on Posting Close date, so please submit your application prior to the Close Date Accountabilities What your main responsibilities are: Data Pipeline - Develop and maintain scalable data pipelines and builds out new API integrations to support continuing increases in data volume and complexity Data Integration - Connect offline and online data to continuously improve overall understanding of customer behavior and journeys for personalization. Data pre-processing including collecting, parsing, managing, analyzing and visualizing large sets of data Data Quality Management - Cleanse the data and improve data quality and readiness for analysis. Drive standards, define and implement/improve data governance strategies and enforce best practices to scale data analysis across platforms Data Transformation - Processes data by cleansing data and transforming them to proper storage structure for the purpose of querying and analysis using ETL and ELT process Data Enablement - Ensure data is accessible and useable to wider enterprise to enable a deeper and more timely understanding of operation. Qualifications & Specifications Masters /Bachelor’s degree in Engineering /Computer Science/ Math/ Statistics or equivalent. Strong programming skills in Python/Pyspark/SAS. Proven experience with large data sets and related technologies – Hadoop, Hive, Distributed computing systems, Spark optimization. Experience on cloud platforms (preferably Azure) and it's services Azure Data Factory (ADF), ADLS Storage, Azure DevOps. Hands-on experience on Databricks, Delta Lake, Workflows. Should have knowledge of DevOps process and tools like Docker, CI/CD, Kubernetes, Terraform, Octopus. Hands-on experience with SQL and data modeling to support the organization's data storage and analysis needs. Experience on any BI tool like Power BI (Good to have). Cloud migration experience (Good to have) Cloud and Data Engineering certification (Good to have) Working in an Agile environment 4-6 Years Of Relevant Work Experience Is Required. Experience with stakeholder management is an added advantage. What We Are Looking For Education: Bachelor's degree or equivalent in Computer Science, MIS, Mathematics, Statistics, or similar discipline. Master's degree or PhD preferred. Knowledge, Skills And Abilities Fluency in English Analytical Skills Accuracy & Attention to Detail Numerical Skills Planning & Organizing Skills Presentation Skills Data Modeling and Database Design ETL (Extract, Transform, Load) Skills Programming Skills FedEx was built on a philosophy that puts people first, one we take seriously. We are an equal opportunity/affirmative action employer and we are committed to a diverse, equitable, and inclusive workforce in which we enforce fair treatment, and provide growth opportunities for everyone. All qualified applicants will receive consideration for employment regardless of age, race, color, national origin, genetics, religion, gender, marital status, pregnancy (including childbirth or a related medical condition), physical or mental disability, or any other characteristic protected by applicable laws, regulations, and ordinances. Our Company FedEx is one of the world's largest express transportation companies and has consistently been selected as one of the top 10 World’s Most Admired Companies by "Fortune" magazine. Every day FedEx delivers for its customers with transportation and business solutions, serving more than 220 countries and territories around the globe. We can serve this global network due to our outstanding team of FedEx team members, who are tasked with making every FedEx experience outstanding. Our Philosophy The People-Service-Profit philosophy (P-S-P) describes the principles that govern every FedEx decision, policy, or activity. FedEx takes care of our people; they, in turn, deliver the impeccable service demanded by our customers, who reward us with the profitability necessary to secure our future. The essential element in making the People-Service-Profit philosophy such a positive force for the company is where we close the circle, and return these profits back into the business, and invest back in our people. Our success in the industry is attributed to our people. Through our P-S-P philosophy, we have a work environment that encourages team members to be innovative in delivering the highest possible quality of service to our customers. We care for their well-being, and value their contributions to the company. Our Culture Our culture is important for many reasons, and we intentionally bring it to life through our behaviors, actions, and activities in every part of the world. The FedEx culture and values have been a cornerstone of our success and growth since we began in the early 1970’s. While other companies can copy our systems, infrastructure, and processes, our culture makes us unique and is often a differentiating factor as we compete and grow in today’s global marketplace.

Posted 2 weeks ago

Apply

0 years

0 Lacs

Gurugram, Haryana, India

On-site

Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate Job Description & Summary At PwC, our people in data and analytics focus on leveraging data to drive insights and make informed business decisions. They utilise advanced analytics techniques to help clients optimise their operations and achieve their strategic goals. In business intelligence at PwC, you will focus on leveraging data and analytics to provide strategic insights and drive informed decision-making for clients. You will develop and implement innovative solutions to optimise business performance and enhance competitive advantage. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. Responsibilities: · Design, develop, and maintain scalable data pipelines using Azure data services such as Azure Data Factory and Apache Spark. · Implement efficient Extract, Transform, Load (ETL) processes to move and transform data across various sources. · Design, develop, and maintain data solutions using Azure Synapse Analytics. · Implement data ingestion, transformation, and extraction processes using Azure Synapse Pipelines. · Knowledge about data warehousing concepts · Utilize Azure SQL Database, Azure Blob Storage, Azure Data Lake Storage, and other Azure data services to store and retrieve data. · Performance optimization and troubleshooting capabilities · Advanced SQL knowledge, capable to write optimized queries for faster data workflows. · Proven work experience in Spark, Python, SQL, Any RDBMS. · Experience in designing solutions for multiple large data warehouses with a good understanding of cluster and parallel architecture as well as high-scale or distributed RDBMS · Must be extremely well versed with handling large volume data and work using different tools to derive the required solution. Mandatory skill sets: Azure Databricks, Azure Data Factory (ADF), or Azure Synapse Analytics, along with Python and SQL expertise Preferred skill sets: · Experienced in Delta Lake, Power BI, or Azure DevOps. · Knowledge of Databricks will be a plus · Knowledge of Spark, Scala, or other distributed processing frameworks. · Exposure to BI tools like Power BI, Tableau, or Looker. · Familiarity with data security and compliance in the cloud. · Experience in leading a development team. Years of experience required: 4 – 7 yrs Education qualification: B.tech/MBA/MCA Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Master of Business Administration, Bachelor of Technology Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills Azure Synapse Analytics, Databricks Platform Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Analytical Thinking, Business Case Development, Business Data Analytics, Business Intelligence and Reporting Tools (BIRT), Business Intelligence Development Studio, Communication, Competitive Advantage, Continuous Process Improvement, Creativity, Data Analysis and Interpretation, Data Architecture, Database Management System (DBMS), Data Collection, Data Pipeline, Data Quality, Data Science, Data Visualization, Embracing Change, Emotional Regulation, Empathy, Inclusion, Industry Trend Analysis {+ 16 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Available for Work Visa Sponsorship? Government Clearance Required? Job Posting End Date

Posted 2 weeks ago

Apply

4.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Line of Service Advisory Industry/Sector Not Applicable Specialism Oracle Management Level Senior Associate Job Description & Summary At PwC, our people in business application consulting specialise in consulting services for a variety of business applications, helping clients optimise operational efficiency. These individuals analyse client needs, implement software solutions, and provide training and support for seamless integration and utilisation of business applications, enabling clients to achieve their strategic objectives. Those in Oracle technology at PwC will focus on utilising and managing Oracle suite of software and technologies for various purposes within an organisation. You will be responsible for tasks such as installation, configuration, administration, development, and support of Oracle products and solutions. *Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " *Responsibilities: Extensive experience in Oracle ERP/Fusion SaaS/PaaS project implementations as a technical developer . Completed at least 2 full Oracle Cloud (Fusion) Implementation Extensive Knowledge on database structure for ERP/Oracle Cloud (Fusion) Extensively worked on BI Publisher reports, FBDI/OTBI Cloud and Oracle Integration (OIC) *Mandatory skill sets BI Publisher reports, FBDI/OTBI Cloud and Oracle Integration (OIC) *Preferred skill sets database structure for ERP/Oracle Cloud (Fusion) *Years of experience required • Minimum 4Years of Oracle fusion experience • *Education Qualification : Any Graduate or Post Graduate Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor of Technology, Master Degree, Bachelor Degree Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills Oracle Integration Cloud (OIC) Optional Skills Accepting Feedback, Active Listening, Analytical Thinking, Business Transformation, Communication, Creativity, Design Automation, Embracing Change, Emotional Regulation, Empathy, Inclusion, Intellectual Curiosity, Learning Agility, Optimism, Oracle Application Development Framework (ADF), Oracle Business Intelligence (BI) Publisher, Oracle Cloud Infrastructure, Oracle Data Integration, Process Improvement, Process Optimization, Self-Awareness, Strategic Technology Planning, Teamwork, Well Being Desired Languages (If blank, desired languages not specified) Travel Requirements Available for Work Visa Sponsorship? Government Clearance Required? Job Posting End Date

Posted 2 weeks ago

Apply

6.0 years

0 Lacs

Hyderābād

On-site

Job requisition ID :: 77965 Date: Jul 14, 2025 Location: Hyderabad Designation: Senior Consultant Entity: Deloitte Touche Tohmatsu India LLP Education Bachelor’s degree in relevant field (e.g. Engineering, Analytics or Data Science, Computer Science, Statistics) or equivalent experience. Experience At least 6 years of experience with big data technologies like Azure Data Lake, Synapse, PySpark, Azure Data Factory (ADF), AWS Redshift , S3, SQL Server ,MLOps or their equivalent. Experience in implementing complex ETL pipelines day-to-day operations. Experience on knowledge graphs is a plus. 3+ years of experience in Agile Development and code deployment and CI-CD pipelines. 2+ years of experience in job orchestration using Airflow or equivalent. 2+ years in AI/ML, specially Gen AI concepts on Rag patterns, chunking techniques. Exposure on knowledge graphs is a plus. Build, Design and Deliver enterprise data programs. Proficiency in implementing data quality rules. Proficiency in analytical tools like Tableau, Power BI or equivalent. Experience with security models and development on large data sets. Experience in data quality management tools. Work closely with different stakeholders: Business owners, users, product managers, program managers, architects, engineering managers & developers, etc. to translate business needs and product requirements to well-documented engineering solutions. Ensuring data quality and consistency: Ensure data quality and consistency across various sources. Strong working knowledge on Python. Designing and contributing to best practices: Design and contribute to best practices in Enterprise Data Warehouse (EDW) architecture. Additional Desired Preferences Experience with scientific chemistry nomenclature or prior work experience in life sciences, chemistry, or hard sciences or degree in sciences Experience with pharmaceutical datasets and nomenclature Experience working with knowledge graphs Ability to explain complex technical issues to a non-technical audience Self-directed and able to handle multiple concurrent projects and prioritize tasks independently Able to make tough decisions when trade-offs are required to deliver results Strong communication skills required: Verbal, written, and interpersonal

Posted 2 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies