Jobs
Interviews

1640 Adf Jobs - Page 3

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

0.0 - 2.0 years

3 - 10 Lacs

Niranjanpur, Indore, Madhya Pradesh

Remote

Job Title - Sr. Data Engineer Experience - 2+ Years Location - Indpre (onsite) Industry - IT Job Type - Full ime Roles and Responsibilities- 1. Design and develop scalable data pipelines and workflows for data ingestion, transformation, and integration. 2. Build and maintain data storage systems, including data warehouses, data lakes, and relational databases. 3. Ensure data accuracy, integrity, and consistency through validation and quality assurance processes. 4. Collaborate with data scientists, analysts, and business teams to understand data needs and deliver tailored solutions. 5. Optimize database performance and manage large-scale datasets for efficient processing. 6. Leverage cloud platforms (AWS, Azure, or GCP) and big data technologies (Hadoop, Spark, Kafka) for building robust data solutions. 7. Automate and monitor data workflows using orchestration frameworks such as Apache Airflow. 8. Implement and enforce data governance policies to ensure compliance and data security. 9. Troubleshoot and resolve data-related issues to maintain seamless operations. 10. Stay updated on emerging tools, technologies, and trends in data engineering. Skills and Knowledge- 1. Core Skills: ● Proficient in Python (libraries: Pandas, NumPy) and SQL. ● Knowledge of data modeling techniques, including: ○ Entity-Relationship (ER) Diagrams ○ Dimensional Modeling ○ Data Normalization ● Familiarity with ETL processes and tools like: ○ Azure Data Factory (ADF) ○ SSIS (SQL Server Integration Services) 2. Cloud Expertise: ● AWS Services: Glue, Redshift, Lambda, EKS, RDS, Athena ● Azure Services: Databricks, Key Vault, ADLS Gen2, ADF, Azure SQL ● Snowflake 3. Big Data and Workflow Automation: ● Hands-on experience with big data technologies like Hadoop, Spark, and Kafka. ● Experience with workflow automation tools like Apache Airflow (or similar). Qualifications and Requirements- ● Education: ○ Bachelor’s degree (or equivalent) in Computer Science, Information Technology, Engineering, or a related field. ● Experience: ○ Freshers with strong understanding, internships and relevant academic projects are welcome. ○ 2+ years of experience working with Python, SQL, and data integration or visualization tools is preferred. ● Other Skills: ○ Strong communication skills, especially the ability to explain technical concepts to non-technical stakeholders. ○ Ability to work in a dynamic, research-oriented team with concurrent projects. Job Types: Full-time, Permanent Pay: ₹300,000.00 - ₹1,000,000.00 per year Benefits: Paid sick time Provident Fund Work from home Schedule: Day shift Monday to Friday Weekend availability Supplemental Pay: Performance bonus Ability to commute/relocate: Niranjanpur, Indore, Madhya Pradesh: Reliably commute or planning to relocate before starting work (Preferred) Experience: Data Engineer: 2 years (Preferred) Work Location: In person Application Deadline: 31/08/2025

Posted 3 days ago

Apply

7.0 - 10.0 years

5 - 9 Lacs

Hyderābād

On-site

Position: Technical Team Lead (MT710FT RM 3428) Job Summary We are seeking a highly skilled and experienced Technical Team Lead to join our dynamic team. The ideal candidate will have a strong background in software development, particularly in C# and ASP.NET, and will be responsible for leading a team of developers to deliver high quality software solutions. The candidate should possess a deep understanding of cloud technologies, particularly Azure, and have experience with modern development practices. Responsibilities Lead and mentor a team of software developers, ensuring best practices in coding and design. Design, develop, and maintain scalable web applications using C#, ASP.NET, and Blazor. Implement and manage RESTful APIs to support front end and back end integration. Utilize Azure services, including Azure Storage and Azure Kubernetes Service (AKS), to deploy and manage applications. Collaborate with cross functional teams to define, design, and ship new features. Conduct code reviews and ensure adherence to coding standards and best practices. Stay updated with emerging technologies and industry trends to drive innovation within the team. Mandatory Skills Proficient in C# and ASP.NET. Experience with Blazor for building interactive web UIs. Strong understanding of REST API development and integration. Hands on experience with Azure services, including Azure Storage and Azure Kubernetes Service. Familiarity with AI technologies and their application in software development, particularly in enhancing user experience and automating processes. Preferred Skills Knowledge of Azure Data Factory (ADF) for data integration. Experience with Selenium for automated testing. Qualifications Bachelor’s degree in Computer Science, Information Technology, or a related field. 7 10 years of experience in software development, with at least 3 years in a leadership role. Strong problem solving skills and the ability to work under pressure. Excellent communication and interpersonal skills. Proven track record of delivering high quality software solutions on time. If you are a passionate leader with a strong technical background and a desire to drive innovation, we encourage you to apply for this exciting opportunity. ******************************************************************************************************************************************* Job Category: Digital_Cloud_Web Technologies Job Type: Full Time Job Location: Hyderabad Experience: 7-10 Years Notice period: 0-30 days

Posted 3 days ago

Apply

4.0 years

5 - 6 Lacs

Hyderābād

On-site

Technical Delivery Manager – Level 5 Technical Delivery Manager (Manager), in Solutions Delivery – Data Team. As a Delivery Manager in the SD, EDH team, the individual needs to have experience in Data related technologies, Projects implementation and maintenance. He/She should have People Management, Resource Management experience to manage the team of 7–8 resources. Work you’ll do Complete delivery responsibility of projects allocated in a Data team Responsible to timely allocate and support teams to fix the production issues and user’s problems Complete responsibility to utilize the team effectively and gain productivity from the team for the projects allocated. Provide direction to the team in terms of technical issues and the design challenges Should manage the task allocation and track the work for 10-15 team members. Complete responsibility for performance management of 7-8 professionals As a coach, he / she need to coach the team members on technical and professional areas. Provide updates to the group lead in USI on daily basis on the project, challenges, and risks on timely basis. Should be self-driven individual and always look for future opportunities for the team and motivate them with the right tasks / right opportunities Responsible for gathering the team members needs on trainings / operational items and act appropriately. Communicate regularly with onsite counterparts and update the status and discuss the challenges for the projects being managed by them. Timely resolve them and guide teams towards success. The team Solutions Delivery-Canada is an integral part of the Information Technology Services group. The principle focus of this organization is the development and maintenance of technology solutions that e-enable the delivery of Function and Marketplace Services and Management Information Systems. Solutions Delivery Canada develops and maintains solutions built on varied technologies like Siebel, PeopleSoft Microsoft technologies and Lotus Notes. Solutions Delivery Canada has various groups which provide the best of the breed solutions to the clients by following a streamlined system development methodology. Solutions Delivery Canada comprises of groups like Usability, Application Architecture, Development and Quality Assurance and Performance. Qualifications Required: Computer Science University degree or equivalent work experience At least 4 years’ experience in Azure Data Factory and Azure related services Bachelor’s degree in business administration, information technology, computer science, or related field. Overall 9-10 years of IT experience and into Data technologies like Datastage, ADF or having worked on Data Hubs Excellent organizational and communication skills. Must have strong interpersonal skills, presentation skills, and ability to work productively with all levels in a global organization Expert problem solver. Finding simple answers to complex questions or problems. Demonstrate project management capabilities. The ability to effectively manage multiple assignments and responsibilities in a fast-paced environment At least 6.5 years’ experience in leading projects and operations on data Projects. Ability to support operational and project delivery dealing with competing priorities from a management perspective. Strong and proven analytical skills in Data related technologies Strong technical knowledge in Data technologies having skill set including but not limiting to SQL, Unix, ETL Tools, Azure Cloud Should be able to code the requirement in the above-mentioned technical areas whenever required Managing support activities for Data Projects Collaborating with clients and Team to resolve tickets within deadlines Should have 5 years of People Management, expertise in coaching / mentoring the team members Should have managed at a min of 7 resources in the previous organization Having experience / expertise into core project management activities is a plus Having vast experience in Azure based technologies is an added advantage Work Location: Hyderabad Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Professional development At Deloitte, professionals have the opportunity to work with some of the best and discover what works best for them. Here, we prioritize professional growth, offering diverse learning and networking opportunities to help accelerate careers and enhance leadership skills. Our state-of-the-art DU: The Leadership Center in India, located in Hyderabad, represents a tangible symbol of our commitment to the holistic growth and development of our people. Explore DU: The Leadership Center in India. Benefits to help you thrive At Deloitte, we know that great people make a great organization. Our comprehensive rewards program helps us deliver a distinctly Deloitte experience that helps that empowers our professionals to thrive mentally, physically, and financially—and live their purpose. To support our professionals and their loved ones, we offer a broad range of benefits. Eligibility requirements may be based on role, tenure, type of employment and/ or other criteria. Learn more about what working at Deloitte can mean for you. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Requisition code: 306750

Posted 3 days ago

Apply

8.0 years

0 Lacs

Delhi

Remote

1. Role : Data engineer Experience : 8+ years Remote Skills : Adf, azure databricks, pyspark Budget : 1.1lpm Note : Need Aadhar, PAN, Education documents, previous companies experience letters and LinkedIn for the BGV after the first technical round to proceed further. Job Type: Full-time Pay: ₹50,000.00 - ₹110,000.00 per year Schedule: Day shift Work Location: In person

Posted 3 days ago

Apply

5.0 years

0 Lacs

Indore, Madhya Pradesh, India

On-site

📌 Job Title: Data Engineering Specialists – Multiple Roles 📍 Location: Hyderabad, Chennai , Indore 💼 Employment Type: Full-Time 🧠 Experience: 5 to 7 Years 📊 Domain: Data Engineering | Data Warehousing | Analytics 🌐 Work Type: Hybrid/Onsite (as per project need) 🎯 Notice Period: Immediate to 30 days preferred 🔍 Overview InfoBeans is hiring Data Engineering Professionals for two critical roles in our growing Data Management & Analytics division. We're looking for skilled individuals with strong technical depth in Snowflake, DataVault 2.0, WhereScape , and more. We have two different positions open — one focused on Data Acquisition Engineering and the other on Data Use Case Implementation . Read on to see which one suits your expertise! 💼 Open Positions 1️⃣ Data Use Case Specialist 🔧 Experience Required: 5+ years Ideal for candidates with strong analytical and visualization background Must-Have Skills: Snowflake (Expert level) Power BI (or equivalent BI tool) WhereScape RED and 3D DataVault 2.0 Excellent communication and client-facing skills Strong experience in Requirements Engineering Key Responsibilities: Collaborate with business stakeholders to understand data requirements Build and optimize dashboards and data models using Snowflake & Power BI Support end-to-end implementation of data use cases Participate in requirements gathering, validation, and delivery tracking 2️⃣ Data Acquisition Engineer 🔧 Experience Required: 5+ years Ideal for candidates with strong ETL pipeline and scripting experience Must-Have Skills: Python (Scripting & Automation) Azure Data Factory (ADF) Databricks WhereScape RED and 3D DataVault 2.0 Snowflake SQL (Intermediate to advanced) Strong experience in Requirements Engineering Key Responsibilities: Develop scalable and reliable data pipelines Integrate multiple data sources into the Snowflake platform Collaborate with analytics and BI teams for data provisioning Build automation scripts and manage data flows using Databricks and ADF 🌟 Why InfoBeans? At InfoBeans, we value innovation, collaboration, and a passion for technology. You’ll get the chance to work with industry-leading platforms and grow with a team of seasoned professionals in the data domain. 📩 Apply Now Interested candidates can reach out directly or apply. Feel free to refer someone from your network! #DataEngineering #Snowflake #PowerBI #Databricks #WhereScape #DataVault #ADF #SQL #Python #HyderabadJobs #InfoBeans #WeAreHiring #DataJobs #Analytics #BIJobs #ETL

Posted 3 days ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Job Summary: We are seeking a skilled and experienced Azure Databricks Engineer to join our growing data engineering team. The ideal candidate will have deep hands-on expertise in building scalable data pipelines and streaming architectures using Azure-native technologies. Prior experience in the banking or financial services domain is highly desirable, as you will be working with critical data assets and supporting regulatory, risk, and operational reporting use cases. Key Responsibilities: Design, develop, and optimize data pipelines using Databricks (PySpark) for batch and real-time data processing. Implement CDC (Change Data Capture) and Delta Live Tables/Autoloader to support near-real-time ingestion. Integrate various structured and semi-structured data sources using ADF, ADLS, and Kafka (Confluent). Develop CI/CD pipelines for data engineering workflows using GitHub Actions or Azure DevOps. Write efficient and reusable SQL and Python code for data transformations and validations. Ensure data quality, lineage, governance, and security across all ingestion and transformation layers. Collaborate closely with business analysts, data scientists, and data stewards to support use cases in risk, finance, compliance, and operations. Participate in code reviews, architectural discussions, and documentation efforts. Required Skills & Qualifications: Strong proficiency in SQL, Python, and PySpark. Proven experience with Azure Databricks, including notebooks, jobs, clusters, and Delta Lake. Experience with Azure Data Lake Storage (ADLS Gen2) and Azure Data Factory (ADF). Hands-on with Confluent Kafka for streaming data integration. Strong understanding of Autoloader, CDC mechanisms, and Delta Lake-based architecture. Experience implementing CI/CD pipelines using GitHub and/or Azure DevOps. Knowledge of data modeling, data warehousing, and data security best practices. Exposure to regulatory and risk data use cases in the banking/financial sector is a strong plus. Preferred Qualifications: Azure certifications (e.g., Azure Data Engineer Associate). Experience with tools such as Delta Live Tables, Unity Catalog, and Lakehouse architecture. Familiarity with business glossaries, data lineage tools, and data governance frameworks. Understanding of financial data including GL, loan, customer, transaction, or market risk domains.

Posted 3 days ago

Apply

4.0 years

0 Lacs

Greater Kolkata Area

On-site

Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate Job Description & Summary At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decision-making and driving business growth. In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Responsibilities Job Description: Analyses current business practices, processes, and procedures as well as identifying future business opportunities for leveraging Microsoft Azure Data & Analytics Services. Provide technical leadership and thought leadership as a senior member of the Analytics Practice in areas such as data access & ingestion, data processing, data integration, data modeling, database design & implementation, data visualization, and advanced analytics. Engage and collaborate with customers to understand business requirements/use cases and translate them into detailed technical specifications. Develop best practices including reusable code, libraries, patterns, and consumable frameworks for cloud-based data warehousing and ETL. Maintain best practice standards for the development or cloud-based data warehouse solutioning including naming standards. Designing and implementing highly performant data pipelines from multiple sources using Apache Spark and/or Azure Databricks Integrating the end-to-end data pipeline to take data from source systems to target data repositories ensuring the quality and consistency of data is always maintained Working with other members of the project team to support delivery of additional project components (API interfaces) Evaluating the performance and applicability of multiple tools against customer requirements Working within an Agile delivery / DevOps methodology to deliver proof of concept and production implementation in iterative sprints. Integrate Databricks with other technologies (Ingestion tools, Visualization tools). Proven experience working as a data engineer Highly proficient in using the spark framework (python and/or Scala) Extensive knowledge of Data Warehousing concepts, strategies, methodologies. Direct experience of building data pipelines using Azure Data Factory and Apache Spark (preferably in Databricks). Hands on experience designing and delivering solutions using Azure including Azure Storage, Azure SQL Data Warehouse, Azure Data Lake, Azure Cosmos DB, Azure Stream Analytics Experience in designing and hands-on development in cloud-based analytics solutions. Expert level understanding on Azure Data Factory, Azure Synapse, Azure SQL, Azure Data Lake, and Azure App Service is required. Designing and building of data pipelines using API ingestion and Streaming ingestion methods. Knowledge of Dev-Ops processes (including CI/CD) and Infrastructure as code is essential. Thorough understanding of Azure Cloud Infrastructure offerings. Strong experience in common data warehouse modeling principles including Kimball. Working knowledge of Python is desirable Experience developing security models. Databricks & Azure Big Data Architecture Certification would be plus Mandatory Skill Sets ADE, ADB, ADF Preferred Skill Sets ADE, ADB, ADF Years Of Experience Required 4-8 Years Education Qualification BE, B.Tech, MCA, M.Tech Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Master of Engineering, Bachelor of Engineering Degrees/Field Of Study Preferred Certifications (if blank, certifications not specified) Required Skills Microsoft Azure Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Agile Scalability, Amazon Web Services (AWS), Analytical Thinking, Apache Airflow, Apache Hadoop, Azure Data Factory, Communication, Creativity, Data Anonymization, Data Architecture, Database Administration, Database Management System (DBMS), Database Optimization, Database Security Best Practices, Databricks Unified Data Analytics Platform, Data Engineering, Data Engineering Platforms, Data Infrastructure, Data Integration, Data Lake, Data Modeling, Data Pipeline {+ 27 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date

Posted 3 days ago

Apply

10.0 years

0 Lacs

Greater Kolkata Area

On-site

Line of Service Advisory Industry/Sector Not Applicable Specialism Oracle Management Level Senior Manager Job Description & Summary At PwC, our people in business application consulting specialise in consulting services for a variety of business applications, helping clients optimise operational efficiency. These individuals analyse client needs, implement software solutions, and provide training and support for seamless integration and utilisation of business applications, enabling clients to achieve their strategic objectives. Those in Oracle technology at PwC will focus on utilising and managing Oracle suite of software and technologies for various purposes within an organisation. You will be responsible for tasks such as installation, configuration, administration, development, and support of Oracle products and solutions. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. Job Description & Summary: Oracle BRM (OBRM) Technical Consultant/ Developer Job Description & Summary: Oracle BRM (OBRM) Technical Consultant/ Developer Responsibilities Experience in implementing Complex BRM systems. Experience in development, configuration, and maintenance of the BRM system focusing specifically on subscription management, comprehensive billing cycles, and diverse AR activities. Hands on experience in developing and supporting integration of the BRM system with third party tax systems, payment and billing systems, WEBS platform, ensuring seamless data consistency and process integrity. Strong project management, troubleshooting, and communication capabilities. Mandatory Skill Sets Should have knowledge on Oracle BRM Should have knowledge of Subscription Management modules, billing cycles etc. Should have knowledge of the development of custom opcodes and MTAs. Should have knowledge of programming languages like C/C++ for developing customization in Oracle BRM. Strong communication skills. Preferred Skill Sets Having knowledge on C/C++/Java. Having Knowledge on Oracle middleware and OBRM related suites. Years of experience Required 10 to 15 Years Education Qualification BE/B.Tech/MBA/MCA/M.Tech Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor Degree, Master Degree Degrees/Field Of Study Preferred Certifications (if blank, certifications not specified) Required Skills Oracle Optional Skills Accepting Feedback, Active Listening, Business Transformation, Communication, Design Automation, Emotional Regulation, Empathy, Inclusion, Intellectual Curiosity, Optimism, Oracle Application Development Framework (ADF), Oracle Business Intelligence (BI) Publisher, Oracle Cloud Infrastructure, Oracle Data Integration, Process Improvement, Process Optimization, Strategic Technology Planning, Teamwork, Well Being Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date

Posted 3 days ago

Apply

3.0 - 6.0 years

0 Lacs

Greater Kolkata Area

On-site

Line of Service Advisory Industry/Sector FS X-Sector Specialism Operations Management Level Senior Associate Job Description & Summary At PwC, our people in data and analytics focus on leveraging data to drive insights and make informed business decisions. They utilise advanced analytics techniques to help clients optimise their operations and achieve their strategic goals. In business intelligence at PwC, you will focus on leveraging data and analytics to provide strategic insights and drive informed decision-making for clients. You will develop and implement innovative solutions to optimise business performance and enhance competitive advantage. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. Responsibilities Job Description & Summary – Senior Associate – Azure Data Engineer Role : Senior Associate Exp : 3 - 6 Years Location: Kolkata Technical Skills: Strong expertise in Azure Databricks, Azure Data Factory (ADF), PySpark, SQL Server, and Python. Solid understanding of Azure Functions and their application in data processing workflows. Understanding of DevOps practices and CI/CD pipelines for data solutions. Experience with other ETL tools such as Informatica Intelligent Cloud Services (IICS) is a plus. Strong problem-solving skills and ability to work independently and collaboratively in a fast-paced environment. Excellent communication skills to effectively convey technical concepts to non-technical stakeholders. Key Responsibilities: Develop, maintain, and optimize scalable data pipelines using Azure Databricks, Azure Data Factory (ADF), and PySpark. Collaborate with data architects and business stakeholders to translate requirements into technical solutions. Implement and manage data integration processes using SQL Server and Python. Design and deploy Azure Functions to support data processing workflows. Monitor and troubleshoot data pipeline performance and reliability issues. Ensure data quality, security, and compliance with industry standards and best practices. Document technical specifications and maintain clear and concise project documentation. Mandatory Skill Sets Azure Databricks, Azure Data Factory (ADF), and PySpark. Preferred skill sets: Azure Databricks, Azure Data Factory (ADF), and PySpark. Years Of Experience Required 3-6 Years Education Qualification B.E.(B.Tech)/M.E/M.Tech Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor of Engineering, Master of Engineering Degrees/Field Of Study Preferred Certifications (if blank, certifications not specified) Required Skills ETL Tools, Microsoft Azure, PySpark Optional Skills Python (Programming Language) Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date

Posted 3 days ago

Apply

0 years

0 Lacs

Greater Kolkata Area

On-site

Line of Service Advisory Industry/Sector Not Applicable Specialism Microsoft Management Level Associate Job Description & Summary At PwC, our people in software and product innovation focus on developing cutting-edge software solutions and driving product innovation to meet the evolving needs of clients. These individuals combine technical experience with creative thinking to deliver innovative software products and solutions. Those in software engineering at PwC will focus on developing innovative software solutions to drive digital transformation and enhance business performance. In this field, you will use your knowledge to design, code, and test cutting-edge applications that revolutionise industries and deliver exceptional user experiences. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Job Description & Summary: We are seeking a Data Engineer to design, develop, and maintain data ingestion processes to a data platform built using Microsoft Technologies, ensuring data quality and integrity. The role involves collaborating with data architects and business analysts to implement solutions using tools like ADF, Azure Databricks, and requires strong SQL skills. Responsibilities Key responsibilities include developing, testing, and optimizing ETL workflows and maintaining documentation. ETL development experience in Microsoft data track are required. Work with business team to translate the business requirement to technical requirements. Demonstrated expertise in Agile methodologies, including Scrum, Kanban, or SAFe. Mandatory Skill Sets Strong proficiency in Azure Databricks, including Spark and Delta Lake. Experience with Azure Data Factory, Azure Data Lake Storage, and Azure SQL Database. Proficiency in data integration and ETL processes and T-SQL. Experienced working in Python for data engineering Experienced working in Postgres Database Experienced working in graph database Experienced in architecture design and data modelling Good To Have Skill Sets: Unity Catalog / Purview Familiarity with Fabric/Snowflake service offerings Visualization tool – PowerBI Preferred Skill Sets Hands on knowledge of python, Pyspark and strong SQL knowledge. ETL and data warehousing is must. Relevant certifications (Any one) (e.g., Databricks Data Engineer Associate Microsoft Certified: Azure Data Engineer Associate Azure Solution Architect) are mandatory Years of experience required: 5+yrs Education qualification: Bachelor's degree in Computer Science, IT, or a related field. Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor of Engineering Degrees/Field Of Study Preferred Certifications (if blank, certifications not specified) Required Skills Data Engineering Optional Skills Acceptance Test Driven Development (ATDD), Acceptance Test Driven Development (ATDD), Accepting Feedback, Active Listening, Android, API Management, Appian (Platform), Application Development, Application Frameworks, Application Lifecycle Management, Application Software, Business Process Improvement, Business Process Management (BPM), Business Requirements Analysis, C#.NET, C++ Programming Language, Client Management, Code Review, Coding Standards, Communication, Computer Engineering, Computer Science, Continuous Integration/Continuous Delivery (CI/CD), Debugging, Emotional Regulation {+ 41 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date

Posted 3 days ago

Apply

3.0 years

0 Lacs

Greater Kolkata Area

On-site

Line of Service Advisory Industry/Sector FS X-Sector Specialism Data, Analytics & AI Management Level Associate Job Description & Summary At PwC, our people in data and analytics focus on leveraging data to drive insights and make informed business decisions. They utilise advanced analytics techniques to help clients optimise their operations and achieve their strategic goals. In business intelligence at PwC, you will focus on leveraging data and analytics to provide strategic insights and drive informed decision-making for clients. You will develop and implement innovative solutions to optimise business performance and enhance competitive advantage. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Responsibilities Senior Associate Exp : 3 - 6 Years Location: Kolkata Technical Skills: Strong expertise in Azure Databricks, Azure Data Factory (ADF), PySpark, SQL Server, and Python. Solid understanding of Azure Functions and their application in data processing workflows. Understanding of DevOps practices and CI/CD pipelines for data solutions. Experience with other ETL tools such as Informatica Intelligent Cloud Services (IICS) is a plus. Strong problem-solving skills and ability to work independently and collaboratively in a fast-paced environment. Excellent communication skills to effectively convey technical concepts to non-technical stakeholders. Key Responsibilities: Develop, maintain, and optimize scalable data pipelines using Azure Databricks, Azure Data Factory (ADF), and PySpark. Collaborate with data architects and business stakeholders to translate requirements into technical solutions. Implement and manage data integration processes using SQL Server and Python. Design and deploy Azure Functions to support data processing workflows. Monitor and troubleshoot data pipeline performance and reliability issues. Ensure data quality, security, and compliance with industry standards and best practices. Document technical specifications and maintain clear and concise project documentation. Mandatory Skill Sets Azure Databricks, Azure Data Factory (ADF), and PySpark. Preferred skill sets: Azure Databricks, Azure Data Factory (ADF), and PySpark. Years Of Experience Required 3-6 Years Education Qualification B.E.(B.Tech)/M.E/M.Tech Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor of Engineering, Master of Engineering Degrees/Field Of Study Preferred Certifications (if blank, certifications not specified) Required Skills Microsoft Azure Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Business Case Development, Business Data Analytics, Business Intelligence and Reporting Tools (BIRT), Business Intelligence Development Studio, Communication, Competitive Advantage, Continuous Process Improvement, Data Analysis and Interpretation, Data Architecture, Database Management System (DBMS), Data Collection, Data Pipeline, Data Quality, Data Science, Data Visualization, Emotional Regulation, Empathy, Inclusion, Industry Trend Analysis, Intellectual Curiosity, Java (Programming Language), Market Development {+ 11 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date

Posted 3 days ago

Apply

3.0 years

0 Lacs

Ahmedabad, Gujarat, India

On-site

Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate Job Description & Summary At PwC, our people in data and analytics focus on leveraging data to drive insights and make informed business decisions. They utilise advanced analytics techniques to help clients optimise their operations and achieve their strategic goals. In business intelligence at PwC, you will focus on leveraging data and analytics to provide strategic insights and drive informed decision-making for clients. You will develop and implement innovative solutions to optimise business performance and enhance competitive advantage. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Responsibilities Job Accountabilities - Hands on Experience in Azure Data Components like ADF / Databricks / Azure SQL - Good Programming Logic Sense in SQL - Good PySpark knowledge for Azure Data Bricks - Data Lake and Data Warehouse Concept Understanding - Unit and Integration testing understanding - Good communication skill to express thoghts and interact with business users - Understanding of Data Security and Data Compliance - Agile Model Understanding - Project Documentation Understanding - Certification (Good to have) - Domain Knowledge Mandatory Skill Sets Azure DE, ADB, ADF, ADL Preferred Skill Sets Azure DE, ADB, ADF, ADL Years Of Experience Required 3 to 9 years Education Qualification Graduate Engineer or Management Graduate Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Master of Business Administration, Bachelor of Engineering Degrees/Field Of Study Preferred Certifications (if blank, certifications not specified) Required Skills Microsoft Azure Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Analytical Thinking, Business Case Development, Business Data Analytics, Business Intelligence and Reporting Tools (BIRT), Business Intelligence Development Studio, Communication, Competitive Advantage, Continuous Process Improvement, Creativity, Data Analysis and Interpretation, Data Architecture, Database Management System (DBMS), Data Collection, Data Pipeline, Data Quality, Data Science, Data Visualization, Embracing Change, Emotional Regulation, Empathy, Inclusion, Industry Trend Analysis {+ 16 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date

Posted 3 days ago

Apply

3.0 years

0 Lacs

Ahmedabad, Gujarat, India

On-site

Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate Job Description & Summary At PwC, our people in data and analytics focus on leveraging data to drive insights and make informed business decisions. They utilise advanced analytics techniques to help clients optimise their operations and achieve their strategic goals. In business intelligence at PwC, you will focus on leveraging data and analytics to provide strategic insights and drive informed decision-making for clients. You will develop and implement innovative solutions to optimise business performance and enhance competitive advantage. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Responsibilities Job Accountabilities - Hands on Experience in Azure Data Components like ADF / Databricks / Azure SQL - Good Programming Logic Sense in SQL - Good PySpark knowledge for Azure Data Bricks - Data Lake and Data Warehouse Concept Understanding - Unit and Integration testing understanding - Good communication skill to express thoghts and interact with business users - Understanding of Data Security and Data Compliance - Agile Model Understanding - Project Documentation Understanding - Certification (Good to have) - Domain Knowledge Mandatory Skill Sets Azure DE, ADB, ADF, ADL Preferred Skill Sets Azure DE, ADB, ADF, ADL Years Of Experience Required 3 to 9 years Education Qualification Graduate Engineer or Management Graduate Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor of Engineering, Master of Business Administration Degrees/Field Of Study Preferred Certifications (if blank, certifications not specified) Required Skills Microsoft Azure Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Analytical Thinking, Business Case Development, Business Data Analytics, Business Intelligence and Reporting Tools (BIRT), Business Intelligence Development Studio, Communication, Competitive Advantage, Continuous Process Improvement, Creativity, Data Analysis and Interpretation, Data Architecture, Database Management System (DBMS), Data Collection, Data Pipeline, Data Quality, Data Science, Data Visualization, Embracing Change, Emotional Regulation, Empathy, Inclusion, Industry Trend Analysis {+ 16 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date

Posted 3 days ago

Apply

10.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Job Summary Job Title: AI/ML Engineer Location: TechM Blr ECITY Years of Experience: 10+ Years Job Summary Chevron invites applications for the role of AI/ML Engineer within our Enterprise AI team in India. This position is integral to designing and developing AI/ML models that significantly accelerate the delivery of business value. We are looking for a Machine Learning Engineer with the ability to bring their expertise, innovative attitude, and excitement for solving complex problems with modern technologies and approaches. We seek individuals with a passion for exploring, innovating, and delivering innovative Data Science solutions that provide immense value to our business. Responsibilities Design and develop AI/ML models to enhance business processes and deliver actionable insights. Implement machine learning frameworks and libraries, ensuring robust model performance and scalability. Collaborate with cross functional teams to integrate AI solutions into existing workflows. Manage the lifecycle of machine learning models, including training, validation, deployment, and monitoring. Develop and maintain custom APIs for machine learning models to facilitate training and inference. Utilize Azure services to build and deploy machine learning pipelines effectively. Engage with technical experts to identify opportunities for applying machine learning and analytics. Communicate findings and insights clearly to stakeholders at all levels. Mandatory Skills Minimum 5 years of experience in Object Oriented Programming in Python. Proven experience with Azure IaaS services, particularly in building machine learning pipelines using Azure Machine Learning and/or Fabric. Strong understanding of software engineering principles, including source control, architecture, and testing methodologies. Experience with containers and container management (Docker, Kubernetes). Proficient in orchestrating large scale ML/DL jobs and leveraging Modern Data Platform tooling. Experience in designing custom APIs for machine learning models. Knowledge of mathematics (linear algebra, probability, Statistics) and algorithms. Ability to communicate effectively in both oral and written forms. Preferred Skills Experience implementing machine learning frameworks such as MLflow. Familiarity with Data Engineering and transformation tools like Azure Databricks, Spark, and Azure ADF. History of working with large scale model optimization and Neural Networks Hyper Parameter Tuning. Experience with unstructured data using Azure Cognitive Services and/or Computer Vision. Understanding of enterprise SaaS complexities, including security, scalability, and production support. Qualifications Bachelor's or Master's degree in Computer Science, Data Science, or a related field. 7 10 years of relevant experience in AI/ML engineering. Strong problem solving skills and a methodical approach to software design and development.

Posted 3 days ago

Apply

8.0 - 12.0 years

0 Lacs

Goregaon, Maharashtra, India

On-site

Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Manager Job Description & Summary At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decision-making and driving business growth. In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Job Description & Summary: A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Responsibilities: Job Description: Key Responsibilities: Designs, implements and maintains reliable and scalable data infrastructure Writes, deploys and maintains software to build, integrate, manage, maintain, and quality-assure data Designs, develops, and delivers large-scale data ingestion, data processing, and data transformation projects on the Azure cloud Mentors and shares knowledge with the team to provide design reviews, discussions and prototypes Works with customers to deploy, manage, and audit standard processes for cloud products Adheres to and advocates for software & data engineering standard processes (e.g. technical design and review, unit testing, monitoring, alerting, source control, code review & documentation) Deploys secure and well-tested software that meets privacy and compliance requirements; develops, maintains and improves CI / CD pipeline Service reliability and following site-reliability engineering standard processes: on-call rotations for services they maintain, responsible for defining and maintaining SLAs. Designs, builds, deploys and maintains infrastructure as code. Containerizes server deployments. Part of a cross-disciplinary team working closely with other data engineers, software engineers, data scientists, data managers and business partners in a Scrum/Agile setup Job Requirements: Education : Bachelor or higher degree in computer science, Engineering, Information Systems or other quantitative fields Experience : Years of experience: 8 to 12 years relevant experience Deep and hands-on experience designing, planning, productionizing, maintaining and documenting reliable and scalable data infrastructure and data products in complex environments Hands on experience with: Spark for data processing (batch and/or real-time) Configuring Delta Lake on Azure Databricks Languages: SQL, pyspark, python Cloud platforms: Azure Azure Data Factory (must) , Azure Data Lake (must), Azure SQL DB (must), Synapse (must), SQL Pools (must), Databricks (good to have) Designing data solutions in Azure incl. data distributions and partitions, scalability, cost-management, disaster recovery and high availability Azure Devops (or similar tools) for source control & building CI/CD pipelines Experience designing and implementing large-scale distributed systems Customer management and front-ending and ability to lead large organizations through influence Desirable Criteria : Strong customer management- own the delivery for Data track with customer stakeholders Continuous learning and improvement attitude Key Behaviors : Empathetic: Cares about our people, our community and our planet Curious: Seeks to explore and excel Creative: Imagines the extraordinary Inclusive: Brings out the best in each other Mandatory Skill Sets: ‘Must have’ knowledge, skills and experiences Synapse, ADF, spark, SQL, pyspark, spark-SQL Preferred Skill Sets: ‘Good to have’ knowledge, skills and experiences Cosmos DB, Data modeling, Databricks, PowerBI, experience of having built analytics solution with SAP as data source for ingestion pipelines. Depth: Candidate should have in-depth hands-on experience w.r.t end to end solution designing in Azure data lake, ADF pipeline development and debugging, various file formats, Synapse and Databricks with excellent coding skills in PySpark and SQL with logic building capabilities. He/she should have sound knowledge of optimizing workloads. Years Of Experience Required: 8 to 12 years relevant experience Education Qualification: BE, B.Tech, ME, M,Tech, MBA, MCA (60% above) Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor of Engineering, Bachelor of Technology, Master of Business Administration, Master of Engineering Degrees/Field Of Study Preferred: Certifications (if blank, certifications not specified) Required Skills Apache Synapse Optional Skills Microsoft Power Business Intelligence (BI) Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date

Posted 3 days ago

Apply

6.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Zenoti provides an all-in-one, cloud-based software solution for the beauty and wellness industry. Our solution allows users to seamlessly manage every aspect of the business in a comprehensive mobile solution: online appointment bookings, POS, CRM, employee management, inventory management, built-in marketing programs and more. Zenoti helps clients streamline their systems and reduce costs, while simultaneously improving customer retention and spending. Our platform is engineered for reliability and scale and harnesses the power of enterprise-level technology for businesses of all sizes Zenoti powers more than 30,000 salons, spas, medspas and fitness studios in over 50 countries. This includes a vast portfolio of global brands, such as European Wax Center, Hand & Stone, Massage Heights, Rush Hair & Beauty, Sono Bello, Profile by Sanford, Hair Cuttery, CorePower Yoga and TONI&GUY. Our recent accomplishments include surpassing a $1 billion unicorn valuation, being named Next Tech Titan by GeekWire, raising an $80 million investment from TPG, ranking as the 316th fastest-growing company in North America on Deloitte’s 2020 Technology Fast 500™. We are also proud to be recognized as a Great Place to Work CertifiedTM for 2021-2022 as this reaffirms our commitment to empowering people to feel good and find their greatness. To learn more about Zenoti visit: https://www.zenoti.com What will I be doing? Design, architect, develop and maintain components of Zenoti Collaborate with a team of product managers, developers, and quality assurance engineers to define, design and deploy new features and functionality Build software that ensures the best possible usability, performance, quality and responsiveness of features Work in a team following agile development practices (SCRUM) Learn to scale your features to handle 2x to 4x growth every year and manage code that has to deal with millions of records and terabytes of data Release new features into production every month and get real feedback from thousands of customers to refine your designs Be proud of what you work on, and obsess about the quality of your work. Join our team to do the best work of your career. What skills do I need? 6+ years' experience developing ETL solutions and data pipelines with expertise in processing trillions of records efficiently 6+ years' experience with SQL Server, T-SQL, stored procedures, and deep understanding of SQL performance tuning for large-scale data processing Strong understanding of ETL concepts, data modeling, and data warehousing principles with hands-on experience building data pipelines using Python Extensive experience with Big Data platforms including Azure Fabric, Azure Databricks, Azure Data Factory (ADF), Amazon Redshift, Apache Spark, and Delta Lake Expert-level SQL skills for complex data transformations, aggregations, and query optimization to handle trillions of records with optimal performance Hands-on experience creating data lakehouse architectures and implementing data governance and security best practices across Big Data platforms Strong logical, analytical, and problem-solving skills with ability to design and optimize distributed computing clusters for maximum throughput Excellent communication skills for cross-functional collaboration and ability to work in a fast-paced environment with changing priorities Experience with cloud-native data solutions including Azure Data Lake, Azure Synapse, and containerization technologies (Docker, Kubernetes) Proven track record of implementing CI/CD pipelines for data engineering workflows, automating data pipeline deployment, and monitoring performance at scale Benefits Attractive Compensation Comprehensive medical coverage for yourself and your immediate family An environment where wellbeing is high on priority – access to regular yoga, meditation, breathwork, nutrition counseling, stress management, inclusion of family for most benefit awareness building sessions Opportunities to be a part of a community and give back: Social activities are part of our culture; You can look forward to regular engagement, social work, community give-back initiatives Zenoti provides equal employment opportunities to all employees and applicants for employment and prohibits discrimination and harassment of any type without regard to race, color, religion, age, sex, national origin, disability status, genetics, protected veteran status, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state, or local laws. This policy applies to all terms and conditions of employment, including recruiting, hiring, placement, promotion, termination, layoff, recall, transfer, leaves of absence, compensation and training.

Posted 3 days ago

Apply

0.0 years

0 Lacs

Varthur, Bengaluru, Karnataka

On-site

Outer Ring Road, Devarabisanahalli Vlg Varthur Hobli, Bldg 2A, Twr 3, Phs 1, BANGALORE, IN, 560103 INFORMATION TECHNOLOGY 4230 Band B Satyanarayana Ambati Job Description Application Developer Bangalore, Karnataka, India AXA XL offers risk transfer and risk management solutions to clients globally. We offer worldwide capacity, flexible underwriting solutions, a wide variety of client-focused loss prevention services and a team-based account management approach. AXA XL recognizes data and information as critical business assets, both in terms of managing risk and enabling new business opportunities. This data should not only be high quality, but also actionable – enabling AXA XL’s executive leadership team to maximize benefits and facilitate sustained advantage. What you’ll be DOING What will your essential responsibilities include? We are seeking an experienced ETL Developer to support and evolve our enterprise data integration workflows. The ideal candidate will have deep expertise in Informatica PowerCenter, strong hands-on experience with Azure Data Factory and Databricks, and a passion for building scalable, reliable ETL pipelines. This role is critical for both day-to-day operational reliability and long-term modernization of our data engineering stack in the Azure cloud. Key Responsibilities: Maintain, monitor, and troubleshoot existing Informatica PowerCenter ETL workflows to ensure operational reliability and data accuracy. Enhance and extend ETL processes to support new data sources, updated business logic, and scalability improvements. Develop and orchestrate PySpark notebooks in Azure Databricks for data transformation, cleansing, and enrichment. Configure and manage Databricks clusters for performance optimization and cost efficiency. Implement Delta Lake solutions that support ACID compliance, versioning, and time travel for reliable data lake operations. Automate data workflows using Databricks Jobs and Azure Data Factory (ADF) pipelines. Design and manage scalable ADF pipelines, including parameterized workflows and reusable integration patterns. Integrate with Azure Blob Storage and ADLS Gen2 using Spark APIs for high-performance data ingestion and output. Ensure data quality, consistency, and governance across legacy and cloud-based pipelines. Collaborate with data analysts, engineers, and business teams to deliver clean, validated data for reporting and analytics. Participate in the full Software Development Life Cycle (SDLC) from design through deployment, with an emphasis on maintainability and audit readiness. Develop maintainable and efficient ETL logic and scripts following best practices in security and performance. Troubleshoot pipeline issues across data infrastructure layers, identifying and resolving root causes to maintain reliability. Create and maintain clear documentation of technical designs, workflows, and data processing logic for long-term maintainability and knowledge sharing. Stay informed on emerging cloud and data engineering technologies to recommend improvements and drive innovation. Follow internal controls, audit protocols, and secure data handling procedures to support compliance and operational standards. Provide accurate time and effort estimates for assigned development tasks, accounting for complexity and risk. What you will BRING We’re looking for someone who has these abilities and skills: Advanced experience with Informatica PowerCenter, including mappings, workflows, session tuning, and parameterization Expertise in Azure Databricks + PySpark, including: Notebook development Cluster configuration and tuning Delta Lake (ACID, versioning, time travel) Job orchestration via Databricks Jobs or ADF Integration with Azure Blob Storage and ADLS Gen2 using Spark APIs Strong hands-on experience with Azure Data Factory: Building and managing pipelines Parameterization and dynamic datasets Notebook integration and pipeline monitoring Proficiency in SQL, PL/SQL, and scripting languages such as Python, Bash, or PowerShell Strong understanding of data warehousing, dimensional modeling, and data profiling Familiarity with Git, CI/CD pipelines, and modern DevOps practices Working knowledge of data governance, audit trails, metadata management, and compliance standards such as HIPAA and GDPR Effective problem-solving and troubleshooting skills with the ability to resolve performance bottlenecks and job failures Awareness of Azure Functions, App Services, API Management, and Application Insights Understanding of Azure Key Vault for secrets and credential management Familiarity with Spark-based big data ecosystems (e.g., Hive, Kafka) is a plus Who WE are AXA XL, the P&C and specialty risk division of AXA, is known for solving complex risks. For mid-sized companies, multinationals and even some inspirational individuals we don’t just provide re/insurance, we reinvent it. How? By combining a comprehensive and efficient capital platform, data-driven insights, leading technology, and the best talent in an agile and inclusive workspace, empowered to deliver top client service across all our lines of business property, casualty, professional, financial lines and specialty. With an innovative and flexible approach to risk solutions, we partner with those who move the world forward. Learn more at axaxl.com What we OFFER Inclusion AXA XL is committed to equal employment opportunity and will consider applicants regardless of gender, sexual orientation, age, ethnicity and origins, marital status, religion, disability, or any other protected characteristic. At AXA XL, we know that an inclusive culture and enables business growth and is critical to our success. That’s why we have made a strategic commitment to attract, develop, advance and retain the most inclusive workforce possible, and create a culture where everyone can bring their full selves to work and reach their highest potential. It’s about helping one another — and our business — to move forward and succeed. Five Business Resource Groups focused on gender, LGBTQ+, ethnicity and origins, disability and inclusion with 20 Chapters around the globe. Robust support for Flexible Working Arrangements Enhanced family-friendly leave benefits Named to the Diversity Best Practices Index Signatory to the UK Women in Finance Charter Learn more at axaxl.com/about-us/inclusion-and-diversity. AXA XL is an Equal Opportunity Employer. Total Rewards AXA XL’s Reward program is designed to take care of what matters most to you, covering the full picture of your health, wellbeing, lifestyle and financial security. It provides competitive compensation and personalized, inclusive benefits that evolve as you do. We’re committed to rewarding your contribution for the long term, so you can be your best self today and look forward to the future with confidence. Sustainability At AXA XL, Sustainability is integral to our business strategy. In an ever-changing world, AXA XL protects what matters most for our clients and communities. We know that sustainability is at the root of a more resilient future. Our 2023-26 Sustainability strategy, called “Roots of resilience”, focuses on protecting natural ecosystems, addressing climate change, and embedding sustainable practices across our operations. Our Pillars: Valuing nature: How we impact nature affects how nature impacts us. Resilient ecosystems - the foundation of a sustainable planet and society – are essential to our future. We’re committed to protecting and restoring nature – from mangrove forests to the bees in our backyard – by increasing biodiversity awareness and inspiring clients and colleagues to put nature at the heart of their plans. Addressing climate change: The effects of a changing climate are far-reaching and significant. Unpredictable weather, increasing temperatures, and rising sea levels cause both social inequalities and environmental disruption. We're building a net zero strategy, developing insurance products and services, and mobilizing to advance thought leadership and investment in societal-led solutions. Integrating ESG: All companies have a role to play in building a more resilient future. Incorporating ESG considerations into our internal processes and practices builds resilience from the roots of our business. We’re training our colleagues, engaging our external partners, and evolving our sustainability governance and reporting. AXA Hearts in Action : We have established volunteering and charitable giving programs to help colleagues support causes that matter most to them, known as AXA XL’s “Hearts in Action” programs. These include our Matching Gifts program, Volunteering Leave, and our annual volunteering day – the Global Day of Giving. For more information, please see axaxl.com/sustainability.

Posted 3 days ago

Apply

0.0 years

0 Lacs

Bengaluru, Karnataka

On-site

Job Description: Application Developer Bangalore, Karnataka, India AXA XL offers risk transfer and risk management solutions to clients globally. We offer worldwide capacity, flexible underwriting solutions, a wide variety of client-focused loss prevention services and a team-based account management approach. AXA XL recognizes data and information as critical business assets, both in terms of managing risk and enabling new business opportunities. This data should not only be high quality, but also actionable – enabling AXA XL’s executive leadership team to maximize benefits and facilitate sustained advantage. What you’ll be DOING What will your essential responsibilities include? We are seeking an experienced ETL Developer to support and evolve our enterprise data integration workflows. The ideal candidate will have deep expertise in Informatica PowerCenter, strong hands-on experience with Azure Data Factory and Databricks, and a passion for building scalable, reliable ETL pipelines. This role is critical for both day-to-day operational reliability and long-term modernization of our data engineering stack in the Azure cloud. Key Responsibilities: Maintain, monitor, and troubleshoot existing Informatica PowerCenter ETL workflows to ensure operational reliability and data accuracy. Enhance and extend ETL processes to support new data sources, updated business logic, and scalability improvements. Develop and orchestrate PySpark notebooks in Azure Databricks for data transformation, cleansing, and enrichment. Configure and manage Databricks clusters for performance optimization and cost efficiency. Implement Delta Lake solutions that support ACID compliance, versioning, and time travel for reliable data lake operations. Automate data workflows using Databricks Jobs and Azure Data Factory (ADF) pipelines. Design and manage scalable ADF pipelines, including parameterized workflows and reusable integration patterns. Integrate with Azure Blob Storage and ADLS Gen2 using Spark APIs for high-performance data ingestion and output. Ensure data quality, consistency, and governance across legacy and cloud-based pipelines. Collaborate with data analysts, engineers, and business teams to deliver clean, validated data for reporting and analytics. Participate in the full Software Development Life Cycle (SDLC) from design through deployment, with an emphasis on maintainability and audit readiness. Develop maintainable and efficient ETL logic and scripts following best practices in security and performance. Troubleshoot pipeline issues across data infrastructure layers, identifying and resolving root causes to maintain reliability. Create and maintain clear documentation of technical designs, workflows, and data processing logic for long-term maintainability and knowledge sharing. Stay informed on emerging cloud and data engineering technologies to recommend improvements and drive innovation. Follow internal controls, audit protocols, and secure data handling procedures to support compliance and operational standards. Provide accurate time and effort estimates for assigned development tasks, accounting for complexity and risk. What you will BRING We’re looking for someone who has these abilities and skills: Advanced experience with Informatica PowerCenter, including mappings, workflows, session tuning, and parameterization Expertise in Azure Databricks + PySpark, including: Notebook development Cluster configuration and tuning Delta Lake (ACID, versioning, time travel) Job orchestration via Databricks Jobs or ADF Integration with Azure Blob Storage and ADLS Gen2 using Spark APIs Strong hands-on experience with Azure Data Factory: Building and managing pipelines Parameterization and dynamic datasets Notebook integration and pipeline monitoring Proficiency in SQL, PL/SQL, and scripting languages such as Python, Bash, or PowerShell Strong understanding of data warehousing, dimensional modeling, and data profiling Familiarity with Git, CI/CD pipelines, and modern DevOps practices Working knowledge of data governance, audit trails, metadata management, and compliance standards such as HIPAA and GDPR Effective problem-solving and troubleshooting skills with the ability to resolve performance bottlenecks and job failures Awareness of Azure Functions, App Services, API Management, and Application Insights Understanding of Azure Key Vault for secrets and credential management Familiarity with Spark-based big data ecosystems (e.g., Hive, Kafka) is a plus Who WE are AXA XL, the P&C and specialty risk division of AXA, is known for solving complex risks. For mid-sized companies, multinationals and even some inspirational individuals we don’t just provide re/insurance, we reinvent it. How? By combining a comprehensive and efficient capital platform, data-driven insights, leading technology, and the best talent in an agile and inclusive workspace, empowered to deliver top client service across all our lines of business property, casualty, professional, financial lines and specialty. With an innovative and flexible approach to risk solutions, we partner with those who move the world forward. Learn more at axaxl.com What we OFFER Inclusion AXA XL is committed to equal employment opportunity and will consider applicants regardless of gender, sexual orientation, age, ethnicity and origins, marital status, religion, disability, or any other protected characteristic. At AXA XL, we know that an inclusive culture and enables business growth and is critical to our success. That’s why we have made a strategic commitment to attract, develop, advance and retain the most inclusive workforce possible, and create a culture where everyone can bring their full selves to work and reach their highest potential. It’s about helping one another — and our business — to move forward and succeed. Five Business Resource Groups focused on gender, LGBTQ+, ethnicity and origins, disability and inclusion with 20 Chapters around the globe. Robust support for Flexible Working Arrangements Enhanced family-friendly leave benefits Named to the Diversity Best Practices Index Signatory to the UK Women in Finance Charter Learn more at axaxl.com/about-us/inclusion-and-diversity. AXA XL is an Equal Opportunity Employer. Total Rewards AXA XL’s Reward program is designed to take care of what matters most to you, covering the full picture of your health, wellbeing, lifestyle and financial security. It provides competitive compensation and personalized, inclusive benefits that evolve as you do. We’re committed to rewarding your contribution for the long term, so you can be your best self today and look forward to the future with confidence. Sustainability At AXA XL, Sustainability is integral to our business strategy. In an ever-changing world, AXA XL protects what matters most for our clients and communities. We know that sustainability is at the root of a more resilient future. Our 2023-26 Sustainability strategy, called “Roots of resilience”, focuses on protecting natural ecosystems, addressing climate change, and embedding sustainable practices across our operations. Our Pillars: Valuing nature: How we impact nature affects how nature impacts us. Resilient ecosystems - the foundation of a sustainable planet and society – are essential to our future. We’re committed to protecting and restoring nature – from mangrove forests to the bees in our backyard – by increasing biodiversity awareness and inspiring clients and colleagues to put nature at the heart of their plans. Addressing climate change: The effects of a changing climate are far-reaching and significant. Unpredictable weather, increasing temperatures, and rising sea levels cause both social inequalities and environmental disruption. We're building a net zero strategy, developing insurance products and services, and mobilizing to advance thought leadership and investment in societal-led solutions. Integrating ESG: All companies have a role to play in building a more resilient future. Incorporating ESG considerations into our internal processes and practices builds resilience from the roots of our business. We’re training our colleagues, engaging our external partners, and evolving our sustainability governance and reporting. AXA Hearts in Action : We have established volunteering and charitable giving programs to help colleagues support causes that matter most to them, known as AXA XL’s “Hearts in Action” programs. These include our Matching Gifts program, Volunteering Leave, and our annual volunteering day – the Global Day of Giving. For more information, please see axaxl.com/sustainability.

Posted 3 days ago

Apply

8.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

JOB with 8+ years of experience in IT industry and with strong .Net/.Net Core/Azure Cloud Service/ Azure DevOps. This is a client facing role and hence should have strong communication skills. This is for a US client and the resource should be hands-on experience in coding and Azure Cloud. Working Hours 8 Hours , With a 4 Hours Of Overlap During EST Time Zone. ( 12 PM 9 PM) This Overlap Hours Is Mandatory As Meetings Happen During This Overlap Design, develop, enhance, document, and maintain robust applications using .NET Core 6/8+, C#, REST APIs, T-SQL, and modern JavaScript/jQuery. Integrate and support third-party APIs and external services. Collaborate across cross-functional teams to deliver scalable solutions across the full technology stack. Identify, prioritize, and execute tasks throughout the Software Development Life Cycle (SDLC). Participate in Agile/Scrum ceremonies and manage tasks using Jira. Understand technical priorities, architectural dependencies, risks, and implementation challenges. Troubleshoot, debug, and optimize existing solutions with a strong focus on performance and SKILLS : 8+ years of hands-on development experience with: C#, .NET Core 6/8+, Entity Framework / EF Core. JavaScript, jQuery, REST APIs. Expertise in MS SQL Server, including: Complex SQL queries, Stored Procedures, Views, Functions, Packages, Cursors, Tables, and Object Types. Skilled in unit testing with XUnit, MSTest. Strong in software design patterns, system architecture, and scalable solution design. Ability to lead and inspire teams through clear communication, technical mentorship, and ownership. Strong problem-solving and debugging capabilities. Ability to write reusable, testable, and efficient code. Develop and maintain frameworks and shared libraries to support large-scale applications. Excellent technical documentation, communication, and leadership skills. Microservices and Service-Oriented Architecture (SOA). Experience in API Integrations. 2+ years of hands with Azure Cloud Services, including : Azure Functions. Azure Durable Functions. Azure Service Bus, Event Grid, Storage Queues. Blob Storage, Azure Key Vault, SQL Azure. Application Insights, Azure SKILLS ( GOOD TO HAVE) : Familiarity with AngularJS, ReactJS, and other front-end frameworks. Experience with Azure API Management (APIM). Knowledge of Azure Containerization and Orchestration (e.g., AKS/Kubernetes). Experience with Azure Data Factory (ADF) and Logic Apps. Exposure to Application Support and operational monitoring. Azure DevOps CI/CD pipelines (Classic / YAML). (ref:hirist.tech)

Posted 3 days ago

Apply

6.0 - 10.0 years

0 Lacs

hyderabad, telangana

On-site

You are an experienced Integration Developer who will design, develop, and deploy integrations using Azure Service Bus, Logic Apps, Power Automate, Power Apps, Azure Functions, and ADF to integrate third-party applications. Your background should demonstrate a strong familiarity with Azure services and substantial experience in integration development. In this role, you will be expected to have: - 6+ years of experience in Integration Developer role focusing on designing, developing, and deploying integrations using Azure Service Bus, Logic Apps, Power Automate, Power Apps, Azure Functions, and ADF for third-party application integration. - Proficiency with hands-on experience in Azure Function, Logic Apps, and Azure Service Bus. - Previous exposure to Biz Talk Integration. - Sound knowledge of Azure services and a proven track record in integration development. - Excellent communication and collaboration skills to effectively work with team members and stakeholders. - Ability to work independently and also contribute effectively as part of a team. If you meet these qualifications and are excited about the prospect of working with cutting-edge technologies in a dynamic environment, we encourage you to send your resume along with a brief introduction to prasanth.bhavanari@valuelabs.com. We look forward to hearing from you soon!,

Posted 3 days ago

Apply

6.0 - 10.0 years

0 Lacs

karnataka

On-site

As a Lead Azure Data Engineer at CGI, you will have the opportunity to be part of a dynamic team of builders who are dedicated to helping clients succeed. With our global resources, expertise, and stability, we aim to achieve results for our clients and members. If you are looking for a challenging role that offers professional growth and development, this is the perfect opportunity for you. In this role, you will be responsible for supporting the development and maintenance of our trading and risk data platform. Your main focus will be on designing and building data foundations and end-to-end solutions to maximize the value from data. You will collaborate with other data professionals to integrate and enrich trade data from various ETRM systems and create scalable solutions to enhance the usage of TRM data across different platforms and teams. Key Responsibilities: - Implement and manage lake House using Databricks and Azure Tech stack (ADLS Gen2, ADF, Azure SQL). - Utilize SQL, Python, Apache Spark, and Delta Lake for data engineering tasks. - Implement data integration techniques, ETL processes, and data pipeline architectures. - Develop CI/CD pipelines for code management using GIT. - Create and maintain technical documentation for the platform. - Ensure the platform is developed with software engineering, data analytics, and data security best practices. - Optimize data processing and storage systems for high performance, reliability, and security. - Work in Agile Methodology and utilize ADO Boards for Sprint deliveries. - Demonstrate excellent communication skills to convey technical and business concepts effectively. - Collaborate with team members at all levels to share ideas and knowledge effectively. Required Qualifications: - Bachelor's degree in computer science or related field. - 6 to 10 years of experience in software development/engineering. - Proficiency in Azure technologies including Databricks, ADLS Gen2, ADF, and Azure SQL. - Strong hands-on experience with SQL, Python, Apache Spark, and Delta Lake. - Knowledge of data integration techniques, ETL processes, and data pipeline architectures. - Experience in building CI/CD pipelines and using GIT for code management. - Familiarity with Agile Methodology and ADO Boards for Sprint deliveries. At CGI, we believe in ownership, teamwork, respect, and belonging. As a CGI Partner, you will have the opportunity to turn meaningful insights into action, develop innovative solutions, and collaborate with a diverse team to shape your career and contribute to our collective success. Join us on this exciting journey of growth and innovation at one of the largest IT and business consulting services firms in the world.,

Posted 3 days ago

Apply

8.0 - 12.0 years

0 Lacs

karnataka

On-site

We are seeking a talented and experienced technical lead to join our team and oversee our data engineering projects. The ideal candidate will be based in Bangalore or Hyderabad with 7-12 years of experience. As a technical lead, your responsibilities will include designing technical architecture, developing and maintaining data pipelines, data warehouses, data models, and reports using Microsoft Azure and Power BI. You will be in charge of managing project scope, timeline, and deliverables. Additionally, you will mentor and guide team members, collaborate with stakeholders, and ensure the quality and performance of our data solutions. To excel in this role, you should possess a bachelors degree in computer science, engineering, or a related field, along with a minimum of 8 years of experience in data engineering, data analysis, and business intelligence. Proficiency in COSMOS (Configurations and governance), Power Automate Flows, Azure DevOps, MS Forms, Power BI, Azure Data Factory, and Azure SQL is essential. Experience in leading and managing technical teams and projects using agile methodology is preferred. Strong communication, problem-solving, and analytical skills are required for this position. The willingness to work in shifts is also expected.,

Posted 3 days ago

Apply

7.0 - 11.0 years

0 Lacs

chandigarh

On-site

As a Senior Azure Data Engineer at iO Associates in Mohali, you will be responsible for building and optimizing data pipelines, supporting data integration across systems, and enhancing the Azure-based Enterprise Data Platform. The company leads the real estate sector with headquarters in Mohali and offices in the US and over 17 other countries. Your key responsibilities will include building and enhancing the Azure-based EDP using modern tools like Databricks, Synapse, ADF, and ADLS Gen2. You will develop and maintain ETL pipelines, collaborate with teams to deliver efficient data solutions, create data products for enterprise-wide use, mentor team members, promote code reusability, and contribute to documentation, reviews, and architecture planning. To excel in this role, you should have at least 7 years of experience in data engineering with expertise in Databricks, Python, Scala, Azure Synapse, and ADF. You should have a proven track record of building and managing ETL/data pipelines across various sources and formats, along with strong skills in data modeling, warehousing, and CI/CD practices. This is an excellent opportunity to join a company that values your growth, emphasizes work-life balance, and recognizes your contributions. If you are interested in this position, please email at [Email Address].,

Posted 3 days ago

Apply

6.0 - 9.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

About ValGenesis ValGenesis is a leading digital validation platform provider for life sciences companies. ValGenesis suite of products are used by 30 of the top 50 global pharmaceutical and biotech companies to achieve digital transformation, total compliance and manufacturing excellence/intelligence across their product lifecycle. Learn more about working for ValGenesis, the de facto standard for paperless validation in Life Sciences: https://www.youtube.com/watch?v=tASq7Ld0JsQ About the Role: We are looking for experienced database developer who could join our engineering team to build the enterprise applications for our global customers. If you are a technology enthusiast and have passion to develop enterprise cloud products with quality, security, and performance, we are eager to discuss with you about the potential role. Responsibilities Database Development Utilize expertise in MS SQL Server, PostgreSQL to design and develop efficient and scalable database solutions. Collaborate with development stakeholders to understand and implement database requirements. Write and optimize complex SQL queries, stored procedures, and functions. Database tuning & configurations of servers Knowledge on both Cloud & on-premises database Knowledge of SaaS based applications development ETL Integration Leverage experience with ETL tools such as ADF and SSIS to facilitate seamless data migration. Design and implement data extraction, transformation, and loading processes. Ensure data integrity during the ETL process and troubleshoot any issues that may arise. Reporting Develop and maintain SSRS reports based on customer needs. Collaborate with stakeholders to understand reporting requirements and implement effective solutions. Performance Tuning Database performance analysis using Dynatrace, NewRelic or similar tools. Analyze query performance and implement tuning strategies to optimize database performance. Conduct impact analysis and resolve production issues within specified SLAs. Version Control and Collaboration Utilize GIT and SVN for version control of database scripts and configurations. Collaborate with cross-functional teams using tools such as JIRA for story mapping, tracking, and issue resolution. Documentation Document database architecture, processes, and configurations. Provide detailed RCA (Root Cause Analysis) for any database-related issues. Requirements 6 - 9 years of hands-on experience in software development. Must have extensive experience in stored procedure development and performance fine tuning. Proficient in SQL, MS SQL Server, SSRS, and SSIS. Working knowledge of C# ASP.NET web application development. Ability to grasp new concepts and facilitate continuous learning. Strong sense of responsibility and accountability. We’re on a Mission In 2005, we disrupted the life sciences industry by introducing the world’s first digital validation lifecycle management system. ValGenesis VLMS® revolutionized compliance-based corporate validation activities and has remained the industry standard. Today, we continue to push the boundaries of innovation ― enhancing and expanding our portfolio beyond validation with an end-to-end digital transformation platform. We combine our purpose-built systems with world-class consulting services to help every facet of GxP meet evolving regulations and quality expectations. The Team You’ll Join Our customers’ success is our success. We keep the customer experience centered in our decisions, from product to marketing to sales to services to support. Life sciences companies exist to improve humanity’s quality of life, and we honor that mission. We work together. We communicate openly, support each other without reservation, and never hesitate to wear multiple hats to get the job done. We think big. Innovation is the heart of ValGenesis. That spirit drives product development as well as personal growth. We never stop aiming upward. We’re in it to win it. We’re on a path to becoming the number one intelligent validation platform in the market, and we won’t settle for anything less than being a market leader. How We Work Our Chennai, Hyderabad and Bangalore offices are onsite, 5 days per week. We believe that in-person interaction and collaboration fosters creativity, and a sense of community, and is critical to our future success as a company. ValGenesis is an equal-opportunity employer that makes employment decisions on the basis of merit. Our goal is to have the best-qualified people in every job. All qualified applicants will receive consideration for employment without regard to race, religion, sex, sexual orientation, gender identity, national origin, disability, or any other characteristics protected by local law.

Posted 4 days ago

Apply

6.0 - 9.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

About ValGenesis ValGenesis is a leading digital validation platform provider for life sciences companies. ValGenesis suite of products are used by 30 of the top 50 global pharmaceutical and biotech companies to achieve digital transformation, total compliance and manufacturing excellence/intelligence across their product lifecycle. Learn more about working for ValGenesis, the de facto standard for paperless validation in Life Sciences: https://www.youtube.com/watch?v=tASq7Ld0JsQ About the Role: We are looking for experienced database developer who could join our engineering team to build the enterprise applications for our global customers. If you are a technology enthusiast and have passion to develop enterprise cloud products with quality, security, and performance, we are eager to discuss with you about the potential role. Responsibilities Database Development Utilize expertise in MS SQL Server, PostgreSQL to design and develop efficient and scalable database solutions. Collaborate with development stakeholders to understand and implement database requirements. Write and optimize complex SQL queries, stored procedures, and functions. Database tuning & configurations of servers Knowledge on both Cloud & on-premises database Knowledge of SaaS based applications development ETL Integration Leverage experience with ETL tools such as ADF and SSIS to facilitate seamless data migration. Design and implement data extraction, transformation, and loading processes. Ensure data integrity during the ETL process and troubleshoot any issues that may arise. Reporting Develop and maintain SSRS reports based on customer needs. Collaborate with stakeholders to understand reporting requirements and implement effective solutions. Performance Tuning Database performance analysis using Dynatrace, NewRelic or similar tools. Analyze query performance and implement tuning strategies to optimize database performance. Conduct impact analysis and resolve production issues within specified SLAs. Version Control and Collaboration Utilize GIT and SVN for version control of database scripts and configurations. Collaborate with cross-functional teams using tools such as JIRA for story mapping, tracking, and issue resolution. Documentation Document database architecture, processes, and configurations. Provide detailed RCA (Root Cause Analysis) for any database-related issues. Requirements 6 - 9 years of hands-on experience in software development. Must have extensive experience in stored procedure development and performance fine tuning. Proficient in SQL, MS SQL Server, SSRS, and SSIS. Working knowledge of C# ASP.NET web application development. Ability to grasp new concepts and facilitate continuous learning. Strong sense of responsibility and accountability. We’re on a Mission In 2005, we disrupted the life sciences industry by introducing the world’s first digital validation lifecycle management system. ValGenesis VLMS® revolutionized compliance-based corporate validation activities and has remained the industry standard. Today, we continue to push the boundaries of innovation ― enhancing and expanding our portfolio beyond validation with an end-to-end digital transformation platform. We combine our purpose-built systems with world-class consulting services to help every facet of GxP meet evolving regulations and quality expectations. The Team You’ll Join Our customers’ success is our success. We keep the customer experience centered in our decisions, from product to marketing to sales to services to support. Life sciences companies exist to improve humanity’s quality of life, and we honor that mission. We work together. We communicate openly, support each other without reservation, and never hesitate to wear multiple hats to get the job done. We think big. Innovation is the heart of ValGenesis. That spirit drives product development as well as personal growth. We never stop aiming upward. We’re in it to win it. We’re on a path to becoming the number one intelligent validation platform in the market, and we won’t settle for anything less than being a market leader. How We Work Our Chennai, Hyderabad and Bangalore offices are onsite, 5 days per week. We believe that in-person interaction and collaboration fosters creativity, and a sense of community, and is critical to our future success as a company. ValGenesis is an equal-opportunity employer that makes employment decisions on the basis of merit. Our goal is to have the best-qualified people in every job. All qualified applicants will receive consideration for employment without regard to race, religion, sex, sexual orientation, gender identity, national origin, disability, or any other characteristics protected by local law.

Posted 4 days ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies