Jobs
Interviews

1640 Adf Jobs - Page 5

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 years

6 - 8 Lacs

Bengaluru

On-site

DATA ENGINEER II 3+ years of experience in building data Pipelines with Python/PySpark Professional experience in Azure ETL stack ( eg. ADLS, ADF, ADB, ASQL/Synapse) 3+ years of experience with SQL Proficient understanding of code versioning tools such as Git and PM tool like Jira Exposure to CI/CD practices and monitoring/debugging of data workflows

Posted 5 days ago

Apply

5.0 years

0 Lacs

Bengaluru

On-site

Job Requirements Role Summary We are looking for energetic, self-motivated, and exceptional Database/ETL Tester to work on extraordinary enterprise products based on AI and Big Data engineering. He will work with star team of Architects, Data Scientists/AI Specialists, Data Engineers and Integration. Experience Level: 5+ years Responsibilities Responsible for the testing of SQL server Transactions processing & Business Intelligence solutions in an Agile Development environment. Ensure data is consistent and accurate across all layers. Independently investigate data anomalies or inaccuracy patterns to solve data problems to ensure that all data is up to date. Develop and maintain test plans and create reusable test cases. Perform data analysis and creation of test data. Perform manual and automation execution of test scripts. Analyze and document defects found during test execution using JIRA. Assist with writing and performing data audits. Work Experience Requirements Should be hands-on in testing database elements modeled with SQL Server. Experience in Azure environment (Azure SQL, ADF, Blob etc.) is preferred Strong in validating data completeness and correctness between the data feeds and data lake. Should be hands-on in DB testing with SQL scripting experience. Strong knowledge in ETL processes. Exposure to AWS is a plus (S3, RDS-MSSQL, PostgreSQL etc.)

Posted 5 days ago

Apply

3.0 years

2 - 3 Lacs

Bengaluru

On-site

We’re looking for a dedicated Print Support Engineer to provide on-ground technical support for enterprise-grade printing solutions across client locations in and around Bangalore. The ideal candidate will ensure optimal performance of print assets and deliver timely resolutions to enhance client productivity. Key Responsibilities Visit client locations across Bangalore to provide technical support for multifunctional print devices Troubleshoot issues related to ADF, RADF, DADF, SPDF, and networked print environments Conduct routine maintenance, diagnostics, and firmware updates Collaborate with internal teams and escalate issues when needed Document service activities and maintain accurate asset records Educate clients on best practices for eco-friendly and efficient printing Job Type: Full-time Pay: ₹240,000.00 - ₹300,000.00 per year Benefits: Cell phone reimbursement Health insurance Internet reimbursement Experience: Desktop support: 3 years (Required) Work Location: In person Expected Start Date: 01/08/2025

Posted 5 days ago

Apply

6.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Hiring for One of Our Client Job Summary We are looking for an experienced and detail-oriented DAX Modeler with strong expertise in Power BI and Azure data services to join our team as an Individual Contributor . The ideal candidate will have a proven track record of designing and implementing robust data models using DAX, and working with large-scale data environments on the Azure platform. Key Responsibilities Develop and maintain complex DAX models in Power BI, ensuring optimal performance and accuracy. Design and implement tabular data models using Azure Analysis Services (AAS). Work on end-to-end Power BI development including data modeling, report building, and dashboard creation. Handle large-scale data (terabyte-level) from various sources and optimize data pipelines for performance. Collaborate with data engineers, architects, and business users to ensure seamless data flow and transformation. Use Azure Data Factory (ADF) and other Azure services to support ETL processes and data orchestration. Contribute to data warehousing strategies and ensure scalable data architecture. Ensure data quality and adherence to governance and compliance standards. Required Skill Set Minimum 6 years of hands-on experience in DAX and Power BI. Strong expertise in DAX modeling and tabular model design. Proficiency in SQL and understanding of ETL processes. Solid experience in Azure Analysis Services (AAS) and Azure Data Factory (ADF). Familiarity with data engineering practices and data warehousing concepts. Experience handling large-scale datasets (terabyte size). Excellent analytical and problem-solving skills. Ability to work independently and manage priorities in a fast-paced environment. Interested candidates kindly forward your resume to swetha.s@thompsonshr.com

Posted 5 days ago

Apply

3.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Company Description Vichara is a Financial Services focused products and services firm headquartered in NY and building systems for some of the largest i-banks and hedge funds in the world. Job Description Design, build, and manage scalable ELT/ETL pipelines using Snowflake, AWS Glue, S3, and other AWS services; Write and optimize complex SQL queries, stored procedures, and data transformation logic; Support and improve existing data processes, and participate in continuous performance tuning. Implement data quality checks and monitoring to ensure data accuracy, consistency, and reliability. Qualifications At least 3 years’ experience as a Data Engineer with experience in development and maintenance support Hands on development experience in Python and SQL SQL Tuning: Proficient in SQL and understand how to use query metrics to evaluate performance and tune SQL to improve query run time. Experience with AWS services: S3, Glue, EMR, Lambda, CloudWatch, etc; or Azure ADF etc, Strong expertise in Snowflake, including query optimization, Snowpipe, and data modeling. Experience in SSIS and Fabric will be a plus Additional Information Compensation - 35- 50 lakhs pa Benefits: Extended health care Dental care Life insurance

Posted 5 days ago

Apply

2.0 - 5.0 years

0 Lacs

India

Remote

Data Engineer (Remote) Experience Required: 2 to 5 years Location: Remote Budget: ~1.15 Lakh per month (for 4–5 years experience) ~1.00 Lakh per month (for 2 years experience) Key Skills: Azure Data Factory (ADF) Azure Databricks PySpark Job Description: We are looking for a skilled Data Engineer with 2–5 years of experience to join our remote team. The ideal candidate should have hands-on experience working with Azure Data Factory, Azure Databricks, and PySpark, and must be capable of building scalable and efficient data pipelines. Roles & Responsibilities: Design and build data pipelines using ADF and Databricks Optimize ETL processes for performance and scalability Work with cross-functional teams to gather requirements and deliver data solutions Ensure data quality and implement data governance practices Troubleshoot and debug pipeline issues

Posted 5 days ago

Apply

10.0 years

0 Lacs

India

Remote

🚀 We’re Hiring: Senior Data Engineer (Remote – India | Full-time or Contract) 10+ Years (Willing to work U.S. overlapping hours) 💼 Position: Senior Data Engineer 🌍 Location: Remote (India) 📅 Type: Full-Time / Contract 📊 Experience: 10+ Years We are helping our client hire a Senior Data Engineer with over 10+years of experience in modern data platforms. This is a remote role open across India, and available on both full-time and contract basis. 🔧 Must-Have Skills: Data Engineering, Data Warehousing, ETL Azure Databricks PySpark, SparkSQL Python, SQL 👀 What We’re Looking For: We are hiring for two different positions as follows: Lead Developer: (PySpark, Azure Databricks) Databricks Admin  A strong background in building and managing data pipelines Hands-on experience in cloud platforms, especially Azure Ability to work independently and collaborate in distributed teams 📩 How to Apply: Please send your resume to [your email] with the subject line: "Senior Data Engineer – Remote India" ⚠️ Along with your resume, kindly include the following details: Full Name Mobile Number Total Experience Relevant Experience Current CTC Expected CTC Notice Period Current Location Are you fine with Contract or Full time or both? Willing to work IST/US overlapping hours: Yes/No Do you have a PF account? (Yes/No) 🔔 Follow our company page to stay updated on future job openings! #DataEngineer #AzureDatabricks #ADF #PySpark #SQL #RemoteJobsIndia #HiringNow #ContractJobs #IndiaJobs

Posted 5 days ago

Apply

6.0 years

0 Lacs

Delhi, India

On-site

Job Summary: We are looking for a Tech Lead – Data Engineering with 6+ years of hands-on experience in designing and building robust data pipelines and architectures on the Azure cloud platform. The ideal candidate should have strong technical expertise in Azure Data Factory (ADF), Synapse Analytics, and Databricks, with solid coding skills in PySpark and SQL. Experience with Data Mesh architecture and Microsoft Fabric is highly preferred. You will play a key role in end-to-end solutioning, leading data engineering teams, and delivering scalable, high-performance data solutions. Key Responsibilities: · Lead and mentor a team of data engineers across projects and ensure high-quality delivery. · Design, build, and optimize large-scale data pipelines and data integration workflows using ADF and Synapse Analytics. · Architect and implement scalable data solutions on Azure cloud, including Databricks and Microsoft Fabric. · Write efficient and maintainable code using PySpark and SQL for data transformations and processing. · Collaborate with data architects, analysts, and business stakeholders to define data strategies and requirements. · Implement and advocate for Data Mesh principles within the organization. · Provide architectural guidance and perform solutioning for new and existing data projects on Azure. · Ensure data quality, governance, and security best practices are followed. · Stay updated with evolving Azure services and data technologies. Required Skills & Experience: · 6+ years of professional experience in data engineering and solution architecture. · Expertise in Azure Data Factory (ADF) and Azure Synapse Analytics. · Strong hands-on experience with Databricks, PySpark, and advanced SQL. · Good knowledge of Microsoft Fabric and its use cases. · Deep understanding of Azure cloud services related to data storage, processing, and integration. · Familiarity with Data Mesh architecture and distributed data product ownership. · Strong problem-solving and debugging skills. · Excellent communication and stakeholder management abilities. Good to Have: · Experience with CI/CD pipelines for data solutions. · Knowledge of data security and compliance practices on Azure. · Certification in Azure Data Engineering or Solution Architecture.

Posted 5 days ago

Apply

7.0 - 8.0 years

0 Lacs

India

Remote

Hiring Azure Developers. Remote Opportunity. We need someone with 7 to 8 years of experience. His throughout career should be Azure development Azure Functions, Designing, infra, storage , service bus. Primary coding on C# in current projects. Notice period - immediate joiners only Requirements - Hands on exp. on below mentioned technology - MS Azure (Key Vault, Service Bus, App Services [ Web App, Logic App , Mobile App], Azure Functions , Chat Bot Service, Relays, Databases, Blob Storage) API s (REST, RPC, SOAP), Web Services MS SQL Server (specifically database design and development) Agile Software Development Lifecycle

Posted 5 days ago

Apply

6.0 years

0 Lacs

India

On-site

Responsibilities Design and develop data pipelines and ETL processes. Collaborate with data scientists and analysts to understand data needs. Maintain and optimize data warehousing solutions. Ensure data quality and integrity throughout the data lifecycle. Develop and implement data validation and cleansing routines. Work with large datasets from various sources. Automate repetitive data tasks and processes. Monitor data systems and troubleshoot issues as they arise. Qualifications Bachelor’s degree in Computer Science, Information Technology, or a related field. Proven experience as a Data Engineer or similar role (Minimum 6+ years’ experience as Data Engineer). Strong proficiency in Python and PySpark. Excellent problem-solving abilities. Strong communication skills to collaborate with team members and stakeholders. Individual Contributor Technical Skills Required Expert Python, PySpark and SQL/Snowflake Advanced Data warehousing, Data pipeline design – Advanced Level Data Quality, Data validation, Data cleansing – Advanced Level Intermediate/Basic Microsoft Fabric, ADF, Databricks, Master Data management/Data Governance Data Mesh, Data Lake/Lakehouse Architecture

Posted 5 days ago

Apply

8.0 years

0 Lacs

India

On-site

Oracle ADF-Senior Functional Consultant Third Party/ C2H Only immediate Job Location: Bangalore, Chennai, Hyderabad, Noida, Pune Oracle ADF Consultant Roles and Responsibilities: We are looking for an experienced Oracle WebCenter Portal Lead Developer with a minimum of 8 years of hands-on experience in ADF, JavaScript, and REST API. The role involves client-facing responsibilities and requirement gathering for portal development projects. Key Responsibilities: · Lead Oracle WebCenter Portal projects. Develop ADF applications and integrate REST APIs. Extensive knowledge in Oracle Custom Components. Hands-on knowledge in integration with REST APIs, RIDC, IDOC SCRIPT Excellent problem-solving and communication skills.Preferred Skills Knowledge of Oracle WebLogic Server and Oracle Database. Knowledge of Oracle IDCS (Oracle Identity Cloud Services) is an added advantage Knowledge of Oracle WebCenter Content is an added advantage Good to have certifications on OIC, WebCenter, JAVA, etc. Knowledge of SQL is required

Posted 5 days ago

Apply

6.0 years

0 Lacs

India

On-site

Sr. Python Data Engineer Responsibilities Design and develop data pipelines and ETL processes. Collaborate with data scientists and analysts to understand data needs. Maintain and optimize data warehousing solutions. Ensure data quality and integrity throughout the data lifecycle. Develop and implement data validation and cleansing routines. Work with large datasets from various sources. Automate repetitive data tasks and processes. Monitor data systems and troubleshoot issues as they arise. Qualifications Bachelor’s degree in Computer Science, Information Technology, or a related field. Proven experience as a Data Engineer or similar role (Minimum 6+ years’ experience as Data Engineer). Strong proficiency in Python and PySpark. Excellent problem-solving abilities. Strong communication skills to collaborate with team members and stakeholders. Individual Contributor Technical Skills Required Expert Python, PySpark and SQL/Snowflake Advanced Data warehousing, Data pipeline design – Advanced Level Data Quality, Data validation, Data cleansing – Advanced Level Intermediate/Basic Microsoft Fabric, ADF, Databricks, Master Data management/Data Governance Data Mesh, Data Lake/Lakehouse Architecture

Posted 5 days ago

Apply

5.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Before you apply to a job, select your language preference from the options available at the top right of this page. Explore your next opportunity at a Fortune Global 500 organization. Envision innovative possibilities, experience our rewarding culture, and work with talented teams that help you become better every day. We know what it takes to lead UPS into tomorrow—people with a unique combination of skill + passion. If you have the qualities and drive to lead yourself or teams, there are roles ready to cultivate your skills and take you to the next level. Job Description Job Title: Intermediate Data Developer – Azure ADF and Databricks Experience Range: 5-7 Years Location: Chennai, Hybrid Employment Type: Full-Time About UPS UPS is a global leader in logistics, offering a broad range of solutions that include transportation, distribution, supply chain management, and e-commerce. Founded in 1907, UPS operates in over 220 countries and territories, delivering packages and providing specialized services worldwide. Our mission is to enable commerce by connecting people, places, and businesses, with a strong focus on sustainability and innovation. About UPS Supply Chain Symphony™ The UPS Supply Chain Symphony™ platform is a cloud-based solution that seamlessly integrates key supply chain components, including shipping, warehousing, and inventory management, into a unified platform. This solution empowers businesses by offering enhanced visibility, advanced analytics, and customizable dashboards to streamline global supply chain operations and decision-making. About The Role We are seeking an experienced Senior Data Developer to join our data engineering team responsible for building and maintaining complex data solutions using Azure Data Factory (ADF), Azure Databricks , and Cosmos DB . The role involves designing and developing scalable data pipelines, implementing data transformations, and ensuring high data quality and performance. The Senior Data Developer will work closely with data architects, testers, and analysts to deliver robust data solutions that support strategic business initiatives. The ideal candidate should possess deep expertise in big data technologies, data integration, and cloud-native data engineering solutions on Microsoft Azure. This role also involves coaching junior developers, conducting code reviews, and driving strategic improvements in data architecture and design patterns. Key Responsibilities Data Solution Design and Development: Design and develop scalable and high-performance data pipelines using Azure Data Factory (ADF). Implement data transformations and processing using Azure Databricks. Develop and maintain NoSQL data models and queries in Cosmos DB. Optimize data pipelines for performance, scalability, and cost efficiency. Data Integration and Architecture: Integrate structured and unstructured data from diverse data sources. Collaborate with data architects to design end-to-end data flows and system integrations. Implement data security, governance, and compliance standards. Performance Tuning and Optimization: Monitor and tune data pipelines and processing jobs for performance and cost efficiency. Optimize data storage and retrieval strategies for Azure SQL and Cosmos DB. Collaboration and Mentoring: Collaborate with cross-functional teams including data testers, architects, and business analysts. Conduct code reviews and provide constructive feedback to improve code quality. Mentor junior developers, fostering best practices in data engineering and cloud development. Primary Skills Data Engineering: Azure Data Factory (ADF), Azure Databricks. Cloud Platform: Microsoft Azure (Data Lake Storage, Cosmos DB). Data Modeling: NoSQL data modeling, Data warehousing concepts. Performance Optimization: Data pipeline performance tuning and cost optimization. Programming Languages: Python, SQL, PySpark Secondary Skills DevOps and CI/CD: Azure DevOps, CI/CD pipeline design and automation. Security and Compliance: Implementing data security and governance standards. Agile Methodologies: Experience in Agile/Scrum environments. Leadership and Mentoring: Strong communication and coaching skills for team collaboration. Soft Skills Strong problem-solving abilities and attention to detail. Excellent communication skills, both verbal and written. Effective time management and organizational capabilities. Ability to work independently and within a collaborative team environment. Strong interpersonal skills to engage with cross-functional teams. Educational Qualifications Bachelor's degree in Computer Science, Engineering, Information Technology, or a related field. Relevant certifications in Azure and Data Engineering, such as: Microsoft Certified: Azure Data Engineer Associate Microsoft Certified: Azure Solutions Architect Expert Databricks Certified Data Engineer Associate or Professional About The Team As a Senior Data Developer , you will be working with a dynamic, cross-functional team that includes developers, product managers, and other quality engineers. You will be a key player in the quality assurance process, helping shape testing strategies and ensuring the delivery of high-quality web applications. Employee Type Permanent UPS is committed to providing a workplace free of discrimination, harassment, and retaliation.

Posted 6 days ago

Apply

0 years

7 - 10 Lacs

Pune

On-site

Job Title: Data Engineer - Data Solutions Delivery + Data Catalog & Quality Engineer About Advanced Energy Advanced Energy Industries, Inc. (NASDAQ: AEIS), enables design breakthroughs and drives growth for leading semiconductor and industrial customers. Our precision power and control technologies, along with our applications know-how, inspire close partnerships and innovation in thin-film and industrial manufacturing. We are proud of our rich heritage, award-winning technologies, and we value the talents and contributions of all Advanced Energy's employees worldwide. Department: Data and Analytics Team: Data Solutions Delivery Team Job Summary: We are seeking a highly skilled Data Engineer to join our Data and Analytics team. As a member of the Data Solutions Delivery team, you will be responsible for designing, building, and maintaining scalable data solutions. The ideal candidate should have extensive knowledge of Databricks, Azure Data Factory, and Google Cloud, along with strong data warehousing skills from data ingestion to reporting. Familiarity with the manufacturing and supply chain domains is highly desirable. Additionally, the candidate should be well-versed in data engineering, data product, data platform concepts, data mesh, medallion architecture, and establishing enterprise data catalogs using tools like Ataccama, Collibra, or Microsoft Purview. The candidate should also have proven experience in implementing data quality practices using tools like Great Expectations, Deequ, etc. Key Responsibilities: Design, build, and maintain scalable data solutions using Databricks, ADF, and Google Cloud. Develop and implement data warehousing solutions, including ETL processes, data modeling, and reporting. Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions. Ensure data integrity, quality, and security across all data platforms. Provide expertise in data engineering, data product, and data platform concepts. Implement data mesh principles and medallion architecture to build scalable data platforms. Establish and maintain enterprise data catalogs using tools like Ataccama, Collibra, or Microsoft Purview. Implement data quality practices using tools like Great Expectations, Deequ, etc. Work closely with the manufacturing and supply chain teams to understand domain-specific data requirements. Develop and maintain documentation for data solutions, data flows, and data models. Act as an individual contributor, picking up tasks from technical solution documents and delivering high-quality results. Qualifications: Bachelor’s degree in computer science, Information Technology, or a related field. Proven experience as a Data Engineer or similar role. In-depth knowledge of Databricks, Azure Data Factory, and Google Cloud. Strong data warehousing skills, including ETL processes, data modelling, and reporting. Familiarity with manufacturing and supply chain domains. Proficiency in data engineering, data product, data platform concepts, data mesh, and medallion architecture. Experience in establishing enterprise data catalogs using tools like Ataccama, Collibra, or Microsoft Purview. Proven experience in implementing data quality practices using tools like Great Expectations, Deequ, etc. Excellent problem-solving and analytical skills. Strong communication and collaboration skills. Ability to work independently and as part of a team. Preferred Qualifications: Master's degree in a related field. Experience with cloud-based data platforms and tools. Certification in Databricks, Azure, or Google Cloud. As part of our total rewards philosophy, we believe in offering and maintaining competitive compensation and benefits programs for our employees to attract and retain a talented, highly engaged workforce. Our compensation programs are focused on equitable, fair pay practices including market-based base pay, an annual pay-for-performance incentive plan, we offer a strong benefits package in each of the countries in which we operate. Advanced Energy is committed to diversity in its workforce including Equal Employment Opportunity for Minorities, Females, Protected Veterans, and Individuals with Disabilities. We are committed to protecting and respecting your privacy. We take your privacy seriously and will only use your personal information to administer your application in accordance with the RA No. 10173 also known as the Data Privacy Act of 2012

Posted 6 days ago

Apply

7.0 years

0 Lacs

Bengaluru

On-site

Technical Expert BSI We are looking for Technical Expert to be part of our Business Solutions Integrations team in the Analytics, Data and Integration stream. Position Snapshot Location: Bengaluru Type of Contract: Permanent Analytics, Data and Integration Type of work: Hybrid Work Language: Fluent Business English The role The Integration Technical expert will be working in the Business Solution Integration team focused on the Product Engineering and Operations related to Data Integration, Digital integration, and Process Integration the products in the in-Business solution integration and the initiatives where these products are used. Will work together with the Product Manager and Product Owners, as well as various other counterparts in the evolution of the DI, PI, and Digital Products. Will work with architects for orchestrating the design of the integration solutions. Will also act as the first point of contact for project teams to manage demand and will help to drive the transition from engineering to sustain as per the BSI standards. Will work with Operations Managers and Sustain teams on the orchestration of the operations activities, proposing improvements for better performance of the platforms. What you’ll do Work with architects to understand and orchestrate the design choices between the different Data, Process and Digital Integration patterns for fulfilling the data needs. Translate the various requirements into the deliverables for the development and implementation of Process, Data and Digital Integration solutions, following up the requests for getting the work done. Design, develop, and implement integration solutions using ADF, LTRS, Data Integration , SAP PO, CPI, Logic Apps MuleSoft, and Confluent. Work with the Operations Managers and Sustain teams for orchestrating performance and operational issues. We offer you We offer more than just a job. We put people first and inspire you to become the best version of yourself. Great benefits including competitive salary and a comprehensive social benefits package. We have one of the most competitive pension plans on the market, as well as flexible remuneration with tax advantages: health insurance, restaurant card, mobility plan, etc . Personal and professional growth through ongoing training and constant career opportunities reflecting our conviction that people are our most important asset. Minimum qualifications: Minimum of 7 years industry experience in software delivery projects Experience in project and product management, agile methodologies and solution delivery at scale. Skilled and experienced Technical Integration Expert with experience various integration platforms and tools, including ADF, LTRS, Data Integration , SAP PO, CPI, Logic Apps, , MuleSoft, and Confluent. Ability to contribute to a high-performing, motivated workgroup by applying interpersonal and collaboration skills to achieve goals. Fluency in English with excellent oral and written communication skills. Experience in working with cultural diversity: respect for various cultures and understanding how to work with a variety of cultures in the most effective way. Bonus Points If You: Experience with the Azure platform (especially with Data Factory) Experience with Azure DevOps and with Service Now Experience with Power Apps and Power BI About the IT Hub We are a team of IT professionals from many countries and diverse backgrounds, each with unique missions and challenges in the biggest health, nutrition and wellness company of the world. We innovate every day through forward-looking technologies to create opportunities for Nestlé’s digital challenges with our consumers, customers and at the workplace. We collaborate with our business partners around the world to deliver standardized, integrated technology products and services to create tangible business value. About Nestlé We are Nestlé, the largest food and beverage company. We are approximately 275,000 employees strong, driven by the purpose of enhancing the quality of life and contributing to a healthier future. Our values are rooted in respect: respect for ourselves, respect for others, respect for diversity and respect for our future. With more than CHF 94.4 billion sales in 2022, we have an expansive presence, with 344 factories in 77 countries. Want to learn more? Visit us at www.nestle.com. We encourage the diversity of applicants across gender, age, ethnicity, nationality, sexual orientation, social background, religion or belief and disability. Step outside your comfort zone; share your ideas, way of thinking and working to make a difference to the world, every single day. You own a piece of the action – make it count. Join IT Hub Nestlé #beaforceforgood How we will proceed: You send us your CV We contact relevant applicants Interviews Feedback Job Offer communication to the Finalist First working day

Posted 6 days ago

Apply

8.0 - 12.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Avant de postuler à un emploi, sélectionnez votre langue de préférence parmi les options disponibles en haut à droite de cette page. Découvrez votre prochaine opportunité au sein d'une organisation qui compte parmi les 500 plus importantes entreprises mondiales. Envisagez des opportunités innovantes, découvrez notre culture enrichissante et travaillez avec des équipes talentueuses qui vous poussent à vous développer chaque jour. Nous savons ce qu’il faut faire pour diriger UPS vers l'avenir : des personnes passionnées dotées d’une combinaison unique de compétences. Si vous avez les qualités, de la motivation, de l'autonomie ou le leadership pour diriger des équipes, il existe des postes adaptés à vos aspirations et à vos compétences d'aujourd'hui et de demain. Fiche De Poste Job Title: Senior Data Developer – Azure ADF and Databricks Experience Range: 8-12 Years Location: Chennai, Hybrid Employment Type: Full-Time About UPS UPS is a global leader in logistics, offering a broad range of solutions that include transportation, distribution, supply chain management, and e-commerce. Founded in 1907, UPS operates in over 220 countries and territories, delivering packages and providing specialized services worldwide. Our mission is to enable commerce by connecting people, places, and businesses, with a strong focus on sustainability and innovation. About UPS Supply Chain Symphony™ The UPS Supply Chain Symphony™ platform is a cloud-based solution that seamlessly integrates key supply chain components, including shipping, warehousing, and inventory management, into a unified platform. This solution empowers businesses by offering enhanced visibility, advanced analytics, and customizable dashboards to streamline global supply chain operations and decision-making. About The Role We are seeking an experienced Senior Data Developer to join our data engineering team responsible for building and maintaining complex data solutions using Azure Data Factory (ADF), Azure Databricks , and Cosmos DB . The role involves designing and developing scalable data pipelines, implementing data transformations, and ensuring high data quality and performance. The Senior Data Developer will work closely with data architects, testers, and analysts to deliver robust data solutions that support strategic business initiatives. The ideal candidate should possess deep expertise in big data technologies, data integration, and cloud-native data engineering solutions on Microsoft Azure. This role also involves coaching junior developers, conducting code reviews, and driving strategic improvements in data architecture and design patterns. Key Responsibilities Data Solution Design and Development: Design and develop scalable and high-performance data pipelines using Azure Data Factory (ADF). Implement data transformations and processing using Azure Databricks. Develop and maintain NoSQL data models and queries in Cosmos DB. Optimize data pipelines for performance, scalability, and cost efficiency. Data Integration and Architecture: Integrate structured and unstructured data from diverse data sources. Collaborate with data architects to design end-to-end data flows and system integrations. Implement data security, governance, and compliance standards. Performance Tuning and Optimization: Monitor and tune data pipelines and processing jobs for performance and cost efficiency. Optimize data storage and retrieval strategies for Azure SQL and Cosmos DB. Collaboration and Mentoring: Collaborate with cross-functional teams including data testers, architects, and business analysts. Conduct code reviews and provide constructive feedback to improve code quality. Mentor junior developers, fostering best practices in data engineering and cloud development. Primary Skills Data Engineering: Azure Data Factory (ADF), Azure Databricks. Cloud Platform: Microsoft Azure (Data Lake Storage, Cosmos DB). Data Modeling: NoSQL data modeling, Data warehousing concepts. Performance Optimization: Data pipeline performance tuning and cost optimization. Programming Languages: Python, SQL, PySpark Secondary Skills DevOps and CI/CD: Azure DevOps, CI/CD pipeline design and automation. Security and Compliance: Implementing data security and governance standards. Agile Methodologies: Experience in Agile/Scrum environments. Leadership and Mentoring: Strong communication and coaching skills for team collaboration. Soft Skills Strong problem-solving abilities and attention to detail. Excellent communication skills, both verbal and written. Effective time management and organizational capabilities. Ability to work independently and within a collaborative team environment. Strong interpersonal skills to engage with cross-functional teams. Educational Qualifications Bachelor's degree in Computer Science, Engineering, Information Technology, or a related field. Relevant certifications in Azure and Data Engineering, such as: Microsoft Certified: Azure Data Engineer Associate Microsoft Certified: Azure Solutions Architect Expert Databricks Certified Data Engineer Associate or Professional About The Team As a Senior Data Developer , you will be working with a dynamic, cross-functional team that includes developers, product managers, and other quality engineers. You will be a key player in the quality assurance process, helping shape testing strategies and ensuring the delivery of high-quality web applications. Type De Contrat en CDI Chez UPS, égalité des chances, traitement équitable et environnement de travail inclusif sont des valeurs clefs auxquelles nous sommes attachés.

Posted 6 days ago

Apply

8.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Before you apply to a job, select your language preference from the options available at the top right of this page. Explore your next opportunity at a Fortune Global 500 organization. Envision innovative possibilities, experience our rewarding culture, and work with talented teams that help you become better every day. We know what it takes to lead UPS into tomorrow—people with a unique combination of skill + passion. If you have the qualities and drive to lead yourself or teams, there are roles ready to cultivate your skills and take you to the next level. Job Description Job Title: Senior Data Developer – Azure ADF and Databricks Experience Range: 8-12 Years Location: Chennai, Hybrid Employment Type: Full-Time About UPS UPS is a global leader in logistics, offering a broad range of solutions that include transportation, distribution, supply chain management, and e-commerce. Founded in 1907, UPS operates in over 220 countries and territories, delivering packages and providing specialized services worldwide. Our mission is to enable commerce by connecting people, places, and businesses, with a strong focus on sustainability and innovation. About UPS Supply Chain Symphony™ The UPS Supply Chain Symphony™ platform is a cloud-based solution that seamlessly integrates key supply chain components, including shipping, warehousing, and inventory management, into a unified platform. This solution empowers businesses by offering enhanced visibility, advanced analytics, and customizable dashboards to streamline global supply chain operations and decision-making. About The Role We are seeking an experienced Senior Data Developer to join our data engineering team responsible for building and maintaining complex data solutions using Azure Data Factory (ADF), Azure Databricks , and Cosmos DB . The role involves designing and developing scalable data pipelines, implementing data transformations, and ensuring high data quality and performance. The Senior Data Developer will work closely with data architects, testers, and analysts to deliver robust data solutions that support strategic business initiatives. The ideal candidate should possess deep expertise in big data technologies, data integration, and cloud-native data engineering solutions on Microsoft Azure. This role also involves coaching junior developers, conducting code reviews, and driving strategic improvements in data architecture and design patterns. Key Responsibilities Data Solution Design and Development: Design and develop scalable and high-performance data pipelines using Azure Data Factory (ADF). Implement data transformations and processing using Azure Databricks. Develop and maintain NoSQL data models and queries in Cosmos DB. Optimize data pipelines for performance, scalability, and cost efficiency. Data Integration and Architecture: Integrate structured and unstructured data from diverse data sources. Collaborate with data architects to design end-to-end data flows and system integrations. Implement data security, governance, and compliance standards. Performance Tuning and Optimization: Monitor and tune data pipelines and processing jobs for performance and cost efficiency. Optimize data storage and retrieval strategies for Azure SQL and Cosmos DB. Collaboration and Mentoring: Collaborate with cross-functional teams including data testers, architects, and business analysts. Conduct code reviews and provide constructive feedback to improve code quality. Mentor junior developers, fostering best practices in data engineering and cloud development. Primary Skills Data Engineering: Azure Data Factory (ADF), Azure Databricks. Cloud Platform: Microsoft Azure (Data Lake Storage, Cosmos DB). Data Modeling: NoSQL data modeling, Data warehousing concepts. Performance Optimization: Data pipeline performance tuning and cost optimization. Programming Languages: Python, SQL, PySpark Secondary Skills DevOps and CI/CD: Azure DevOps, CI/CD pipeline design and automation. Security and Compliance: Implementing data security and governance standards. Agile Methodologies: Experience in Agile/Scrum environments. Leadership and Mentoring: Strong communication and coaching skills for team collaboration. Soft Skills Strong problem-solving abilities and attention to detail. Excellent communication skills, both verbal and written. Effective time management and organizational capabilities. Ability to work independently and within a collaborative team environment. Strong interpersonal skills to engage with cross-functional teams. Educational Qualifications Bachelor's degree in Computer Science, Engineering, Information Technology, or a related field. Relevant certifications in Azure and Data Engineering, such as: Microsoft Certified: Azure Data Engineer Associate Microsoft Certified: Azure Solutions Architect Expert Databricks Certified Data Engineer Associate or Professional About The Team As a Senior Data Developer , you will be working with a dynamic, cross-functional team that includes developers, product managers, and other quality engineers. You will be a key player in the quality assurance process, helping shape testing strategies and ensuring the delivery of high-quality web applications. Employee Type Permanent UPS is committed to providing a workplace free of discrimination, harassment, and retaliation.

Posted 6 days ago

Apply

5.0 years

0 Lacs

Haveli, Maharashtra, India

On-site

We use cookies to offer you the best possible website experience. Your cookie preferences will be stored in your browser’s local storage. This includes cookies necessary for the website's operation. Additionally, you can freely decide and change any time whether you accept cookies or choose to opt out of cookies to improve website's performance, as well as cookies used to display content tailored to your interests. Your experience of the site and the services we are able to offer may be impacted if you do not accept all cookies. Press Tab to Move to Skip to Content Link Skip to main content Home Page Home Page Life At YASH Core Values Careers Business Consulting Jobs Digital Jobs ERP IT Infrastructure Jobs Sales & Marketing Jobs Software Development Jobs Solution Architects Jobs Join Our Talent Community Social Media LinkedIn Twitter Instagram Facebook Search by Keyword Search by Location Home Page Home Page Life At YASH Core Values Careers Business Consulting Jobs Digital Jobs ERP IT Infrastructure Jobs Sales & Marketing Jobs Software Development Jobs Solution Architects Jobs Join Our Talent Community Social Media LinkedIn Twitter Instagram Facebook View Profile Employee Login Search by Keyword Search by Location Show More Options Loading... Requisition ID All Skills All Select How Often (in Days) To Receive An Alert: Create Alert Select How Often (in Days) To Receive An Alert: Apply now » Apply Now Start apply with LinkedIn Please wait... Module Lead - SQL, Snowflake Job Date: Jun 29, 2025 Job Requisition Id: 61771 Location: Pune, IN Indore, IN Pune, MH, IN Hyderabad, IN YASH Technologies is a leading technology integrator specializing in helping clients reimagine operating models, enhance competitiveness, optimize costs, foster exceptional stakeholder experiences, and drive business transformation. At YASH, we’re a cluster of the brightest stars working with cutting-edge technologies. Our purpose is anchored in a single truth – bringing real positive changes in an increasingly virtual world and it drives us beyond generational gaps and disruptions of the future. We are looking forward to hire MS SQL Professionals in the following areas : Experience 5 - 7 Years Job Description Job Summary: We are looking for a skilled SQL Server, Snowflake Developer to join our data and analytics team. The ideal candidate will have strong experience in developing and maintaining data solutions using SQL Server, Snowflake. You will play a key role in building scalable data pipelines, designing data models, and delivering business intelligence solutions. Key Responsibilities: Develop and optimize complex SQL queries, stored procedures, and ETL processes in SQL Server. Design and implement data pipelines and models in Snowflake. Build and maintain SSIS packages for ETL workflows. Migrate and integrate data between on-premise SQL Server and Snowflake cloud platform. Collaborate with business analysts and stakeholders to understand reporting needs. Ensure data quality, performance tuning, and error handling across all solutions. Maintain technical documentation and support data governance initiatives. Required Skills & Qualifications: 5-7 years of experience with SQL Server (T-SQL). 2+ years of hands-on experience with Snowflake. Strong understanding of ETL/ELT processes and data warehousing principles. Experience with data modeling, performance tuning, and data integration. Familiarity with Azure cloud platforms is a plus. Good communication and problem-solving skills. Preferred / Good-to-Have Skills: Experience with Azure Data Factory (ADF) for orchestrating data workflows. Experience with Power BI or other visualization tools. Exposure to CI/CD pipelines and DevOps practices in data environments. Required Technical/ Functional Competencies Domain/ Industry Knowledge: Basic knowledge of customer's business processes- relevant technology platform or product. Able to prepare process maps, workflows, business cases and simple business models in line with customer requirements with assistance from SME and apply industry standards/ practices in implementation with guidance from experienced team members. Requirement Gathering And Analysis: Working knowledge of requirement management processes and requirement analysis processes, tools & methodologies. Able to analyse the impact of change requested/ enhancement/ defect fix and identify dependencies or interrelationships among requirements & transition requirements for engagement. Product/ Technology Knowledge: Working knowledge of technology product/platform standards and specifications. Able to implement code or configure/customize products and provide inputs in design and architecture adhering to industry standards/ practices in implementation. Analyze various frameworks/tools, review the code and provide feedback on improvement opportunities. Architecture Tools And Frameworks: Working knowledge of architecture Industry tools & frameworks. Able to identify pros/ cons of available tools & frameworks in market and use those as per Customer requirement and explore new tools/ framework for implementation. Architecture Concepts And Principles: Working knowledge of architectural elements, SDLC, methodologies. Able to provides architectural design/ documentation at an application or function capability level and implement architectural patterns in solution & engagements and communicates architecture direction to the business. Analytics Solution Design: Knowledge of statistical & machine learning techniques like classification, linear regression modelling, clustering & decision trees. Able to identify the cause of errors and their potential solutions. Tools & Platform Knowledge: Familiar with wide range of mainstream commercial & open-source data science/analytics software tools, their constraints, advantages, disadvantages, and areas of application. Required Behavioral Competencies Accountability: Takes responsibility for and ensures accuracy of own work, as well as the work and deadlines of the team. Collaboration: Shares information within team, participates in team activities, asks questions to understand other points of view. Agility: Demonstrates readiness for change, asking questions and determining how changes could impact own work. Customer Focus: Identifies trends and patterns emerging from customer preferences and works towards customizing/ refining existing services to exceed customer needs and expectations. Communication: Targets communications for the appropriate audience, clearly articulating and presenting his/her position or decision. Drives Results: Sets realistic stretch goals for self & others to achieve and exceed defined goals/targets. Resolves Conflict: Displays sensitivity in interactions and strives to understand others’ views and concerns. Certifications Mandatory At YASH, you are empowered to create a career that will take you to where you want to go while working in an inclusive team environment. We leverage career-oriented skilling models and optimize our collective intelligence aided with technology for continuous learning, unlearning, and relearning at a rapid pace and scale. Our Hyperlearning workplace is grounded upon four principles Flexible work arrangements, Free spirit, and emotional positivity Agile self-determination, trust, transparency, and open collaboration All Support needed for the realization of business goals, Stable employment with a great atmosphere and ethical corporate culture Apply now » Apply Now Start apply with LinkedIn Please wait... Find Similar Jobs: Careers Home View All Jobs Top Jobs Quick Links Blogs Events Webinars Media Contact Contact Us Copyright © 2020. YASH Technologies. All Rights Reserved.

Posted 6 days ago

Apply

10.0 - 14.0 years

0 Lacs

karnataka

On-site

Are you passionate about problem-solving and eager to learn cutting-edge technologies Do you have a keen interest in innovation and a customer-centric approach If so, Oracle is looking for you to join our Customer Success Services (CSS) team as a Sr. / Sr. Principal Support Engineer - EBS Apps Developer. As part of our team, you will be supporting over 6,000 companies worldwide by building and maintaining their technical landscapes through tailored support services. The ideal candidate for this role is an experienced technical professional with a solid understanding of business solutions, industry best practices, and technology designs within Oracle Applications supporting products and technologies. You should have hands-on experience in the implementation or support of large to medium Oracle Applications implementation projects, as well as strong technical knowledge in Oracle applications, SQL, PL-SQL, OAF, XML, Oracle Forms and Reports, AME, WF, APEX, Java, ADF, JET, and PaaS skills. In addition to technical skills, we are looking for someone who is flexible, open-minded, and capable of working with different technologies and addressing complex architectures in on-premises, cloud, or hybrid environments. You should be able to collaborate effectively with global teams to provide the best-tailored solutions to Oracle customers. The successful candidate should have a minimum of 10 years of relevant experience with excellent problem-solving and troubleshooting skills. You should be self-driven, result-oriented, and have strong communication and teamwork skills. Additionally, you should be able to work effectively in a team, collaborate with stakeholders, and ensure on-time delivery of assigned tasks while meeting deadlines. Your responsibilities will include developing technical solutions to meet business requirements, resolving key issues related to code change requirements and bug fixes, supporting Oracle ERP products and services, and conducting knowledge transfer sessions. You will also be responsible for safeguarding customer satisfaction, engaging in architectural tasks, detecting and addressing performance challenges and security issues, and analyzing, troubleshooting, and solving customer issues using Oracle products. If you are ready to take on this exciting opportunity and work with a team of highly skilled technical experts, apply now to join Oracle's CSS team and be a part of our commitment to empowering an inclusive workforce that promotes opportunities for all.,

Posted 6 days ago

Apply

2.0 - 6.0 years

0 Lacs

haryana

On-site

The ideal candidate for this position in Gurugram should have a strong proficiency in Python coding, as the interview process will be focused on coding skills. While experience in Pyspark and ADF is preferred, a minimum of 2 years of experience is required in this secondary area. The main responsibilities of this role include reviewing and optimizing existing code, working on text parsers and scrappers, and reviewing incoming pipelines. Experience with ADF pipelines is also a key requirement. In addition to technical skills, the candidate should possess certain additional qualities for Position 1. These include the ability to provide guidance to the team on a regular basis, offer technical support, collaborate with other leads, and ensure faster deliverables. There is also an emphasis on exploring how AI can be integrated into the work processes, including the use or creation of LLMs to enhance the Go-To-Market strategy.,

Posted 6 days ago

Apply

0 years

0 Lacs

Bengaluru East, Karnataka, India

On-site

Primary skills: Technology->AWS->Devops Technology->Cloud Integration->Azure Data Factory (ADF),Technology->Cloud Platform->AWS Database, Technology->Cloud Platform->Azure Devops->Azure Pipelines, Technology->DevOps->Continuous integration - Mainframe A day in the life of an Infoscion As part of the Infosys consulting team, your primary role would be to actively aid the consulting team in different phases of the project including problem definition, effort estimation, diagnosis, solution generation and design and deployment You will explore the alternatives to the recommended solutions based on research that includes literature surveys, information available in public domains, vendor evaluation information, etc. and build POCs You will create requirement specifications from the business needs, define the to-be-processes and detailed functional designs based on requirements. You will support configuring solution requirements on the products; understand if any issues, diagnose the root-cause of such issues, seek clarifications, and then identify and shortlist solution alternatives You will also contribute to unit-level and organizational initiatives with an objective of providing high quality value adding solutions to customers. If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Ability to work with clients to identify business challenges and contribute to client deliverables by refining, analyzing, and structuring relevant data Awareness of latest technologies and trends Logical thinking and problem solving skills along with an ability to collaborate Ability to assess the current processes, identify improvement areas and suggest the technology solutions One or two industry domain knowledge

Posted 1 week ago

Apply

4.0 years

0 Lacs

Greater Nashik Area

On-site

Dreaming big is in our DNA. It’s who we are as a company. It’s our culture. It’s our heritage. And more than ever, it’s our future. A future where we’re always looking forward. Always serving up new ways to meet life’s moments. A future where we keep dreaming bigger. We look for people with passion, talent, and curiosity, and provide them with the teammates, resources and opportunities to unleash their full potential. The power we create together – when we combine your strengths with ours – is unstoppable. Are you ready to join a team that dreams as big as you do? AB InBev GCC was incorporated in 2014 as a strategic partner for Anheuser-Busch InBev. The center leverages the power of data and analytics to drive growth for critical business functions such as operations, finance, people, and technology. The teams are transforming Operations through Tech and Analytics. Do You Dream Big? We Need You. Job Description Job Title: Azure Data Engineer Location: Bengaluru Reporting to: Senior Manager Data Engineering Purpose of the role We are seeking an experienced Data Engineer with over 4 years of expertise in data engineering and a focus on leveraging GenAI solutions. The ideal candidate will have a strong background in Azure services, relational databases, and programming languages, including Python and PySpark. You will play a pivotal role in designing, building, and optimizing scalable data pipelines while integrating AI-driven solutions to enhance our data capabilities. Key tasks & accountabilities Data Pipeline Development: Design and implement efficient ETL/ELT pipelines using Azure Data Factory (ADF) and Azure Databricks (ADB). Ensure high performance and scalability of data pipelines. Relational Database Management: Work with relational databases to structure and query data efficiently. Design, optimize, and maintain database schemas. Programming and Scripting: Write, debug, and optimize Python, PySpark, and SQL code to process large datasets. Develop reusable code components and libraries for data processing. Data Quality and Governance: Implement data validation, cleansing, and monitoring mechanisms. Ensure compliance with data governance policies and best practices. Performance Optimization: Identify and resolve bottlenecks in data processing and storage. Optimize resource utilization on Azure services. Collaboration and Communication: Work closely with cross-functional teams, including AI, analytics, and product teams. Document processes, solutions, and best practices for future use. Qualifications, Experience, Skills Previous Work Experience 4+ years of experience in data engineering. Proficiency in Azure Data Factory (ADF) and Azure Databricks (ADB). Expertise in relational databases and advanced SQL. Strong programming skills in Python and PySpark. Experience with GenAI solutions is a plus. Familiarity with data governance and best practices. Level Of Educational Attainment Required Bachelor's degree in Computer Science, Information Technology, or a related field. Technical Expertise: Knowledge of machine learning pipelines and GenAI workflows. Experience with Azure Synapse or other cloud data platforms. Familiarity with CI/CD pipelines for data workflows. And above all of this, an undying love for beer! We dream big to create future with more cheers.

Posted 1 week ago

Apply

5.0 - 7.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Avant de postuler à un emploi, sélectionnez votre langue de préférence parmi les options disponibles en haut à droite de cette page. Découvrez votre prochaine opportunité au sein d'une organisation qui compte parmi les 500 plus importantes entreprises mondiales. Envisagez des opportunités innovantes, découvrez notre culture enrichissante et travaillez avec des équipes talentueuses qui vous poussent à vous développer chaque jour. Nous savons ce qu’il faut faire pour diriger UPS vers l'avenir : des personnes passionnées dotées d’une combinaison unique de compétences. Si vous avez les qualités, de la motivation, de l'autonomie ou le leadership pour diriger des équipes, il existe des postes adaptés à vos aspirations et à vos compétences d'aujourd'hui et de demain. Fiche De Poste Job Title: Intermediate Data Developer – Azure ADF and Databricks Experience Range: 5-7 Years Location: Chennai, Hybrid Employment Type: Full-Time About UPS UPS is a global leader in logistics, offering a broad range of solutions that include transportation, distribution, supply chain management, and e-commerce. Founded in 1907, UPS operates in over 220 countries and territories, delivering packages and providing specialized services worldwide. Our mission is to enable commerce by connecting people, places, and businesses, with a strong focus on sustainability and innovation. About UPS Supply Chain Symphony™ The UPS Supply Chain Symphony™ platform is a cloud-based solution that seamlessly integrates key supply chain components, including shipping, warehousing, and inventory management, into a unified platform. This solution empowers businesses by offering enhanced visibility, advanced analytics, and customizable dashboards to streamline global supply chain operations and decision-making. About The Role We are seeking an experienced Senior Data Developer to join our data engineering team responsible for building and maintaining complex data solutions using Azure Data Factory (ADF), Azure Databricks , and Cosmos DB . The role involves designing and developing scalable data pipelines, implementing data transformations, and ensuring high data quality and performance. The Senior Data Developer will work closely with data architects, testers, and analysts to deliver robust data solutions that support strategic business initiatives. The ideal candidate should possess deep expertise in big data technologies, data integration, and cloud-native data engineering solutions on Microsoft Azure. This role also involves coaching junior developers, conducting code reviews, and driving strategic improvements in data architecture and design patterns. Key Responsibilities Data Solution Design and Development: Design and develop scalable and high-performance data pipelines using Azure Data Factory (ADF). Implement data transformations and processing using Azure Databricks. Develop and maintain NoSQL data models and queries in Cosmos DB. Optimize data pipelines for performance, scalability, and cost efficiency. Data Integration and Architecture: Integrate structured and unstructured data from diverse data sources. Collaborate with data architects to design end-to-end data flows and system integrations. Implement data security, governance, and compliance standards. Performance Tuning and Optimization: Monitor and tune data pipelines and processing jobs for performance and cost efficiency. Optimize data storage and retrieval strategies for Azure SQL and Cosmos DB. Collaboration and Mentoring: Collaborate with cross-functional teams including data testers, architects, and business analysts. Conduct code reviews and provide constructive feedback to improve code quality. Mentor junior developers, fostering best practices in data engineering and cloud development. Primary Skills Data Engineering: Azure Data Factory (ADF), Azure Databricks. Cloud Platform: Microsoft Azure (Data Lake Storage, Cosmos DB). Data Modeling: NoSQL data modeling, Data warehousing concepts. Performance Optimization: Data pipeline performance tuning and cost optimization. Programming Languages: Python, SQL, PySpark Secondary Skills DevOps and CI/CD: Azure DevOps, CI/CD pipeline design and automation. Security and Compliance: Implementing data security and governance standards. Agile Methodologies: Experience in Agile/Scrum environments. Leadership and Mentoring: Strong communication and coaching skills for team collaboration. Soft Skills Strong problem-solving abilities and attention to detail. Excellent communication skills, both verbal and written. Effective time management and organizational capabilities. Ability to work independently and within a collaborative team environment. Strong interpersonal skills to engage with cross-functional teams. Educational Qualifications Bachelor's degree in Computer Science, Engineering, Information Technology, or a related field. Relevant certifications in Azure and Data Engineering, such as: Microsoft Certified: Azure Data Engineer Associate Microsoft Certified: Azure Solutions Architect Expert Databricks Certified Data Engineer Associate or Professional About The Team As a Senior Data Developer , you will be working with a dynamic, cross-functional team that includes developers, product managers, and other quality engineers. You will be a key player in the quality assurance process, helping shape testing strategies and ensuring the delivery of high-quality web applications. Type De Contrat en CDI Chez UPS, égalité des chances, traitement équitable et environnement de travail inclusif sont des valeurs clefs auxquelles nous sommes attachés.

Posted 1 week ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Zensar Technologies, is hiring for Azure data Engineer for Hyderabad. If you're passionate about Azure Databricks, Pyspark, and Synapse, this could be a great fit! Azure Databricks and Hands on Pyspark with tuning Azure Data Factory pipelines for various data loading into ADB, perf tuning Azure Synapse Azure Monitoring and Log Analytics (error handling in ADF pipelines and ADB) Logic Apps and Functions Performance Tuning Databricks, Data factory and Synapse Databricks data loading (layers) and Export (which connection options, which best approach for report and access for fast) If you're interested or know someone who might be, please share your updated resume with me at - Divyanka.kumari2@zensar.com

Posted 1 week ago

Apply

6.0 years

5 - 9 Lacs

Hyderābād

Remote

Technical Lead – Big Data & Python skillset As a Technical Lead, you will be responsible as a strong full stack developer and individual contributor responsible to design application modules and deliver from the technical standpoint. High level of skills in coming up with high level design working with the architect and lead in module implementations technically. Must be a strong developer and ability to innovative. Should be a go to person on the assigned modules, applications/ projects and initiatives. Maintains appropriate certifications and applies respective skills on project engagements. Work you’ll do A unique opportunity to be a part of growing Delivery, methods & Tools team that drives consistency, quality, and efficiency of the services delivered to stakeholders. Responsibilities: Full stack hands on developer and strong individual contributor. Go-to person on the assigned projects. Able to understand and implement the project as per the proposed Architecture. Implements best Design Principles and Patterns. Understands and implements the security aspects of the application. Knows ADO and is familiar with using ADO. Obtains/maintains appropriate certifications and applies respective skills on project engagements. Leads or contributes significantly to Practice. Estimates and prioritizes Product Backlogs. Defines work items. Works on unit test automation. Recommend improvements to existing software programs as deemed necessary. Go-to person in the team for any technical issues. Conduct Peer Reviews Conducts Tech sessions within Team. Provides input to standards and guidelines. Implements best practices to enable consistency across all projects. Participate in the continuous improvement processes, as assigned. Mentors and coaches Juniors in the Team. Contributes to POCs. Supports the QA team with clarifications/ doubts. Takes ownership of the deployment, Tollgate, and deployment activities. Oversees the development of documentation. Participates in regular work, status communications and stakeholder updates. Supports development of intellectual capital. Contributes to knowledge network. Acts as a technical escalation point. Conducts sprint review. Does code Optimization and suggests team on the best practices. Skills: Education qualification : BE /B Tech ( IT/CS/Electronics) / MCA / MSc Computer science 6-9 years of IT experience in application development , support or maintenance activities 2+ years of experience in team management. Must have in-depth knowledge of software development lifecycles including agile development and testing. Enterprise Data Management framework , data security & Compliance( optional). o Data Ingestion, Storage n Transformation o Data Auditing n Validation ( optional) o Data Visualization with Power BI ( optional) o Data Analytics systems ( optional) o Scaling and Handling large data sets. Designing & Building Data Services using At least 2+ years’ in : Azure SQL DB , SQL Wearhouse, ADF , Azure Storage, ADO CI/CD, Azure Synapse Data Model Design Data Entities : modeling and depiction. Metadata Mgmt( optional). Database development patterns n practices : SQL / NoSQL ( Relation / Non-Relational – native JSON) , flexi schema, indexing practices, Master / child model data mgmt, Columnar , Row API / SDK for No SQL DBs Ops & Mgmt. Design and Implementation of Data warehouse, Azure Synapse, Data Lake, Delta lake Apace Spark Mgmt Programming Languages PySpark / Python , C#( optional) API : Invoke / Request n Response PowerShell with Azure CLI ( optional) Git with ADO Repo Mgmt, Branching Strategies Version control Mgmt Rebasing, filtering , cloning , merging Debugging & Perf Tuning n Optimization skills : Ability to analyze PySpark code, PL/SQL, . Enhancing response times GC Mgmt Debugging and Logging n Alerting techniques. Prior experience that demonstrates good business understanding is needed (experience in a professional services organization is a plus). Excellent written and verbal communications, organization, analytical, planning and leadership skills. Strong management, communication, technical and remote collaboration skill are a must. Experience in dealing with multiple projects and cross-functional teams, and ability to coordinate across teams in a large matrix organization environment. Ability to effectively conduct technical discussions directly with Project/Product management, and clients. Excellent team collaboration skills. Education & Experience: Education qualification: BE /B Tech ( IT/CS/Electronics) / MCA / MSc Computer science 6-9 years of Domain experience or other relevant industry experience. 2+ years of Product owner or Business Analyst or System Analysis experience. Minimum 3+ years of Software development experience in .NET projects. 3+ years of experiencing in Agile / scrum methodology Work timings: 9am-4pm, 7pm- 9pm Location: Hyderabad Experience: 6-9 yrs The team At Deloitte, Shared Services center improves overall efficiency and control while giving every business unit access to the company’s best and brightest resources. It is also lets business units focus on what really matters – satisfying customers and developing new products and services to sustain competitive advantage. A shared services center is a simple concept, but making it work is anything but easy. It involves consolidating and standardizing a wildly diverse collection of systems, processes, and functions. And if requires a high degree of cooperation among business units that generally are not accustomed to working together – with people who do not necessarily want to change. USI shared services team provides a wide array of services to the U.S. and it is constantly evaluating and expanding its portfolio. The shared services team provides call center support, Document Services support, financial processing and analysis support, Record management support, Ethics and compliance support and admin assistant support. How you’ll grow At Deloitte, we’ve invested a great deal to create a rich environment in which our professionals can grow. We want all our people to develop in their own way, playing to their own strengths as they hone their leadership skills. And, as a part of our efforts, we provide our professionals with a variety of learning and networking opportunities—including exposure to leaders, sponsors, coaches, and challenging assignments—to help accelerate their careers along the way. No two people learn in exactly the same way. So, we provide a range of resources including live classrooms, team-based learning, and eLearning. DU: The Leadership Center in India, our state-of-the-art, world-class learning Center in the Hyderabad offices is an extension of the Deloitte University (DU) in Westlake, Texas, and represents a tangible symbol of our commitment to our people’s growth and development. Explore DU: The Leadership Center in India Benefits At Deloitte, we know that great people make a great organization. We value our people and offer employees a broad range of benefits. Learn more about what working at Deloitte can mean for you. Deloitte’s culture Our positive and supportive culture encourages our people to do their best work every day. We celebrate individuals by recognizing their uniqueness and offering them the flexibility to make daily choices that can help them to be healthy, centered, confident, and aware. We offer well-being programs and are continuously looking for new ways to maintain a culture that is inclusive, invites authenticity, leverages our diversity, and where our people excel and lead healthy, happy lives. Learn more about Life at Deloitte. Corporate citizenship Deloitte is led by a purpose: to make an impact that matters. This purpose defines who we are and extends to relationships with our clients, our people and our communities. We believe that business has the power to inspire and transform. We focus on education, giving, skill-based volunteerism, and leadership to help drive positive social impact in our communities. Learn more about Deloitte’s impact on the world. #CAP-PD Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Professional development At Deloitte, professionals have the opportunity to work with some of the best and discover what works best for them. Here, we prioritize professional growth, offering diverse learning and networking opportunities to help accelerate careers and enhance leadership skills. Our state-of-the-art DU: The Leadership Center in India, located in Hyderabad, represents a tangible symbol of our commitment to the holistic growth and development of our people. Explore DU: The Leadership Center in India. Benefits to help you thrive At Deloitte, we know that great people make a great organization. Our comprehensive rewards program helps us deliver a distinctly Deloitte experience that helps that empowers our professionals to thrive mentally, physically, and financially—and live their purpose. To support our professionals and their loved ones, we offer a broad range of benefits. Eligibility requirements may be based on role, tenure, type of employment and/ or other criteria. Learn more about what working at Deloitte can mean for you. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Requisition code: 300914

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies