Home
Jobs
Companies
Resume

130 Datafactory Jobs - Page 4

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4 - 9 years

4 - 9 Lacs

Chennai

Work from Office

Naukri logo

TCS-C2H POSITION IN CHENNAI LOCATION JOB DESCRIPTION: Role name: Analyst Role Description: Design and develop DBT models, ADF pipelines, writing SQL/Snowflake queriesTroubleshooting the day-to-day job failures and proving the solutionMaintain data integrity and database performance, stability, and scalability. Make recommendations to optimize database and application performance and efficiency Competencies: Digital : Snowflake Experience (Years): 4-6 Essential Skills: • Good knowledge of SQL.• Experience in writing complex SQL queries.• Experience in Azure Data Factory and Azure Storage (Blob, tables etc.)• Good understating of ETL• Experience in any cloud based ETL tool.• Experience in Snowflake Desirable Skills: • Experience in SSIS or any other ETL tool• Development of DBT models• Experience with Agile and DevOps concepts• Strong development / programming skills• Experience with CI/CD pipelines and using test driven frameworks.• Experience in troubleshooting and debugging advanced SQL queries.• Strong working knowledge of Azure services and best practices please share immediately bhavani@coventine.com whatpp 7995348267 emergency requirement please share the cv

Posted 2 months ago

Apply

6 - 11 years

15 - 30 Lacs

Pune, Goregaon, Bengaluru

Hybrid

Naukri logo

Hi, Hope you are looking for a job change. We have opening for Azure Data Engineer for an MNC in Mumbai Location., I'm sharing JD with you. Please have a look and revert with below details and Updated Resume. Apply only if you can join in 15 Days. Its Hybrid Mode ( 3-Days from Office) Role : Azure Data Engineer Experience: 6.12 Years Mode: Permanent Work Location: , Mumbai (Goregon East) Notice Period: immediate to 20 Days Work Mode : Hybrid Mandatory Skills: Azure data engineer, Azure data bricks ,Pyspark, Power BI . Azure data lake, Sql azure , Synapse.Azure Data Factory, Python Full Name: Email ID: Mobile Number: Alternate No: Total Experience: Relevant experience: Current Organization: Working as Permanent Employee: Payroll Company: Experience in Azure Data Engineering : Experience in Azure data bricks : Which version in Pyspark : Experience in Power BI : Experience in Azure data lake : Experience in azure : Experience in Synapse : Experience in Python: Notice period: Current location: Preferred location: Current CTC: Exp CTC: Pan Card Number : Date of Birth : Any Offer in Hand : LWD: Any Offer in hand: Serving Notice Period: Can you join Immediately: Job Description- Required Skills.: Azure data engineer, Azure data bricks ,Pyspark, Power BI. Azure data lake, sql azure , Synapse. Azure Data Factory, Python Experience 6- 12 Years Location- Mumbai (Goregon East) Notice periods-Immediate to 20 Days Regards, Rejeesh S Email : rejeesh.s@jobworld.jobs Mobile : +91- 9188336668

Posted 2 months ago

Apply

5 - 10 years

13 - 23 Lacs

Pune, Bengaluru, Hyderabad

Work from Office

Naukri logo

5+ yrs Azure data Engineer PySpark, Java, SQL expert. Data modeling, ingestion, DW. Azure: ADF, Data bricks, ADLS. Hadoop, Hive, Cloudera. Data bricks project experience. Strong design, leadership skills. Agile, Git, ML con. Degree in CS/Engineering.

Posted 2 months ago

Apply

3 - 6 years

10 - 12 Lacs

Bengaluru

Remote

Naukri logo

Job Title: Data Engineer (3-5 Years Experience) Location: HYBRID- Mostly Remote Employment Type: Full-Time/ Contract Job Summary: We are seeking a highly skilled Data Engineer with 3-5 years of experience in designing, building, and optimizing data pipelines and cloud-based data solutions . The ideal candidate should have in-depth expertise in one of the following primary technologies: AWS, Azure, Databricks, or Snowflake while being familiar with other cloud and big data platforms. This role requires strong ETL development skills, data modeling knowledge, and a deep understanding of cloud-based data engineering best practices. Key Responsibilities: Design, develop, and optimize data ingestion and transformation pipelines using ETL/ELT frameworks . Implement big data processing solutions using Databricks (PySpark, Scala), Snowflake, AWS, or Azure . Develop and maintain structured and semi-structured data models , ensuring efficient data warehousing and analytics . Optimize data pipeline performance , handling large-scale datasets with efficiency. Ensure data governance, security, and compliance in cloud-based environments. Work with cloud-based storage, compute, and database services (e.g., AWS Redshift, Azure Synapse, Snowflake, Databricks Delta Lake). Collaborate with analysts, data scientists, and business teams to support reporting, analytics, and machine learning initiatives . Troubleshoot and resolve data pipeline and infrastructure issues . Implement monitoring and alerting mechanisms to maintain data pipeline health. Required Skills & Qualifications: 3-5 years of experience in data engineering, cloud computing, and ETL development . Primary expertise in one of the following: Databricks: Strong experience in Apache Spark, PySpark, Scala , and performance optimization. Snowflake: Deep knowledge of Snowflake architecture, SQL optimization, data sharing, and security best practices . AWS: Experience with AWS Glue, Redshift, S3, Lambda, and Step Functions for data workflows . Azure: Strong expertise in Azure Synapse, Data Factory, ADLS, and Databricks on Azure . Proficiency in SQL, Python, and data modeling techniques. Hands-on experience with structured and semi-structured data formats (JSON, Avro, Parquet, XML). Experience with data orchestration tools (Airflow, DBT, Prefect). Strong understanding of data security, governance, and compliance best practices. Excellent problem-solving and troubleshooting skills in big data environments . Preferred Qualifications: Experience with CI/CD pipelines for data engineering . Familiarity with real-time data streaming using Kafka, Kinesis, or Spark Streaming. Exposure to machine learning workflows and MLOps best practices. Knowledge of BI and visualization tools like Power BI, Looker, or Tableau.

Posted 2 months ago

Apply

8 - 10 years

19 - 30 Lacs

Chennai, Bengaluru, Mumbai (All Areas)

Work from Office

Naukri logo

Azure Databricks, Pyspark, Azure Data Factory

Posted 2 months ago

Apply

8 - 12 years

10 - 20 Lacs

Bengaluru

Hybrid

Naukri logo

*Job Description *Years of Experience: 8-12 *Mandatory Skills: Data factory, Databricks, SQL DB, Python, ADLS, PySpark *Preferred Skills: PBI, Power APP stack, Data Modelling , AWS lambda, Glue, GMR, Airflow, Kinesis, Redshift Role and responsibilities: Create Service Offerings Create Customer Presentation (Pitch Deck based on Service Offerings) Help in winning the new deals and to manage the same Help in taking the offerings to market along with the Sales team Help in account mining /farming to grow the customer accounts Ensure the services are delivered as per contractual agreement with the Customer *Required Skills and Qualifications: Architecting and overall end-to-end design, deployment, and delivery of Azure Data Platforms, across Data Lakes, Data Warehouses, Data Lake houses, pipelines, Databricks, BI and Data Analytics solutions Required Technical Skill set: SQL, Azure Data Factory+ Azure Data Lake + Azure SQL+ Azure Synapse+ Remaining up to date in new and emerging technologies Working with clients to develop data technology strategy and roadmaps, and plan delivery Oversight and support of delivery team outputs Data modelling, design, and build Infrastructure as Code delivery Enforcing technical architecture and documentation standards, policies, and procedures Analysing, implementing, and resolving issues with existing Azure Data Platforms Working with business experts and customers to understand business needs, and translate business requirements into reporting and data analytics functionality Assisting in scoping, estimation, and task planning for assigned projects Following the project work plans to meet functionality requirements, project objectives and timelines Providing accurate and complete technical architectural documents Addressing customer queries and issues in a timely manner Providing mentoring and hands on guidance to other team members Experience in designing and implementing Azure Data solutions using services such as: Azure Synapse Analytics Azure Databricks Azure Data Lake Storage Gen2 Azure SQL Database Azure Data Factory Azure DevOps Azure Stream Analytics Azure Blob storage Azure Cosmos DB ARM templates Familiar with Microsoft Power BI Familiar with Azure Purview An understanding of Master Data Management and Data Governance frameworks Familiar with Infrastructure as Code approaches and implementations Familiar with development approaches such as CI/CD Familiar with Azure DevOps Strong communication and collaboration skills Strong analytical thinking and problem-solving skills Ability to work as a team member and leader in a diverse technical environment Be customer-service oriented Be able to work in a fast-paced, changing environment. Proficient in spoken and written English Willing to travel abroad when required Graduate-level education in Computer Science or a relevant field, or a widely recognised professional qualification at a comparable level Formal training and/or certification on related technologies is highly valued Minimum of three years working in a similar role Knowledge of Common Data Models/Industry Data Models/Synapse Analytics Database templates will be considered an asset Experience in OLAP technology and the Microsoft on-premises BI Stack (SSIS/SSRS/SSAS) will be useful but is not compulsory The role is highly technical and requires a robust understanding and hands-on expertise of Microsoft Azure cloud technologies, data architecture and modelling concepts. The role also demands strong analytical, problem solving, and planning skills. The role requires strong communication and collaboration skills and the motivation to achieve results in a dynamic business environment.

Posted 2 months ago

Apply

8 - 12 years

27 - 32 Lacs

Bengaluru

Work from Office

Naukri logo

As a Software Developer you'll participate in many aspects of the software development lifecycle, such as design, code implementation, testing, and support. You will create software that enables your clients' hybrid-cloud and AI journeys. Your primary responsibilities include: Comprehensive Feature Development and Issue Resolution: Working on the end to end feature development and solving challenges faced in the implementation. Stakeholder Collaboration and Issue Resolution: Collaborate with key stakeholders, internal and external, to understand the problems, issues with the product and features and solve the issues as per SLAs defined. Continuous Learning and Technology Integration: Being eager to learn new technologies and implementing the same in feature development. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Proficient in .Net Core with React or Angular Experience in Agile teams applying the best architectural, design, unit testing patterns & practices with an eye for code quality and standards. AZURE FUNCTION, AZURE SERVICE BUS, AZURE STORAGE ACCOUNT- MANDATORY AZURE DURABLE FUNCTIONS AZURE DATA FACTORY, AZURE SQL OR COSMOS DB(DATABASE)Required Ability to write calculation rules and configurable consolidation rules. Preferred technical and professional experience Excellent written and verbal interpersonal skills for coordinating across teams. Should have at least 2 end to end implementation experience. Ability to write and update the rules of historical overrides.

Posted 2 months ago

Apply

4 - 8 years

15 - 30 Lacs

Delhi NCR, Gurgaon, Noida

Hybrid

Naukri logo

Salary: 20 to 30 LPA Exp: 3 to 8 years Location : Gurgaon Notice: Immediate to 30 days..!! Key Responsibilities & Skillsets: Common Skillsets : 4+ years of experience in analytics, Azure Databricks , Power BI, SQL and associated data engineering jobs. Must have experience with managing and transforming big data sets using Azure ETL Excellent communication & presentation skills Experience in managing Python codes and collaborating with customer on model evolution Good knowledge of data base management and Hadoop/Spark, SQL, HIVE, Python (expertise). Superior analytical and problem solving skills Should be able to work on a problem independently and prepare client ready deliverable with minimal or no supervision Good communication skill for client interaction Data Management Skillsets: Ability to understand data models and identify ETL optimization opportunities. Exposure to ETL tools is preferred Should have strong grasp of advanced SQL functionalities (joins, nested query, and procedures). Strong ability to translate functional specifications / requirements to technical requirements

Posted 2 months ago

Apply

8 - 13 years

0 - 0 Lacs

Mumbai Suburbs, Navi Mumbai, Mumbai

Work from Office

Naukri logo

Role & responsibilities Reviewing infra-architecture of application against well architected framework. Design and execute the overall strategic roadmap for the cloud architecture. Deploying Infrastructure for Cloud as well as guiding other teams for deployment infra and application. Guide and support application team through cloud journey from 0-day to go-live. Collaborating with engineering and development teams to evaluate and identify optimal cloud solutions Define standards (and/or select cloud vendor products) for the overall architecture in coordination with the solution architects and engineering leads Continue improving cloud product reliability, availability, maintainability & cost/benefitincl. developing fault-tolerant tools to ensure general robustness of the cloud infra Manage capacity across public and private cloud resource poolsincl. automating scale down/up of environments Support developers in optimizing and automating cloud engineering activities e.g. real time migration, provisioning and deployment, etc. Provide inputs to IT financial management for cloud costs associated with capacity build-out, forecast for cloud IT investments, etc. Developing and maintaining cloud solutions in accordance with best practices. Ensuring efficient functioning of cloud resources/ functions in accordance with company security policies and best practices in cloud security Educate teams on the implementation of new cloud-based initiatives Employ exceptional problem-solving skills, with the ability to see and solve issues before they affect business productivity Orchestrating and automating cloud-based platforms throughout the company Requirements Experience in engineering infrastructure design; 6+ years in cloud engineering roles with experience in leading teams as a lead engineer / architect. Expertise in developing terraform code. Expertise in deploying azure infrastructure. Expertise in Deploying and managing data engineering related services like Databricks (including unity Catalog), Datafactory. Engineer needs to work in tandem with Data engineering developer, so should have good knowledge on ADB Notebooks, ADB Cluster, ADF Linked services & different types of IR. Networking behaviour of ADB and ADF. Expertise in deploying Azure services like AKS, AML, ADB, ADF, PAAS DB (Cosmos DB, SQL, Postgres), Azure AI services (Custom Vision, Document Intelligence, QnA maker, OpenAI), Azure OpenShift. Expertise in troubleshooting deployment /network related issues for cloud infrastructure. Experience in deploying using terraform enterprise and GitHub. Expertise in developing Azure PowerShell script. Expertise in Azure CLI. Basic familiarity with network and security features e.g. cloud network topology, BGP, routing, TCP/IP, DNS, SMTP, HTTPS, Security, Guardrails etc. High availability engineering experience (region, availability zone, data replication clustering) Awareness in open-source tools & scripting language (powershell, Shell). Deep understanding of software development lifecycles and cloud economics, incl. knowledge of consumption-driven TCO Good knowledge of security implications of public & private cloud infra design Azure certifications preferred. Basic Database experience, including knowledge of SQL and NoSQL, and related data stores such as Postgres.

Posted 2 months ago

Apply

3 - 6 years

14 - 24 Lacs

Bengaluru, Hyderabad

Hybrid

Naukri logo

Role & responsibilities Role: Data Engineer/Senior Data Engineer Experience: 3yrs to 5Yrs Location: Bangalore Build a cross-platform data strategy to aggregate multiple sources and process development datasets. Proven work experience as a Data Engineer or a similar role, focusing on data pipeline development, API integration, database design, and data management. Strong expertise in Azure cloud services, particularly Azure Data Factory (ADF), Azure SQL Database, Azure Blob Storage, Azure Databricks. Proficient in SQL and Python programming and experience working with relational databases, data warehousing, and data modelling concepts. Experience with API integration, including RESTful APIs, data extraction, and data ingestion from external systems. Solid understanding of ETL processes, data integration, and data transformation techniques. Proficiency in Python programming and experience with data manipulation, automation, and scripting tasks. Strong problem-solving and analytical skills, with the ability to identify and resolve complex data related issues Perks and benefits : Competitive salary and performance-based bonuses. Comprehensive insurance plans. Collaborative and supportive work environment. Chance to learn and grow with a talented team. A positive and fun work environment.

Posted 2 months ago

Apply

6 - 8 years

25 - 40 Lacs

Bengaluru

Work from Office

Naukri logo

Person should have 6+ years in Azure Cloud. Should have experience in Data Engineer, Architecture. Experience in working on Azure Services like Azure Data Factory, Azure Function, Azure SQL, Azure Data Bricks, Azure Data Lake, Synapse Analytics etc.

Posted 2 months ago

Apply

6 - 8 years

27 - 42 Lacs

Bengaluru

Work from Office

Naukri logo

Experience in working on Azure Services like Azure Data Factory, Azure Function, Azure SQL, Azure Data Bricks, Azure Data Lake, Synapse Analytics etc.Strong SQL, data modeling, Agile exp. Mentor teams, deploy platforms. Big Data & ML knowledge a plus

Posted 2 months ago

Apply

3 - 6 years

4 - 8 Lacs

Andhra Pradesh

Work from Office

Naukri logo

Description Job Summary Role Value Proposition MetLife Data & Analytics organization is the team of expert technologists responsible for building big data platforms and data services with innovative technologies to enable MetLife businesses to generate insights and value to its customers. The team is the center of excellence in data engineering in MetLife and plays a key role in data enablement through multiple data stores supporting different kind of analytical use cases to be able to derive predictive prescriptive and descriptive insights. The Azure Data Engineer III serves as big data development expert within the data analytics engineering organization of MetLife Data & Analytics. This position has the responsibility of building ETL data warehousing and reusable components using cutting edge big data and cloud technologies. The resource will collaborate with the business systems analyst technical leads project managers and business/operations teams in building data enablement solutions across different LOBs and use cases. Key Responsibilities Collect store process and analyze large datasets to build and implement extract transfer load (ETL) processes Develop reusable frameworks to reduce the development effort involved thereby ensuring cost savings for the projects. Develop quality code with thought through performance optimizations in place right at the development stage. Appetite to learn new technologies and be ready to work on new cutting-edge cloud technologies. Work with team spread across the globe in driving the delivery of projects and recommend development and performance improvements. Essential Business Experience and Technical Skills: Ingesting huge volumes data from various platforms for Analytics needs and writing high-performance reliable and maintainable ETL code Strong Strong analytic skills related to working with unstructured datasets Strong experience in building/designing Data warehouses data stores for analytics consumption On prem and Cloud (real time as well as batch use cases) Ability to interact with business analysts and functional analysts in getting the requirements and implementing the ETL solutions. Required 10+ years of solutions development and delivery experience 5+ years of leadership experience in delivery of enterprise scale technology programs Hands on expertise inAzure SQL Synapze Cosmos DB Data Factory Python Spark Scala experience is a MUST Building and Implementing data ingestion and curation process developed using Cloud data tools such as Azure SQL Synapze Cosmos DB Data Factory Spark(Scala/python) Data bricks Delta lake etc. Ingesting huge volumes data from various platforms for Reporting Analytics Data Supply and Transactional (Operational data store and APIs) needs. Strong SQL knowledge and data analysis skills for data anomaly detection and data quality assurance. Named Job Posting? (if Yes - needs to be approved by SCSC) Additional Details Global Grade C Level To Be Defined Named Job Posting? (if Yes - needs to be approved by SCSC) No Remote work possibility No Global Role Family 60242 (P) Data Management Local Role Name 60327 Data Engineer Local Skills 6170 SQL Languages RequiredEnglish Role Rarity To Be Defined

Posted 2 months ago

Apply

5 - 10 years

16 - 20 Lacs

Noida

Work from Office

Naukri logo

Profile: Database Administrator Exp: 5-10 yrs Location: Sector 64, Noida ROLE PURPOSE: We are seeking an experienced Database Administrator (DBA) with deep expertise in Microsoft Azure to manage, optimize, and secure our database systems. The ideal candidate will have a solid foundation in both on-premises and cloud-based environments, ensuring the reliability, availability, and performance of our databases. ROLE and RESPONSIBILITIES: Design, implement, and maintain databases in Microsoft Azure, including Azure SQL Database, Cosmos DB, and other Azure database services. Manage and optimize database infrastructure, ensuring high availability, security, and reliability. Utilize Azure tools like Azure Monitor, Azure Security Center, and Azure Automation for monitoring, alerting, and automation. Perform performance tuning, query optimization, and index management. Troubleshoot and resolve database-related issues to ensure optimal performance. Analyze and refine database performance using data tools and SQL Profiler. Develop and implement robust backup and disaster recovery solutions in Azure. Manage database security, including user access, encryption, and auditing. Ensure compliance with data privacy regulations (e.g., GDPR). Regularly maintain databases, including patching, updates, and configuration changes. Automate routine database tasks to streamline operations. Manage database version control and deployments. Work closely with development, DevOps, and IT teams to provide database support. Support application teams with data migrations, transformations, and ETL processes. Provide expertise and support during system integration and database migrations to Azure. Requirements Bachelors degree in Computer Science, Information Technology, or related field. 5+ years of experience as a Database Administrator. Expertise in Microsoft Azure database services (Azure SQL Database, Cosmos DB, etc.). Strong knowledge of T-SQL, query optimization, and indexing. Experience with backup, recovery, and high availability strategies in cloud environments. Familiarity with Azure DevOps, CI/CD pipelines, and Infrastructure-as-Code (IaC) tools. Microsoft Azure certifications (e.g., Azure Database Administrator Associate, Azure Solutions Architect). Experience with NoSQL databases and big data tools. Knowledge of database security, compliance standards, and regulatory requirements. Proficiency with PowerShell, Python, or other scripting languages for automation.

Posted 2 months ago

Apply

4 - 8 years

6 - 10 Lacs

Bengaluru

Work from Office

Naukri logo

As a Software Developer you'll participate in many aspects of the software development lifecycle, such as design, code implementation, testing, and support. You will create software that enables your clients' hybrid-cloud and AI journeys. Your primary responsibilities include: Comprehensive Feature Development and Issue Resolution: Working on the end to end feature development and solving challenges faced in the implementation. Stakeholder Collaboration and Issue Resolution: Collaborate with key stakeholders, internal and external, to understand the problems, issues with the product and features and solve the issues as per SLAs defined. Continuous Learning and Technology Integration: Being eager to learn new technologies and implementing the same in feature development Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Proficient in .Net Core with React or Angular Experience in Agile teams applying the best architectural, design, unit testing patterns & practices with an eye for code quality and standards. AZURE FUNCTION, AZURE SERVICE BUS, AZURE STORAGE ACCOUNT- MANDATORY AZURE DURABLE FUNCTIONS AZURE DATA FACTORY, AZURE SQL OR COSMOS DB(DATABASE)Required Ability to write calculation rules and configurable consolidation rules Preferred technical and professional experience Excellent written and verbal interpersonal skills for coordinating across teams. Should have at least 2 end to end implementation experience. Ability to write and update the rules of historical overrides

Posted 2 months ago

Apply

3 - 5 years

5 - 9 Lacs

Pune

Work from Office

Naukri logo

Job Title Azure Data Engineer Responsibilities A day in the life of an Infoscion As part of the Infosys consulting team, your primary role would be to get to the heart of customer issues, diagnose problem areas, design innovative solutions and facilitate deployment resulting in client delight. You will develop a proposal by owning parts of the proposal document and by giving inputs in solution design based on areas of expertise. You will plan the activities of configuration, configure the product as per the design, conduct conference room pilots and will assist in resolving any queries related to requirements and solution design You will conduct solution/product demonstrations, POC/Proof of Technology workshops and prepare effort estimates which suit the customer budgetary requirements and are in line with organization’s financial guidelines Actively lead small projects and contribute to unit-level and organizational initiatives with an objective of providing high quality value adding solutions to customers. If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Technical and Professional Requirements: Azure Data EngineersAzure Data Factory, Azure Data Bricks and Azure SAAS Preferred Skills: Technology->Cloud Integration->Azure Data Factory (ADF) Additional Responsibilities: Ability to develop value-creating strategies and models that enable clients to innovate, drive growth and increase their business profitability Good knowledge on software configuration management systems Awareness of latest technologies and Industry trends Logical thinking and problem solving skills along with an ability to collaborate Understanding of the financial processes for various types of projects and the various pricing models available Ability to assess the current processes, identify improvement areas and suggest the technology solutions One or two industry domain knowledge Client Interfacing skills Project and Team management Educational Requirements Master Of Comp. Applications,Master Of Engineering,Master Of Science,Master Of Technology,Bachelor Of Comp. Applications,Bachelor Of Science,Bachelor of Engineering,Bachelor Of Technology Service Line Engineering Services * Location of posting is subject to business requirements

Posted 2 months ago

Apply

4 - 9 years

0 Lacs

Mysore, Bengaluru, Hyderabad

Hybrid

Naukri logo

Open & Direct Walk-in Drive event | Hexaware technologies - Azure Data Engineer/Architect in Bangalore, Karnataka on 29th March [Saturday] 2025 - Azure Databricks/ Data factory/ SQL & Pyspark Dear Candidate, I hope this email finds you well. We are thrilled to announce an exciting opportunity for talented professionals like yourself to join our team as an Azure Data Engineer. We are hosting an Open Walk-in Drive in Bangalore, Karnataka on 29th March [Saturday] 2025 , and we believe your skills in Databricks, Data Factory, SQL, and Pyspark align perfectly with what we are seeking. Details of the Walk-in Drive: Date: 29th March [Saturday] 2025 Experience 4 years to 15 years Time: 9.00 AM to 5 PM Venue: Hexaware Technologies Ltd, Shanti Niketan, 11th Floor, Crescent - 2 Prestige, Whitefield Main Rd, Mahadevapura, Bengaluru, Karnataka 560048 Point of Contact: Azhagu Kumaran Mohan/+91-9789518386 Key Skills and Experience: As an Azure Data Engineer, we are looking for candidates who possess expertise in the following: Databricks Data Factory SQL Pyspark/Spark Roles and Responsibilities: As a part of our dynamic team, you will be responsible for: Designing, implementing, and maintaining data pipelines Collaborating with cross-functional teams to understand data requirements. Optimizing and troubleshooting data processes Leveraging Azure data services to build scalable solutions. What to Bring: Updated resume Photo ID, Passport size photo How to Register: To express your interest and confirm your participation, please reply to this email with your updated resume attached. Walk-ins are also welcome on the day of the event. This is an excellent opportunity to showcase your skills, network with industry professionals, and explore the exciting possibilities that await you at Hexaware Technologies. If you have any questions or require further information, please feel free to reach out to me at AzhaguK@hexaware.com - +91-9789518386 We look forward to meeting you and exploring the potential of having you as a valuable member of our team. ********* less than 4 years of total experience will not be Screen selected to attend the interview***********

Posted 2 months ago

Apply

5 - 10 years

15 - 30 Lacs

Hyderabad

Work from Office

Naukri logo

Lead Data Engineer Data Management Job description Company Overview Accordion works at the intersection of sponsors and management teams throughout every stage of the investment lifecycle, providing hands-on, execution-focused support to elevate data and analytics capabilities. So, what does it mean to work at Accordion? It means joining 1,000+ analytics, data science, finance & technology experts in a high-growth, agile, and entrepreneurial environment while transforming how portfolio companies drive value. It also means making your mark on Accordions futureby embracing a culture rooted in collaboration and a firm-wide commitment to building something great, together. Headquartered in New York City with 10 offices worldwide, Accordion invites you to join our journey. Data & Analytics (Accordion | Data & Analytics) Accordion's Data & Analytics (D&A) team delivers cutting-edge, intelligent solutions to a global clientele, leveraging a blend of domain knowledge, sophisticated technology tools, and deep analytics capabilities to tackle complex business challenges. We partner with Private Equity clients and their Portfolio Companies across diverse sectors, including Retail, CPG, Healthcare, Media & Entertainment, Technology, and Logistics. D&A team delivers data and analytical solutions designed to streamline reporting capabilities and enhance business insights across vast and complex data sets ranging from Sales, Operations, Marketing, Pricing, Customer Strategies, and more. Location: Hyderabad Role Overview: Accordion is looking for Lead Data Engineer. He/she will be responsible for the design, development, configuration/deployment, and maintenance of the above technology stack. He/she must have in-depth understanding of various tools & technologies in the above domain to design and implement robust and scalable solutions which address client current and future requirements at optimal costs. The Lead Data Engineer should be able to evaluate existing architectures and recommend way to upgrade and improve the performance of architectures both on-premises and cloud-based solutions. A successful Lead Data Engineer should possess strong working business knowledge, familiarity with multiple tools and techniques along with industry standards and best practices in Business Intelligence and Data Warehousing environment. He/she should have strong organizational, critical thinking, and communication skills. What You will do: Partners with clients to understand their business and create comprehensive business requirements. Develops end-to-end Business Intelligence framework based on requirements including recommending appropriate architecture (on-premises or cloud), analytics and reporting. Works closely with the business and technology teams to guide in solution development and implementation. Work closely with the business teams to arrive at methodologies to develop KPIs and Metrics. Work with Project Manager in developing and executing project plans within assigned schedule and timeline. Develop standard reports and functional dashboards based on business requirements. Conduct training programs and knowledge transfer sessions to junior developers when needed. Recommend improvements to provide optimum reporting solutions. Curiosity to learn new tools and technologies to provide futuristic solutions for clients. Ideally, you have: Undergraduate degree (B.E/B.Tech.) from tier-1/tier-2 colleges are preferred. More than 5 years of experience in related field. Proven expertise in SSIS, SSAS and SSRS (MSBI Suite.) In-depth knowledge of databases (SQL Server, MySQL, Oracle etc.) and data warehouse (any one of Azure Synapse, AWS Redshift, Google BigQuery, Snowflake etc.) In-depth knowledge of business intelligence tools (any one of Power BI, Tableau, Qlik, DOMO, Looker etc.) Good understanding of Azure (OR) AWS: Azure (Data Factory & Pipelines, SQL Database & Managed Instances, DevOps, Logic Apps, Analysis Services) or AWS (Glue, Aurora Database, Dynamo Database, Redshift, QuickSight). Proven abilities to take on initiative and be innovative. Analytical mind with problem solving attitude. Why Explore a Career at Accordion: High growth environment: Semi-annual performance management and promotion cycles coupled with a strong meritocratic culture, enables fast track to leadership responsibility. Cross Domain Exposure: Interesting and challenging work streams across industries and domains that always keep you excited, motivated, and on your toes. Entrepreneurial Environment : Intellectual freedom to make decisions and own them. We expect you to spread your wings and assume larger responsibilities. Fun culture and peer group: Non-bureaucratic and fun working environment; Strong peer environment that will challenge you and accelerate your learning curve. Other benefits for full time employees: Health and wellness programs that include employee health insurance covering immediate family members and parents, term life insurance for employees, free health camps for employees, discounted health services (including vision, dental) for employee and family members, free doctors consultations, counsellors, etc. Corporate Meal card options for ease of use and tax benefits. Team lunches, company sponsored team outings and celebrations. Cab reimbursement for women employees beyond a certain time of the day. Robust leave policy to support work-life balance. Specially designed leave structure to support woman employees for maternity and related requests. Reward and recognition platform to celebrate professional and personal milestones. A positive & transparent work environment including various employee engagement and employee benefit initiatives to support personal and professional learning and development.

Posted 2 months ago

Apply

10 - 15 years

35 - 40 Lacs

Mumbai

Work from Office

Naukri logo

Job Description About the company: With over 2.5 crore customers, over 5,000 distribution points and nearly 2,000 branches, IndusInd Bank is a universal bank with a widespread banking footprint across the country. IndusInd offers a wide array of products and services for individuals and corporates including microfinance, personal loans, personal and commercial vehicles loans, credit cards, SME loans. Over the years, IndusInd has grown ceaselessly and dynamically, driven by zeal to offer our customers banking services at par with the highest quality standards in the industry. IndusInd is a pioneer in digital first solutions to bring together the power of next-gen digital product stack, customer excellence and trust of an established bank. Job Purpose: To lead the Azure Data Engineering tracks of Banks Data Office. Should take end to end ownership of bringing the data from different source system and creating models on top of it for end user consumption. Experience Overall experience between 10 to 15 years, applicant must have minimum 7- 10 years of hard core professional experience in using the Azure Data engineer using azure services Technical Skills Strong foundation in software engineering principles and an architectural mindset. Knowledge of Big Data technologies, such as Spark Experience in architecting software solution on public cloud like Azure Well versed with Azure services like databricks, data factory, cosmos DB, azure functions, Synapse Experience in authoring or reviewing system design documents for enterprise solutions. knowledge of testing frameworks and libraries. Working knowledge of the pros, cons and usages of various azure services Experience of working in Agile delivery Good knowledge of database management languages e.g. SQL, pyspark SQL. Experience in data hygiene procedures, identity resolution capabilities or data management a plus Familiarity with industry best practices for collection and use of data Responsibility Work closely with the Product Owners and stake holders to design the Technical Architecture for data platform to meet the requirements of the proposed solution. Work with the leadership to set the standards for software engineering practices within the machine learning engineering team and support across other disciplines Play an active role in leading team meetings and workshops with business units. Choose and use the right analytical libraries, programming languages, and frameworks for each task. Help the Data Engineering team produce high-quality code that allows us to put solutions into production Create and own the technical product backlogs for products, help the team to close the backlogs in right time. Refactor code into reusable libraries, APIs, and tools. Help us to shape the next generation of our products. Qualifications Bachelors of Computer Science or Equivalent Should have certification done on Microsoft Data Engineer Expert certification (AZ400 and DP200) Behavioral Competencies Natural leadership skills Should have excellent problem-solving and time management skills Strong analytical ability Applicant should have excellent communication skill Process oriented with flexible execution mindset. Clear and demonstrative communication Efficient problem solving skills Identify, track and escalate risks in a timely manner Selection Process: Interested Candidates are mandatorily required to apply through the listing on Jigya. Only applications received through Jigya will be evaluated further. Shortlisted candidates may need to appear in an Online Assessment and/or a Technical Screening interview administered by Jigya, on behalf on IndusInd Bank Candidates selected after the screening rounds will be processed further by IndusInd Bank

Posted 2 months ago

Apply

5 - 10 years

7 - 17 Lacs

Bengaluru

Work from Office

Naukri logo

Proficient with Azure Platform Development (Azure Functions, Azure Services etc) . 5 to 15 Years relevant software development experience with fairly Full stack profile Proficient in Cloud Native Deployment with CI/CD Pipelines. Proficient in one or more of Data Development (SQL Databases, No SQL, Cloud Datastores etc) technologies Good and Effective Communication skill to understand the requirement and articulate the solution Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Proficient in .Net Core with React or Angular Experience in Agile teams applying the best architectural, design, unit testing patterns & practices with an eye for code quality and standards. AZURE FUNCTION, AZURE SERVICE BUS, AZURE STORAGE ACCOUNT- MANDATORY AZURE DURABLE FUNCTIONS AZURE DATA FACTORY, AZURE SQL OR COSMOS DB(DATABASE)Required Ability to write calculation rules and configurable consolidation rules Preferred technical and professional experience Excellent written and verbal interpersonal skills for coordinating across teams. Should have at least 2 end to end implementation experience. Ability to write and update the rules of historical overrides

Posted 2 months ago

Apply

6 - 8 years

30 - 45 Lacs

Bengaluru

Work from Office

Naukri logo

Experience in working on Azure Services like Azure Data Factory, Azure Function, Azure SQL, Azure Data Bricks, Azure Data Lake, Synapse Analytics. Experience in Data Warehousing with Big Data or Cloud. Experience of working in Agile delivery model.

Posted 2 months ago

Apply

7 - 12 years

15 - 30 Lacs

Bengaluru, Hyderabad, Gurgaon

Hybrid

Naukri logo

At least 8+ years experience as Data Engineering role for analytical projects, preferably on Microsoft Azure. Proficient in Azure technologies like ADF, Azure Synapse, Databricks, Analysis services. Proficient in Azure Tables, Cache, SQL Server, Azure AD. Solid understanding of cloud security, leveraging Windows operating systems, Active Directory, Federated AD with Market leading SSO solutions. Knowledge in Python, PySpark or Spark SQL. Experience in Azure Analytics and DevOps. Preparing Requirements Analysis and Data Architecture Design. Designing and delivering Azure Data Analytics solutions. Providing Azure technological vision in the project implementation for analytical projects. Taking part in Proof of Concepts (POCs) and pilot solutions preparation Lead the Implementation with responsibility for the delivery of the Architecture designs and Data flow strategy. Experience with preparing data for Data Science and Machine Learning purposes. Ability to conduct data profiling, cataloguing, and mapping for technical design and construction of technical data flows. Experience in business processing mapping of data and analytics solutions.

Posted 2 months ago

Apply

6 - 11 years

12 - 22 Lacs

Chennai, Pune, Gurgaon

Work from Office

Naukri logo

About Client Hiring for One of Our Multinational Corporations! Job Title : Data Engineer Qualification : Any Graduate or Above Relevant Experience : 5 to 8 Years Must Have Skills : Python Pyspark AWS/GCP/Azure Roles and Responsibilities : Data Pipeline Development: Design, build, and optimize end-to-end data pipelines on cloud platforms (GCP, AWS, or Azure) to support business intelligence, reporting, and analytical solutions. Cloud Infrastructure Management: Leverage cloud services for efficient data processing, storage, and management. Work with technologies like Google Cloud Platform (GCP), Amazon Web Services (AWS), or Microsoft Azure to build scalable, fault-tolerant, and cost-effective data solutions. Data Integration: Collaborate with cross-functional teams to integrate data from multiple sources, ensuring accurate and timely delivery of data to end-users and analytical systems. Data Modeling: Design and maintain data models for efficient data processing, storage, and retrieval. Ensure data quality and consistency across various datasets. Automation & Optimization: Write efficient, reusable, and maintainable Python code to automate repetitive tasks and optimize data processing workflows using tools like PySpark. Data Security & Compliance: Implement data security, privacy, and governance best practices while adhering to regulatory compliance requirements (GDPR, HIPAA, etc.). Troubleshooting & Debugging: Identify, troubleshoot, and resolve data pipeline and infrastructure issues. Perform root cause analysis and implement preventive measures. Collaboration & Communication: Work closely with data scientists, data analysts, and business teams to understand requirements and deliver solutions that meet business needs. Location : Chennai, Bangalore, Hyderabad, Pune, Gurgaon CTC Range : Upto 40 LPA (Lakhs Per Annum) Notice period : 90 days Mode of Interview : Virtual Mode of Work : Work From Office -- Thanks & Regards, SHRIVIDYA Black and White Business Solutions Pvt.Ltd. Bangalore,Karnataka,India. Direct Number:08067432486 shrividya@blackwhite.in |www.blackwhite.in

Posted 2 months ago

Apply

4 - 6 years

7 - 15 Lacs

Pune, Delhi NCR, Bengaluru

Hybrid

Naukri logo

Exp: 4 to 6 years Job Location: Noida / Mumbai / Pune / Bangalore / Gurgaon / Kochi ( Hybrid work) Notice : Immediate to 30 days Skill set : ADF , Pyspark , SQL Role & responsibilities Key Responsibilities: • Develop scalable data pipelines using Azure Data Factory (ADF), Databricks, PySpark, and Delta Lake to support ML and AI workloads. • Optimize and transform large datasets for feature engineering, model training, and real-time AI inference. • Build and maintain lakehouse architecture using Azure Data Lake Storage (ADLS) & Delta Lake. • Work closely with ML engineers & Data Scientists to deliver high-quality, structured data for training Generative AI models. • Implement MLOps best practices for continuous data processing, versioning, and model retraining workflows. • Monitor & improve data quality using Azure Data Quality Services • Ensure cost-efficient data processing in Databricks using Photon, Delta Caching, and Auto-Scaling Clusters. • Secure data pipelines by implementing RBAC, encryption, and governance Required Skills & Experience: • 3+ years of experience in Data Engineering with Azure & Databricks. • Proficiency in PySpark, SQL, and Delta Lake for large-scale data transformations. • Strong experience with Azure Data Factory (ADF), Azure Synapse, and Event Hubs. • Hands-on experience in building feature stores for ML models. • Experience with ML model deployment and MLOps pipelines (MLflow, Kubernetes, or Azure ML) is a plus. • Good understanding of Generative AI concepts and handling unstructured data (text, images, video, embeddings). • Familiarity with Azure DevOps, CI/CD for data pipelines, and Infrastructure as Code (Terraform, Bicep). • Strong problem-solving, debugging, and performance optimization skills. Preferred candidate profile Interested candidates , kindly share updated resume at simpy.bagati@infogain.com

Posted 2 months ago

Apply

5 - 9 years

25 - 40 Lacs

Pune, Bengaluru, Gurgaon

Hybrid

Naukri logo

Salary: 25 to 40 LPA Exp: 5 to 9 years Location : Gurgaon/Bangalore/Pune Notice: Immediate to 30 days..!! Key Responsibilities & Skillsets: Common Skillsets : 5+ years of experience in analytics, Azure Databricks , Power BI, SQL and associated data engineering jobs. Must have experience with managing and transforming big data sets using Azure ETL Excellent communication & presentation skills Experience in managing Python codes and collaborating with customer on model evolution Good knowledge of data base management and Hadoop/Spark, SQL, HIVE, Python (expertise). Superior analytical and problem solving skills Should be able to work on a problem independently and prepare client ready deliverable with minimal or no supervision Good communication skill for client interaction Data Management Skillsets: Ability to understand data models and identify ETL optimization opportunities. Exposure to ETL tools is preferred Should have strong grasp of advanced SQL functionalities (joins, nested query, and procedures). Strong ability to translate functional specifications / requirements to technical requirements

Posted 2 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies