Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 10.0 years
5 - 15 Lacs
Hyderabad, Pune
Work from Office
We are looking for a seasoned Azure Data Engineer with deep hands-on experience in Databricks and Azure data services to join our high-impact Data & AI team. JD: https://tinyurl.com/d6jheey2
Posted 1 month ago
10.0 - 14.0 years
40 - 45 Lacs
Hyderabad
Work from Office
Skills: Cloudera, Big Data, Hadoop, SPARK, Kafka, Hive, CDH Clusters Design and implement Cloudera-based data platforms, including cluster sizing, configuration, and optimization. Install, configure, and administer Cloudera Manager and CDP clusters, managing all aspects of the cluster lifecycle. Monitor and troubleshoot platform performance, identifying and resolving issues promptly. Review the maintain the data ingestion and processing pipelines on the Cloudera platform. Collaborate with data engineers and data scientists to design and optimize data models, ensuring efficient data storage and retrieval. Implement and enforce security measures for the Cloudera platform, including authentication, authorization, and encryption. Manage platform user access and permissions, ensuring compliance with data privacy regulations and internal policies. Experience in creating Technology Road Maps for Cloudera Platform. Stay up-to-date with the latest Cloudera and big data technologies, and recommend and implement relevant updates and enhancements to the platform. Experience in Planning, testing, and executing upgrades involving Cloudera components and ensuring platform stability and security. Document platform configurations, processes, and procedures, and provide training and support to other team members as needed. Requirements Bachelor's degree in Computer Science, Engineering, or a related field. Proven experience as a Cloudera platform engineer or similar role, with a strong understanding of Cloudera Manager and CDH clusters. Expertise in designing, implementing, and maintaining scalable and high-performance data platforms using Cloudera technologies such as Hadoop, Spark, Hive, Kafka. Strong knowledge of big data concepts and technologies, data modeling, and data warehousing principles. Familiarity with data security and compliance requirements, and experience implementing security measures for Cloudera platforms. Proficiency in Linux system administration and scripting languages (e.g., Shell, Python). Strong troubleshooting and problem-solving skills, with the ability to diagnose and resolve platform issues quickly. Excellent communication and collaboration skills, with the ability to work effectively in cross-functional teams. Experience on Azure Data Factory/Azure Databricks/Azure Synapse is a plus Timings: 10 am to 7.30 pm 2 days WFO and 3 days WFH.
Posted 1 month ago
6.0 - 11.0 years
25 - 37 Lacs
Hyderabad, Bengaluru, Delhi / NCR
Work from Office
Azure Expertise, Proven experience with Azure Cloud services especially Azure Data Factory, Azure SQL Database & Azure Databricks Expert in PySpark data processing & analytics Strong background in building and optimizing data pipelines and workflows. Required Candidate profile Solid exp with data modeling,ETL processes & data warehousing Performance Tuning Ability to optimize data pipelines & jobs to ensure scalability & performance troubleshooting & resolving performance
Posted 1 month ago
4.0 - 9.0 years
16 - 20 Lacs
Pune
Work from Office
Azure Data Engineer Skills -SQL, ETL, AZURE, Python, Pyspark, Databricks Exp- min 4 Years Immediate Joiners - 45 days(max) Location- Pune UK Shifts CTC offered- 16-20 LPA Contact - divyam@genesishrs.com | 8905344933
Posted 1 month ago
5.0 - 10.0 years
15 - 20 Lacs
Pune
Work from Office
AZURE DATA ENGINEER Skills - Strong technical experience in Azure, SQL , Azure data factory, ETL, Databricks Graduation must Experience- 5-10 years CTC- Up to 14 - 20 LPA 21st June -F2F Interview only (Pune) Contact- 7742324144
Posted 1 month ago
4.0 - 9.0 years
15 - 20 Lacs
Pune
Work from Office
Job Role: Azure Data Engineer Job Location: Pune Experience: 4+ Yrs Skills: SQL + ETL + Azure + Python + Pyspark + Databricks Job Description: As an Azure Data Engineer, you will play a crucial role in designing, implementing, and maintaining our data infrastructure on the Azure platform. You will collaborate with cross-functional teams to develop robust data pipelines, optimize data workflows, and ensure data integrity and reliability. Responsibilities: Design, develop, and deploy data solutions on Azure, leveraging SQL Azure, Azure Data Factory, and Databricks. Build and maintain scalable data pipelines to ingest, transform, and load data from various sources into Azure data repositories. Implement data security and compliance measures to safeguard sensitive information. Collaborate with data scientists and analysts to support their data requirements and enable advanced analytics and machine learning initiatives. Optimize and tune data workflows for performance and efficiency. Troubleshoot data-related issues and provide timely resolution. Stay updated with the latest Azure data services and technologies and recommend best practices for data engineering. Qualifications: Bachelors degree in computer science, Information Technology, or related field. Proven experience as a data engineer, preferably in a cloud environment. Strong proficiency in SQL Azure for database design, querying, and optimization. Hands-on experience with Azure Data Factory for ETL/ELT workflows. Familiarity with Azure Databricks for big data processing and analytics. Experience with other Azure data services such as Azure Synapse Analytics, Azure Cosmos DB, and Azure Data Lake Storage is a plus. Solid understanding of data warehousing concepts, data modeling, and dimensional modeling. Excellent problem-solving and communication skills.
Posted 1 month ago
4.0 - 5.0 years
17 - 25 Lacs
Pune
Work from Office
Seeking an experienced Azure Data Engineer to design, build, and maintain scalable data solutions using ADF, Databricks, Synapse, Azure SQL, and more. Strong Python/SQL skills, 4+ yrs exp, and Azure cloud expertise required.
Posted 1 month ago
3.0 - 8.0 years
15 - 30 Lacs
Navi Mumbai, Pune
Work from Office
We're Hiring: Data Scientist Databricks & ML Deployment Expert Location: Mumbai/Pune Experience: 38 Years Apply Now! Are you passionate about deploying real-world machine learning solutions? We're looking for a versatile Data Scientist with deep expertise in Databricks, PySpark , and end-to-end ML deployment to drive impactful projects in the Retail and Automotive domains. What Youll Do Develop scalable ML models (Regression, Classification, Clustering) Deliver advanced use cases like CLV modeling , Predictive Maintenance , and Time Series Forecasting Design and automate ML workflows on Databricks using PySpark Build and deploy APIs to serve ML models (Flask, FastAPI, Django) Own model deployment and monitoring in production environments Work closely with Data Engineering and DevOps teams for CI/CD integration Optimize pipelines and model performance (code & infrastructure level) Must-Have Skills Strong hands-on with Databricks and PySpark Proven track record in ML model development & deployment (min. 2 production deployments) Solid grasp of Regression, Classification, Clustering & Time Series Proficiency in SQL , workflow automation, and ELT/ETL processes API development (Flask, FastAPI, Django) CI/CD, deployment automation, and ML pipeline optimization Familiarity with Medallion Architecture Domain Expertise Retail : CLV, Pricing, Demand Forecasting Automotive : Predictive Maintenance, Time Series Nice to Have MLflow, Docker, Kubernetes Cloud: Azure, AWS, or GCP If you're excited to build production-ready ML systems that create real business impact, we want to hear from you! Apply Now to chaity.mukherjee@celebaltech.com.
Posted 1 month ago
4.0 - 9.0 years
11 - 21 Lacs
Pune, Hinjewadi, Hinjewadi-Pune
Work from Office
SECTION A: POSITION SUMMARY This role is accountable to develop, expand and optimize Data Management Architecture, Design & Implementation under Singtel Data Platform & Management 1. Design, develop and implement data governance and management solution, data quality, Privacy, protection & associated control technology solutions as per best industry practice. 2. Review, evaluate and implement Data Management standards primarily Data Classification, Data Retention across systems. 3. Design, develop and implement Automated Data Discovery rules to identify presence of PII attributes. 4. Drive development, optimization, testing and tooling to improve overall data control management (Security, Data Privacy, protection, Data Quality) 5. Review, analyze, benchmark, and approve solution design from product companies, internal teams, and vendors. 6. Ensure that proposed solutions are aligned and conformed to the data landscape, big data architecture guidelines and roadmap. SECTION B: KEY RESPONSIBILITIES AND RESULTS 1 Design and implement data management standards like Catalog Management, Data Quality, Data Classification, Data Retention 2 Drive BAU process, testing and tooling to improve data security, privacy, and protection 3 Identify, design, and implement internal process improvements: automating manual processes, control and optimizing data technology service delivery. 4 Implement and support Data Management Technology solution throughout lifecycle like user onboarding, upgrades, fixes, access management etc.. SECTION C: QUALIFICATIONS / EXPERIENCE / KNOWLEDGE REQUIRED Category Essential for this role Education and Qualifications Diploma in Data Analytics, Data Engineering, IT, Computer Science, Software Engineering, or equivalent. Work Experience Exposure to Data Management and Big Data Concepts Knowledge and experience in Data Management, Data Integration, Data Quality products Technical Skills Informatica CDGC, Collibra, Alatian Informatica Data Quality, Data Privacy Management Azure Data Bricks
Posted 1 month ago
3.0 - 8.0 years
11 - 21 Lacs
Pune
Work from Office
P2 Grade- (3+ to 5 yrs) -> CTC -> 11.50 LPA Max P3 Grade (5 to 8 yrs) CTC -> 21 LPA Max POSITION SUMMARY This role is accountable to develop, expand and optimize Data Management Architecture, Design & Implementation under Singtel Data Platform & Management 1. Design, develop and implement data governance and management solution, data quality, Privacy, protection & associated control technology solutions as per best industry practice. 2. Review, evaluate and implement Data Management standards primarily Data Classification, Data Retention across systems. 3. Design, develop and implement Automated Data Discovery rules to identify presence of PII attributes. 4. Drive development, optimization, testing and tooling to improve overall data control management (Security, Data Privacy, protection, Data Quality) 5. Review, analyze, benchmark, and approve solution design from product companies, internal teams, and vendors. 6. Ensure that proposed solutions are aligned and conformed to the data landscape, big data architecture guidelines and roadmap. KEY RESPONSIBILITIES AND RESULTS 1 Design and implement data management standards like Catalog Management, Data Quality, Data Classification, Data Retention 2 Drive BAU process, testing and tooling to improve data security, privacy, and protection 3 Identify, design, and implement internal process improvements: automating manual processes, control and optimizing data technology service delivery. 4 Implement and support Data Management Technology solution throughout lifecycle like user onboarding, upgrades, fixes, access management etc.. QUALIFICATIONS / EXPERIENCE / KNOWLEDGE REQUIRED Category Essential for this role Education and Qualifications Diploma in Data Analytics, Data Engineering, IT, Computer Science, Software Engineering, or equivalent. Work Experience • Exposure to Data Management and Big Data Concepts Knowledge and experience in Data Management, Data Integration, Data Quality products Technical Skills • Informatica CDGC, Collibra, Alatian Informatica Data Quality, Data Privacy Management Azure Data Bricks
Posted 1 month ago
3.0 - 6.0 years
0 Lacs
Noida
Work from Office
Role & responsibilities • Bachelors degree in computer science, Engineering, or related field. • Proven experience as a Data Engineer, with a minimum of 2 years of hands-on experience working with Azure Data Factory. • Strong proficiency in SQL and experience with relational databases (e.g., SQL Server, Azure SQL Database). • Solid understanding of data modeling concepts and ETL principles. • Experience with cloud-based data technologies, specifically Microsoft Azure (Azure Data Lake Storage, Azure SQL Data Warehouse, etc.). • Familiarity with data orchestration and workflow scheduling tools (e.g., Azure Data Factory). • Knowledge of programming languages such as Python is a plus. • Excellent problem-solving skills and attention to detail. • Strong communication and collaboration skills, with the ability to work effectively in a cross-functional team environment. • Azure certifications (e.g., Azure Data Engineer, Azure Developer) are desirable but not required.
Posted 1 month ago
6.0 - 11.0 years
30 - 35 Lacs
Hyderabad, Delhi / NCR
Hybrid
Support enhancements to the MDM and Performance platform Track System Performance Troubleshoot issues Resolve production issues Required Candidate profile 5+ years in Python and advanced SQL including profiling, refactoring Experience with REST API and Hands on AWS Glue EMR etc Experience with Markit EDM or Semarchy or MDM will be plus
Posted 1 month ago
4.0 - 8.0 years
8 - 18 Lacs
Hyderabad
Hybrid
Azure Data Engineer | Hyderabad (Onsite) Experience: 4 to 8 Years Job Type: Full Time Timings: 9 AM to 6 PM IST Roles & Responsibilities: Design and develop scalable, multi-terabyte data models and data marts. Build and maintain cloud-based analytics solutions using Azure services. Create and manage data pipelines and implement streaming ingestion methods. Analyze complex data sets and derive actionable insights. Develop solutions leveraging Azure Data Bricks, Synapse, SQL, and Data Lake. Ensure robust cloud infrastructure using Azure platform components. Follow DevOps processes, including CI/CD and Infrastructure as Code (IaC). Apply strong data warehouse modeling principles in solution architecture. Collaborate with stakeholders to understand analytics requirements. Work in agile teams to deliver projects efficiently and on time. Troubleshoot, debug, and optimize data engineering solutions. Preferred Skills: Strong knowledge of Azure ecosystem & data warehouse concepts. Scripting with Python/PowerShell, familiarity with Power BI. Added edge: KQL & LLM model exposure. Quick learner, agile contributor, and problem-solver. Apply Now / Share CVs : sirisha.nethi@quadranttechnologies.com | ajay.pesaru@quadranttechnologies.com
Posted 1 month ago
6.0 - 11.0 years
20 - 32 Lacs
Pune, Gurugram
Hybrid
Key Responsibilities: Design and develop ETL/ELT pipelines using Azure Data Factory , Snowflake , and DBT . Build and maintain data integration workflows from various data sources to Snowflake. Write efficient and optimized SQL queries for data extraction and transformation. Work with stakeholders to understand business requirements and translate them into technical solutions. Monitor, troubleshoot, and optimize data pipelines for performance and reliability. Maintain and enforce data quality, governance, and documentation standards. Collaborate with data analysts, architects, and DevOps teams in a cloud-native environment. Must-Have Skills: Strong experience with Azure Cloud Platform services. Proven expertise in Azure Data Factory (ADF) for orchestrating and automating data pipelines. Proficiency in SQL for data analysis and transformation. Hands-on experience with Snowflake and SnowSQL for data warehousing. Practical knowledge of DBT (Data Build Tool) for transforming data in the warehouse. Experience working in cloud-based data environments with large-scale datasets. Good-to-Have Skills: Experience with Azure Data Lake , Azure Synapse , or Azure Functions . Familiarity with Python or PySpark for custom data transformations. Understanding of CI/CD pipelines and DevOps for data workflows. Exposure to data governance , metadata management , or data catalog tools. Knowledge of business intelligence tools (e.g., Power BI, Tableau) is a plus.
Posted 1 month ago
6.0 - 11.0 years
20 - 32 Lacs
Pune, Gurugram, Bengaluru
Hybrid
Key Responsibilities: Design and develop ETL/ELT pipelines using Azure Data Factory , Snowflake , and DBT . Build and maintain data integration workflows from various data sources to Snowflake. Write efficient and optimized SQL queries for data extraction and transformation. Work with stakeholders to understand business requirements and translate them into technical solutions. Monitor, troubleshoot, and optimize data pipelines for performance and reliability. Maintain and enforce data quality, governance, and documentation standards. Collaborate with data analysts, architects, and DevOps teams in a cloud-native environment. Must-Have Skills: Strong experience with Azure Cloud Platform services. Proven expertise in Azure Data Factory (ADF) for orchestrating and automating data pipelines. Proficiency in SQL for data analysis and transformation. Hands-on experience with Snowflake and SnowSQL for data warehousing. Practical knowledge of DBT (Data Build Tool) for transforming data in the warehouse. Experience working in cloud-based data environments with large-scale datasets. Good-to-Have Skills: Experience with Azure Data Lake , Azure Synapse , or Azure Functions . Familiarity with Python or PySpark for custom data transformations. Understanding of CI/CD pipelines and DevOps for data workflows. Exposure to data governance , metadata management , or data catalog tools. Knowledge of business intelligence tools (e.g., Power BI, Tableau) is a plus.
Posted 1 month ago
7.0 - 12.0 years
20 - 35 Lacs
Noida, Chennai
Hybrid
Deployment, configuration & maintenance of Databricks clusters & workspaces Security & Access Control Automate administrative task using tools like Python, PowerShell &Terraform Integrations with Azure Data Lake, Key Vault & implement CI/CD pipelines Required Candidate profile Azure, AWS, or GCP; Azure experience is preferred Strong skills in Python, PySpark, PowerShell & SQL Experience with Terraform ETL processes, data pipeline &big data technologies Security & Compliance
Posted 1 month ago
8.0 - 13.0 years
15 - 25 Lacs
Pune, Bengaluru, Delhi / NCR
Hybrid
Skills : Azure Data Factory, Azure Databricks, Azure Data Lake, Lead Experience,Pyspark, SQL Exp : 8-13 yrs Location : PAN India
Posted 1 month ago
1.0 - 4.0 years
0 - 2 Lacs
Nagpur
Work from Office
BRIEF OF JOB PROFILE & ROLE AND RESPONSIBILITIES Your duties as Junior Data Scientist are as follows: Design, develop, test, deploy and maintain BI development projects which will include but not limited to Report and Dashboard development, Statistical Modelling, Machine Learning, Artificial Intelligence. Deliver client presentations, create Data Visualizations, and exhibit Data Storytelling skills. Perform complex development tasks and use the technologies as deemed appropriate for the projects by the Employer. Technologies will include SQL, Power BI, Tableau, Python, R, TensorFlow, Cloud based data intelligence and analytics services and tools like SQL Server Analysis Services (SSAS), Data Bricks, Azure Data Factory, AWS Glue. Work directly or indirectly with clients to understand and define requirements, work with other application development teams on various projects. Work with the team of developers. Take directions well from the Team lead/Architect and work together with him/her to contribute to all development, production, and support activities. Participate and work using Agile methodologies like Scrum and conduct the Scrum events appropriately. other duties as may arise from time to time and as may be assigned to the employee. Ensure and implement the best possible performance, quality, and responsiveness of applications. Work together with the team in development and testing efforts. Identify bottlenecks and bugs, and devise solutions to mitigate and address these issues. Exhibit flexible and proactive working style with strong personal ownership of problem resolution. Work to ensure deadlines are met.
Posted 1 month ago
6.0 - 11.0 years
9 - 19 Lacs
Pune, Chennai, Bengaluru
Hybrid
Role & responsibilities Primary Skills : Azure Synapse ~ Azure Data. Looking for an Architect and Developers . Nice to have: Amazon Redshift; BigData -Hadoop;BigData -Hive;Scala;Apache Spark;Snowflake;Databricks;Google Cloud PlatformSAP;Apache Cassandra;flume;BigData -HBASE;Apache Impala;apache nifi;Apache Pig;Sqoop;PySpark;Python for Data Science;Data-Dremio;Apache Hadoop
Posted 1 month ago
5.0 - 9.0 years
14 - 17 Lacs
Pune
Work from Office
Diacto is seeking an experienced and highly skilled Data Architect to lead the design and development of scalable and efficient data solutions. The ideal candidate will have strong expertise in Azure Databricks, Snowflake (with DBT, GitHub, Airflow), and Google BigQuery. This is a full-time, on-site role based out of our Baner, Pune office. Qualifications: B.E./B.Tech in Computer Science, IT, or related discipline MCS/MCA or equivalent preferred Key Responsibilities: Design, build, and optimize robust data architecture frameworks for large-scale enterprise solutions Architect and manage cloud-based data platforms using Azure Databricks, Snowflake, and BigQuery Define and implement best practices for data modeling, integration, governance, and security Collaborate with engineering and analytics teams to ensure data solutions meet business needs Lead development using tools such as DBT, Airflow, and GitHub for orchestration and version control Troubleshoot data issues and ensure system performance, reliability, and scalability Guide and mentor junior data engineers and developers
Posted 1 month ago
3.0 - 8.0 years
8 - 18 Lacs
Hyderabad, Bengaluru, Mumbai (All Areas)
Hybrid
Job Description: We are seeking a skilled Azure Data Engineer with hands-on experience in Azure Data Factory , Azure Databricks , PySpark , and SQL . The ideal candidate will be responsible for designing, developing, and maintaining scalable data pipelines and ETL processes on Azure cloud platforms. Key Responsibilities: Develop and manage data workflows using Azure Data Factory. Build and optimize big data solutions using Azure Databricks and PySpark. Write efficient SQL queries for data transformation and analysis. Collaborate with data architects and business stakeholders to deliver data-driven solutions. Requirements: Strong experience with Azure Data Factory and Azure Databricks. Proficiency in PySpark for big data processing. Solid understanding of SQL and relational databases. Experience in building scalable and secure data pipelines on Azure.
Posted 1 month ago
4.0 - 7.0 years
6 - 9 Lacs
Hyderabad, Bengaluru
Hybrid
Job Summary We are seeking a skilled Azure Data Engineer with 4 years of overall experience , including at least 2 years of hands-on experience with Azure Databricks (Must) . The ideal candidate will have strong expertise in building and maintaining scalable data pipelines and working across cloud-based data platforms. Key Responsibilities Design, develop, and optimize large-scale data pipelines using Azure Data Factory, Azure Databricks, and Azure Synapse. Implement data lake solutions and work with structured and unstructured datasets in Azure Data Lake Storage (ADLS). Collaborate with data scientists, analysts, and engineering teams to design and deliver end-to-end data solutions. Develop ETL/ELT processes and integrate data from multiple sources. Monitor, debug, and optimize workflows for performance and cost-efficiency. Ensure data governance, quality, and security best practices are maintained. Must-Have Skills 4+ years of total experience in data engineering. 2+ years of experience with Azure Databricks (PySpark, Notebooks, Delta Lake). Strong experience with Azure Data Factory, Azure SQL, and ADLS. Proficient in writing SQL queries and Python/Scala scripting. Understanding of CI/CD pipelines and version control systems (e.g., Git). Solid grasp of data modeling and warehousing concepts. Skills: azure synapse,data modeling,data engineering,azure,azure databricks,azure data lake storage (adls),ci/cd,etl,elt,data warehousing,sql,scala,git,azure data factory,python
Posted 1 month ago
7.0 - 10.0 years
9 - 12 Lacs
Mumbai
Work from Office
Experience 7-10 years Educational qualification - bachelors degree or higher in a related field Summary Applies the principles of software engineering to design, develop, maintain, test, and evaluate computer software that provide business capabilities, solutions, and/or product suites. Provides systems life cycle management (e.g., analyses, technical requirements, design, coding, testing, implementation of systems and applications software, etc.) to ensure delivery of technical solutions is on time and within budget. Researches and supports the integration of emerging technologies. Provides knowledge and support for applications development, integration, and maintenance. Develops program logic for new applications or analyzes and modifies logic in existing applications. Analyzes requirements, tests, and integrates application components. Ensures that system improvements are successfully implemented. May focus on web/internet applications specifically, using a variety of languages and platforms. Defines application complexity drivers, estimates development efforts, creates milestones and/or timelines, and tracks progress towards completion. Application Development/Programming Identifies areas for improvement and develops innovative enhancements using available software development tools following design requirements of customer. System and Technology Integration Interprets internal/external business challenges and recommends integration of the appropriate systems, applications and technology to provide a fully functional solution to a business problem. Development and support of the activities outlined below (Note: Other items may arise that are not directly referenced in this scope that may include technology updates, technology expansion, DevOps pipeline changes, information security, and technical debt compliance) New Development Development of new features/functionality driven by PI (program Increment). This will include documenting Features, Stories, obtaining Approvals from Business and UPS IT Product Owners, story analysis, design of the required solution, review with UPS SME(s), coding, testing, non-functional requirements (reporting, production capacity, performance, security), and migration/deployment. Scope at high level The scope of this project includes the following activities, but not limited to: Develop new integration pipelines with SC360 Data bricks, Azure functions, Azure data factory, Azure DevOps, Cosmos dB, Oracle, Azure SQL, SSIS Packages. Work in alignment with business teams to support development effort for all SC360 data related PI items. Develop fixes for defects, issues identified in production environment. Build POCs as needed to supplement SC360 platform. Develop and implement architectural changes as needed in SC360 platform to increase efficiency, reduce cost, and monitor the platform. Provide production support assistance as needed. NFR includes but not limited to able to build according to UPS Coding Standards including Security Compliance Required Skills General skills Strong communication skills (both oral and written) Will need to work closely with UPS IT, Business Product Owners, and potential direct engagement with UPS customers. Agile life-cycle management Vulnerability/Threat Analysis Testing Deployments across environments and segregation of duties Technical skills Mandatory. Experience with Azure Data bricks, SQL, ETL SSIS Packages Very Critical. Azure Data Factory, Function Apps, DevOps A must Experience with Azure and other cloud technologies Database Oracle, SQL Server and COSMOS experience needed. Azure Services (key vault, app config, Blob storage, Redis cache, service bus, event grid, ADLS, App insight etc.) Knowledge of STRIIMs Nice to have. Microservicesexperience, preferred. Experience with Angular, .NET core Not critical
Posted 1 month ago
8.0 - 12.0 years
25 - 30 Lacs
Chennai
Hybrid
Job Title: Senior Data Developer Azure ADF and Databricks Experience Range: 8-12 Years Location: Chennai, Hybrid Employment Type: Full-Time About the role We are seeking an experienced Senior Data Developer to join our data engineering team responsible for building and maintaining complex data solutions using Azure Data Factory (ADF), Azure Databricks , and Cosmos DB . The role involves designing and developing scalable data pipelines, implementing data transformations, and ensuring high data quality and performance. The Senior Data Developer will work closely with data architects, testers, and analysts to deliver robust data solutions that support strategic business initiatives. The ideal candidate should possess deep expertise in big data technologies, data integration, and cloud-native data engineering solutions on Microsoft Azure. This role also involves coaching junior developers, conducting code reviews, and driving strategic improvements in data architecture and design patterns. Key Responsibilities Data Solution Design and Development : Design and develop scalable and high-performance data pipelines using Azure Data Factory (ADF). Implement data transformations and processing using Azure Databricks. Develop and maintain NoSQL data models and queries in Cosmos DB. Optimize data pipelines for performance, scalability, and cost efficiency. Data Integration and Architecture: Integrate structured and unstructured data from diverse data sources. Collaborate with data architects to design end-to-end data flows and system integrations. Implement data security, governance, and compliance standards. Performance Tuning and Optimization: Monitor and tune data pipelines and processing jobs for performance and cost efficiency. Optimize data storage and retrieval strategies for Azure SQL and Cosmos DB. Collaboration and Mentoring: Collaborate with cross-functional teams including data testers, architects, and business analysts. Conduct code reviews and provide constructive feedback to improve code quality. Mentor junior developers, fostering best practices in data engineering and cloud development. Primary Skills Data Engineering: Azure Data Factory (ADF), Azure Databricks. Cloud Platform: Microsoft Azure (Data Lake Storage, Cosmos DB). Data Modeling: NoSQL data modeling, Data warehousing concepts. Performance Optimization: Data pipeline performance tuning and cost optimization. Programming Languages: Python, SQL, PySpark Secondary Skills DevOps and CI/CD: Azure DevOps, CI/CD pipeline design and automation. Security and Compliance: Implementing data security and governance standards. Agile Methodologies: Experience in Agile/Scrum environments. Leadership and Mentoring: Strong communication and coaching skills for team collaboration. Soft Skills Strong problem-solving abilities and attention to detail. Excellent communication skills, both verbal and written. Effective time management and organizational capabilities. Ability to work independently and within a collaborative team environment. Strong interpersonal skills to engage with cross-functional teams. Educational Qualifications Bachelor's degree in Computer Science, Engineering, Information Technology, or a related field. Relevant certifications in Azure and Data Engineering, such as: Microsoft Certified: Azure Data Engineer Associate Microsoft Certified: Azure Solutions Architect Expert Databricks Certified Data Engineer Associate or Professional About the Team As a Senior Data Developer , you will be working with a dynamic, cross-functional team that includes developers, product managers, and other quality engineers. You will be a key player in the quality assurance process, helping shape testing strategies and ensuring the delivery of high-quality web applications.
Posted 1 month ago
8.0 - 12.0 years
25 - 30 Lacs
Mumbai
Work from Office
Job Title: Senior Data Developer Azure ADF and Databricks Experience Range: 8-12 Years Location: Chennai, Hybrid Employment Type: Full-Time About the role We are seeking an experienced Senior Data Developer to join our data engineering team responsible for building and maintaining complex data solutions using Azure Data Factory (ADF), Azure Databricks , and Cosmos DB . The role involves designing and developing scalable data pipelines, implementing data transformations, and ensuring high data quality and performance. The Senior Data Developer will work closely with data architects, testers, and analysts to deliver robust data solutions that support strategic business initiatives. The ideal candidate should possess deep expertise in big data technologies, data integration, and cloud-native data engineering solutions on Microsoft Azure. This role also involves coaching junior developers, conducting code reviews, and driving strategic improvements in data architecture and design patterns. Key Responsibilities Data Solution Design and Development : Design and develop scalable and high-performance data pipelines using Azure Data Factory (ADF). Implement data transformations and processing using Azure Databricks. Develop and maintain NoSQL data models and queries in Cosmos DB. Optimize data pipelines for performance, scalability, and cost efficiency. Data Integration and Architecture: Integrate structured and unstructured data from diverse data sources. Collaborate with data architects to design end-to-end data flows and system integrations. Implement data security, governance, and compliance standards. Performance Tuning and Optimization: Monitor and tune data pipelines and processing jobs for performance and cost efficiency. Optimize data storage and retrieval strategies for Azure SQL and Cosmos DB. Collaboration and Mentoring: Collaborate with cross-functional teams including data testers, architects, and business analysts. Conduct code reviews and provide constructive feedback to improve code quality. Mentor junior developers, fostering best practices in data engineering and cloud development. Primary Skills Data Engineering: Azure Data Factory (ADF), Azure Databricks. Cloud Platform: Microsoft Azure (Data Lake Storage, Cosmos DB). Data Modeling: NoSQL data modeling, Data warehousing concepts. Performance Optimization: Data pipeline performance tuning and cost optimization. Programming Languages: Python, SQL, PySpark Secondary Skills DevOps and CI/CD: Azure DevOps, CI/CD pipeline design and automation. Security and Compliance: Implementing data security and governance standards. Agile Methodologies: Experience in Agile/Scrum environments. Leadership and Mentoring: Strong communication and coaching skills for team collaboration. Soft Skills Strong problem-solving abilities and attention to detail. Excellent communication skills, both verbal and written. Effective time management and organizational capabilities. Ability to work independently and within a collaborative team environment. Strong interpersonal skills to engage with cross-functional teams. Educational Qualifications Bachelor's degree in Computer Science, Engineering, Information Technology, or a related field. Relevant certifications in Azure and Data Engineering, such as: Microsoft Certified: Azure Data Engineer Associate Microsoft Certified: Azure Solutions Architect Expert Databricks Certified Data Engineer Associate or Professional About the Team As a Senior Data Developer , you will be working with a dynamic, cross-functional team that includes developers, product managers, and other quality engineers. You will be a key player in the quality assurance process, helping shape testing strategies and ensuring the delivery of high-quality web applications.
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39815 Jobs | Dublin
Wipro
19317 Jobs | Bengaluru
Accenture in India
15105 Jobs | Dublin 2
EY
14860 Jobs | London
Uplers
11139 Jobs | Ahmedabad
Amazon
10431 Jobs | Seattle,WA
IBM
9214 Jobs | Armonk
Oracle
9174 Jobs | Redwood City
Accenture services Pvt Ltd
7676 Jobs |
Capgemini
7672 Jobs | Paris,France