Home
Jobs
Companies
Resume

127 Datafactory Jobs

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

10.0 - 15.0 years

12 - 17 Lacs

Bengaluru

Work from Office

Naukri logo

Create Solution Outline and Macro Design to describe end to end product implementation in Data Platforms including, System integration, Data ingestion, Data processing, Serving layer, Design Patterns, Platform Architecture Principles for Data platform Contribute to pre-sales, sales support through RfP responses, Solution Architecture, Planning and Estimation Contribute to reusable components / asset / accelerator development to support capability development Participate in Customer presentations as Platform Architects / Subject Matter Experts on Big Data, Azure Cloud and related technologies Participate in customer PoCs to deliver the outcomes Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Candidates must have experience in designing of data products providing descriptive, prescriptive, and predictive analytics to end users or other systems 10 - 15 years of experience in data engineering and architecting data platforms 5 – 8 years’ experience in architecting and implementing Data Platforms Azure Cloud Platform. 5 – 8 years’ experience in architecting and implementing Data Platforms on-prem (Hadoop or DW appliance) Experience on Azure cloud is mandatory (ADLS Gen 1 / Gen2, Data Factory, Databricks, Synapse Analytics, Azure SQL, Cosmos DB, Event hub, Snowflake), Azure Purview, Microsoft Fabric, Kubernetes, Terraform, Airflow. Experience in Big Data stack (Hadoop ecosystem Hive, HBase, Kafka, Spark, Scala PySpark, Python etc.) with Cloudera or Hortonworks Preferred technical and professional experience Exposure to Data Cataloging and Governance solutions like Collibra, Alation, Watson Knowledge Catalog, dataBricks unity Catalog, Apache Atlas, Snowflake Data Glossary etc Candidates should have experience in delivering both business decision support systems (reporting, analytics) and data science domains / use cases

Posted 10 hours ago

Apply

6.0 - 7.0 years

8 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

As an Associate Software Developer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include: Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviors. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modeling results Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Total Exp-6-7 Yrs (Relevant-4-5 Yrs) Mandatory Skills: Azure Databricks, Python/PySpark, SQL, Github, - Azure Devops - Azure Blob Ability to use programming languages like Java, Python, Scala, etc., to build pipelines to extract and transform data from a repository to a data consumer Ability to use Extract, Transform, and Load (ETL) tools and/or data integration, or federation tools to prepare and transform data as needed. Ability to use leading edge tools such as Linux, SQL, Python, Spark, Hadoop and Java Preferred technical and professional experience You thrive on teamwork and have excellent verbal and written communication skills. Ability to communicate with internal and external clients to understand and define business needs, providing analytical solutions Ability to communicate results to technical and non-technical audiences

Posted 10 hours ago

Apply

7.0 - 11.0 years

11 - 15 Lacs

Kolkata

Work from Office

Naukri logo

Skill required: Tech for Operations - Tech Solution Architecture Designation: Solution Architecture Specialist Qualifications: Any Graduation Years of Experience: 7 to 11 years About Accenture Combining unmatched experience and specialized skills across more than 40 industries, we offer Strategy and Consulting, Technology and Operations services, and Accenture Song all powered by the worlds largest network of Advanced Technology and Intelligent Operations centers. Our 699,000 people deliver on the promise of technology and human ingenuity every day, serving clients in more than 120 countries. Visit us at www.accenture.com What would you do In our Service Supply Chain offering, we leverage a combination of proprietary technology and client systems to develop, execute, and deliver BPaaS (business process as a service) or Managed Service solutions across the service lifecycle:Plan, Deliver, and Recover. Join our dynamic Service Supply Chain (SSC) team and be at the forefront of helping world class organizations unlock their full potential. Imagine a career where your innovative work makes a real impact, and every day brings new challenges and opportunities for growth. We re on the lookout for passionate, talented individuals ready to make a difference. . If you re eager to shape the future and drive success, this is your chancejoin us now and lets build something extraordinary together!The Technical Solution Architect I is responsible for evaluating an organizations business needs and determining how IT can support those needs leveraging software like Azure, and Salesforce. Aligning IT strategies with business goals has become paramount, and a solutions architect can help determine, develop, and improve technical solutions in support of business goals. The Technical Solution Architect I also bridge communication between IT and business operations to ensure everyone is aligned in developing and implementing technical solutions for business problems. The process requires regular feedback, adjustments, and problem solving in order to properly design and implement potential solutions. To be successful as a Technical Solution Architect I, you should have excellent technical, analytical, and project management skills. What are we looking for Minimum of 5 years of IT experienceMinimum of 1 year of experience in solution architectureMinimum of 1 year of Enterprise-scale project delivery experienceMicrosoft Azure Cloud ServicesMicrosoft Azure Data FactoryMicrosoft Azure DatabricksMicrosoft Azure DevOpsWritten and verbal communicationAbility to establish strong client relationshipProblem-solving skillsStrong analytical skillsExpert knowledge of Azure Cloud ServicesExperience with Azure Data platforms (Logic apps, Service bus, Databricks, Data Factory, Azure integration services)CI/CD, version-controlling experience using Azure DevopsPython ProgrammingKnowledge of both traditional and modern data architecture and processing concepts, including relational databases, data warehousing, and business analytics. (e.g., NoSQL, SQL Server, Oracle, Hadoop, Spark, Knime). Good understanding of security processes, best practices, standards & issues involved in multi-tier cloud or hybrid applications. Proficiency in both high-level and low-level designing to build an architect using customization or configuration on Salesforce Service cloud, Field Service lightening, APEX, Visual Force, Lightening, Community. Expertise in designing and building real time/batch integrations between Salesforce and other systems. Design Apex and Lightning framework including Lightning Pattern, Error logging framework etc. Roles and Responsibilities: Meet with clients to understanding their needs (lead architect assessment meetings),and determining gaps between those needs and technical functionality. Communicate with key stakeholder, across different stages of the Software Development Life Cycle. Work on creating the high-level design and lead architectural decisions Interact with clients to create end-to-end specifications for Azure & Salesforce cloud solutions Provide clarification and answer any question regarding the solution architecture Lead the development of custom enterprise solutions Responsible for application architecture, ensuring high performance, scalability, and availability for those applications Responsible for overall data architect, modeling, and related standards enforced throughout the enterprise ecosystem including data, master data, and metadata, processes, governance, and change control Unify the data architecture used within all applications and identifying appropriate systems of record, reference, and management Share engagement experience with the internal audiences and enrich collective IP. Conduct architecture workshops and other enablement sessions. Qualification Any Graduation

Posted 4 days ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Working with event based / streaming technologies to ingest and process data Working with other members of the project team to support delivery of additional project components (API interfaces, Search). Evaluating the performance and applicability of multiple tools against customer requirements Working within an Agile delivery / DevOps methodology to deliver proof of concept and production implementation in iterative sprints. Strong knowledge of Data Management principles Experience in building ETL / data warehouse transformation processes Direct experience of building data piplines using Databricks. Experience using geospatial frameworks on Apache Spark and associated design and development patterns Experience working in a Dev/Ops environment with tools such as Terraform Show more Show less

Posted 4 days ago

Apply

7.0 - 12.0 years

20 - 35 Lacs

Bengaluru, Malaysia

Work from Office

Naukri logo

Core Competences Required and Desired Attributes: Bachelor's degree in computer science, Information Technology, or a related field. Proficiency in Azure Data Factory, Azure Databricks and Unity Catalog, Azure SQL Database, and other Azure data services. Strong programming skills in SQL, Python and PySpark languages. Experience in the Asset Management domain would be preferable. Strong proficiency in data analysis and data modelling, with the ability to extract insights from complex data sets. Hands-on experience in Power BI, including creating custom visuals, DAX expressions, and data modelling. Familiarity with Azure Analysis Services, data modelling techniques, and optimization. Experience with data quality and data governance frameworks with an ability to debug, fine tune and optimise large scale data processing jobs. Strong analytical and problem-solving skills, with a keen eye for detail. Excellent communication and interpersonal skills, with the ability to work collaboratively in a team environment. Proactive and self-motivated, with the ability to manage multiple tasks and deliver high-quality results within deadlines. Roles and Responsibilities Core Competences Required and Desired Attributes: Bachelor's degree in computer science, Information Technology, or a related field. Proficiency in Azure Data Factory, Azure Databricks and Unity Catalog, Azure SQL Database, and other Azure data services. Strong programming skills in SQL, Python and PySpark languages. Experience in the Asset Management domain would be preferable. Strong proficiency in data analysis and data modelling, with the ability to extract insights from complex data sets. Hands-on experience in Power BI, including creating custom visuals, DAX expressions, and data modelling. Familiarity with Azure Analysis Services, data modelling techniques, and optimization. Experience with data quality and data governance frameworks with an ability to debug, fine tune and optimise large scale data processing jobs. Strong analytical and problem-solving skills, with a keen eye for detail. Excellent communication and interpersonal skills, with the ability to work collaboratively in a team environment. Proactive and self-motivated, with the ability to manage multiple tasks and deliver high-quality results within deadlines.

Posted 6 days ago

Apply

4.0 - 8.0 years

10 - 15 Lacs

Bengaluru

Work from Office

Naukri logo

As a Software Developer you'll participate in many aspects of the software development lifecycle, such as design, code implementation, testing, and support. You will create software that enables your clients' hybrid-cloud and AI journeys. Your primary responsibilities includeComprehensive Feature Development and Issue ResolutionWorking on the end to end feature development and solving challenges faced in the implementation. Stakeholder Collaboration and Issue ResolutionCollaborate with key stakeholders, internal and external, to understand the problems, issues with the product and features and solve the issues as per SLAs defined. Continuous Learning and Technology IntegrationBeing eager to learn new technologies and implementing the same in feature development. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Proficient in .Net Core with React or Angular Experience in Agile teams applying the best architectural, design, unit testing patterns & practices with an eye for code quality and standards. AZURE FUNCTION, AZURE SERVICE BUS, AZURE STORAGE ACCOUNT- MANDATORY AZURE DURABLE FUNCTIONS AZURE DATA FACTORY, AZURE SQL OR COSMOS DB(DATABASE)Required Ability to write calculation rules and configurable consolidation rules. Preferred technical and professional experience Excellent written and verbal interpersonal skills for coordinating across teams. Should have at least 2 end to end implementation experience. Ability to write and update the rules of historical overrides.

Posted 1 week ago

Apply

4.0 - 9.0 years

2 - 6 Lacs

Bengaluru

Work from Office

Naukri logo

Roles and Responsibilities: 4+ years of experience as a data developer using Python Knowledge in Spark, PySpark preferable but not mandatory Azure Cloud experience (preferred) Alternate Cloud experience is fine preferred experience in Azure platform including Azure data Lake, data Bricks, data Factory Working Knowledge on different file formats such as JSON, Parquet, CSV, etc. Familiarity with data encryption, data masking Database experience in SQL Server is preferable preferred experience in NoSQL databases like MongoDB Team player, reliable, self-motivated, and self-disciplined

Posted 1 week ago

Apply

4.0 - 6.0 years

5 - 9 Lacs

Hyderabad

Work from Office

Naukri logo

4-6 years of experience building resilient, highly available and scalable cloud native platforms and solutions. Extensive experience with the .NET framework and other technologies: C#, Web API Experience with using a broad range of Azure services, mainly from the list below: Web Apps, Web jobs, Storage, Azure Key Vault, Blueprint Assignment, Azure Policy, Azure Service Bus. Expertise in creation and usage of ARM Templates is required. Usage and deployment knowledge of Infrastructure as a code using tools such as Terraform is required. Advanced knowledge on IaaS and PaaS services of Azure Knowledge on Monitoring tools (Application Insights) is required. Comprehensive understanding on Azure platform and services Knowledge on IAM Identity and Access Management is needed. APP Insights, Azure SQL DB, Cosmos DB, Functions, Azure Bot Service, Express Route Azure VM, Azure VNet, Azure Active Directory, Azure AD B2C, Azure Analytics Services - Azure Analysis Services, SQL Data Warehouse, Data Factory, Databricks Develop and maintain an Azure based cloud solution, with an emphasis on best practice cloud security. Automating tasks using Azure Dev-Ops and CI/CD Pipeline Expertise in one of the languages such as PowerShell or Python, .Net, C# is preferable. Strong knowledge on Dev-Ops and tools in Azure. Infrastructure and Application monitoring across production and non-production platforms Experience with DevOps Orchestration / Configuration / Continuous Integration Management technologies. Knowledge on hybrid public cloud design concepts Good understanding of High Availability and Disaster Recovery concepts for infrastructure Problem Solving: Ability to analyze and resolve complex infrastructure resource and application deployment issues. Excellent communication skills, understanding customer needs, negotiations skills, Vendor management skills. Education (degree): Bachelor's degree in Computer Science, Business Information Systems or relevant experience and accomplishments Technical Skills 1. Cloud provisioning and management Azure 2. Programming Language C#, .NET Core, PowerShell 3. Web API's

Posted 1 week ago

Apply

4.0 years

0 Lacs

India

On-site

Linkedin logo

Mandatory Skills : Azure Cloud Technologies, Azure Data Factory, Azure Databricks (Advance Knowledge), PySpark, CI/CD Pipeline (Jenkins, GitLab CI/CD or Azure DevOps), Data Ingestion, SOL Seeking a skilled Data Engineer with expertise in Azure cloud technologies, data pipelines, and big data processing. The ideal candidate will be responsible for designing, developing, and optimizing scalable data solutions. Responsibilities Azure Databricks and Azure Data Factory Expertise:  Demonstrate proficiency in designing, implementing, and optimizing data workflows using Azure Databricks and Azure Data Factory.  Provide expertise in configuring and managing data pipelines within the Azure cloud environment. PySpark Proficiency:  Possess a strong command of PySpark for data processing and analysis.  Develop and optimize PySpark code to ensure efficient and scalable data transformations. Big Data & CI/CD Experience:  Ability to troubleshoot and optimize data processing tasks on large datasets. Design and implement automated CI/CD pipelines for data workflows.  This involves using tools like Jenkins, GitLab CI/CD, or Azure DevOps to automate the building, testing, and deployment of data pipelines. Data Pipeline Development & Deployment:  Design, implement, and maintain end-to-end data pipelines for various data sources and destinations.  This includes unit tests for individual components, integration tests to ensure that different components work together correctly, and end-to-end tests to verify the entire pipeline's functionality.  Familiarity with Github/Repo for deployment of code  Ensure data quality, integrity, and reliability throughout the entire data pipeline. Extraction, Ingestion, and Consumption Frameworks:  Develop frameworks for efficient data extraction, ingestion, and consumption.  Implement best practices for data integration and ensure seamless data flow across the organization. Collaboration and Communication:  Collaborate with cross-functional teams to understand data requirements and deliver scalable solutions.  Communicate effectively with stakeholders to gather and clarify data-related requirements. Requirements Bachelor’s or master’s degree in Computer Science, Data Engineering, or a related field. 4+ years of relevant hands-on experience in data engineering with Azure cloud services and advanced Databricks. Strong analytical and problem-solving skills in handling large-scale data pipelines. Experience in big data processing and working with structured & unstructured datasets. Expertise in designing and implementing data pipelines for ETL workflows. Strong proficiency in writing optimized queries and working with relational databases. Experience in developing data transformation scripts and managing big data processing using PySpark.. Skills: sol,azure,azure databricks,sql,pyspark,data ingestion,azure cloud technologies,azure datafactory,azure data factory,ci/cd pipeline (jenkins, gitlab ci/cd or azure devops),azure databricks (advance knowledge),ci/cd pipelines Show more Show less

Posted 1 week ago

Apply

8.0 - 12.0 years

25 - 30 Lacs

Chennai

Work from Office

Naukri logo

Job description Job Title: Manager Data Engineer - Azure Location: Chennai (On-site) Experience: 8 - 12 years Employment Type: Full-Time About the Role We are seeking a highly skilled Senior Azure Data Solutions Architect to design and implement scalable, secure, and efficient data solutions supporting enterprise-wide analytics and business intelligence initiatives. You will lead the architecture of modern data platforms, drive cloud migration, and collaborate with cross-functional teams to deliver robust Azure-based solutions. Key Responsibilities Architect and implement end-to-end data solutions using Azure services (Data Factory, Databricks, Data Lake, Synapse, Cosmos DB Design robust and scalable data models, including relational, dimensional, and NoSQL schemas. Develop and optimize ETL/ELT pipelines and data lakes using Azure Data Factory, Databricks, and open formats such as Delta and Iceberg. Integrate data governance, quality, and security best practices into all architecture designs. Support analytics and machine learning initiatives through structured data pipelines and platforms. Collaborate with data engineers, analysts, data scientists, and business stakeholders to align solutions with business needs. Drive CI/CD integration with Databricks using Azure DevOps and tools like DBT. Monitor system performance, troubleshoot issues, and optimize data infrastructure for efficiency and reliability. Stay current with Azure platform advancements and recommend improvements. Required Skills & Experience Extensive hands-on experience with Azure services: Data Factory, Databricks, Data Lake, Azure SQL, Cosmos DB, Synapse. • Expertise in data modeling and design (relational, dimensional, NoSQL). • Proven experience with ETL/ELT processes, data lakes, and modern lakehouse architectures. • Proficiency in Python, SQL, Scala, and/or Java. • Strong knowledge of data governance, security, and compliance frameworks. • Experience with CI/CD, Azure DevOps, and infrastructure as code (Terraform or ARM templates). • Familiarity with BI and analytics tools such as Power BI or Tableau. • Excellent communication, collaboration, and stakeholder management skills. • Bachelors degree in Computer Science, Engineering, Information Systems, or related field. Preferred Qualifications • Experience in regulated industries (finance, healthcare, etc.). • Familiarity with data cataloging, metadata management, and machine learning integration. • Leadership experience guiding teams and presenting architectural strategies to leadership. Why Join Us? • Work on cutting-edge cloud data platforms in a collaborative, innovative environment. • Lead strategic data initiatives that impact enterprise-wide decision-making. • Competitive compensation and opportunities for professional growth.

Posted 1 week ago

Apply

9.0 - 14.0 years

10 - 16 Lacs

Chennai

Work from Office

Naukri logo

Azure Data Bricks, Data Factory, Pyspark, Sql If Your Interst in this position Attached your CV to this Mail ID muniswamyinfyjob@gmail.com

Posted 1 week ago

Apply

6.0 - 9.0 years

4 - 8 Lacs

Pune

Work from Office

Naukri logo

Your Role As a senior software engineer with Capgemini, you will have 6 + years of experience in Azure technology with strong project track record In this role you will play a key role in: Strong customer orientation, decision making, problem solving, communication and presentation skills Very good judgement skills and ability to shape compelling solutions and solve unstructured problems with assumptions Very good collaboration skills and ability to interact with multi-cultural and multi-functional teams spread across geographies Strong executive presence and entrepreneurial spirit Superb leadership and team building skills with ability to build consensus and achieve goals through collaboration rather than direct line authority Your Profile Experience with Azure Data Bricks, Data Factory Experience with Azure Data components such as Azure SQL Database, Azure SQL Warehouse, SYNAPSE Analytics Experience in Python/Pyspark/Scala/Hive Programming Experience with Azure Databricks/ADB is must have Experience with building CI/CD pipelines in Data environments

Posted 1 week ago

Apply

7.0 - 12.0 years

10 - 15 Lacs

Bangalore Rural

Work from Office

Naukri logo

A candidate with distributed computer understanding and experience with SQL, Spark, ETL. Experience using databases like MySQL DB, Postegre, SQL, OracleAWS or Datafactory based ETL on Azure is required.

Posted 1 week ago

Apply

6.0 - 10.0 years

8 - 12 Lacs

Pune, Gurugram, Bengaluru

Work from Office

Naukri logo

Contractual Hiring manager :- My profile :- linkedin.com/in/yashsharma1608 Payroll of :- https://www.nyxtech.in/ 1. AZURE DATA ENGINEER WITH FABRIC The Role : Lead Data Engineer PAYROLL Client - Brillio About Role: Experience 6 to 8yrs Location- Bangalore , Hyderabad , Pune , Chennai , Gurgaon (Hyderabad is preferred) Notice- 15 days / 30 days. Budget -15 LPA AZURE FABRIC EXP MANDATE Skills : Azure Onelake, datapipeline , Apache Spark , ETL , Datafactory , Azure Fabric , SQL , Python/Scala. Key Responsibilities: Data Pipeline Development: Lead the design, development, and deployment of data pipelines using Azure OneLake, Azure Data Factory, and Apache Spark, ensuring efficient, scalable, and secure data movement across systems. ETL Architecture: Architect and implement ETL (Extract, Transform, Load) workflows, optimizing the process for data ingestion, transformation, and storage in the cloud. Data Integration: Build and manage data integration solutions that connect multiple data sources (structured and unstructured) into a cohesive data ecosystem. Use SQL, Python, Scala, and R to manipulate and process large datasets. Azure OneLake Expertise: Leverage Azure OneLake and Azure Synapse Analytics to design and implement scalable data storage and analytics solutions that support big data processing and analysis. Collaboration with Teams: Work closely with Data Scientists, Data Analysts, and BI Engineers to ensure that the data infrastructure supports analytical needs and is optimized for performance and accuracy. Performance Optimization: Monitor, troubleshoot, and optimize data pipeline performance to ensure high availability, fast processing, and minimal downtime. Data Governance & Security: Implement best practices for data governance, data security, and compliance within the Azure ecosystem, ensuring data privacy and protection. Leadership & Mentorship: Lead and mentor a team of data engineers, promoting a collaborative and high-performance team culture. Oversee code reviews, design decisions, and the implementation of new technologies. Automation & Monitoring: Automate data engineering workflows, job scheduling, and monitoring to ensure smooth operations. Use tools like Azure DevOps, Airflow, and other relevant platforms for automation and orchestration. Documentation & Best Practices: Document data pipeline architecture, data models, and ETL processes, and contribute to the establishment of engineering best practices, standards, and guidelines. C Innovation: Stay current with industry trends and emerging technologies in data engineering, cloud computing, and big data analytics, driving innovation within the team.C

Posted 1 week ago

Apply

5.0 - 8.0 years

2 - 6 Lacs

Bengaluru

Work from Office

Naukri logo

Job Information Job Opening ID ZR_1628_JOB Date Opened 09/12/2022 Industry Technology Job Type Work Experience 5-8 years Job Title Data Engineer City Bangalore Province Karnataka Country India Postal Code 560001 Number of Positions 4 Roles and Responsibilities: 4+ years of experience as a data developer using Python Knowledge in Spark, PySpark preferable but not mandatory Azure Cloud experience (preferred) Alternate Cloud experience is fine preferred experience in Azure platform including Azure data Lake, data Bricks, data Factory Working Knowledge on different file formats such as JSON, Parquet, CSV, etc. Familiarity with data encryption, data masking Database experience in SQL Server is preferable preferred experience in NoSQL databases like MongoDB Team player, reliable, self-motivated, and self-disciplined check(event) ; career-website-detail-template-2 => apply(record.id,meta)" mousedown="lyte-button => check(event)" final-style="background-color:#2B39C2;border-color:#2B39C2;color:white;" final-class="lyte-button lyteBackgroundColorBtn lyteSuccess" lyte-rendered=""> I'm interested

Posted 1 week ago

Apply

5.0 - 10.0 years

15 - 30 Lacs

Noida, Hyderabad, Delhi / NCR

Work from Office

Naukri logo

Job Role: Azure Data Engineer Location: Greater Noida & Hyderabad Experience: 5 to 10 years Notice Period: Immediate to 30 days Job Description: Bachelor's degree in Computer Science, Computer Engineering, Technology, Information Systems (CIS/MIS), Engineering or related technical discipline, or equivalent experience/training Many years software solution development using agile, DevOps, operating in a product model that includes designing, developing, and implementing large-scale applications or data engineering solutions 3 years Data Engineering experience using SQL 2 years of cloud development (prefer Microsoft Azure) including Azure EventHub, Azure Data Factory, Azure Databricks, Azure DevOps, Azure Blob Storage, Azure Power Apps and Power BI. Combination of Development, Administration & Support experience in several of the following tools/platforms required: a. Scripting: Python, PySpark, Unix, SQL b. Data Platforms: Teradata, SQL Server c. Azure Data Explorer. Administration skills are a plus d. Azure Cloud Technologies: Azure Data Factory, Azure Databricks, Azure Blob Storage, Azure Power Apps, and Azure Functions e. CI/CD: GitHub, Azure DevOps, Terraform

Posted 1 week ago

Apply

6.0 - 11.0 years

20 - 22 Lacs

Hyderabad, Bengaluru

Work from Office

Naukri logo

Detailed job description - Skill Set: 6+ years of experience in Database and Data warehouse tech (Azure Synapse/ADB, ADF/SQL Server/SAP HANA), Experience in ADF, ADB, Azure Synapse Solid knowledge on Python, Spark, SQL Experience with Power Shell Scripting Effectively analyzes the heterogeneous source data and writes SQL scripts to integrate data from multiple data sources Statistical Analysis and assess the results to generate actionable insights and presents the findings to the business users for informed decision making Performs Data mining which provides actionable data in response to changing business requirements. Adapts to the changing business requirements and supports the development and implementation of best-known methods with respect to data analytics Mandatory Skills Must Have Skills (Top 3 technical skills only) * Experience in ADF, ADB, Azure Synapse Solid knowledge on Python, PySpark, SQL Experience with Power Shell Scripting

Posted 2 weeks ago

Apply

12.0 - 15.0 years

10 - 14 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : SAP Master Data Management & Architecture Good to have skills : NAMinimum 12 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure that application requirements are met, overseeing the development process, and providing guidance to team members. You will also engage in problem-solving activities, ensuring that solutions are effectively implemented across multiple teams, while maintaining a focus on quality and efficiency in application delivery. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Expected to provide solutions to problems that apply across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure alignment with strategic goals. Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP Master Data Management & Architecture.- Strong understanding of data governance principles and practices.- Experience with data integration techniques and tools.- Ability to design and implement data models that support business processes.- Familiarity with data quality management and data lifecycle management. Additional Information:- The candidate should have minimum 12 years of experience in SAP Master Data Management & Architecture.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 weeks ago

Apply

5.0 - 10.0 years

15 - 30 Lacs

Pune, Gurugram, Bengaluru

Work from Office

Naukri logo

Warm welcome from SP Staffing Services! Reaching out to you regarding permanent opportunity !! Job Description: Exp: 5-10 yrs Location: Gurugram/Bangalore/Pune Skill: Azure Data Engineer Interested can share your resume to sangeetha.spstaffing@gmail.com with below inline details. Full Name as per PAN: Mobile No: Alt No/ Whatsapp No: Total Exp: Relevant Exp in Databricks: Rel Exp in Pyspark: Rel Exp in DWH: Rel Exp in Python: Current CTC: Expected CTC: Notice Period (Official): Notice Period (Negotiable)/Reason: Date of Birth: PAN number: Reason for Job Change: Offer in Pipeline (Current Status): Availability for virtual interview on weekdays between 10 AM- 4 PM(plz mention time): Current Res Location: Preferred Job Location: Whether educational % in 10th std, 12th std, UG is all above 50%? Do you have any gaps in between your education or Career? If having gap, please mention the duration in months/year:

Posted 2 weeks ago

Apply

3.0 - 6.0 years

1 - 6 Lacs

Gurugram

Work from Office

Naukri logo

Role & responsibilities Design, develop, and maintain scalable Python applications for data processing and analytics. Build and manage ETL pipelines using Databricks on Azure/AWS cloud platforms. Collaborate with analysts and other developers to understand business requirements and implement data-driven solutions. Optimize and monitor existing data workflows to improve performance and scalability. Write clean, maintainable, and testable code following industry best practices. Participate in code reviews and provide constructive feedback. Maintain documentation and contribute to project planning and reporting. Skills & Experience Bachelor's degree in Computer Science, Engineering, or related field Prior experience as a Python Developer or similar role, with a strong portfolio showcasing your past projects. 2-5 years of Python experience Strong proficiency in Python programming. Hands-on experience with Databricks platform (Notebooks, Delta Lake, Spark jobs, cluster configuration, etc.). Good knowledge of Apache Spark and its Python API (PySpark). Experience with cloud platforms (preferably Azure or AWS) and working with Databricks on cloud. Familiarity with data pipeline orchestration tools (e.g., Airflow, Azure Data Factory, etc.).

Posted 2 weeks ago

Apply

4.0 - 8.0 years

6 - 10 Lacs

Bengaluru

Work from Office

Naukri logo

Back At BCE Global Tech, immerse yourself in exciting projects that are shaping the future of both consumer and enterprise telecommunications This involves building innovative mobile apps to enhance user experiences and enable seamless connectivity on-the-go Thrive in diverse roles like Full Stack Developer, Backend Developer, UI/UX Designer, DevOps Engineer, Cloud Engineer, Data Science Engineer, and Scrum Master; at a workplace that encourages you to freely share your bold and different ideas If you are passionate about technology and eager to make a difference, we want to hear from you! Apply now to join our dynamic team in Bengaluru We are seeking a talented Site Reliability Engineer (SRE) to join our team The ideal candidate will have a strong background in software engineering and systems administration, with a passion for building scalable and reliable systems As an SRE, you will collaborate with development and operations teams to ensure our services are reliable, performant, and highly available Key Responsibilities "Ensure the 24/7 operations and reliability of data services in our production GCP and on-premise Hadoop environments Collaborate with the data engineering development team to design, build, and maintain scalable, reliable, and secure data pipelines and systems Develop and implement monitoring, alerting, and incident response strategies to proactively identify and resolve issues before they impact production Drive the implementation of security and reliability best practices across the software development life cycle Contribute to the development of tools and automation to streamline the management and operation of data services Participate in on-call rotation and respond to incidents in a timely and effective manner Continuously evaluate and improve the reliability, scalability, and performance of data services" Technology Skills "4+ years of experience in site reliability engineering or a similar role Strong experience with Google Cloud Platform (GCP) services, including BigQuery, Dataflow, Pub/Sub, and Cloud Storage Experience with on-premise Hadoop environments and related technologies (HDFS, Hive, Spark, etc ) Proficiency in at least one programming language (Python, Scala, Java, Go, etc ) Required qualifications to be successful in this role Bachelors degree in computer science engineering, or related field 8 -10 years of experience as a SRE Proven experience as an SRE, DevOps engineer, or similar role Strong problem-solving skills and ability to work under pressure Excellent communication and collaboration skills Flexible to work in EST time zones ( 9-5 EST) Additional Information Job Type Full Time Work ProfileHybrid (Work from Office/ Remote) Years of Experience8-10 Years LocationBangalore What We Offer Competitive salaries and comprehensive health benefits Flexible work hours and remote work options Professional development and training opportunities A supportive and inclusive work environment

Posted 2 weeks ago

Apply

6.0 - 10.0 years

20 - 27 Lacs

Bengaluru

Remote

Naukri logo

Greetings!!! Position: Azure Data Engineer Budget: 28.00 LPA Type: FTE/Lateral Location: Pan India(WFH) Experience: 6 - 10 Yrs. JOB DESCRIPTION Azure Data engineer Must have skills - Azure ,Databricks,Datafactory,Python ,Pyspark

Posted 2 weeks ago

Apply

4.0 - 6.0 years

6 - 8 Lacs

Mumbai

Work from Office

Naukri logo

Design and implement data solutions using Microsoft Azure and Databricks platforms. You will work with cloud-based tools for data engineering, analysis, and machine learning. Expertise in Azure, Databricks, and cloud data solutions is required.

Posted 2 weeks ago

Apply

4.0 - 8.0 years

9 - 14 Lacs

Hyderabad

Work from Office

Naukri logo

Software Engineering Senior Analyst ABOUT EVERNORTH: Evernorthexists to elevate health for all, because we believe health is the starting point for human potential and progress. As champions for affordable, predictable and simple health care,we solve the problems others don’t, won’t or can’t. Excited to grow your career We value our talented employees, and whenever possible strive to help one of our associates grow professionally before recruiting new talent to our open positions. If you think the open position you see is right for you, we encourage you to apply! Our people make all the difference in our success. We are looking for engineer to develop, optimize and fine-tune AI models for performance, scalability, and accuracy. In this role you will support the full software lifecycle of design, development, testing, and support for technical delivery. This role requires working with both onsite and offshore team members in properly defining testable scenarios based on requirements/acceptance criteria. Responsibilities Participate in daily stand-up meeting to verify all the ongoing tickets status. Estimating data ingestion work in datalake based on entity count and complexity Work on designing suitable Azure cloud data management solutions to address the business stakeholder’s needs with regards to their data ingestion, processing, and transmission to downstream systems Participate in discussion with team to understand requirement to ingest and transform data into datalake and make available processed data to different target databases Develop ingestion code to ingest the data from different sources in datalake Export the processed data to target databases so that they can use data in reporting Optimize the data ingestion or data transformation workflows Optimizing long running jobs Develop Azure migration flow and Azure databricks jobs for LakeHouse Keep track of job after deployment and identify performance bottlenecks, failures, data growth Track support ticket, triage, fix and deploy Responsible for monitoring and execution of nightly ETL process which loads data to Azure data warehouse system Responsible for on-boarding the new clients for Member Model, Remittance and Paid Claims Prepare Root Cause Analysis document and suggest solutions to mitigate future re-occurrence for the issue Qualifications Required Skills: Minimum of 3-5 years of professional experience Experience administering the following Data Warehouse Architecture Components: 2+ years with Azure Technologies 2+ years with Azure - Data Factory(ADF), ADLSGen2, Storage Account, Lakehouse Analytics, Synapse, SQL DB, Databricks 2+ years with SQL Server, Python, Scala, SSIS, SSRS Understanding of Data access, Data Retention, and archiving Good hands-on experience on troubleshooting data error and ETL jobs Good understanding of ETL process and agile framework Good Communication skills Required Experience & Education: Software engineer (with 3-5 years of overall experience) with at-least 2 years in the key skills listed above B achelor’s degree equivalent in Computer Science, equivalent preferred. Location & Hours of Work: HIH-Hyderabad & General Shift(11:30 AM - 8:30 PM IST) Equal Opportunity Statement Evernorth is an Equal Opportunity Employer actively encouraging and supporting organization-wide involvement of staff in diversity, equity, and inclusion efforts to educate, inform and advance both internal practices and external work with diverse client populations. About Evernorth Health Services Evernorth Health Services, a division of The Cigna Group, creates pharmacy, care and benefit solutions to improve health and increase vitality. We relentlessly innovate to make the prediction, prevention and treatment of illness and disease more accessible to millions of people. Join us in driving growth and improving lives.

Posted 2 weeks ago

Apply

5.0 - 8.0 years

11 - 16 Lacs

Hyderabad

Work from Office

Naukri logo

Software Engineering Senior Analyst ABOUT EVERNORTH: Evernorthexists to elevate health for all, because we believe health is the starting point for human potential and progress. As champions for affordable, predictable and simple health care,we solve the problems others don’t, won’t or can’t. We are always looking upward. And that starts with finding the right talent to help us get there. Position Overview Responsibilities Participate in daily stand-up meeting to verify all the ongoing tickets status. Estimating data ingestion work in datalake based on entity count and complexity Work on designing suitable Azure cloud data management solutions to address the business stakeholder’s needs with regards to their data ingestion, processing, and transmission to downstream systems Participate in discussion and lead team to understand requirement to ingest and transform data into datalake and make available processed data to different target databases Review developed ingestion code to ingest the data from different sources in datalake Review and perform impact analysis on proposed solutions for Optimizing long running jobs Keep track of job after deployment and identify performance bottlenecks, failures, data growth Track support ticket, triage, fix and deploy Review prepared Root Cause Analysis document Firm grasp on the processes and standard operation procedures and influencing the fellow team members in following them. Engaged in fostering and improving organizational culture. Qualifications Required Skills: Minimum of 5-8 years of professional experience Experience administering the following Data Warehouse Architecture Components: 5+ years with Azure Technologies 5+ years with Azure - Data Factory(ADF), ADLSGen2, Storage Account, Lakehouse Analytics, Synapse, SQL DB, Databricks 5+ years with SQL Server, Python, Scala, SSIS, SSRS Understanding of Data access, Data Retention, and archiving Good hands on experience on troubleshooting data error and ETL jobs Good understanding of ETL process and agile framework Good Communication skills Required Experience & Education: Software engineer (with 5-8 years of overall experience) with at-least 5 years in the key skills listed above Bachelor’s degree equivalent in Information Technology, Computer Science, Technology Management, or related field of study. Location & Hours of Work: Hyderabad and Hybrid (11:30 AM IST to 8:30 PM IST) Equal Opportunity Statement Evernorth is an Equal Opportunity Employer actively encouraging and supporting organization-wide involvement of staff in diversity, equity, and inclusion efforts to educate, inform and advance both internal practices and external work with diverse client populations. About Evernorth Health Services Evernorth Health Services, a division of The Cigna Group, creates pharmacy, care and benefit solutions to improve health and increase vitality. We relentlessly innovate to make the prediction, prevention and treatment of illness and disease more accessible to millions of people. Join us in driving growth and improving lives.

Posted 2 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies