Home
Jobs
Companies
Resume

130 Datafactory Jobs - Page 2

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 - 6.0 years

6 - 8 Lacs

Mumbai

Work from Office

Naukri logo

Design and implement data solutions using Microsoft Azure and Databricks platforms. You will work with cloud-based tools for data engineering, analysis, and machine learning. Expertise in Azure, Databricks, and cloud data solutions is required.

Posted 2 weeks ago

Apply

4.0 - 8.0 years

9 - 14 Lacs

Hyderabad

Work from Office

Naukri logo

Software Engineering Senior Analyst ABOUT EVERNORTH: Evernorthexists to elevate health for all, because we believe health is the starting point for human potential and progress. As champions for affordable, predictable and simple health care,we solve the problems others don’t, won’t or can’t. Excited to grow your career We value our talented employees, and whenever possible strive to help one of our associates grow professionally before recruiting new talent to our open positions. If you think the open position you see is right for you, we encourage you to apply! Our people make all the difference in our success. We are looking for engineer to develop, optimize and fine-tune AI models for performance, scalability, and accuracy. In this role you will support the full software lifecycle of design, development, testing, and support for technical delivery. This role requires working with both onsite and offshore team members in properly defining testable scenarios based on requirements/acceptance criteria. Responsibilities Participate in daily stand-up meeting to verify all the ongoing tickets status. Estimating data ingestion work in datalake based on entity count and complexity Work on designing suitable Azure cloud data management solutions to address the business stakeholder’s needs with regards to their data ingestion, processing, and transmission to downstream systems Participate in discussion with team to understand requirement to ingest and transform data into datalake and make available processed data to different target databases Develop ingestion code to ingest the data from different sources in datalake Export the processed data to target databases so that they can use data in reporting Optimize the data ingestion or data transformation workflows Optimizing long running jobs Develop Azure migration flow and Azure databricks jobs for LakeHouse Keep track of job after deployment and identify performance bottlenecks, failures, data growth Track support ticket, triage, fix and deploy Responsible for monitoring and execution of nightly ETL process which loads data to Azure data warehouse system Responsible for on-boarding the new clients for Member Model, Remittance and Paid Claims Prepare Root Cause Analysis document and suggest solutions to mitigate future re-occurrence for the issue Qualifications Required Skills: Minimum of 3-5 years of professional experience Experience administering the following Data Warehouse Architecture Components: 2+ years with Azure Technologies 2+ years with Azure - Data Factory(ADF), ADLSGen2, Storage Account, Lakehouse Analytics, Synapse, SQL DB, Databricks 2+ years with SQL Server, Python, Scala, SSIS, SSRS Understanding of Data access, Data Retention, and archiving Good hands-on experience on troubleshooting data error and ETL jobs Good understanding of ETL process and agile framework Good Communication skills Required Experience & Education: Software engineer (with 3-5 years of overall experience) with at-least 2 years in the key skills listed above B achelor’s degree equivalent in Computer Science, equivalent preferred. Location & Hours of Work: HIH-Hyderabad & General Shift(11:30 AM - 8:30 PM IST) Equal Opportunity Statement Evernorth is an Equal Opportunity Employer actively encouraging and supporting organization-wide involvement of staff in diversity, equity, and inclusion efforts to educate, inform and advance both internal practices and external work with diverse client populations. About Evernorth Health Services Evernorth Health Services, a division of The Cigna Group, creates pharmacy, care and benefit solutions to improve health and increase vitality. We relentlessly innovate to make the prediction, prevention and treatment of illness and disease more accessible to millions of people. Join us in driving growth and improving lives.

Posted 2 weeks ago

Apply

5.0 - 8.0 years

11 - 16 Lacs

Hyderabad

Work from Office

Naukri logo

Software Engineering Senior Analyst ABOUT EVERNORTH: Evernorthexists to elevate health for all, because we believe health is the starting point for human potential and progress. As champions for affordable, predictable and simple health care,we solve the problems others don’t, won’t or can’t. We are always looking upward. And that starts with finding the right talent to help us get there. Position Overview Responsibilities Participate in daily stand-up meeting to verify all the ongoing tickets status. Estimating data ingestion work in datalake based on entity count and complexity Work on designing suitable Azure cloud data management solutions to address the business stakeholder’s needs with regards to their data ingestion, processing, and transmission to downstream systems Participate in discussion and lead team to understand requirement to ingest and transform data into datalake and make available processed data to different target databases Review developed ingestion code to ingest the data from different sources in datalake Review and perform impact analysis on proposed solutions for Optimizing long running jobs Keep track of job after deployment and identify performance bottlenecks, failures, data growth Track support ticket, triage, fix and deploy Review prepared Root Cause Analysis document Firm grasp on the processes and standard operation procedures and influencing the fellow team members in following them. Engaged in fostering and improving organizational culture. Qualifications Required Skills: Minimum of 5-8 years of professional experience Experience administering the following Data Warehouse Architecture Components: 5+ years with Azure Technologies 5+ years with Azure - Data Factory(ADF), ADLSGen2, Storage Account, Lakehouse Analytics, Synapse, SQL DB, Databricks 5+ years with SQL Server, Python, Scala, SSIS, SSRS Understanding of Data access, Data Retention, and archiving Good hands on experience on troubleshooting data error and ETL jobs Good understanding of ETL process and agile framework Good Communication skills Required Experience & Education: Software engineer (with 5-8 years of overall experience) with at-least 5 years in the key skills listed above Bachelor’s degree equivalent in Information Technology, Computer Science, Technology Management, or related field of study. Location & Hours of Work: Hyderabad and Hybrid (11:30 AM IST to 8:30 PM IST) Equal Opportunity Statement Evernorth is an Equal Opportunity Employer actively encouraging and supporting organization-wide involvement of staff in diversity, equity, and inclusion efforts to educate, inform and advance both internal practices and external work with diverse client populations. About Evernorth Health Services Evernorth Health Services, a division of The Cigna Group, creates pharmacy, care and benefit solutions to improve health and increase vitality. We relentlessly innovate to make the prediction, prevention and treatment of illness and disease more accessible to millions of people. Join us in driving growth and improving lives.

Posted 2 weeks ago

Apply

2.0 - 5.0 years

4 - 8 Lacs

Kolkata

Work from Office

Naukri logo

Capgemini Invent Capgemini Invent is the digital innovation, consulting and transformation brand of the Capgemini Group, a global business line that combines market leading expertise in strategy, technology, data science and creative design, to help CxOs envision and build whats next for their businesses. Your role Develop and maintain data pipelines tailored to Azure environments, ensuring security and compliance with client data standards. Collaborate with cross-functional teams to gather data requirements, translate them into technical specifications, and develop data models. Leverage Python libraries for data handling, enhancing processing efficiency and robustness. Ensure SQL workflows meet client performance standards and handle large data volumes effectively. Build and maintain reliable ETL pipelines, supporting full and incremental loads and ensuring data integrity and scalability in ETL processes. Implement CI/CD pipelines for automated deployment and testing of data solutions. Optimize and tune data workflows and processes to ensure high performance and reliability. Monitor, troubleshoot, and optimize data processes for performance and reliability. Document data infrastructure, workflows, and maintain industry knowledge in data engineering and cloud tech. Your Profile Bachelors degree in computer science, Information Systems, or a related field 4+ years of data engineering experience with a strong focus on Azure data services for client-centric solutions. Extensive expertise in Azure Synapse, Data Lake Storage, Data Factory, Databricks, and Blob Storage, ensuring secure, compliant data handling for clients. Good interpersonal communication skills Skilled in designing and maintaining scalable data pipelines tailored to client needs in Azure environments. Proficient in SQL and PL/SQL for complex data processing and client-specific analytics. What you will love about working here We recognize the significance of flexible work arrangements to provide support. Be it remote work, or flexible work hours, you will get an environment to maintain healthy work life balance. At the heart of our mission is your career growth. Our array of career growth programs and diverse professions are crafted to support you in exploring a world of opportunities. Equip yourself with valuable certifications in the latest technologies such as Generative AI. About Capgemini Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable world, while creating tangible impact for enterprises and society. It is a responsible and diverse group of 340,000 team members in more than 50 countries. With its strong over 55-year heritage, Capgemini is trusted by its clients to unlock the value of technology to address the entire breadth of their business needs. It delivers end-to-end services and solutions leveraging strengths from strategy and design to engineering, all fueled by its market leading capabilities in AI, cloud and data, combined with its deep industry expertise and partner ecosystem. The Group reported 2023 global revenues of 22.5 billion.

Posted 2 weeks ago

Apply

3.0 years

2 - 10 Lacs

Gurgaon

On-site

Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. Primary Responsibilities: Onboard clients via the components of our data engineering pipeline, which consists of UIs, Azure Databricks, Azure ServiceBus, Apache Airflow, and various container-based services configured through IUs, SQL, PL/SQL, Python, YAML, Node, and Shell, with code managed in GitHub, deployed through Jenkins, and monitored through Prometheus, Grafana Work as a part of our client implementation team to ensure the highest standards of product configuration that meet client requirements Test and troubleshoot data pipeline using sample and live client data. Utilize Jenkins, Python, Groovy Scripts and Java to automate these tests. Must be able to parse logs to determine next actions. Work with product teams to ensure the product is configured appropriately Utilize dashboards for Kubernetes/OpenShift to diagnose high level issues and ensure services are healthy Support Implementation immediately after go live, work with O&M team to transition support to that team Participate in daily AGILE meetings Estimate project deliverables Configure and test REST APIs and utilize manual tools to interact with API’s Work with data providers to clarify requirements and remove roadblocks Drive automation into everyday activities Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications: 3+ years of experience working with SQL (preferably Oracle Pl/SQL and SparkSQL) and data at scale 3+ years of ETL experience ensuring source to target data integrity. Familiar with various file types (Delimited Text, Fixed Width, XML, JSON, Parque) 1+ years of coding experience with one or more of the follow languages; Java, C#, Python, NodeJS using Git, with practical experience with working collaboratively through Git branching strategies 1+ years of experience with Microsoft Azure cloud infrastructure, DataBricks, DataFactory, DataLake, Airflow and Cosmos Database 1+ years of experience in reading and configuring YAML 1+ years of experience with ServiceBus, setting up ingress and egress within a subscription, or relevant Azure Cloud services administrative experience 1+ years of experience with Unit Testing, Code Quality tools, CI/CD Technologies, Security and Container Technologies 1+ years of Agile development experience and knowledge of Agile ceremonies and practices At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone–of every race, gender, sexuality, age, location and income–deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes — an enterprise priority reflected in our mission.

Posted 2 weeks ago

Apply

3.0 years

0 Lacs

Gurgaon, Haryana, India

On-site

Linkedin logo

Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. Primary Responsibilities Onboard clients via the components of our data engineering pipeline, which consists of UIs, Azure Databricks, Azure ServiceBus, Apache Airflow, and various container-based services configured through IUs, SQL, PL/SQL, Python, YAML, Node, and Shell, with code managed in GitHub, deployed through Jenkins, and monitored through Prometheus, Grafana Work as a part of our client implementation team to ensure the highest standards of product configuration that meet client requirements Test and troubleshoot data pipeline using sample and live client data. Utilize Jenkins, Python, Groovy Scripts and Java to automate these tests. Must be able to parse logs to determine next actions. Work with product teams to ensure the product is configured appropriately Utilize dashboards for Kubernetes/OpenShift to diagnose high level issues and ensure services are healthy Support Implementation immediately after go live, work with O&M team to transition support to that team Participate in daily AGILE meetings Estimate project deliverables Configure and test REST APIs and utilize manual tools to interact with API’s Work with data providers to clarify requirements and remove roadblocks Drive automation into everyday activities Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications 3+ years of experience working with SQL (preferably Oracle Pl/SQL and SparkSQL) and data at scale 3+ years of ETL experience ensuring source to target data integrity. Familiar with various file types (Delimited Text, Fixed Width, XML, JSON, Parque) 1+ years of coding experience with one or more of the follow languages; Java, C#, Python, NodeJS using Git, with practical experience with working collaboratively through Git branching strategies 1+ years of experience with Microsoft Azure cloud infrastructure, DataBricks, DataFactory, DataLake, Airflow and Cosmos Database 1+ years of experience in reading and configuring YAML 1+ years of experience with ServiceBus, setting up ingress and egress within a subscription, or relevant Azure Cloud services administrative experience 1+ years of experience with Unit Testing, Code Quality tools, CI/CD Technologies, Security and Container Technologies 1+ years of Agile development experience and knowledge of Agile ceremonies and practices At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone-of every race, gender, sexuality, age, location and income-deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes — an enterprise priority reflected in our mission. Show more Show less

Posted 2 weeks ago

Apply

6.0 - 8.0 years

18 - 20 Lacs

Bengaluru

Work from Office

Naukri logo

The Development Lead will oversee the design, development, and delivery of advanced data solutions using Azure Databricks, SQL, and data visualization tools like Power BI. The role involves leading a team of developers, managing data pipelines, and creating insightful dashboards and reports to drive data-driven decision-making across the organization. The individual will ensure best practices are followed in data architecture, development, and reporting while maintaining alignment with business objectives. Key Responsibilities: Data Integration & ETL Processes: Design, build, and optimize ETL pipelines to manage the flow of data from various sources into data lakes, data warehouses, and reporting platforms. Data Visualization & Reporting: Lead the development of interactive dashboards and reports using Power BI, ensuring that business users have access to actionable insights and performance metrics. SQL Development & Optimization: Write, optimize, and review complex SQL queries for data extraction, transformation, and reporting, ensuring high performance and scalability across large datasets. Azure Cloud Solutions: Implement and manage cloud-based solutions using Azure services (Azure Databricks, Azure SQL Database, Data Lake) to support business intelligence and reporting initiatives. Collaboration with Stakeholders: Work closely with business leaders and cross-functional teams to understand reporting and analytics needs, translating them into technical requirements and actionable data solutions. Quality Assurance & Best Practices: Implement and maintain best practices in development, ensuring code quality, version control, and adherence to data governance standards. Performance Monitoring & Tuning: Continuously monitor the performance of data systems, reporting tools, and dashboards to ensure they meet SLAs and business requirements. Documentation & Training: Create and maintain comprehensive documentation for all data solutions, including architecture diagrams, ETL workflows, and data models. Provide training and support to end-users on Power BI reports and dashboards. Required Qualifications: Bachelors or Masters degree in Computer Science, Information Systems, or a related field. Proven experience as a Development Lead or Senior Data Engineer with expertise in Azure Databricks, SQL, Power BI, and data reporting/visualization. Hands-on experience in Azure Databricks for large-scale data processing and analytics, including Delta Lake, Spark SQL, and integration with Azure Data Lake. Strong expertise in SQL for querying, data transformation, and database management. Proficiency in Power BI for developing advanced dashboards, data models, and reporting solutions. Experience in ETL design and data integration across multiple systems, with a focus on performance optimization. Knowledge of Azure cloud architecture, including Azure SQL Database, Data Lake, and other relevant services. Experience leading agile development teams, with a strong focus on delivering high-quality, scalable solutions. Strong problem-solving skills, with the ability to troubleshoot and resolve complex data and reporting issues. Excellent communication skills, with the ability to interact with both technical and non-technical stakeholders. Preferred Qualifications: Knowledge of additional Azure services (e.g., Azure Synapse, Data Factory, Logic Apps) is a plus. Experience in Power BI for data visualization and custom calculations. Keywords Data Factory,Power BI*,Spark SQL,Logic Apps,Azure Databricks*,ETL design,agile development,SQL*,Synapse,data reporting*,Delta Lake,Azure Data Lake,Azure cloud architecture Mandatory Key Skills Data Factory,Power BI*,Spark SQL,Logic Apps,Azure Databricks*,ETL design,agile development,SQL*,Synapse,data reporting*,Delta Lake,Azure Data Lake,Azure cloud architecture

Posted 2 weeks ago

Apply

2.0 - 4.0 years

5 - 8 Lacs

Gurugram

Work from Office

Naukri logo

Proficient in Databricks for big data processing and analytics. Experience with Power BI for creating complex reports. Strong knowledge of SQL and experience with relational databases. Mail Us - info@a1selectors.com

Posted 2 weeks ago

Apply

170.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

We are M&G Global Services Private Limited (formerly known as 10FA India Private Limited, and prior to that Prudential Global Services Private Limited). We are a fully owned subsidiary of the M&G plc group of companies, operating as a Global Capability Centre providing a range of value adding services to the Group since 2003. At M&G our purpose is to give everyone real confidence to put their money to work. As an international savings and investments business with roots stretching back more than 170 years, we offer a range of financial products and services through Asset Management, Life and Wealth. All three operating segments work together to deliver attractive financial outcomes for our clients, and superior shareholder returns. M&G Global Services has rapidly transformed itself into a powerhouse of capability that is playing an important role in M&G plc’s ambition to be the best loved and most successful savings and investments company in the world. Our diversified service offerings extending from Digital Services (Digital Engineering, AI, Advanced Analytics, RPA, and BI & Insights), Business Transformation, Management Consulting & Strategy, Finance, Actuarial, Quants, Research, Information Technology, Customer Service, Risk & Compliance and Audit provide our people with exciting career growth opportunities. Through our behaviours of telling it like it is, owning it now, and moving it forward together with care and integrity; we are creating an exceptional place to work for exceptional talent. Key Qualifications: Inspire the teams to deliver exceptional performance by helping them to connect the purpose of their work, beyond the impact to the bottom line Lead by example in demonstrating the organization's values and principles Support direct reports to make effective decisions, collaborate across silos, speak up and take personal accountability Encourage Investment Platform team to solve business problems creatively and collaboratively and question the status quo Be curious and insightful about customer’s needs and support Investment Platform team to adapt to the changing environment Simplify complexity and reduce unnecessary bureaucracy Key Accountabilities And Responsibilities Defines the technology roadmap for the portfolio or portfolios in line with the organisational objectives Being aware of the technology budget for the portfolio and be accountable for the operational costs of the platform including Azure Ensures implementation of best practices in software development, including coding standards, code reviews, and testing across squads Drive tech initatives including modernisation, and cost optimisation through a community of Lead Engineers Continuously assess and improve engineering processes to enhance efficiency and productivity Identify potential risks in projects and operations, and develop mitigation strategies Ensure compliance with security and regulatory requirements Establish key performance indicators (KPIs) to measure the success of engineering initiatives Regularly report on the performance and progress of engineering activities Provide guidance and mentorship to the engineering squads Be a role model and lead by example in demonstrating the organization's values and principles Stay abreast of emerging technologies and industry trends, and evaluate their potential impact on the organization Deliver business prioritised changes Demonstrate a solid understanding of the technology, and domain knowledge Oversee the quality of engineering of the squads Ensuring any solutions being designed within the squad meet M&G standards Fostering strong relationships with the other key stakeholders - outcome delivery managers, engineering lead, enterprise architecture, technology partners, SRE, and other platforms Playing an active role in engineering community for learning and sharing Assist with end-to-end solutions and see it through until it reaches the end customer. Leads tech improvements within the squad Provide technical guidance and mentorship to the squad members Build and maintain strong relationships with all members of the team Contribute actively in the Lead Engineering Team and lead initiatives within the squad Partnering with and assisting the Investment Plaform’s change and run teams to deliver their objectives Work across multiple disciplines across the Investment Platform and opportunities to connect and improve working practices Consistently identify improvement opportunities within the team and implements processes, ways of working or facilitate training to address improvement opportunities Works within established frameworks and procedures, with the freedom to interpret them to solve a range of problems Build and maintain strong relationships with all key stakeholders in the Front Office, Sustainability focused teams, Transformation & Innovation or the engineering community Key Stakeholder Management Internal M&G Investments External N/A Knowledge, Skills, Experience & Educational Qualification KNOWLEDGE AND SKILLS (KEY): Advanced software engineering skills paired with expertise in some of the following programming languages: C#, Java, Python, Javascript, HTML/CSS or equivalent Strong understanding of DevOps principles and experience in building CI/CD pipelines Strong experience working with data on Azure including Azure Data Lake, ADLA, Cosmos DB, SQL, and Datafactory Demonstrates Architectural Excellence by developing solutions using the guiding tenets of Azure Well-Architected Framework covering reliability, security, cost optimisation, operational excellence and performance efficiency Has strong Azure experience on PaaS services such as App Service, Function App, Logic App, Data Factory, Service Bus and Key Vault Has deep understanding of Azure AD, Service Principals and Managed Identities to configure role based access for web applications Is able to investigate complex application issues by using telemetry and Application Insights information Has a good experience with REST APIs and Microservices and API management on Apigee Follows best coding practices using SOLID principles, design patterns and other industry standards Proactive self-starter who can manage their own workload and can juggle multiple priorities at the same time Experience working in an agile environment and good understanding of integrating testing withing the SDLC Applying a “solutions” mentality for any outcome that you are assigned to and ensure proper solutions design is considered Great interpersonal skills, with the ability to communicate clearly and effectively, both written and orally, within a project team Ability to identify problems and have the drive to follow them through to resolution Excellent attention to detail, and ability to prioritize and work efficiently to project deadlines Has people management skills and able to develop high performing teams KNOWLEDGE AND SKILLS (DESIRABLE): Relevant experience developing back or frontend solutions (e.g. On Azure Cloud, Angular, React, Node.js software frameworks) Exposure to financial markets & asset management processes and understand analysis into a wide variety of asset classes and associated analytics (e.g. Equity, Fixed Income, Private Assets etc) Exposure to data visualisation tools - Power BI or equivalent EXPERIENCE: 18 + years of total experience in software engineering 5 + years of experience in a core engineering role in the cloud EDUCATION, QUALIFICATIONS NECESSARY: Graduate in any discipline We have a diverse workforce and an inclusive culture at M&G Global Services, regardless of gender, ethnicity, age, sexual orientation, nationality, disability or long term condition, we are looking to attract, promote and retain exceptional people. We also welcome those who take part in military service and those returning from career breaks. Show more Show less

Posted 2 weeks ago

Apply

4.0 - 8.0 years

10 - 15 Lacs

Bengaluru

Work from Office

Naukri logo

As a Software Developer you'll participate in many aspects of the software development lifecycle, such as design, code implementation, testing, and support. You will create software that enables your clients' hybrid-cloud and AI journeys. Your primary responsibilities include Comprehensive Feature Development and Issue ResolutionWorking on the end to end feature development and solving challenges faced in the implementation. Stakeholder Collaboration and Issue ResolutionCollaborate with key stakeholders, internal and external, to understand the problems, issues with the product and features and solve the issues as per SLAs defined. Continuous Learning and Technology IntegrationBeing eager to learn new technologies and implementing the same in feature development Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Proficient in .Net Core with React or Angular Experience in Agile teams applying the best architectural, design, unit testing patterns & practices with an eye for code quality and standards. AZURE FUNCTION, AZURE SERVICE BUS, AZURE STORAGE ACCOUNT- MANDATORY AZURE DURABLE FUNCTIONS AZURE DATA FACTORY, AZURE SQL OR COSMOS DB(DATABASE)Required Ability to write calculation rules and configurable consolidation rules Preferred technical and professional experience Excellent written and verbal interpersonal skills for coordinating across teams. Should have at least 2 end to end implementation experience. Ability to write and update the rules of historical overrides

Posted 2 weeks ago

Apply

7.0 - 12.0 years

10 - 18 Lacs

Bengaluru

Hybrid

Naukri logo

Job Goals Design and implement resilient data pipelines to ensure data reliability, accuracy, and performance. Collaborate with cross-functional teams to maintain the quality of production services and smoothly integrate data processes. Oversee the implementation of common data models and data transformation pipelines, ensuring alignement to standards. Drive continuous improvement in internal data frameworks and support the hiring process for new Data Engineers. Regularly engage with collaborators to discuss considerations and manage the impact of changes. Support architects in shaping the future of the data platform and help land new capabilities into business-as-usual operations. Identify relevant emerging trends and build compelling cases for adoption, such as tool selection. Ideal Skills & Capabilities A minimum of 6 years of experience in a comparable Data Engineer position is required. Data Engineering Expertise: Proficiency in designing and implementing resilient data pipelines, ensuring data reliability, accuracy, and performance, with practical knowledge of modern cloud data technology stacks (AZURE) Technical Proficiency: Experience with Azure Data Factory and Databricks , and skilled in Python , Apache Spark , or other distributed data programming frameworks. Operational Knowledge: In-depth understanding of data concepts, data structures, modelling techniques, and provisioning data to support varying consumption needs, along with accomplished ETL/ELT engineering skills. Automation & DevOps: Experience using DevOps toolchains for managing CI/CD and an automation-first mindset in building solutions, including self-healing and fault-tolerant methods. Data Management Principles: Practical application of data management principles such as security and data privacy, with experience handling sensitive data through techniques like anonymisation/tokenisation/pseudo-anonymisation.

Posted 3 weeks ago

Apply

7.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

The Job We’re looking for an experienced, smart, driven individual who will help analyze our data used in our linking solutions through quantitative approaches, blending analytical, mathematical, and technical skills. Must-Have Skills Bachelor’s or master’s degree in computer science, engineering, mathematics, statistics, or equivalent technical discipline. 7+ years of experience working with data mapping, data analysis, and managing large data sets/data warehouses. Strong application development experience in Java . Strong proficiency in Angular 1.X. Familiarity with Singleton and MVC design patterns. Strong proficiency in SQL and/or MySQL, including optimization techniques (at least MySQL). Experience using tools such as Eclipse, GIT, Postman, JIRA, and Confluence. Knowledge of test-driven development. Solid understanding of object-oriented programming. Proficiency in Azure Data Bricks , Azure Data Explorer , ADLS2 , EventHub technologies. Hands-on experience with Docker , GitHub , and CI/CD pipelines . Experience working with Cosmos DB is preferred. Ability to analyze, evaluate, and make data-driven recommendations from big data. Strong understanding of data structures, algorithms, and their applications in solving business problems. Excellent analytical , problem-solving , and communication skills, both verbal and written. Strong organizational, project planning, time management, and change management skills. Good-to-Have Skills Expertise in Spring Boot, Microservices, and API development. Familiarity with OAuth2.0 patterns (experience with at least 2 patterns). Knowledge of Graph Databases (e.g., Neo4J, Apache Tinkerpop, Gremlin). Experience with Kafka messaging. Familiarity with Docker, Kubernetes, and cloud development. Experience with CI/CD tools like Jenkins and GitHub Actions. Knowledge of industry-wide technology trends and best practices. Familiarity with Azure Data Factory . Experience in containerization and deployment processes (Docker, GitHub, CI/CD). Experience with data visualization tools and techniques. Knowledge of cloud-native technologies and architectures. Experience Range 5+ years of relevant experience in data engineering, data analysis, and working with large datasets. Hiring Locations Chennai, Mumbai, Gurgaon Skills Java, Angular, Azure (DataBricks, DataFactory) Show more Show less

Posted 3 weeks ago

Apply

8.0 - 12.0 years

20 - 25 Lacs

Hyderabad, Pune

Hybrid

Naukri logo

Job Title : Data Engineer Work Location : India, Pune / Hyderabad (Hybrid) Responsibilities include: Design, implement, and optimize end-to-end data pipelines for ingesting, processing, and transforming large volumes of structured and unstructured data. Develop data pipelines to extract and transform data in near real time using cloud native technologies. Implement data validation and quality checks to ensure accuracy and consistency. Monitor system performance, troubleshoot issues, and implement optimizations to enhance reliability and ePiciency. Collaborate with business users, analysts, and other stakeholders to understand data requirements and deliver tailored solutions. Document technical designs, workflows, and best practices to facilitate knowledge sharing and maintain system documentation. Provide technical guidance and support to team members and stakeholders as needed. Desirable Competencies: 8+ years of work experience Proficiency in writing complex SQL queries on MPP systems (Snowflake/Redshift) Experience in Databricks and Delta tables. Data Engineering experience with Spark/Scala/Python Experience in Microsoft Azure stack (Azure Storage Accounts, Data Factory and Databricks). Experience in Azure DevOps and CI/CD pipelines. Working knowledge of Python Feel comfortable participating in 2-week sprint development cycles. About Us Founded in 1956, Williams-Sonoma Inc. is the premier specialty retailer of high-quality products for the kitchen and home in the United States. Today, Williams-Sonoma, Inc. is one of the United States' largest e-commerce retailers with some of the best known and most beloved brands in home furnishings. Our family of brands are Williams-Sonoma, Pottery Barn, Pottery Barn Kids, Pottery Barn Teens, West Elm, Williams-Sonoma Home, Rejuvenation, GreenRow and Mark and Graham. We currently operate retail stores globally. Our products are also available to customers through our catalogues and online worldwide. Williams-Sonoma has established a technology center in Pune, India to enhance its global operations. The India Technology Center serves as a critical hub for innovation and focuses on developing cutting-edge solutions in areas such as e-commerce, supply chain optimization, and customer experience management. By integrating advanced technologies like artificial intelligence, data analytics, and machine learning, the India Technology Center plays a crucial role in accelerating Williams-Sonoma's growth and maintaining its competitive edge in the global market.

Posted 3 weeks ago

Apply

5.0 years

0 Lacs

New Delhi, Delhi, India

On-site

Linkedin logo

TCS Hiring for Azure Admin + Azure Platform Eng Experience: 5 to 8 Years Only Job Location: New Delhi,Kolkata,Mumbai,Pune,Bangalore TCS Hiring for Azure Admin + Azure Platform Eng Required Technical Skill Set: Deployment through Terraform,Azure Administration, DataFactory, DataBricks, Active Directory, Identity,Unitycatalog, Terraform, Mechine leaning, AI and Access Management 3+ years of prior product/technical support customer facing experience Must have good knowledge working in Azure cloud technical support Good to have technical skills and hands-on experience in following areas: Deployment through Terraform,PowerShell/CLI, Identity Management, Azure Resource Group Management, Azure PaaS services e.g.: ADF, Databricks, Storage Account Understanding about the machine leaning and AI concept related to Infrastructure. · Unity catalog end to end process to migrate from hive to UC Excellent team player with good interpersonal and communication skills. Experience of Life Science and Health care domain preferred. Roles & Responsibilities: Resource Group creation along with various component deployment using Terraform Template Management of user access in Azure PaaS product such as Azure SQL, WebApp, AppService, Storage Account , DataBricks, DataFactory Creation of Service Principle/AD groups and managing access using this to various application Troubleshoot issues regarding access, data visualizations, permission issues Kind Regards, Priyankha M Show more Show less

Posted 3 weeks ago

Apply

5.0 years

0 Lacs

Kolkata, West Bengal, India

On-site

Linkedin logo

TCS Hiring for Azure Admin + Azure Platform Eng Experience: 5 to 8 Years Only Job Location: New Delhi,Kolkata,Mumbai,Pune,Bangalore TCS Hiring for Azure Admin + Azure Platform Eng Required Technical Skill Set: Deployment through Terraform,Azure Administration, DataFactory, DataBricks, Active Directory, Identity,Unitycatalog, Terraform, Mechine leaning, AI and Access Management 3+ years of prior product/technical support customer facing experience Must have good knowledge working in Azure cloud technical support Good to have technical skills and hands-on experience in following areas: Deployment through Terraform,PowerShell/CLI, Identity Management, Azure Resource Group Management, Azure PaaS services e.g.: ADF, Databricks, Storage Account Understanding about the machine leaning and AI concept related to Infrastructure. · Unity catalog end to end process to migrate from hive to UC Excellent team player with good interpersonal and communication skills. Experience of Life Science and Health care domain preferred. Roles & Responsibilities: Resource Group creation along with various component deployment using Terraform Template Management of user access in Azure PaaS product such as Azure SQL, WebApp, AppService, Storage Account , DataBricks, DataFactory Creation of Service Principle/AD groups and managing access using this to various application Troubleshoot issues regarding access, data visualizations, permission issues Kind Regards, Priyankha M Show more Show less

Posted 3 weeks ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

TCS Hiring for Azure Admin + Azure Platform Eng Experience: 5 to 8 Years Only Job Location: New Delhi,Kolkata,Mumbai,Pune,Bangalore TCS Hiring for Azure Admin + Azure Platform Eng Required Technical Skill Set: Deployment through Terraform,Azure Administration, DataFactory, DataBricks, Active Directory, Identity,Unitycatalog, Terraform, Mechine leaning, AI and Access Management 3+ years of prior product/technical support customer facing experience Must have good knowledge working in Azure cloud technical support Good to have technical skills and hands-on experience in following areas: Deployment through Terraform,PowerShell/CLI, Identity Management, Azure Resource Group Management, Azure PaaS services e.g.: ADF, Databricks, Storage Account Understanding about the machine leaning and AI concept related to Infrastructure. · Unity catalog end to end process to migrate from hive to UC Excellent team player with good interpersonal and communication skills. Experience of Life Science and Health care domain preferred. Roles & Responsibilities: Resource Group creation along with various component deployment using Terraform Template Management of user access in Azure PaaS product such as Azure SQL, WebApp, AppService, Storage Account , DataBricks, DataFactory Creation of Service Principle/AD groups and managing access using this to various application Troubleshoot issues regarding access, data visualizations, permission issues Kind Regards, Priyankha M Show more Show less

Posted 3 weeks ago

Apply

6.0 - 11.0 years

15 - 30 Lacs

Hyderabad, Pune, Bengaluru

Hybrid

Naukri logo

Warm Greetings from SP Staffing Services Private Limited!! We have an urgent opening with our CMMI Level5 client for the below position. Please send your update profile if you are interested. Relevant Experience: 6 - 15 Yrs Location: Pan India Job Description: Candidate must be experienced working in projects involving Other ideal qualifications include experiences in Primarily looking for a data engineer with expertise in processing data pipelines using Databricks Spark SQL on Hadoop distributions like AWS EMR Data bricks Cloudera etc. Should be very proficient in doing large scale data operations using Databricks and overall very comfortable using Python Familiarity with AWS compute storage and IAM concepts Experience in working with S3 Data Lake as the storage tier Any ETL background Talend AWS Glue etc. is a plus but not required Cloud Warehouse experience Snowflake etc. is a huge plus Carefully evaluates alternative risks and solutions before taking action. Optimizes the use of all available resources Develops solutions to meet business needs that reflect a clear understanding of the objectives practices and procedures of the corporation department and business unit Skills Hands on experience on Databricks Spark SQL AWS Cloud platform especially S3 EMR Databricks Cloudera etc. Experience on Shell scripting Exceptionally strong analytical and problem-solving skills Relevant experience with ETL methods and with retrieving data from dimensional data models and data warehouses Strong experience with relational databases and data access methods especially SQL Excellent collaboration and cross functional leadership skills Excellent communication skills both written and verbal Ability to manage multiple initiatives and priorities in a fast-paced collaborative environment Ability to leverage data assets to respond to complex questions that require timely answers has working knowledge on migrating relational and dimensional databases on AWS Cloud platform Skills Interested can share your resume to sankarspstaffings@gmail.com with below inline details. Over All Exp : Relevant Exp : Current CTC : Expected CTC : Notice Period :

Posted 3 weeks ago

Apply

6.0 - 11.0 years

15 - 30 Lacs

Hyderabad, Pune, Bengaluru

Hybrid

Naukri logo

Warm Greetings from SP Staffing Services Private Limited!! We have an urgent opening with our CMMI Level5 client for the below position. Please send your update profile if you are interested. Relevant Experience: 6 - 15 Yrs Location: Pan India Job Description: Candidate must be proficient in Databricks Understands where to obtain information needed to make the appropriate decisions Demonstrates ability to break down a problem to manageable pieces and implement effective timely solutions Identifies the problem versus the symptoms Manages problems that require involvement of others to solve Reaches sound decisions quickly Develops solutions to meet business needs that reflect a clear understanding of the objectives practices and procedures of the corporation department and business unit Roles Responsibilities Provides innovative and cost effective solution using databricks Optimizes the use of all available resources Develops solutions to meet business needs that reflect a clear understanding of the objectives practices and procedures of the corporation department and business unit Learn adapt quickly to new Technologies as per the business need Develop a team of Operations Excellence building tools and capabilities that the Development teams leverage to maintain high levels of performance scalability security and availability Skills The Candidate must have 710 yrs of experience in databricks delta lake Hands on experience on Azure Experience on Python scripting Relevant experience with ETL methods and with retrieving data from dimensional data models and data warehouses Strong experience with relational databases and data access methods especially SQL Knowledge of Azure architecture and design Interested can share your resume to sankarspstaffings@gmail.com with below inline details. Over All Exp : Relevant Exp : Current CTC : Expected CTC : Notice Period :

Posted 3 weeks ago

Apply

8.0 - 11.0 years

35 - 37 Lacs

Kolkata, Ahmedabad, Bengaluru

Work from Office

Naukri logo

Dear Candidate, Seeking a Cloud Monitoring Specialist to set up observability and real-time monitoring in cloud environments. Key Responsibilities: Configure logging and metrics collection. Set up alerts and dashboards using Grafana, Prometheus, etc. Optimize system visibility for performance and security. Required Skills & Qualifications: Familiar with ELK stack, Datadog, New Relic, or Cloud-native monitoring tools. Strong troubleshooting and root cause analysis skills. Knowledge of distributed systems. Soft Skills: Strong troubleshooting and problem-solving skills. Ability to work independently and in a team. Excellent communication and documentation skills. Note: If interested, please share your updated resume and preferred time for a discussion. If shortlisted, our HR team will contact you. Kandi Srinivasa Delivery Manager Integra Technologies

Posted 3 weeks ago

Apply

8.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Location: Chennai,Kolkata,Gurgaon,Bangalore and Pune Experience: 8 -12 Years Work Mode: Hybrid Mandatory Skills: Python, Pyspark, SQL, ETL, Data Pipeline, Azure Databricks, Azure DataFactory, Azure Synapse, Airflow, and Architect Designing,Architect. Overview We are seeking a skilled and motivated Data Engineer with experience in Python, SQL, Azure, and cloud-based technologies to join our dynamic team. The ideal candidate will have a solid background in building and optimizing data pipelines, working with cloud platforms, and leveraging modern data engineering tools like Airflow, PySpark, and Azure Data Engineering. If you are passionate about data and looking for an opportunity to work on cutting-edge technologies, this role is for you! Primary Roles And Responsibilities Developing Modern Data Warehouse solutions using Databricks and AWS/ Azure Stack Ability to provide solutions that are forward-thinking in data engineering and analytics space Collaborate with DW/BI leads to understand new ETL pipeline development requirements. Triage issues to find gaps in existing pipelines and fix the issues Work with business to understand the need in reporting layer and develop data model to fulfill reporting needs Help joiner team members to resolve issues and technical challenges. Drive technical discussion with client architect and team members Orchestrate the data pipelines in scheduler via Airflow Skills And Qualifications Bachelor's and/or master’s degree in computer science or equivalent experience. Must have total 6+ yrs. of IT experience and 3+ years' experience in Data warehouse/ETL projects. Deep understanding of Star and Snowflake dimensional modelling. Strong knowledge of Data Management principles Good understanding of Databricks Data & AI platform and Databricks Delta Lake Architecture Should have hands-on experience in SQL, Python and Spark (PySpark) Candidate must have experience in AWS/ Azure stack Desirable to have ETL with batch and streaming (Kinesis). Experience in building ETL / data warehouse transformation processes Experience with Apache Kafka for use with streaming data / event-based data Experience with other Open-Source big data products Hadoop (incl. Hive, Pig, Impala) Experience with Open Source non-relational / NoSQL data repositories (incl. MongoDB, Cassandra, Neo4J) Experience working with structured and unstructured data including imaging & geospatial data. Experience working in a Dev/Ops environment with tools such as Terraform, CircleCI, GIT. Proficiency in RDBMS, complex SQL, PL/SQL, Unix Shell Scripting, performance tuning and troubleshoot Databricks Certified Data Engineer Associate/Professional Certification (Desirable). Comfortable working in a dynamic, fast-paced, innovative environment with several ongoing concurrent projects Should have experience working in Agile methodology Strong verbal and written communication skills. Strong analytical and problem-solving skills with a high attention to detail. Skills: azure databricks,sql,data warehouse,skills,azure datafactory,pyspark,azure synapse,airflow,python,data pipeline,data engineering,architect,etl,pipelines,architect designing,data,azure Show more Show less

Posted 3 weeks ago

Apply

8.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Location: Chennai,Kolkata,Gurgaon,Bangalore and Pune Experience: 8 -12 Years Work Mode: Hybrid Mandatory Skills: Python, Pyspark, SQL, ETL, Data Pipeline, Azure Databricks, Azure DataFactory, Azure Synapse, Airflow, and Architect Designing,Architect. Overview We are seeking a skilled and motivated Data Engineer with experience in Python, SQL, Azure, and cloud-based technologies to join our dynamic team. The ideal candidate will have a solid background in building and optimizing data pipelines, working with cloud platforms, and leveraging modern data engineering tools like Airflow, PySpark, and Azure Data Engineering. If you are passionate about data and looking for an opportunity to work on cutting-edge technologies, this role is for you! Primary Roles And Responsibilities Developing Modern Data Warehouse solutions using Databricks and AWS/ Azure Stack Ability to provide solutions that are forward-thinking in data engineering and analytics space Collaborate with DW/BI leads to understand new ETL pipeline development requirements. Triage issues to find gaps in existing pipelines and fix the issues Work with business to understand the need in reporting layer and develop data model to fulfill reporting needs Help joiner team members to resolve issues and technical challenges. Drive technical discussion with client architect and team members Orchestrate the data pipelines in scheduler via Airflow Skills And Qualifications Bachelor's and/or master’s degree in computer science or equivalent experience. Must have total 6+ yrs. of IT experience and 3+ years' experience in Data warehouse/ETL projects. Deep understanding of Star and Snowflake dimensional modelling. Strong knowledge of Data Management principles Good understanding of Databricks Data & AI platform and Databricks Delta Lake Architecture Should have hands-on experience in SQL, Python and Spark (PySpark) Candidate must have experience in AWS/ Azure stack Desirable to have ETL with batch and streaming (Kinesis). Experience in building ETL / data warehouse transformation processes Experience with Apache Kafka for use with streaming data / event-based data Experience with other Open-Source big data products Hadoop (incl. Hive, Pig, Impala) Experience with Open Source non-relational / NoSQL data repositories (incl. MongoDB, Cassandra, Neo4J) Experience working with structured and unstructured data including imaging & geospatial data. Experience working in a Dev/Ops environment with tools such as Terraform, CircleCI, GIT. Proficiency in RDBMS, complex SQL, PL/SQL, Unix Shell Scripting, performance tuning and troubleshoot Databricks Certified Data Engineer Associate/Professional Certification (Desirable). Comfortable working in a dynamic, fast-paced, innovative environment with several ongoing concurrent projects Should have experience working in Agile methodology Strong verbal and written communication skills. Strong analytical and problem-solving skills with a high attention to detail. Skills: azure databricks,sql,data warehouse,skills,azure datafactory,pyspark,azure synapse,airflow,python,data pipeline,data engineering,architect,etl,pipelines,architect designing,data,azure Show more Show less

Posted 3 weeks ago

Apply

8.0 years

0 Lacs

Greater Kolkata Area

On-site

Linkedin logo

Location: Chennai,Kolkata,Gurgaon,Bangalore and Pune Experience: 8 -12 Years Work Mode: Hybrid Mandatory Skills: Python, Pyspark, SQL, ETL, Data Pipeline, Azure Databricks, Azure DataFactory, Azure Synapse, Airflow, and Architect Designing,Architect. Overview We are seeking a skilled and motivated Data Engineer with experience in Python, SQL, Azure, and cloud-based technologies to join our dynamic team. The ideal candidate will have a solid background in building and optimizing data pipelines, working with cloud platforms, and leveraging modern data engineering tools like Airflow, PySpark, and Azure Data Engineering. If you are passionate about data and looking for an opportunity to work on cutting-edge technologies, this role is for you! Primary Roles And Responsibilities Developing Modern Data Warehouse solutions using Databricks and AWS/ Azure Stack Ability to provide solutions that are forward-thinking in data engineering and analytics space Collaborate with DW/BI leads to understand new ETL pipeline development requirements. Triage issues to find gaps in existing pipelines and fix the issues Work with business to understand the need in reporting layer and develop data model to fulfill reporting needs Help joiner team members to resolve issues and technical challenges. Drive technical discussion with client architect and team members Orchestrate the data pipelines in scheduler via Airflow Skills And Qualifications Bachelor's and/or master’s degree in computer science or equivalent experience. Must have total 6+ yrs. of IT experience and 3+ years' experience in Data warehouse/ETL projects. Deep understanding of Star and Snowflake dimensional modelling. Strong knowledge of Data Management principles Good understanding of Databricks Data & AI platform and Databricks Delta Lake Architecture Should have hands-on experience in SQL, Python and Spark (PySpark) Candidate must have experience in AWS/ Azure stack Desirable to have ETL with batch and streaming (Kinesis). Experience in building ETL / data warehouse transformation processes Experience with Apache Kafka for use with streaming data / event-based data Experience with other Open-Source big data products Hadoop (incl. Hive, Pig, Impala) Experience with Open Source non-relational / NoSQL data repositories (incl. MongoDB, Cassandra, Neo4J) Experience working with structured and unstructured data including imaging & geospatial data. Experience working in a Dev/Ops environment with tools such as Terraform, CircleCI, GIT. Proficiency in RDBMS, complex SQL, PL/SQL, Unix Shell Scripting, performance tuning and troubleshoot Databricks Certified Data Engineer Associate/Professional Certification (Desirable). Comfortable working in a dynamic, fast-paced, innovative environment with several ongoing concurrent projects Should have experience working in Agile methodology Strong verbal and written communication skills. Strong analytical and problem-solving skills with a high attention to detail. Skills: azure databricks,sql,data warehouse,skills,azure datafactory,pyspark,azure synapse,airflow,python,data pipeline,data engineering,architect,etl,pipelines,architect designing,data,azure Show more Show less

Posted 3 weeks ago

Apply

8.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Location: Chennai,Kolkata,Gurgaon,Bangalore and Pune Experience: 8 -12 Years Work Mode: Hybrid Mandatory Skills: Python, Pyspark, SQL, ETL, Data Pipeline, Azure Databricks, Azure DataFactory, Azure Synapse, Airflow, and Architect Designing,Architect. Overview We are seeking a skilled and motivated Data Engineer with experience in Python, SQL, Azure, and cloud-based technologies to join our dynamic team. The ideal candidate will have a solid background in building and optimizing data pipelines, working with cloud platforms, and leveraging modern data engineering tools like Airflow, PySpark, and Azure Data Engineering. If you are passionate about data and looking for an opportunity to work on cutting-edge technologies, this role is for you! Primary Roles And Responsibilities Developing Modern Data Warehouse solutions using Databricks and AWS/ Azure Stack Ability to provide solutions that are forward-thinking in data engineering and analytics space Collaborate with DW/BI leads to understand new ETL pipeline development requirements. Triage issues to find gaps in existing pipelines and fix the issues Work with business to understand the need in reporting layer and develop data model to fulfill reporting needs Help joiner team members to resolve issues and technical challenges. Drive technical discussion with client architect and team members Orchestrate the data pipelines in scheduler via Airflow Skills And Qualifications Bachelor's and/or master’s degree in computer science or equivalent experience. Must have total 6+ yrs. of IT experience and 3+ years' experience in Data warehouse/ETL projects. Deep understanding of Star and Snowflake dimensional modelling. Strong knowledge of Data Management principles Good understanding of Databricks Data & AI platform and Databricks Delta Lake Architecture Should have hands-on experience in SQL, Python and Spark (PySpark) Candidate must have experience in AWS/ Azure stack Desirable to have ETL with batch and streaming (Kinesis). Experience in building ETL / data warehouse transformation processes Experience with Apache Kafka for use with streaming data / event-based data Experience with other Open-Source big data products Hadoop (incl. Hive, Pig, Impala) Experience with Open Source non-relational / NoSQL data repositories (incl. MongoDB, Cassandra, Neo4J) Experience working with structured and unstructured data including imaging & geospatial data. Experience working in a Dev/Ops environment with tools such as Terraform, CircleCI, GIT. Proficiency in RDBMS, complex SQL, PL/SQL, Unix Shell Scripting, performance tuning and troubleshoot Databricks Certified Data Engineer Associate/Professional Certification (Desirable). Comfortable working in a dynamic, fast-paced, innovative environment with several ongoing concurrent projects Should have experience working in Agile methodology Strong verbal and written communication skills. Strong analytical and problem-solving skills with a high attention to detail. Skills: azure databricks,sql,data warehouse,skills,azure datafactory,pyspark,azure synapse,airflow,python,data pipeline,data engineering,architect,etl,pipelines,architect designing,data,azure Show more Show less

Posted 3 weeks ago

Apply

8.0 - 11.0 years

35 - 37 Lacs

Kolkata, Ahmedabad, Bengaluru

Work from Office

Naukri logo

Dear Candidate, We are hiring a Cloud Architect to design and oversee scalable, secure, and cost-efficient cloud solutions. Great for architects who bridge technical vision with business needs. Key Responsibilities: Design cloud-native solutions using AWS, Azure, or GCP Lead cloud migration and transformation projects Define cloud governance, cost control, and security strategies Collaborate with DevOps and engineering teams for implementation Required Skills & Qualifications: Deep expertise in cloud architecture and multi-cloud environments Experience with containers, serverless, and microservices Proficiency in Terraform, CloudFormation, or equivalent Bonus: Cloud certification (AWS/Azure/GCP Architect) Soft Skills: Strong troubleshooting and problem-solving skills. Ability to work independently and in a team. Excellent communication and documentation skills. Note: If interested, please share your updated resume and preferred time for a discussion. If shortlisted, our HR team will contact you. Kandi Srinivasa Delivery Manager Integra Technologies

Posted 3 weeks ago

Apply

4.0 - 7.0 years

3 - 8 Lacs

Bengaluru

Work from Office

Naukri logo

JLL empowers you to shape a brighter way . Our people at JLL and JLL Technologies are shaping the future of real estate for a better world by combining world class services, advisory and technology for our clients. We are committed to hiring the best, most talented people and empowering them to thrive, grow meaningful careers and to find a place where they belong. Whether you’ve got deep experience in commercial real estate, skilled trades or technology, or you’re looking to apply your relevant experience to a new industry, join our team as we help shape a brighter way forward. We are currently seeking a Software Engineer II to join our JLL Technologies Leasing Engineering team. About JLL Technologies – JLL Technologies is a specialized group within JLL. We deliver unparalleled digital advisory, implementation, and services solutions to organizations globally. We provide best-in-class technologies to bring digital ambitions to life aligning technology, people, and processes. Our goal is to leverage technology to increase the value and liquidity of the worlds buildings, while enhancing the productivity and the happiness of those that occupy them. We are seeking candidates that are self-starters who can work in a diverse and fast-paced environment that can join our team to manage and deliver software. What this job involves As a Full Stack Engineer at JLL Technologies, your responsibilities are to: Develops technical understanding of existing architecture to design and expand its capability with new business requirement Independently develops, executes, and monitors complex web and business components, web services, and reports for assigned projects. Maintaining and improving existing codebase and perform peers code review Exploring and evaluating new technologies where relevant Perform unit testing, performance testing, system integration testing and assist with user acceptance testing. Providing on-going support (Troubleshoot, identify, and rectify production issue) to application used within the organization Providing written technical documentation Participating in weekend deployments when required Technical Skills & Competencies Mandatory: Experienced with React.js (with Redux) frontend technologies Knowledge of Node.js development. Strong knowledge of C#, .NET Core, WEB API Strong proficiency in JavaScript (ES6), including DOM manipulation and the JavaScript object model Strong proficiency in MS-SQL, procedure and performance tuning Write unit tests and integration tests using code coverage tools Proficiency in Material UI, HTML5 & CSS web design language Familiarity with Azure Cloud offering and Dev Ops, Github platform Preferable: Experience in development on PaaS offering such as Azure Function, Azure Logic apps, APIM, Data Factory Experience in Elastic Search or Azure cognitive search Experience in adoption of code quality tool such as SonarQube Sound like the job you’re looking for Before you apply it’s also worth knowing what we’re looking for : Education and experience A Bachelors degree in computer science, information systems, software engineering, or a related field. 3-5 years of experience in application development, integration, implementation, and maintenance Reliable, self-motivated, and self-disciplined individual. Effective written and verbal communication skills. Excellent technical, analytical and organizational skills. What you can expect from us We succeed together—across the desk and around the globe and believe the best inspire the best, so we invest in supporting each other, learning together and celebrating our success. Our Total Rewards program reflects our commitment to helping you achieve your career ambitions, recognizing your contributions, investing in your well-being and providing competitive benefits and pay. Apply today! Location On-site –Bengaluru, KA Scheduled Weekly Hours: 40 If this job description resonates with you, we encourage you to apply even if you don’t meet all of the requirements. We’re interested in getting to know you and what you bring to the table! JLL Privacy Notice Jones Lang LaSalle (JLL), together with its subsidiaries and affiliates, is a leading global provider of real estate and investment management services. We take our responsibility to protect the personal information provided to us seriously. Generally the personal information we collect from you are for the purposes of processing in connection with JLL’s recruitment process. We endeavour to keep your personal information secure with appropriate level of security and keep for as long as we need it for legitimate business or legal reasons. We will then delete it safely and securely. Candidate Privacy Statement . For candidates in the United States, please see a full copy of our Equal Employment Opportunity and Affirmative Action policy here. Jones Lang LaSalle (“JLL”) is an Equal Opportunity Employer and is committed to working with and providing reasonable accommodations to individuals with disabilities. If you need a reasonable accommodation because of a disability for any part of the employment process – including the online application and/or overall selection process – you may contact us at Accommodation Requests . This email is only to request an accommodation. Please direct any other general recruiting inquiries to our Contact Us page I want to work for JLL.

Posted 3 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies