Jobs
Interviews

198 Synapse Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

7.0 - 11.0 years

0 Lacs

karnataka

On-site

As an experienced AI Analytics Engineer with expertise in Azure DevOps, your role will involve designing, implementing, and optimizing data pipelines, machine learning models, and analytics solutions. You will be responsible for bridging the gap between data science, engineering, and DevOps practices to deliver scalable and production-ready AI/ML solutions. Key Responsibilities: - Design, develop, and deploy AI/ML models and analytics workflows. - Build and manage end-to-end CI/CD pipelines in Azure DevOps for data and ML projects. - Automate data ingestion, preprocessing, model training, testing, and deployment. - Monitor model performance and implement retraining pipelines. - Work closely with data scientists, data engineers, and business stakeholders to translate requirements into scalable solutions. - Ensure solutions are secure, cost-optimized, and highly available on Azure. - Perform root cause analysis and continuous improvement for production issues. Qualifications Required: - Hands-on experience with Azure DevOps (pipelines, repos, artifacts, boards). - Strong programming skills in Python or R for AI/ML development. - Experience with Azure Machine Learning, Databricks, Synapse, Data Factory. - Good understanding of MLOps principles and tools. - Strong knowledge of data visualization and analytics (Power BI a plus). - Familiarity with containerization (Docker, Kubernetes) for deploying ML workloads. - Experience with version control (Git) and agile development practices. In addition to the above details, the job description also emphasizes the importance of the following soft skills: - Excellent communication and collaboration skills. - Ability to translate technical insights into business value. - Strong analytical thinking and attention to detail. (Note: Any additional details of the company were not provided in the job description),

Posted 1 day ago

Apply

7.0 - 11.0 years

0 Lacs

bhopal, madhya pradesh

On-site

You will be responsible for designing and building ETL pipelines in Azure Databricks (PySpark, Delta Lake) to load, clean, and deliver data across Bronze, Silver, and Gold layers. Your role will also involve implementing Data Lakehouse Architecture on Azure Data Lake Gen2 with partitioning, schema management, and performance optimization. Additionally, you will need to develop data models (dimensional/star schema) for reporting in Synapse and Power BI. Integration of Databricks with Azure services like ADF, Key Vault, Event Hub, Synapse Analytics, Purview, and Logic Apps will be a part of your tasks. Building and managing CI/CD pipelines in Azure DevOps (YAML, Git repos, pipelines) will also fall under your responsibilities. You will be expected to optimize performance through cluster tuning, caching, Z-ordering, Delta optimization, and job parallelization. Ensuring data security and compliance (row-level security, PII masking, GDPR/HIPAA, audit logging) and collaborating with data architects and analysts to translate business needs into technical solutions are vital aspects of this role. - Strong experience in Azure Databricks (Python, PySpark, SQL). - Proficiency with Delta Lake (ACID transactions, schema evolution, incremental loads). - Hands-on with Azure ecosystem - Data Factory, ADLS Gen2, Key Vault, Event Hub, Synapse. - Knowledge of data governance & lineage tools (Purview, Unity Catalog is a plus). - Strong understanding of data warehouse design and star schema. - Azure DevOps (YAML, Git repos, pipelines) experience. - Good debugging skills for performance tuning & schema drift issues. **Good to Have:** - Experience with healthcare or financial data. - Familiarity with FHIR, OMOP, OpenEHR (for healthcare projects). - Exposure to AI/ML integration using Databricks ML runtime. - Experience with Unity Catalog for governance across workspaces. If you are ready to take the lead in building scalable data solutions with Azure Databricks, this Full-time position in Bhopal, MP, India awaits you!,

Posted 2 days ago

Apply

4.0 - 10.0 years

0 Lacs

karnataka

On-site

As a Data Engineering Senior Associate at Microsoft, Fabric, Azure (Databricks & ADF), PySpark, your role will involve: - Requirement gathering and analysis - Designing and implementing data pipelines using Microsoft Fabric & Databricks - Extracting, transforming, and loading (ETL) data from various sources into Azure Data Lake Storage - Implementing data security and governance measures - Monitoring and optimizing data pipelines for performance and efficiency - Troubleshooting and resolving data engineering issues - Providing optimized solutions for any problem related to data engineering - Working with a variety of sources like Relational DB, API, File System, Realtime streams, CDC, etc. - Demonstrating strong knowledge on Databricks, Delta tables Qualifications Required: - 4-10 years of experience in Data Engineering or related roles - Hands-on experience in Microsoft Fabric and Azure Databricks - Proficiency in PySpark for data processing and scripting - Strong command over Python & SQL for writing complex queries, performance tuning, etc. - Experience working with Azure Data Lake Storage and Data Warehouse concepts (e.g., dimensional modeling, star/snowflake schemas) - Hands-on experience in performance tuning & optimization on Databricks & MS Fabric - Understanding CI/CD practices in a data engineering context - Excellent problem-solving and communication skills - Exposure to BI tools like Power BI, Tableau, or Looker Additional Details: - Experienced in Azure DevOps is a plus - Familiarity with data security and compliance in the cloud - Experience with different databases like Synapse, SQL DB, Snowflake etc.,

Posted 3 days ago

Apply

5.0 - 10.0 years

12 - 22 Lacs

noida, mumbai (all areas)

Hybrid

Role Overview: We are looking for a Data Engineer & Power BI Analyst with strong skills in Azure data integration and Power BI dashboarding . The ideal candidate will be able to handle the end-to-end data lifecycle : extracting data from SAP, integrating it into Azure, transforming/aggregating it, and finally enabling business insights through interactive Power BI reports. Key Responsibilities: Extract and integrate SAP data (IDOCs, OData, RFC, etc.) into the Azure ecosystem . Build and optimize data pipelines using Azure Data Factory, Synapse, Databricks, Fabric, and Logic Apps . Perform data aggregation, transformation, and modeling using Databricks, Synapse Analytics, and Delta Lake . Design and develop Power BI dashboards and reports , including DAX calculations, RLS, and custom visuals. Collaborate with stakeholders to understand business needs and deliver actionable insights . Ensure data quality, performance optimization, and governance across the reporting layer. Shift Timing: 12 PM 9 PM Why Inadev: We fuel growth by encouraging cross department training and development, sponsoring certifications and credentials for continued learning. You will get opportunities to be bold, try new things and really make your mark. We understand that you have a life outside of work and thats why we offer flexible time off and the ability for you to manage your work schedule as needed.

Posted 3 days ago

Apply

6.0 - 11.0 years

22 - 37 Lacs

bengaluru

Work from Office

Why Join Decision Point? Work with a top-tier analytics consulting firm known for data-driven innovation. Get exposure to Fortune 500 clients and real-world impact projects. Be part of a collaborative culture that encourages leadership and learning. Solve complex business problems using cutting-edge tech across domains Key Responsibilities Lead and deliver end-to-end data engineering projects in a fast-paced environment Design and implement scalable data pipelines using PySpark , SQL , and orchestration tools Develop robust data models to support analytics and reporting Optimize data storage and processing on AWS (Glue, S3, Redshift, Lambda) and Azure (ADF, Synapse, Blob Storage) Collaborate with cross-functional teams (Data Science, Product, BI) to deliver business value Manage team performance and mentor junior engineers Ensure data quality, governance, and performance tuning Key Skills Programming: Python (advanced), PySpark (main), SQL (main) Cloud Platforms: AWS (Glue, S3, Redshift, Lambda), Azure (ADF, Synapse, Blob Storage, Data Lake) Data Orchestration: Airflow, Azure Data Factory Data Modeling: Dimensional and relational modeling Leadership: Team management, stakeholder collaboration, project ownership Other: CI/CD for data pipelines, version control (Git), agile methodologies

Posted 4 days ago

Apply

14.0 - 18.0 years

0 Lacs

karnataka

On-site

As a member of the Analytics and Insights Managed Services team, you will play a crucial role in leveraging industry expertise, technology, data management, and managed services experience to drive sustained outcomes for our clients and enhance business performance. Your responsibilities will involve empowering companies to revamp their analytics and insights approach, optimize processes for efficiency, and enhance client satisfaction. A deep understanding of IT services, operational excellence, and client-centric solutions will be key to excelling in this role. **Key Responsibilities:** - Serve as the primary point of contact for client interactions, nurturing strong relationships. - Manage client escalations efficiently and ensure timely issue resolution. - Act as the face of the team during strategic client discussions, governance meetings, and regular interactions. - Lead the end-to-end delivery of managed data analytics services, ensuring adherence to business requirements, timelines, and quality standards. - Implement standardized processes and best practices to drive operational efficiency. - Collaborate with cross-functional teams to deliver comprehensive solutions. - Monitor, manage, and report service-level agreements (SLAs) and key performance indicators (KPIs). - Drive innovation and automation in data integration, processing, analysis, and reporting workflows. - Stay abreast of industry trends, emerging technologies, and regulatory requirements impacting managed services. - Ensure data security, privacy, and compliance with relevant standards and regulations. - Lead and mentor a team of service managers and technical professionals to foster high performance and continuous development. - Collaborate with sales teams to identify growth opportunities and expand services. - Experience in solutioning responses and operating models for RFPs. **Qualification Required:** - Bachelor's degree in information technology, Data Science, Computer Science, Statistics, or a related field (Masters degree preferred). - Minimum of 14 years of experience, with at least 3 years in a managerial or leadership role. - Proven experience in managing data analytics services for external clients, preferably within a managed services or consulting environment. - Proficiency in data analytics tools such as Power BI, Tableau, QlikView, Data Integration tools like ETL, Informatica, Talend, Snowflake, and programming languages like Python, R, SAS, SQL. - Strong understanding of Data & Analytics services cloud platforms (e.g., AWS, Azure, GCP), big data technologies like Hadoop, Spark, traditional Data warehousing tools like Teradata, Netezza, machine learning, AI, and automation in data analytics. - Certification in data-related disciplines preferred. - Demonstrated leadership abilities, project management skills, and excellent communication skills. This job offers you an exciting opportunity to lead and excel in a dynamic environment where you can contribute to the success of our clients through innovative analytics and insights solutions.,

Posted 4 days ago

Apply

1.0 - 6.0 years

0 Lacs

andhra pradesh

On-site

As a Data Engineer at Microsoft Fabric, you will be responsible for designing, developing, and optimizing data pipelines, reporting solutions, and analytics frameworks using Microsoft Fabric. Your role will involve collaborating with stakeholders and technical teams to deliver scalable, secure, and high-performing analytics solutions. You will work closely with data architects, analysts, and business stakeholders to gather analytics requirements and build data solutions using Microsoft Fabric components such as Data Factory, OneLake, Synapse, and Power BI. Your responsibilities will include developing and optimizing pipelines for ingestion, transformation, and integration, as well as creating and maintaining semantic models and datasets for reporting purposes. Ensuring compliance with best practices for performance, governance, and security of Fabric solutions will also be a key aspect of your role. Additionally, you will support migration projects, conduct proof-of-concepts, and create and maintain documentation related to ETL processes, data flows, and data mappings. You will also play a crucial role in guiding and training client teams on Fabric adoption. To excel in this role, you should have 4-6 years of experience in data analytics, BI, or cloud platforms, with at least 1 year of hands-on experience in Microsoft Fabric, specifically in Data Factory, OneLake, Synapse, and Power BI semantic models and reporting. Strong SQL and data modeling skills, experience with ETL/ELT and performance tuning, familiarity with Azure and cloud data platforms, as well as strong communication and client-facing skills are essential requirements. Knowledge of the Azure Data Stack (ADF, Synapse, Databricks), governance, security, compliance, and consulting/IT services experience will be beneficial. This is a full-time position located in Visakhapatnam, with health insurance and Provident Fund benefits provided. The work location is in person.,

Posted 5 days ago

Apply

12.0 - 14.0 years

0 Lacs

hyderabad, telangana, india

On-site

JOB DESCRIPTION Roles & responsibilities Here are some of the key responsibilities of Sr AI Research Scientist: Research and Development: Conduct original research on generative AI models, focusing on model architecture, training methodologies, fine-tuning techniques, and evaluation strategies. Maintain a strong publication record in top-tier conferences and journals, showcasing contributions to the fields of Natural Language Processing (NLP), Deep Learning (DL), and Machine Learning (ML). Experience with POCs on emerging and latest innovation in AI. Multimodal Development: Design and experiment with multimodal generative models that integrate various data types, including text, images, and other modalities to enhance AI capabilities. Agentic AI Systems: Develop and design autonomous AI systems that exhibit agentic behavior, capable of making independent decisions and adapting to dynamic environments. Design and Implementation: Lead the design, development, and implementation of generative AI models and systems, ensuring a deep understanding of the domain. Select suitable models, train them on large datasets, fine-tune hyperparameters, and optimize overall performance. Algorithm Optimization: Optimize generative AI algorithms to enhance their efficiency, scalability, and computational performance through techniques such as parallelization, distributed computing, and hardware acceleration, maximizing the capabilities of modern computing architectures. Data Preprocessing and Feature Engineering: Manage large datasets by performing data preprocessing and feature engineering to extract critical information for generative AI models. This includes tasks such as data cleaning, normalization, dimensionality reduction, and feature selection. Model Evaluation and Validation: Evaluate the performance of generative AI models using relevant metrics and validation techniques. Conduct experiments, analyze results, and iteratively refine models to meet desired performance benchmarks. Technical Leadership: Provide technical leadership and mentorship to junior team members, guiding their development in generative AI through work reviews, skill-building, and knowledge sharing. Ability to drive multiple teams and cross-collaborate to ensure the quality delivery. Documentation and Reporting: Document research findings, model architectures, methodologies, and experimental results thoroughly. Prepare technical reports, presentations, and whitepapers to effectively communicate insights and findings to stakeholders. Continuous Learning and Innovation: Stay abreast of the latest advancements in generative AI by reading research papers, attending conferences, and engaging with relevant communities. Foster a culture of learning and innovation within the team to drive continuous improvement. Mandatory technical & functional skills Machine learning frameworks - PyTorch or TensorFlow. Deep Learning algorithms - CNN, RNN, LSTM, Transformers LLMs ( BERT, GPT, etc.) and NLP algorithms. Design experience for fine Tuning of Open source LLMs from Huggingface, Meta- LLaMA 3.1, BLOOM, Mistral AI etc. GCP : Vertex AI or Azure : AI Foundry or AWS SageMaker Scientific understanding - PEFT - LORA, QLORA, etc. Exposure to GCP : Vertex AI or Azure : AI Foundry or AWS SageMaker In-depth conceptual understanding on emerging and latest innovation in AI. Stay current with AI trends - MCP, A2A protocol, ACP, etc. Preferred Technical & Functional Skills Langgraph/CrewAI/Autogen Large scale deployment of GenAI/DL/ML projects, with good understanding of MLOps /LLM Ops Ensure scalability and efficiency, handle data tasks, Cloud computing experience- Azure/AWS/GCP BigQuery/Synapse Key behavioral attributes/requirements Ability to mentor Managers and Tech Leads Ability to own project deliverables, not just individual tasks Understand business objectives and functions to support data needs RESPONSIBILITIES Roles & responsibilities Here are some of the key responsibilities of Sr AI Research Scientist: Research and Development: Conduct original research on generative AI models, focusing on model architecture, training methodologies, fine-tuning techniques, and evaluation strategies. Maintain a strong publication record in top-tier conferences and journals, showcasing contributions to the fields of Natural Language Processing (NLP), Deep Learning (DL), and Machine Learning (ML). Experience with POCs on emerging and latest innovation in AI. Multimodal Development: Design and experiment with multimodal generative models that integrate various data types, including text, images, and other modalities to enhance AI capabilities. Agentic AI Systems: Develop and design autonomous AI systems that exhibit agentic behavior, capable of making independent decisions and adapting to dynamic environments. Design and Implementation: Lead the design, development, and implementation of generative AI models and systems, ensuring a deep understanding of the domain. Select suitable models, train them on large datasets, fine-tune hyperparameters, and optimize overall performance. Algorithm Optimization: Optimize generative AI algorithms to enhance their efficiency, scalability, and computational performance through techniques such as parallelization, distributed computing, and hardware acceleration, maximizing the capabilities of modern computing architectures. Data Preprocessing and Feature Engineering: Manage large datasets by performing data preprocessing and feature engineering to extract critical information for generative AI models. This includes tasks such as data cleaning, normalization, dimensionality reduction, and feature selection. Model Evaluation and Validation: Evaluate the performance of generative AI models using relevant metrics and validation techniques. Conduct experiments, analyze results, and iteratively refine models to meet desired performance benchmarks. Technical Leadership: Provide technical leadership and mentorship to junior team members, guiding their development in generative AI through work reviews, skill-building, and knowledge sharing. Ability to drive multiple teams and cross-collaborate to ensure the quality delivery. Documentation and Reporting: Document research findings, model architectures, methodologies, and experimental results thoroughly. Prepare technical reports, presentations, and whitepapers to effectively communicate insights and findings to stakeholders. Continuous Learning and Innovation: Stay abreast of the latest advancements in generative AI by reading research papers, attending conferences, and engaging with relevant communities. Foster a culture of learning and innovation within the team to drive continuous improvement. Mandatory technical & functional skills Machine learning frameworks - PyTorch or TensorFlow. Deep Learning algorithms - CNN, RNN, LSTM, Transformers LLMs ( BERT, GPT, etc.) and NLP algorithms. Design experience for fine Tuning of Open source LLMs from Huggingface, Meta- LLaMA 3.1, BLOOM, Mistral AI etc. GCP : Vertex AI or Azure : AI Foundry or AWS SageMaker Scientific understanding - PEFT - LORA, QLORA, etc. Exposure to GCP : Vertex AI or Azure : AI Foundry or AWS SageMaker In-depth conceptual understanding on emerging and latest innovation in AI. Stay current with AI trends - MCP, A2A protocol, ACP, etc. Preferred Technical & Functional Skills Langgraph/CrewAI/Autogen Large scale deployment of GenAI/DL/ML projects, with good understanding of MLOps /LLM Ops Ensure scalability and efficiency, handle data tasks, Cloud computing experience- Azure/AWS/GCP BigQuery/Synapse Key behavioral attributes/requirements Ability to mentor Managers and Tech Leads Ability to own project deliverables, not just individual tasks Understand business objectives and functions to support data needs QUALIFICATIONS This role is for you if you have the below Educational Qualifications Masters (MS by Research)/PhD or equivalent degree in Computer Science Preferences to research scholars from Tier 1 colleges- IITs, NITs, IISc, IIITs, ISIs, etc. Work Experience 12+ Years of experience with strong record of publications (at least 5) in top tier conferences and journals #KGS Show more Show less

Posted 5 days ago

Apply

15.0 - 17.0 years

0 Lacs

bengaluru, karnataka, india

On-site

Who We Are At Kyndryl, we design, build, manage and modernize the mission-critical technology systems that the world depends on every day. So why work at Kyndryl We are always moving forward - always pushing ourselves to go further in our efforts to build a more equitable, inclusive world for our employees, our customers and our communities. The Role Are you ready to embark on an exhilarating journey as a Data Consultant Join Kyndryl and become a driving force behind the transformative power of data! We're seeking an exceptionally talented individual to accelerate the competitive performance of our customers worldwide, establishing us as their unrivaled business and technology consulting partner. As Enterprise Consultant for Data and AI solutions you are expected to create a vision of the future by working with leadership to identify opportunities and translate them into functional and non-functional requirements. You should be and expert in Data Science/Model Lifecycle and deep expertise of managing data solutions. As an Architect in our Data and AI team,you will provide best-fit architectural solutions for one or more projects leveraging your architectural skills assist in defining scope and sizing of work and anchor Proof of Concept developments. You will provide solution architecture for the business problem, platform integration with third party services, designing and developing complex features for clients business needs. You will collaborate with some of the best talent in the industry to create and implement innovative solutions, participate in Pre-Sales and various pursuits focused on our clients business needs. You will also contribute in a variety of roles in thought leadership, mentorship, systems analysis, architecture, data management models, configuration, testing, debugging, and documentation. You will challenge your leading edge solutions, consultative and business skills through the diversity of work in multiple industry domains. Responsibilities: Handling RFP / RFI / Government Tender technical solution, detailed scope preparation, effort estimations and response drafting Excellent presentation skills is important Understand client needs, translate them into business solutions that can be implemented Responsible for architecture, design and development of scalable data engineering / AI solutions and standards for various business problems using cloud native services, or third party services on hyperscalers Take ownership of technical solutions from design and architecture perspective, ensure the right direction and propose resolution to potential Data science/Model related problems Delivering and presenting proofs of concept of key technology components to project stakeholders Working within an Agile delivery / DevOps methodology to deliver proof of concept and production implementation in iterative sprints. Design and develop model utilization benchmarks, metrics, and monitoring to measure and improve models. Detect model drift and alert - Prometheus, Grafana stack, Cloud native monitoring stack Research, design, implement and validate cutting-edge deployment methods across hybrid cloud scenarios Develop and maintain documentation of the Model flows and integrations, pipelines etc Evaluate and create PoVs around the performance aspects of DSML platforms and tools in the market against customer requirements Assist in driving improvements to the Data Engineering stack, with a focus on the digital experience for the user, as well as model performance & security to meet the needs of the business and customers, now & in the future Required Experience: Data Engineer with 15+ years of experience with following skills - Must have 5+ years of experience working with Data modernisation solutions, working experience with on-prem or in cloud AI, DWH solutions based on native and third party solutions based AWS, Azure, GCP, Databricks etc. Experience of handling RFP / RFI / Tender responses, proposal preparation Must have experience designing and architecting of data lake/warehouse projects using Paas and Saas - such as Snowflake, Databricks, Redshift, Synapse, BigQuery etc. or data warehouse/data lake implementations on-premise Must have good knowledge in designing data pipelines ETL/ELT, DS modules implementing complex stored Procedures and standard DWH and ETL concepts Experience in Data Migration from on-premise RDBMS to cloud data warehouses Good understanding of relational as well as NoSQL data stores, methods and approaches (star and snowflake, dimensional modelling) Hands-on experience in Python, PySpark, programming for data integration projects Support in providing resolution to an extensive range of complicated data pipeline related problems, proactively and as issues surface Who You Are Preferred Skills: Understanding of cloud network, security, data security and data access controls and design aspects AI and Data solutions on Hyperscalers such as Databricks, MS-Fabric, co-pilot, AWS redshift, GCP BigQuery, GCP Gemini etc Background Agentic AI, GenAI technologies will be added advantage Hands ON experience for planning and executing POC / MVP / Client projects engaging Data Modernization and AI use case developments Required Skills: .Bachelor's degree in Computer Science, Information Security, or a related field .Skilled in planning, organization, analytics, and problem-solving .Excellent communication and interpersonal skills to work collaboratively with clients and team members .Comfortable working with statistics Being You Diversity is a whole lot more than what we look like or where we come from, it's how we think and who we are. We welcome people of all cultures, backgrounds, and experiences. But we're not doing it single-handily: Our Kyndryl Inclusion Networks are only one of many ways we create a workplace where all Kyndryls can find and provide support and advice. This dedication to welcoming everyone into our company means that Kyndryl gives you - and everyone next to you - the ability to bring your whole self to work, individually and collectively, and support the activation of our equitable culture. That's the Kyndryl Way. What You Can Expect With state-of-the-art resources and Fortune 100 clients, every day is an opportunity to innovate, build new capabilities, new relationships, new processes, and new value. Kyndryl cares about your well-being and prides itself on offering benefits that give you choice, reflect the diversity of our employees and support you and your family through the moments that matter - wherever you are in your life journey. Our employee learningprograms give you access to the best learning in the industry to receive certifications, includingMicrosoft, Google, Amazon, Skillsoft, and many more. Through our company-wide volunteering and giving platform, you can donate, start fundraisers, volunteer, and search over 2 million non-profit organizations. At Kyndryl, we invest heavily in you, we want you to succeed so that together, we will all succeed. Get Referred! If you know someone that works at Kyndryl, when asked How Did You Hear About Us during the application process, select Employee Referral and enter your contact's Kyndryl email address.

Posted 5 days ago

Apply

15.0 - 17.0 years

0 Lacs

bengaluru, karnataka, india

On-site

Who We Are At Kyndryl, we design, build, manage and modernize the mission-critical technology systems that the world depends on every day. So why work at Kyndryl We are always moving forward - always pushing ourselves to go further in our efforts to build a more equitable, inclusive world for our employees, our customers and our communities. The Role Are you ready to embark on an exhilarating journey as a Data Consultant Join Kyndryl and become a driving force behind the transformative power of data! We're seeking an exceptionally talented individual to accelerate the competitive performance of our customers worldwide, establishing us as their unrivaled business and technology consulting partner. As Enterprise Consultant for Data and AI solutions you are expected to create a vision of the future by working with leadership to identify opportunities and translate them into functional and non-functional requirements. You should be and expert in Data Science/Model Lifecycle and deep expertise of managing data solutions. As an Architect in our Data and AI team,you will provide best-fit architectural solutions for one or more projects leveraging your architectural skills assist in defining scope and sizing of work and anchor Proof of Concept developments. You will provide solution architecture for the business problem, platform integration with third party services, designing and developing complex features for clients business needs. You will collaborate with some of the best talent in the industry to create and implement innovative solutions, participate in Pre-Sales and various pursuits focused on our clients business needs. You will also contribute in a variety of roles in thought leadership, mentorship, systems analysis, architecture, data management models, configuration, testing, debugging, and documentation. You will challenge your leading edge solutions, consultative and business skills through the diversity of work in multiple industry domains. Responsibilities: Handling RFP / RFI / Government Tender technical solution, detailed scope preparation, effort estimations and response drafting Excellent presentation skills is important Understand client needs, translate them into business solutions that can be implemented Responsible for architecture, design and development of scalable data engineering / AI solutions and standards for various business problems using cloud native services, or third party services on hyperscalers Take ownership of technical solutions from design and architecture perspective, ensure the right direction and propose resolution to potential Data science/Model related problems Delivering and presenting proofs of concept of key technology components to project stakeholders Working within an Agile delivery / DevOps methodology to deliver proof of concept and production implementation in iterative sprints. Design and develop model utilization benchmarks, metrics, and monitoring to measure and improve models. Detect model drift and alert - Prometheus, Grafana stack, Cloud native monitoring stack Research, design, implement and validate cutting-edge deployment methods across hybrid cloud scenarios Develop and maintain documentation of the Model flows and integrations, pipelines etc Evaluate and create PoVs around the performance aspects of DSML platforms and tools in the market against customer requirements Assist in driving improvements to the Data Engineering stack, with a focus on the digital experience for the user, as well as model performance & security to meet the needs of the business and customers, now & in the future Required Skills Data Engineer with 15+ years of experience with following skills - Must have 5+ years of experience working with Data modernisation solutions, working experience with on-prem or in cloud AI, DWH solutions based on native and third party solutions based AWS, Azure, GCP, Databricks etc. Experience of handling RFP / RFI / Tender responses, proposal preparation Must have experience designing and architecting of data lake/warehouse projects using Paas and Saas - such as Snowflake, Databricks, Redshift, Synapse, BigQuery etc. or data warehouse/data lake implementations on-premise Must have good knowledge in designing data pipelines ETL/ELT, DS modules implementing complex stored Procedures and standard DWH and ETL concepts Experience in Data Migration from on-premise RDBMS to cloud data warehouses Good understanding of relational as well as NoSQL data stores, methods and approaches (star and snowflake, dimensional modelling) Hands-on experience in Python, PySpark, programming for data integration projects Support in providing resolution to an extensive range of complicated data pipeline related problems, proactively and as issues surface. Who You Are Preferred Skills: Understanding of cloud network, security, data security and data access controls and design aspects AI and Data solutions on Hyperscalers such as Databricks, MS-Fabric, co-pilot, AWS redshift, GCP BigQuery, GCP Gemini etc Background Agentic AI, GenAI technologies will be added advantage Hands ON experience for planning and executing POC / MVP / Client projects engaging Data Modernization and AI use case developments Required Experience: .Bachelor's degree in Computer Science, Information Security, or a related field .Skilled in planning, organization, analytics, and problem-solving .Excellent communication and interpersonal skills to work collaboratively with clients and team members .Comfortable working with statistics Being You Diversity is a whole lot more than what we look like or where we come from, it's how we think and who we are. We welcome people of all cultures, backgrounds, and experiences. But we're not doing it single-handily: Our Kyndryl Inclusion Networks are only one of many ways we create a workplace where all Kyndryls can find and provide support and advice. This dedication to welcoming everyone into our company means that Kyndryl gives you - and everyone next to you - the ability to bring your whole self to work, individually and collectively, and support the activation of our equitable culture. That's the Kyndryl Way. What You Can Expect With state-of-the-art resources and Fortune 100 clients, every day is an opportunity to innovate, build new capabilities, new relationships, new processes, and new value. Kyndryl cares about your well-being and prides itself on offering benefits that give you choice, reflect the diversity of our employees and support you and your family through the moments that matter - wherever you are in your life journey. Our employee learningprograms give you access to the best learning in the industry to receive certifications, includingMicrosoft, Google, Amazon, Skillsoft, and many more. Through our company-wide volunteering and giving platform, you can donate, start fundraisers, volunteer, and search over 2 million non-profit organizations. At Kyndryl, we invest heavily in you, we want you to succeed so that together, we will all succeed. Get Referred! If you know someone that works at Kyndryl, when asked How Did You Hear About Us during the application process, select Employee Referral and enter your contact's Kyndryl email address.

Posted 5 days ago

Apply

15.0 - 17.0 years

0 Lacs

bengaluru, karnataka, india

On-site

Who We Are At Kyndryl, we design, build, manage and modernize the mission-critical technology systems that the world depends on every day. So why work at Kyndryl We are always moving forward - always pushing ourselves to go further in our efforts to build a more equitable, inclusive world for our employees, our customers and our communities. The Role Are you ready to embark on an exhilarating journey as a Data Consultant Join Kyndryl and become a driving force behind the transformative power of data! We're seeking an exceptionally talented individual to accelerate the competitive performance of our customers worldwide, establishing us as their unrivaled business and technology consulting partner. As Enterprise Consultant for Data and AI solutions you are expected to create a vision of the future by working with leadership to identify opportunities and translate them into functional and non-functional requirements. You should be and expert in Data Science/Model Lifecycle and deep expertise of managing data solutions. As an Architect in our Data and AI team,you will provide best-fit architectural solutions for one or more projects leveraging your architectural skills assist in defining scope and sizing of work and anchor Proof of Concept developments. You will provide solution architecture for the business problem, platform integration with third party services, designing and developing complex features for clients business needs. You will collaborate with some of the best talent in the industry to create and implement innovative solutions, participate in Pre-Sales and various pursuits focused on our clients business needs. You will also contribute in a variety of roles in thought leadership, mentorship, systems analysis, architecture, data management models, configuration, testing, debugging, and documentation. You will challenge your leading edge solutions, consultative and business skills through the diversity of work in multiple industry domains. Responsibilities: Handling RFP / RFI / Government Tender technical solution, detailed scope preparation, effort estimations and response drafting Excellent presentation skills is important Understand client needs, translate them into business solutions that can be implemented Responsible for architecture, design and development of scalable data engineering / AI solutions and standards for various business problems using cloud native services, or third party services on hyperscalers Take ownership of technical solutions from design and architecture perspective, ensure the right direction and propose resolution to potential Data science/Model related problems Delivering and presenting proofs of concept of key technology components to project stakeholders Working within an Agile delivery / DevOps methodology to deliver proof of concept and production implementation in iterative sprints. Design and develop model utilization benchmarks, metrics, and monitoring to measure and improve models. Detect model drift and alert - Prometheus, Grafana stack, Cloud native monitoring stack Research, design, implement and validate cutting-edge deployment methods across hybrid cloud scenarios Develop and maintain documentation of the Model flows and integrations, pipelines etc Evaluate and create PoVs around the performance aspects of DSML platforms and tools in the market against customer requirements Assist in driving improvements to the Data Engineering stack, with a focus on the digital experience for the user, as well as model performance & security to meet the needs of the business and customers, now & in the future Required Experience: Data Engineer with 15+ years of experience with following skills - Must have 5+ years of experience working with Data modernisation solutions, working experience with on-prem or in cloud AI, DWH solutions based on native and third party solutions based AWS, Azure, GCP, Databricks etc. Experience of handling RFP / RFI / Tender responses, proposal preparation Must have experience designing and architecting of data lake/warehouse projects using Paas and Saas - such as Snowflake, Databricks, Redshift, Synapse, BigQuery etc. or data warehouse/data lake implementations on-premise Must have good knowledge in designing data pipelines ETL/ELT, DS modules implementing complex stored Procedures and standard DWH and ETL concepts Experience in Data Migration from on-premise RDBMS to cloud data warehouses Good understanding of relational as well as NoSQL data stores, methods and approaches (star and snowflake, dimensional modelling) Hands-on experience in Python, PySpark, programming for data integration projects Support in providing resolution to an extensive range of complicated data pipeline related problems, proactively and as issues surface Who You Are Preferred Skills: Understanding of cloud network, security, data security and data access controls and design aspects AI and Data solutions on Hyperscalers such as Databricks, MS-Fabric, co-pilot, AWS redshift, GCP BigQuery, GCP Gemini etc Background Agentic AI, GenAI technologies will be added advantage Hands ON experience for planning and executing POC / MVP / Client projects engaging Data Modernization and AI use case developments Required Skills: .Bachelor's degree in Computer Science, Information Security, or a related field .Skilled in planning, organization, analytics, and problem-solving .Excellent communication and interpersonal skills to work collaboratively with clients and team members .Comfortable working with statistics Being You Diversity is a whole lot more than what we look like or where we come from, it's how we think and who we are. We welcome people of all cultures, backgrounds, and experiences. But we're not doing it single-handily: Our Kyndryl Inclusion Networks are only one of many ways we create a workplace where all Kyndryls can find and provide support and advice. This dedication to welcoming everyone into our company means that Kyndryl gives you - and everyone next to you - the ability to bring your whole self to work, individually and collectively, and support the activation of our equitable culture. That's the Kyndryl Way. What You Can Expect With state-of-the-art resources and Fortune 100 clients, every day is an opportunity to innovate, build new capabilities, new relationships, new processes, and new value. Kyndryl cares about your well-being and prides itself on offering benefits that give you choice, reflect the diversity of our employees and support you and your family through the moments that matter - wherever you are in your life journey. Our employee learningprograms give you access to the best learning in the industry to receive certifications, includingMicrosoft, Google, Amazon, Skillsoft, and many more. Through our company-wide volunteering and giving platform, you can donate, start fundraisers, volunteer, and search over 2 million non-profit organizations. At Kyndryl, we invest heavily in you, we want you to succeed so that together, we will all succeed. Get Referred! If you know someone that works at Kyndryl, when asked How Did You Hear About Us during the application process, select Employee Referral and enter your contact's Kyndryl email address.

Posted 5 days ago

Apply

8.0 - 12.0 years

12 - 14 Lacs

remote, india

On-site

Role Overview: We are looking for an experienced Data Architect to design, develop, and optimize enterprise data solutions on Microsoft Azure. Key Responsibilities: Lead architecture, design, and implementation of Data Warehouse and Data Lake solutions. Work across the Microsoft Azure stack: Azure Data Factory, Synapse, SQL Database, Key Vault, MS Fabric, DevOps, VNets. Design data models aligned with Medallion Architecture (Bronze, Silver, Gold layers). Create and manage star/snowflake schemas, partitioning strategies, and scalable data flows. Develop optimized SQL (packages, procedures, triggers, transformations, performance tuning). Collaborate with BI teams using Power BI / Tableau for reporting solutions. Oversee environment setup, deployment strategies, and capacity planning. Ensure optimal use of cloud resources to balance performance and cost. Manage stakeholders and communicate effectively across teams. Required Skills & Experience: 8+ years in Data Warehouse / Data Lake programming. At least 2 end-to-end project implementations (1 on Azure mandatory). Strong technical expertise in Azure ecosystem and SQL. Preferred: Azure Solutions Architect Expert or Azure Data Engineer Associate certification.

Posted 6 days ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

Withum is a place where talent thrives - where who you are matters. It's a place of endless opportunities for growth. A place where entrepreneurial energy plus inclusive teamwork equals exponential results. Withum empowers clients and our professional staff with innovative tools and solutions to address their accounting, tax and overall business management and operational needs. As a US nationally ranked Top 25 firm, we recruit only the best and brightest people with a genuine passion for the business. We are seeking an experienced Lead Consultant Data Engineering with a strong background in consulting services and hands-on skills in building modern, scalable data platforms and pipelines. This is a client-facing, delivery-focused role. Please note that this position is centered around external client delivery and is not part of an internal IT or product engineering team. This is a foundational hire. You will be responsible for delivering hands-on client work, support for our proprietary data products, and building the team underneath you. Withum's brand is a reflection of our people, our culture, and our strength. Withum has become synonymous with teamwork and client service excellence. The cornerstone of our success can truly be accredited to the dedicated professionals who work here every day, easy to work with a sense of purpose and caring for their co-workers and whose mission is to help our clients grow and thrive. But our commitment goes beyond our clients as we continue to live the Withum Way, promoting personal and professional growth for all team members, clients, and surrounding communities. How You'll Spend Your Time: - Architect, implement, and optimize data transformation pipelines, data lakes, and cloud-native warehouses for mid- and upper mid-market clients. - Deliver hands-on engineering work across client environments - building fast, scalable, and well-documented pipelines that support both analytics and AI use cases. - Lead technical design and execution using tools such as Tableau, Microsoft Fabric, Synapse, Power BI, Snowflake, and Databricks. - Also have a good hands-on familiarity with SQL Databases. - Optimize for sub-50GB datasets and local or lightweight cloud execution where appropriate - minimizing unnecessary reliance on cluster-based compute. - Collaborate with subject-matter experts to understand business use cases prior to designing data model. - Operate as a client-facing consultant: conduct discovery, define solutions, and lead agile project delivery. - Switch context rapidly across 23 active clients or service streams in a single day. - Provide support for our proprietary data products as needed. - Provide advisory and strategic input to clients on data modernization, AI enablement, and FP&A transformation efforts. - Deliver workshops, demos, and consultative training to business and technical stakeholders. - Ability to implement coding modifications to pre-existing code/procedures in a manner that results in a validated case study. - Take full ownership of hiring, onboarding, and mentoring future data engineers and analysts within the India practice. - During bench time, contribute to building internal data products and tooling - powering our own consulting operations (e.g., utilization dashboards, delivery intelligence, practice forecasting). - Help define and scale delivery methodology, best practices, and reusable internal accelerators for future engagements. - Ability to communicate openly about conflicting deadlines to ensure prioritization aligns with client expectations, with ample time to reset client expectations as needed. - Ensure coding is properly commented to help explain logic or purpose behind more complex sections of code. Requirements: - 6+ years of hands-on experience in data engineering roles, at least 3+ years in a consulting or client delivery environment. - Proven ability to context-switch, self-prioritize, and communicate clearly under pressure. - Demonstrated experience owning full lifecycle delivery, from architecture through implementation and client handoff. - Strong experience designing and implementing ETL / ELT pipelines, preferably in SQL-first tools. - Experience with Microsoft SQL Server / SSIS for maintenance and development of ETL processes. - Real-world experience with SQL Databases, Databricks, Snowflake, and/or Synapse - and a healthy skepticism of when to use them. - Deep understanding of data warehousing, data lakes, data modeling, and incremental processing. - Proficient in Python for ETL scripting, automation, and integration work. - Experience with tools such as dbt core in production environments. - Strong practices around data testing, version control, documentation, and team-based dev workflows. - Working knowledge of Power BI, Tableau, Looker, or similar BI tools. - Experience building platforms for AI/ML workflows or supporting agentic architectures. - Familiarity with Microsoft Fabric's Lakehouse implementation, Delta Lake, Iceberg, and Parquet. - Background in DataOps, CI/CD for data pipelines, and metadata management. - Microsoft certifications (e.g., Azure Data Engineer, Fabric Analytics Engineer) are a plus Website: www.withum.com,

Posted 6 days ago

Apply

8.0 - 12.0 years

0 Lacs

maharashtra

On-site

The responsibilities for this role include developing a detailed project plan with tasks, timelines, milestones, and dependencies. You will be responsible for solutions architecture design and implementation, understanding the source and outlining the ADF structure, as well as designing and scheduling packages using ADF. Collaboration and communication within the team are crucial to ensure smooth workflow. Application performance optimization is also a key aspect of the role. You will monitor and manage resource allocation to ensure tasks are appropriately staffed. Generating detailed technical specifications, business requirements, and unit test report documents are part of the responsibilities. It is important to ensure that the project follows best practices, coding standards, and technical requirements. Collaboration with technical leads to address technical issues and mitigate risks is also required. The ideal candidate should have 8-11 years of experience in Data Engineering. Primary skills should include Data Engineering, with sub-skills in Data Engineering. Additional skills required are AWS - CloudFormation, Azure Storage, AWS-Apps, AWS-Infra, synapse, Azure Datalake, Azure Data Factory, and AWS CloudTrail.,

Posted 6 days ago

Apply

3.0 - 7.0 years

0 Lacs

kochi, kerala

On-site

The responsibilities for this role include conducting original research on generative AI models, focusing on model architectures, training methods, fine-tuning, and evaluation strategies. You will be responsible for building Proof of Concepts (POCs) with emerging AI innovations and assessing their feasibility for production. Additionally, you will design and experiment with multimodal generative models encompassing text, images, audio, and other modalities. Developing autonomous, agent-based AI systems capable of adaptive decision-making is a key aspect of the role. You will lead the design, training, fine-tuning, and deployment of generative AI systems on large datasets. Optimizing AI algorithms for efficiency, scalability, and computational performance using parallelization, distributed systems, and hardware acceleration will also be part of your responsibilities. Managing data preprocessing and feature engineering, including cleaning, normalization, dimensionality reduction, and feature selection, is essential. You will evaluate and validate models using industry-standard benchmarks, iterating to achieve target KPIs. Providing technical leadership and mentorship to junior researchers and engineers is crucial. Documenting research findings, model architectures, and experimental outcomes in technical reports and publications is also required. It is important to stay updated with the latest advancements in NLP, DL, and generative AI to foster a culture of innovation within the team. The mandatory technical and functional skills for this role include strong expertise in PyTorch or TensorFlow. Proficiency in deep learning architectures such as CNN, RNN, LSTM, Transformers, and LLMs (BERT, GPT, etc.) is necessary. Experience in fine-tuning open-source LLMs (Hugging Face, LLaMA 3.1, BLOOM, Mistral AI, etc.) is required. Hands-on knowledge of PEFT techniques (LoRA, QLoRA, etc.) is expected. Familiarity with emerging AI frameworks and protocols (MCP, A2A, ACP, etc.) is a plus. Deployment experience with cloud AI platforms like GCP Vertex AI, Azure AI Foundry, or AWS SageMaker is essential. A proven track record in building POCs for cutting-edge AI use cases is also important. In terms of the desired candidate profile, experience with LangGraph, CrewAI, or Autogen for agent-based AI is preferred. Large-scale deployment of GenAI/ML projects with MLOps/LLMOps best practices is desirable. Experience in handling scalable data pipelines (BigQuery, Synapse, etc.) is a plus. A strong understanding of cloud computing architectures (Azure, AWS, GCP) is beneficial. Key behavioral attributes for this role include a strong ownership mindset, being able to lead end-to-end project deliverables, not just tasks. The ability to align AI solutions with business objectives and data requirements is crucial. Excellent communication and collaboration skills for cross-functional projects are also important.,

Posted 6 days ago

Apply

5.0 - 10.0 years

0 - 0 Lacs

indore, madhya pradesh

On-site

We are looking for an experienced Presales Solution Architect with a strong background in Microsoft Data Platform & AI, as well as exposure to modern data platforms like Snowflake, SAAL.AI, Databricks, Synapse, and BigQuery. The ideal candidate should have a successful history of architecting large-scale data and AI solutions, leading presales activities, and executing significant data platform projects. As part of this role, you will be responsible for establishing a top-performing Data & AI practice within our Cloud Business Unit and driving data consulting projects for clients in various industries. In this position, your key responsibilities will include engaging in presales activities such as gathering requirements, creating solution architectures, estimating costs, and leading presales engagements. You will be tasked with designing enterprise-level Microsoft Data Platform solutions, integrating hybrid architectures with platforms like Snowflake, SAAL.AI, Databricks, BigQuery, and AWS Redshift, and producing proposals, SoWs, and responses to RFP/RFI with competitive positioning. Additionally, you will conduct solution demonstrations, Proof of Concepts (PoCs), and technical workshops for clients. As part of delivery enablement, you will ensure successful delivery of signed deals by maintaining quality and meeting timelines. You will also define reusable frameworks, reference architectures, and accelerators for Microsoft and multi-platform data solutions, and provide technical leadership during the design and initial delivery phases. Moreover, you will act as a trusted advisor to CxO and senior stakeholders, leading consulting engagements to evaluate current data maturity and establish modernization roadmaps. Collaborating with partner ecosystems including Microsoft, Snowflake, Databricks, Google Cloud, and others to deliver joint value propositions will also be a part of your responsibilities. To further develop the Data & AI practice within the Cloud BU, you will be involved in recruitment, training, and capability building. This includes developing industry-specific AI solutions using Microsoft AI & GenAI services and partner platforms, and staying informed about emerging trends in data engineering, analytics, AI, and governance. The ideal candidate should possess strong expertise in Microsoft Data Platform & AI, experience with integrating or working on other data platforms, hands-on delivery experience in enterprise data platform and AI implementations, and a solid understanding of data architecture, ETL/ELT, modeling, governance, and security. Familiarity with AI/ML, GenAI frameworks, and cognitive services is desirable, along with a proven track record in presales solutioning, cost estimation, and proposal creation. Preferred qualifications include certifications such as Microsoft Certified: Azure Solutions Architect Expert, Microsoft Certified: Azure Data Engineer Associate, SnowPro, Databricks Certified Data Engineer, Azure AI Engineer Associate, or other AI/ML specialty certifications. Additionally, experience in Public Sector, BFSI, Healthcare, and Energy verticals, exposure to vertical-led AI solutions, and the ability to establish and lead a Data & AI CoE are advantageous. Candidates should hold a Bachelors or Masters degree in Computer Science, Data Science, Information Technology, or a related field. Join us to lead innovative data & AI transformation initiatives for strategic clients, collaborate with Microsoft and leading data platform partners, shape the future of our Data & AI practice, and be part of a high-growth, innovation-focused Cloud BU.,

Posted 6 days ago

Apply

6.0 - 8.0 years

0 Lacs

bengaluru, karnataka, india

On-site

Who We Are At Kyndryl, we design, build, manage and modernize the mission-critical technology systems that the world depends on every day. So why work at Kyndryl We are always moving forward always pushing ourselves to go further in our efforts to build a more equitable, inclusive world for our employees, our customers and our communities. The Role Are you ready to dive headfirst into the captivating world of data engineering at Kyndryl As a Data Engineer, you&aposll be the visionary behind our data platforms, crafting them into powerful tools for decision-makers. Your role Ensuring a treasure trove of pristine, harmonized data is at everyone&aposs fingertips. As a Data Engineer at Kyndryl, you&aposll be at the forefront of the data revolution, crafting and shaping data platforms that power our organization&aposs success. This role is not just about code and databases; it&aposs about transforming raw data into actionable insights that drive strategic decisions and innovation. In this role, you&aposll be engineering the backbone of our data infrastructure, ensuring the availability of pristine, refined data sets. With a well-defined methodology, critical thinking, and a rich blend of domain expertise, consulting finesse, and software engineering prowess, you&aposll be the mastermind of data transformation. Your journey begins by understanding project objectives and requirements from a business perspective, converting this knowledge into a data puzzle. You&aposll be delving into the depths of information to uncover quality issues and initial insights, setting the stage for data excellence. But it doesn&apost stop there. You&aposll be the architect of data pipelines, using your expertise to cleanse, normalize, and transform raw data into the final dataseta true data alchemist. Armed with a keen eye for detail, you&aposll scrutinize data solutions, ensuring they align with business and technical requirements. Your work isn&apost just a means to an end; it&aposs the foundation upon which data-driven decisions are made and your lifecycle management expertise will ensure our data remains fresh and impactful. So, if you&aposre a technical enthusiast with a passion for data, we invite you to join us in the exhilarating world of data engineering at Kyndryl. Let&aposs transform data into a compelling story of innovation and growth. Your Future at Kyndryl Every position at Kyndryl offers a way forward to grow your career. We have opportunities that you wont find anywhere else, including hands-on experience, learning opportunities, and the chance to certify in all four major platforms. Whether you want to broaden your knowledge base or narrow your scope and specialize in a specific sector, you can find your opportunity here. Who You Are Youre good at what you do and possess the required experience to prove it. However, equally as important you have a growth mindset; keen to drive your own personal and professional development. You are customer-focused someone who prioritizes customer success in their work. And finally, youre open and borderless naturally inclusive in how you work with others. Required Technical and Professional Expertise 68 years of experience working as a Data Engineer or in Azure cloud modernization Good experience in Power BI for data visualization and dashboard development Strong experience in developing data pipelines and using tools such as AWS Glue, Azure Databricks, Synapse, or Google Dataproc Proficient in working with both relational and NoSQL databases, including PostgreSQL, DB2, and MongoDB Excellent problem-solving, analytical, and critical thinking skills Ability to manage multiple projects simultaneously while maintaining a high level of attention to detail Expertise in data mining, data storage, and Extract-Transform-Load (ETL) processes Preferred Technical And Professional Experience Experience in Data Modelling, to create conceptual model of how data is connected and how it will be used in business processes Professional certification, e.g., Open Certified Technical Specialist with Data Engineering Specialization Cloud platform certification, e.g., AWS Certified Data Analytics? Specialty, Elastic Certified Engineer, Google Cloud?Professional Data Engineer, or Microsoft Certified: Azure Data Engineer Associate Understanding of social coding and Integrated Development Environments, e.g., GitHub and Visual Studio Degree in a scientific discipline, such as Computer Science, Software Engineering, or Information Technology Being You Diversity is a whole lot more than what we look like or where we come from, its how we think and who we are. We welcome people of all cultures, backgrounds, and experiences. But were not doing it single-handily: Our Kyndryl Inclusion Networks are only one of many ways we create a workplace where all Kyndryls can find and provide support and advice. This dedication to welcoming everyone into our company means that Kyndryl gives you and everyone next to you the ability to bring your whole self to work, individually and collectively, and support the activation of our equitable culture. Thats the Kyndryl Way. What You Can Expect With state-of-the-art resources and Fortune 100 clients, every day is an opportunity to innovate, build new capabilities, new relationships, new processes, and new value. Kyndryl cares about your well-being and prides itself on offering benefits that give you choice, reflect the diversity of our employees and support you and your family through the moments that matter wherever you are in your life journey. Our employee learning programs give you access to the best learning in the industry to receive certifications, including Microsoft, Google, Amazon, Skillsoft, and many more. Through our company-wide volunteering and giving platform, you can donate, start fundraisers, volunteer, and search over 2 million non-profit organizations. At Kyndryl, we invest heavily in you, we want you to succeed so that together, we will all succeed. Get Referred! If you know someone that works at Kyndryl, when asked How Did You Hear About Us during the application process, select Employee Referral and enter your contact&aposs Kyndryl email address. Show more Show less

Posted 6 days ago

Apply

10.0 - 12.0 years

25 - 30 Lacs

pune

Hybrid

Responsible for collecting, storing, processing, and analysing multiple sets of data from multiple data sources. The primary focus will be on choosing optimal solutions to use for these purposes, then maintaining, implementing, and monitoring them. May be responsible for integrating solutions with the architecture used across the company. Selecting and integrating any Data tools and frameworks required to provide requested capabilities. Implementing ETL process when importing data from existing data sources Expert with SQL Server, Oracle, Azure ADLS, Synapse, Databricks, Snowflake or other similar platforms Experience with building python API and document processing programs

Posted 6 days ago

Apply

6.0 - 10.0 years

0 Lacs

chennai, tamil nadu

On-site

As an Azure Databricks Lead with over 6 years of experience, located in Chennai, your primary role will revolve around spearheading data engineering initiatives and offering technical leadership in Azure-based data platform projects. Your expertise in designing and creating scalable big data pipelines using Azure Databricks, Spark, and related Azure services will be crucial for the success of these initiatives. Your key responsibilities will include leading the design, development, and deployment of scalable data pipelines utilizing Azure Databricks and Spark. Collaboration with stakeholders to comprehend business requirements and transforming them into technical specifications will be essential. You will architect comprehensive data solutions by leveraging Azure services like Data Lake, Data Factory, Synapse, Key Vault, and Event Hub. Optimizing existing data pipelines to ensure adherence to security, performance, and cost best practices will be part of your routine tasks. Guiding and mentoring a team of data engineers and developers will also be a significant aspect of your role. Implementation of data governance, quality checks, and CI/CD practices within the Databricks environment will be necessary for maintaining data integrity and efficiency. Your ability to work closely with cross-functional teams, including data scientists, analysts, and BI developers, will be pivotal in ensuring seamless collaboration and project success.,

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

delhi

On-site

The Analytics and AI Presales Consultant role based in Mumbai/Delhi is a full-time position that requires a highly skilled and motivated individual to join our dynamic team. As an ideal candidate, you should have a strong background in analytics and artificial intelligence, along with a successful track record in presales engagements. This role demands excellent communication skills, technical expertise, and the ability to collaborate effectively with clients and internal teams. Your key responsibilities will include collaborating with the sales team and customers to understand client requirements, conducting product demonstrations, presentations, and workshops to showcase our analytics and AI offerings, providing technical support during the sales process, developing compelling proposals, staying updated with the latest trends in analytics and AI technologies, building strong relationships with clients and stakeholders, assisting in the development of marketing materials, and participating in industry events to promote our solutions. To qualify for this role, you should hold a Bachelor's degree in Computer Science, Data Science, Engineering, or a related field, with a Master's degree considered a plus. You should have a minimum of 5 years of experience in analytics, AI, or a related field, focusing on presales activities. Your expertise should include a strong understanding of analytics and AI technologies such as machine learning, data mining, and predictive analytics, proficiency in programming languages like Python, R, or SQL, experience with platforms and tools like TensorFlow, PyTorch, and Power BI, excellent communication and presentation skills, problem-solving abilities, and the capacity to work both independently and collaboratively in a fast-paced environment. Preferred qualifications for this role include experience in a consulting or client-facing position, knowledge of cloud platforms like AWS, Azure, or Google Cloud, and certifications in analytics or AI technologies such as Microsoft Certified: Azure AI Engineer Associate or Google Cloud Professional Data Engineer. Joining us offers you the opportunity to work with cutting-edge analytics and AI technologies in a collaborative and inclusive work environment. You will also benefit from a competitive salary and benefits package, as well as professional development and growth opportunities.,

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

andhra pradesh

On-site

The ideal candidate for this position must possess excellent communication skills and should be available to work until 12pm ET to ensure overlapping working hours. It is essential for the candidate to have a strong and proven work experience with the following Microsoft tech stack: - C# - .Net framework - SQL - REST API - Azure resources (Data Factory, App Insights, Cosmos, Function apps, Service Bus, Synapse) The responsibilities of the role include designing, modifying, developing, writing, and implementing software programming applications. The candidate will also be required to support and/or install software applications/operating systems. Participation in the testing process through test review, analysis, and certification of software is a key aspect of the role. The ideal candidate should be familiar with a variety of field concepts, practices, and procedures. They will be expected to rely on their experience and judgment to plan and accomplish goals, performing a variety of complicated tasks. The role also requires a high degree of creativity and latitude in decision-making.,

Posted 1 week ago

Apply

5.0 - 7.0 years

0 Lacs

chennai, tamil nadu, india

On-site

Enhance enterprise-wide analytics with Microsoft Fabric Are you interested in transforming data into actionable insights As a Senior Data Engineer, you will play a pivotal role in designing and managing scalable data solutions using Microsoft Fabrics integrated services. This is a unique opportunity to be part of a global IT team supporting digital transformation and data-driven decision-making across the organization. Youll work with advanced technologies like OneLake, Synapse, and Power BI, enabling real-time analytics and business intelligence. Join us in advancing data capabilities in a collaborative, international environment. Do you want to be a key contributor in our digital growth journey Design and deliver scalable data solutions As a Senior Data Engineer, you will be responsible for building robust data pipelines, managing enterprise data storage, and enabling advanced analytics through Microsoft Fabric. Youll collaborate closely with cross-functional teams to translate business needs into technical solutions that support strategic decision-making. Your responsibility will be to: Design and maintain data pipelines using Data Factory and Synapse Optimize OneLake storage for performance and consistency Build enterprise-grade Data Warehouses within Microsoft Fabric Develop interactive dashboards and reports using Power BI Ensure data governance, security, and compliance across platforms You will report to the Chief Enterprise Architect and work closely with global IT teams and business stakeholders. The role is based in Chennai and may involve occasional international travel for training and collaboration. Experienced Data Engineer With Global Mindset We are looking for a structured and results-oriented person who thrives in a collaborative environment. You take responsibility for your work, communicate effectively, and enjoy solving complex data challenges. You are someone who values teamwork, inclusivity, and continuous learning. You also have: 5+ years of experience in data engineering or BI development Experience with Microsoft Fabric (Data Factory, Synapse, Power BI) Proficiency in SQL, Python, and Spark/PySpark Solid understanding of data modeling, warehousing, and governance Familiarity with CI/CD, version control, and DevOps practices Preferred: Experience with real-time data streaming and API integrations Play a key role in the development of enterprise analytics NKT is committed to developing a diverse organization and culture where people of diverse backgrounds can grow and are inspired to do their best. We are establishing gender diversity at NKT and encouraging all interested candidates to apply even if you dont tick all the boxes described. We believe that a diverse organization enables long-term performance, and that an inclusive and welcoming culture creates a better work environment. At NKT, youll be part of a collaborative team with opportunities to grow your skills in an international setting. We offer a work environment where innovation and knowledge sharing are encouraged. Youll have access to training, mentorship, and the chance to work on impactful projects that support our global digital strategy. "As a leader, I believe in empowering individuals to innovate and grow through collaboration and continuous learning," says Hiring Manager, Sapna Anand. Read more about our offer and listen to some voices of NKT Connectors here! We will review applications continuously, but we recommend you apply no later than 30th of September. Be aware that personality and cognitive tests might be included in the recruitment process. Please note that due to the GDPR regulations we cannot accept any applications via e-mail. Be a Connector of the green tomorrow! About NKT NKT connects a greener world with high-quality power cable technology and takes centre stage as the world moves towards green energy. NKT designs, manufactures and installs low-, medium- and high-voltage power cable solutions enabling sustainable energy transmission. Since 1891, NKT has innovated the power cable technology building the infrastructure for the first light bulbs to the megawatts created by renewable energy today. NKT is headquartered in Denmark and employs 6,000 people. NKT is listed on Nasdaq Copenhagen and realised a revenue of EUR 3.3 billion in 2024. We connect a greener world. www.nkt.com Show more Show less

Posted 1 week ago

Apply

4.0 - 8.0 years

15 - 20 Lacs

kolkata, bengaluru, mumbai (all areas)

Hybrid

Job Title: Azure DevOps engineer with Data Engineering capabilities Experience: 4-8 years Location: Pan India Employment Type: Full-Time Technology: Azure DevOps, Terraform, CI/CD, Azure Kubernetes, Kubernetes, Docker, Databricks, ADF, Synapse Key Responsibilities: Job Description: Experience working with cloud-based database platforms such as Azure Synapse Analytics, Azure SQL Database, and Snowflake to enable scalable data warehousing and analytics solutions. Design and develop scalable data pipelines leveraging Azure Data Factory, Azure Databricks, and Synapse Analytics for efficient data ingestion and transformation. Implement ETL workflows to extract data from diverse sourcesincluding relational databases, REST APIs, file systems, real-time streams, and Change Data Capture (CDC)—and load it into Azure Data Lake Storage. Monitor data pipeline health and implement performance tuning and cost optimization measures to ensure reliability and efficiency. Administer cloud infrastructure in Azure environments, focusing on networking, security configurations, monitoring setups, and cost management. Lead cloud migration projects, transitioning existing workloads to Azure with attention to architectural best practices and operational continuity. Develop and maintain infrastructure-as-code (IaC) and automate deployment processes using Azure DevOps pipelines, ensuring repeatable and reliable application and infrastructure delivery. Manage source control, branching strategies, and repository administration using Azure DevOps Repos and GitLab. Utilize containerization technologies such as Docker for packaging, deployment, and orchestration of applications, ensuring security and scalability. Write and maintain automation scripts and tooling using languages including PowerShell and Python, supporting deployment, monitoring, and configuration tasks. Build and manage CI/CD workflows integrating code repositories, automated testing, security scans, and deployment across multiple environments. Leverage Azure DevOps tools including build and release pipelines, work item tracking, and REST APIs to support the software development lifecycle and team collaboration. Required Skills: 4–8 years’ experience in Data Engineering or related roles. Expertise in Azure Databricks, Azure Data Factory, and Synapse Analytics. Proficient in Python scripting and advanced SQL query writing and optimization. Experience with Azure Data Lake Storage and data warehousing concepts (star/snowflake schemas). Familiarity with CI/CD processes in data engineering environments. Hands-on experience with Delta Lake and distributed processing frameworks like Spark; Scala is a plus. Working knowledge of BI tools such as Power BI, Tableau, or Looker. Strong command of Azure DevOps for source control and pipeline automation. Understanding of cloud data security and compliance. Excellent problem-solving and communication skills. Leadership experience in managing or mentoring engineering teams preferred.

Posted 1 week ago

Apply

3.0 - 6.0 years

0 Lacs

bengaluru, karnataka, india

On-site

Job Title: Power BI Developer Location: Bangalore (ITPL) Experience: 36 years (with minimum 2 years in Power BI development) Employment Type: Full-time Job Summary We are seeking a skilled Power BI Developer to design, develop, and maintain business intelligence solutions that transform raw data into actionable insights. The ideal candidate will have strong expertise in Power BI, data modeling, and DAX, along with experience working with stakeholders to deliver interactive dashboards and reports that support business decision-making. Key Responsibilities Design, develop, and deploy Power BI dashboards, reports, and visualizations. Connect to multiple data sources (SQL, Excel, APIs, cloud platforms, etc.) and ensure data accuracy and integrity. Develop data models and perform advanced calculations using DAX. Collaborate with business stakeholders to gather requirements and translate them into technical solutions. Optimize Power BI solutions for performance and usability. Implement row-level security (RLS) and manage user access for reports. Perform data analysis to identify trends, patterns, and business insights. Document processes, models, and dashboards for knowledge sharing. Stay updated with the latest Power BI features and industry best practices. Required Skills & Qualifications Bachelors degree in Computer Science, Information Systems, or related field. 36 years of experience in BI development with at least 2 years in Power BI. Proficiency in Power BI Desktop, Power BI Service, DAX, and Power Query . Strong understanding of data modeling, ETL processes, and data warehouse concepts . Hands-on experience with SQL Server and other relational databases. Knowledge of Azure Data Services (Azure SQL, Data Factory, Synapse) is a plus. Strong analytical and problem-solving skills. Excellent communication and stakeholder management skills. Show more Show less

Posted 1 week ago

Apply

1.0 - 11.0 years

0 Lacs

karnataka

On-site

You will be joining Northern Trust, a globally recognized financial institution with over 130 years of experience and a commitment to providing innovative financial services to successful individuals, families, and institutions. As a Risk Senior Consultant, your primary responsibility will be to support the ex-ante risk reporting process for asset owner clients. This role will involve providing complex risk analyses, overseeing the implementation of new risk analysis products, and communicating with clients, investment managers, and team members regularly. Additionally, you will be expected to develop client presentations, manage data quality and reporting processes, and build tailored solutions for clients using tools like Alteryx and Synapse. Your role will also involve collaborating with Product and DevOps teams to enhance existing tools, researching new ways to model non-publicly traded asset classes, attending industry events to stay updated on regulatory requirements, and mentoring junior team members. To excel in this role, you should have strong analytical skills, previous investment risk experience, and knowledge of investment markets, asset allocation analysis, and industry regulations. Excellent communication skills, time management abilities, and a proactive problem-solving approach will be essential. A CFA and/or FRM certification, along with a background in Finance, Statistics, or Mathematics, will be advantageous. If you have at least 11 years of portfolio analytics experience and are looking to work in a collaborative and flexible environment where career growth is encouraged, then this opportunity at Northern Trust may be the perfect fit for you. Join us in our commitment to making a positive impact in the communities we serve and explore a rewarding career with one of the world's most admired financial institutions. Apply today to be a part of our team!,

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies