Position: Python Intern specializing in AI Location : Bhopal, Madhya Pradesh (Work from Office) Duration: 3 to 6 Months ✅ Must-Have Skills Core Python Programming: Functions, loops, list comprehensions, classes Error handling (try-except), logging File I/O operations, working with JSON/CSV Python Libraries for AI/ML: numpy, pandas – Data manipulation & analysis matplotlib, seaborn – Data visualization scikit-learn – Classical machine learning models Basic familiarity with tensorflow or pytorch Working knowledge of Openai / Transformers (bonus) AI/ML Fundamentals: Supervised and unsupervised learning (e.g., regression, classification, clustering) Concepts of overfitting, underfitting, and bias-variance tradeoff Train-test split, cross-validation Evaluation metrics: accuracy, precision, recall, F1-score, confusion matrix Data Preprocessing: Handling missing data, outliers Data normalization, encoding techniques Feature selection & dimensionality reduction (e.g., PCA) Jupyter Notebook Proficiency: Writing clean, well-documented notebooks Using markdown for explanations and visualizing outputs Version Control: Git basics (clone, commit, push, pull) Using GitHub/GitLab for code collaboration ✅ Good-to-Have Skills Deep Learning: Basic understanding of CNNs, RNNs, transformers Familiarity with keras or torch.nn for model building Generative AI: Prompt engineering, working with LLM APIs like OpenAI or Hugging Face Experience with vector databases (Qdrant, FAISS) NLP: Tokenization, stemming, lemmatization TF-IDF, Word2Vec, BERT basics Projects in sentiment analysis or text classification Tools & Platforms: VS Code, JupyterLab Google Colab / Kaggle Docker (basic understanding) Math for AI: Linear algebra, probability & statistics Basic understanding of gradients and calculus ✅ Soft Skills & Project Experience Participation in mini-projects (e.g., spam detector, digit recognizer) Kaggle competition experience Ability to clearly explain model outputs and results Documenting findings and creating simple dashboards or reports. Job Types: Full-time, Internship Pay: From ₹5,000.00 per month Schedule: Day shift Work Location: In person
Position: Proposal Writing Intern Location: Bhopal, Madhya Pradesh (Work From Office) Minimum Duration: 3 months Roles & Responsibilities: Research and draft bidding documents, including RFPs and RFQs. Assist in preparing NDAs, agreements, and MoUs. Draft, proofread, and communicate effectively in English. Create clear, concise, and compliant content. Ensure accuracy and consistency in all documents. Research market trends and summarize key findings. Skills Required: Strong English communication and writing skills. Basic understanding of legal and compliance principles. Attention to detail and excellent proofreading abilities. Proficiency in MS Office (Word, Excel, PowerPoint). Job Types: Full-time, Internship Contract length: 3 months Pay: From ₹5,000.00 per month Schedule: Day shift Work Location: In person
You should have proficiency in Python with skills in file handling, RestfulAPIs, List, Tuple, etc. Additionally, experience in ML/Data Science using NumPy, Pandas, EDA, TensorFlow, Keras, Matplotlib, etc. is required. Familiarity with Large Language Models (LLMs) is also preferred. Good communication skills are a must for this role. The minimum duration for this position is 6 months. The stipend offered is based on the candidate's experience (DOE) ranging from 5K to 25K. This is a remote job opportunity and the selected candidates will be working from home. Please note that this opportunity is for deserving candidates only.,
The job involves managing social media to enhance brand presence, devising and implementing marketing strategies, managing client communication both online and in-person, researching industry trends, and participating in IT events. The preferred tools for this role include Google Ads, SEO, Email Marketing, Canva, and MS Office (Word, Excel, PowerPoint). To excel in this position, you should have experience in marketing, possess social media and engagement skills, demonstrate proficiency in client handling and communication, and exhibit a self-driven and proactive approach. This is a full-time position with a day shift schedule that requires working in person. If you meet these requirements and are eager to grow professionally, we encourage you to apply for this opportunity.,
Overview We are seeking a Product Delivery Lead with 10+ years of proven experience in leading software product development and delivery, with a strong focus on AI-driven and innovative solutions . This role is for a senior professional who will be accountable for the end-to-end delivery of multiple software products , ensuring that every product is developed strictly to requirements, completed on time, and validated for market readiness . You will be accountable for ensuring product features are fit for purpose, commercially viable, and aligned with innovation goals . This is not a generic project management role . We need a hands-on leader with a proven track record of bringing multiple products to market, leading cross-functional teams, working with AI-driven technologies, demonstrating strong market awareness, and bridging the gap between business strategy, customer needs, and technical execution . Key Responsibilities Own and lead the full product development and delivery lifecycle across a portfolio of AI and innovation-driven products, from concept through launch. Ensure all products are developed strictly against defined requirements , tested thoroughly, and made available to the market within agreed timelines. Review and validate product features to ensure market suitability, usability, and commercial viability . Drive AI-first product innovation , integrating emerging technologies into product design and execution. Collaborate with engineering, design, data science, marketing, and sales to guarantee on-time, high-quality delivery . Conduct rigorous progress reviews and enforce alignment with scope, deadlines, budgets, and quality standards. Monitor competitor products, AI/tech trends, and customer feedback to refine product strategy. Oversee post-launch performance , driving enhancements and next-generation features. Provide strategic leadership, mentoring, and direction to cross-functional teams. Key Requirements (Must-Have) 10+ years of experience in software product development, product delivery, or product management. Proven track record of leading multiple software products to successful market launch. Strong experience in AI/ML-powered products, cloud platforms, or data-driven solutions . Deep expertise in product lifecycle management and Agile/Lean delivery frameworks . Exceptional leadership, communication, and stakeholder management skills. Strong analytical and problem-solving ability to evaluate product-market fit and innovation value . Ability to balance strategic thinking with hands-on product delivery . Preferred Qualifications Background in software engineering, AI development, or data science . Experience in managing innovation-driven product portfolios . Exposure to healthcare, fintech, or enterprise AI product delivery is an advantage. Why This Role Is Different You will own delivery outcomes — not just coordinate tasks. You will lead multiple AI-driven product lines , not a single backlog. You will ensure products are ready for real-world markets , not just “feature complete.” Only candidates with proven product-to-market delivery experience should apply.
Overview We are seeking a Product Delivery Lead with 10+ years of proven experience in leading software product development and delivery, with a strong focus on AI-driven and innovative solutions . This role is for a senior professional who will be accountable for the end-to-end delivery of multiple software products , ensuring that every product is developed strictly to requirements, completed on time, and validated for market readiness . You will be accountable for ensuring product features are fit for purpose, commercially viable, and aligned with innovation goals . This is not a generic project management role . We need a hands-on leader with a proven track record of bringing multiple products to market, leading cross-functional teams, working with AI-driven technologies, demonstrating strong market awareness, and bridging the gap between business strategy, customer needs, and technical execution . Key Responsibilities Own and lead the full product development and delivery lifecycle across a portfolio of AI and innovation-driven products, from concept through launch. Ensure all products are developed strictly against defined requirements , tested thoroughly, and made available to the market within agreed timelines. Review and validate product features to ensure market suitability, usability, and commercial viability . Drive AI-first product innovation , integrating emerging technologies into product design and execution. Collaborate with engineering, design, data science, marketing, and sales to guarantee on-time, high-quality delivery . Conduct rigorous progress reviews and enforce alignment with scope, deadlines, budgets, and quality standards. Monitor competitor products, AI/tech trends, and customer feedback to refine product strategy. Oversee post-launch performance , driving enhancements and next-generation features. Provide strategic leadership, mentoring, and direction to cross-functional teams. Key Requirements (Must-Have) 10+ years of experience in software product development, product delivery, or product management. Proven track record of leading multiple software products to successful market launch. Strong experience in AI/ML-powered products, cloud platforms, or data-driven solutions . Deep expertise in product lifecycle management and Agile/Lean delivery frameworks . Exceptional leadership, communication, and stakeholder management skills. Strong analytical and problem-solving ability to evaluate product-market fit and innovation value . Ability to balance strategic thinking with hands-on product delivery . Preferred Qualifications Background in software engineering, AI development, or data science . Experience in managing innovation-driven product portfolios . Exposure to healthcare, fintech, or enterprise AI product delivery is an advantage. Why This Role Is Different You will own delivery outcomes — not just coordinate tasks. You will lead multiple AI-driven product lines , not a single backlog. You will ensure products are ready for real-world markets , not just “feature complete.” Only candidates with proven product-to-market delivery experience should apply. Job Type: Full-time Pay: From ₹50,000.63 per month Work Location: In person
Overview We are seeking a Product Delivery Lead with 10+ years of proven experience in leading software product development and delivery, with a strong focus on AI-driven and innovative solutions . This role is for a senior professional who will be accountable for the end-to-end delivery of multiple software products , ensuring that every product is developed strictly to requirements, completed on time, and validated for market readiness . You will be accountable for ensuring product features are fit for purpose, commercially viable, and aligned with innovation goals . This is not a generic project management role . We need a hands-on leader with a proven track record of bringing multiple products to market, leading cross-functional teams, working with AI-driven technologies, demonstrating strong market awareness, and bridging the gap between business strategy, customer needs, and technical execution . Key Responsibilities Own and lead the full product development and delivery lifecycle across a portfolio of AI and innovation-driven products, from concept through launch. Ensure all products are developed strictly against defined requirements , tested thoroughly, and made available to the market within agreed timelines. Review and validate product features to ensure market suitability, usability, and commercial viability . Drive AI-first product innovation , integrating emerging technologies into product design and execution. Collaborate with engineering, design, data science, marketing, and sales to guarantee on-time, high-quality delivery . Conduct rigorous progress reviews and enforce alignment with scope, deadlines, budgets, and quality standards. Monitor competitor products, AI/tech trends, and customer feedback to refine product strategy. Oversee post-launch performance , driving enhancements and next-generation features. Provide strategic leadership, mentoring, and direction to cross-functional teams. Key Requirements (Must-Have) 10+ years of experience in software product development, product delivery, or product management. Proven track record of leading multiple software products to successful market launch. Strong experience in AI/ML-powered products, cloud platforms, or data-driven solutions . Deep expertise in product lifecycle management and Agile/Lean delivery frameworks . Exceptional leadership, communication, and stakeholder management skills. Strong analytical and problem-solving ability to evaluate product-market fit and innovation value . Ability to balance strategic thinking with hands-on product delivery . Preferred Qualifications Background in software engineering, AI development, or data science . Experience in managing innovation-driven product portfolios . Exposure to healthcare, fintech, or enterprise AI product delivery is an advantage. Why This Role Is Different You will own delivery outcomes — not just coordinate tasks. You will lead multiple AI-driven product lines , not a single backlog. You will ensure products are ready for real-world markets , not just “feature complete.” Only candidates with proven product-to-market delivery experience should apply.
The IT Tender & Bid Management Intern (Proposal Writer/Strategic Writer) position based in Bhopal, Madhya Pradesh, offers a work-from-office opportunity for a duration of 3-6 months with a stipend based on performance. As an intern in this role, you will be responsible for identifying relevant IT tenders by tracking and analyzing tenders from various platforms. Your tasks will involve reviewing and analyzing tender documents to understand legal, financial, and technical specifications. It will be essential to assess if all prerequisites and compliance requirements are met while highlighting any risks or missing elements before submission. Moreover, you will play a crucial role in bid preparation and proposal writing by translating requirements to the technical, design, and finance teams. Your responsibilities will include developing high-quality, compliant, and compelling proposals focusing on winning strategies. Additionally, you will assist in drafting costing, compliance, and technical documentation to support the proposals. As an intern, you will also be involved in submitting and tracking bids, ensuring all documents are submitted within the deadlines and following up with procurement authorities for updates. The ideal candidate for this position would be a final-year student or recent graduate in Business, IT, or Management with an interest in IT consulting, procurement, and business development. You should possess a good understanding of RFPs, RFIs, RFQs, and procurement processes, along with strong communication skills and the ability to coordinate across teams. Furthermore, analytical and time management skills are essential for success in this role.,
As a Python Intern specializing in AI at our Bhopal, Madhya Pradesh office, you will have the opportunity to work on various AI/ML projects for a duration of 3 to 6 months. Your role will involve utilizing your skills in Core Python programming, Python libraries for AI/ML, AI/ML fundamentals, data preprocessing, Jupyter Notebook proficiency, version control using Git, and more. In terms of Core Python programming, you should be well-versed in functions, loops, list comprehensions, classes, error handling, logging, file I/O operations, and working with JSON/CSV files. Additionally, familiarity with Python libraries such as numpy, pandas, matplotlib, seaborn, and scikit-learn is essential for data manipulation, analysis, and visualization. A good understanding of AI/ML fundamentals including supervised and unsupervised learning, concepts of overfitting, underfitting, and bias-variance tradeoff, as well as evaluation metrics like accuracy, precision, recall, F1-score, and confusion matrix is required. You should also be proficient in data preprocessing techniques such as handling missing data, outliers, normalization, encoding, feature selection, and dimensionality reduction. Your ability to work with Jupyter Notebooks to write clean, well-documented notebooks, use markdown for explanations, and visualize outputs will be crucial. Proficiency in version control using Git for basic operations and collaborating on GitHub/GitLab will also be expected. While not mandatory, having good-to-have skills in Deep Learning, Generative AI, NLP, tools & platforms like VS Code, JupyterLab, Google Colab, Kaggle, Docker, and a strong foundation in Math for AI will be beneficial for your role. Soft skills such as participation in mini-projects, Kaggle competition experience, clear communication of model outputs, and documenting findings will further enhance your profile. This position offers a full-time internship opportunity with a day shift schedule and in-person work location. If you are passionate about AI and Python programming with a keen interest in machine learning, this role will provide you with hands-on experience and learning opportunities in the field.,
As an IT Tender & Bid Management Executive (Proposal Writer/Strategic Writer) at SAA Consultancy Limited, based in Bhopal, Madhya Pradesh, you will be responsible for identifying relevant IT tenders, tracking and analyzing tenders, reviewing and analyzing tender documents, preparing bids and writing proposals, and submitting and tracking bids to ensure successful submissions. Your primary responsibilities will include identifying relevant IT tenders by tracking and analyzing tenders from various platforms. You will be required to review and analyze tender documents to understand legal, financial, and technical specifications, assess compliance requirements, and highlight any risks or missing requirements before submission. Additionally, you will translate tender requirements to the technical, design, and finance teams to prepare proposal documents. You will play a key role in developing high-quality, compliant, and compelling proposals with a focus on winning strategies. Furthermore, you will assist in drafting costing, compliance, and technical documentation. As part of your role, you will be responsible for ensuring all bid documents are submitted within the deadlines and following up with procurement authorities for updates. The ideal candidate for this position should have at least 2+ years of experience in tender management, procurement, or bid writing, preferably in IT & Digital Transformation projects. You should have a strong understanding of legal, financial, and technical tender documentation, as well as RFPs, RFIs, RFQs, and procurement processes. Moreover, you should possess the ability to collaborate effectively with designers, developers, and finance teams to create winning proposals. Strong communication skills are essential for engaging with procurement authorities and clients. Excellent time management skills and the ability to meet strict deadlines are also crucial for success in this role. If you are skilled in tender identification, bid management, and proposal writing with expertise in analyzing IT & digital transformation tenders, we encourage you to apply for this position and be part of our team at SAA Consultancy Limited.,
Location: Bhopal, MP, India Experience: 7+ Years Key Responsibilities Design and build ETL pipelines in Azure Databricks (PySpark, Delta Lake) to load, clean, and deliver data across Bronze, Silver, and Gold layers. Implement Data Lakehouse Architecture on Azure Data Lake Gen2 with partitioning, schema management, and performance optimization. Develop data models (dimensional/star schema) for reporting in Synapse and Power BI. Integrate Databricks with Azure services – ADF, Key Vault, Event Hub, Synapse Analytics, Purview, and Logic Apps. Build and manage CI/CD pipelines in Azure DevOps (YAML, Git repos, pipelines). Optimize performance through cluster tuning, caching, Z-ordering, Delta optimization, and job parallelization. Ensure data security and compliance (row-level security, PII masking, GDPR/HIPAA, audit logging). Collaborate with data architects and analysts to translate business needs into technical solutions. Required Skills Strong experience in Azure Databricks (Python, PySpark, SQL). Proficiency with Delta Lake (ACID transactions, schema evolution, incremental loads). Hands-on with Azure ecosystem – Data Factory, ADLS Gen2, Key Vault, Event Hub, Synapse. Knowledge of data governance & lineage tools (Purview, Unity Catalog is a plus). Strong understanding of data warehouse design and star schema. Azure DevOps (YAML, Git repos, pipelines) experience. Good debugging skills for performance tuning & schema drift issues. Good to Have Experience with healthcare or financial data. Familiarity with FHIR, OMOP, OpenEHR (for healthcare projects). Exposure to AI/ML integration using Databricks ML runtime. Experience with Unity Catalog for governance across workspaces. Deliverables / Outcomes Automated snapshot + incremental pipelines in Databricks. Delta Lake architecture with partitioning, Z-ordering, and schema drift handling. Metadata-driven ingestion framework (YAML configs). Power BI datasets connected to Gold layer. CI/CD pipelines for deployment across. Are you ready to take the lead in building scalable data solutions with Azure Databricks? Apply now! Job Type: Full-time Pay: From ₹20,000.34 per month Work Location: In person
Location: Bhopal, MP, India Experience: 7+ years Key Responsibilities Design and build ETL pipelines in Azure Databricks (PySpark, Delta Lake) to load, clean, and deliver data across Bronze, Silver, and Gold layers. Implement Data Lakehouse Architecture on Azure Data Lake Gen2 with partitioning, schema management, and performance optimization. Develop data models (dimensional/star schema) for reporting in Synapse and Power BI. Integrate Databricks with Azure services – ADF, Key Vault, Event Hub, Synapse Analytics, Purview, and Logic Apps. Build and manage CI/CD pipelines in Azure DevOps (YAML, Git repos, pipelines). Optimize performance through cluster tuning, caching, Z-ordering, Delta optimization, and job parallelization. Ensure data security and compliance (row-level security, PII masking, GDPR/HIPAA, audit logging). Collaborate with data architects and analysts to translate business needs into technical solutions. Required Skills Strong experience in Azure Databricks (Python, PySpark, SQL). Proficiency with Delta Lake (ACID transactions, schema evolution, incremental loads). Hands-on with Azure ecosystem – Data Factory, ADLS Gen2, Key Vault, Event Hub, Synapse. Knowledge of data governance & lineage tools (Purview, Unity Catalog is a plus). Strong understanding of data warehouse design and star schema. Azure DevOps (YAML, Git repos, pipelines) experience. Good debugging skills for performance tuning & schema drift issues. Good to Have Experience with healthcare or financial data. Familiarity with FHIR, OMOP, OpenEHR (for healthcare projects). Exposure to AI/ML integration using Databricks ML runtime. Experience with Unity Catalog for governance across workspaces. What You Will Deliver Automated snapshot + incremental pipelines in Databricks. Delta Lake architecture with partitioning, Z-ordering, and schema drift handling. Metadata-driven ingestion framework (YAML configs). Power BI datasets connected to Gold layer. CI/CD pipelines for deployment across. Are you ready to take the lead in building scalable data solutions with Azure Databricks? Apply now!
You will be responsible for designing and building ETL pipelines in Azure Databricks (PySpark, Delta Lake) to load, clean, and deliver data across Bronze, Silver, and Gold layers. Your role will also involve implementing Data Lakehouse Architecture on Azure Data Lake Gen2 with partitioning, schema management, and performance optimization. Additionally, you will need to develop data models (dimensional/star schema) for reporting in Synapse and Power BI. Integration of Databricks with Azure services like ADF, Key Vault, Event Hub, Synapse Analytics, Purview, and Logic Apps will be a part of your tasks. Building and managing CI/CD pipelines in Azure DevOps (YAML, Git repos, pipelines) will also fall under your responsibilities. You will be expected to optimize performance through cluster tuning, caching, Z-ordering, Delta optimization, and job parallelization. Ensuring data security and compliance (row-level security, PII masking, GDPR/HIPAA, audit logging) and collaborating with data architects and analysts to translate business needs into technical solutions are vital aspects of this role. - Strong experience in Azure Databricks (Python, PySpark, SQL). - Proficiency with Delta Lake (ACID transactions, schema evolution, incremental loads). - Hands-on with Azure ecosystem - Data Factory, ADLS Gen2, Key Vault, Event Hub, Synapse. - Knowledge of data governance & lineage tools (Purview, Unity Catalog is a plus). - Strong understanding of data warehouse design and star schema. - Azure DevOps (YAML, Git repos, pipelines) experience. - Good debugging skills for performance tuning & schema drift issues. **Good to Have:** - Experience with healthcare or financial data. - Familiarity with FHIR, OMOP, OpenEHR (for healthcare projects). - Exposure to AI/ML integration using Databricks ML runtime. - Experience with Unity Catalog for governance across workspaces. If you are ready to take the lead in building scalable data solutions with Azure Databricks, this Full-time position in Bhopal, MP, India awaits you!,
Location: Bhopal, MP, India Experience: 7+ Years Key Responsibilities Design and build ETL pipelines in Azure Databricks (PySpark, Delta Lake) to load, clean, and deliver data across Bronze, Silver, and Gold layers. Implement Data Lakehouse Architecture on Azure Data Lake Gen2 with partitioning, schema management, and performance optimization. Develop data models (dimensional/star schema) for reporting in Synapse and Power BI. Integrate Databricks with Azure services – ADF, Key Vault, Event Hub, Synapse Analytics, Purview, and Logic Apps. Build and manage CI/CD pipelines in Azure DevOps (YAML, Git repos, pipelines). Optimize performance through cluster tuning, caching, Z-ordering, Delta optimization, and job parallelization. Ensure data security and compliance (row-level security, PII masking, GDPR/HIPAA, audit logging). Collaborate with data architects and analysts to translate business needs into technical solutions. Required Skills Strong experience in Azure Databricks (Python, PySpark, SQL). Proficiency with Delta Lake (ACID transactions, schema evolution, incremental loads). Hands-on with Azure ecosystem – Data Factory, ADLS Gen2, Key Vault, Event Hub, Synapse. Knowledge of data governance & lineage tools (Purview, Unity Catalog is a plus). Strong understanding of data warehouse design and star schema. Azure DevOps (YAML, Git repos, pipelines) experience. Good debugging skills for performance tuning & schema drift issues. Good to Have Experience with healthcare or financial data. Familiarity with FHIR, OMOP, OpenEHR (for healthcare projects). Exposure to AI/ML integration using Databricks ML runtime. Experience with Unity Catalog for governance across workspaces. Deliverables / Outcomes Automated snapshot + incremental pipelines in Databricks. Delta Lake architecture with partitioning, Z-ordering, and schema drift handling. Metadata-driven ingestion framework (YAML configs). Power BI datasets connected to Gold layer. CI/CD pipelines for deployment across. Are you ready to take the lead in building scalable data solutions with Azure Databricks? Apply now! Job Type: Full-time Pay: From ₹20,000.00 per month Work Location: In person
Location: Bhopal, MP, India Experience: 5+ years Key Responsibilities Design and build ETL pipelines in Azure Databricks (PySpark, Delta Lake) to load, clean, and deliver data across Bronze, Silver, and Gold layers. Implement Data Lakehouse Architecture on Azure Data Lake Gen2 with partitioning, schema management, and performance optimization. Develop data models (dimensional/star schema) for reporting in Synapse and Power BI. Integrate Databricks with Azure services – ADF, Key Vault, Event Hub, Synapse Analytics, Purview, and Logic Apps. Build and manage CI/CD pipelines in Azure DevOps (YAML, Git repos, pipelines). Optimize performance through cluster tuning, caching, Z-ordering, Delta optimization, and job parallelization. Ensure data security and compliance (row-level security, PII masking, GDPR/HIPAA, audit logging). Collaborate with data architects and analysts to translate business needs into technical solutions. Required Skills Strong experience in Azure Databricks (Python, PySpark, SQL). Proficiency with Delta Lake (ACID transactions, schema evolution, incremental loads). Hands-on with Azure ecosystem – Data Factory, ADLS Gen2, Key Vault, Event Hub, Synapse. Knowledge of data governance & lineage tools (Purview, Unity Catalog is a plus). Strong understanding of data warehouse design and star schema. Azure DevOps (YAML, Git repos, pipelines) experience. Good debugging skills for performance tuning & schema drift issues. Good to Have Experience with healthcare or financial data. Familiarity with FHIR, OMOP, OpenEHR (for healthcare projects). Exposure to AI/ML integration using Databricks ML runtime. Experience with Unity Catalog for governance across workspaces. What You Will Deliver Automated snapshot + incremental pipelines in Databricks. Delta Lake architecture with partitioning, Z-ordering, and schema drift handling. Metadata-driven ingestion framework (YAML configs). Power BI datasets connected to Gold layer. CI/CD pipelines for deployment across. Are you ready to take the lead in building scalable data solutions with Azure Databricks? Apply now!
Location: London, United Kingdom (Hybrid) Experience: 4+ Years Key Responsibilities Deliver end-to-end Microsoft Fabric solutions for NHS data platforms, ensuring compliance with NHS DSPT, IG Toolkit, and healthcare data standards. Lead migration of complex on-premises data systems into Fabric Lakehouse and Warehouse, integrating with Power BI, Purview, and secure pipelines. Define and enforce best practices in Medallion architecture, data modelling, governance, and lineage. Optimise large-scale ETL/ELT workloads using Fabric, Synapse, and Databricks, ensuring resilience, scalability, and high performance. Key Requirements 4+ years of data engineering experience , with hands-on in Azure Data ecosystem (Fabric, Synapse, Data Factory, Databricks). Mastery of: Microsoft Fabric Lakehouse/Warehouse , Delta Lake, Power BI integration. Azure Purview for cataloguing, lineage, and governance. CI/CD pipelines with YAML, GitHub/DevOps . Advanced SQL, Python, PySpark for large-scale data processing. Networking & security : VPN, private endpoints, access control. Proven expertise delivering NHS or public-sector platforms, with strong understanding of healthcare interoperability (FHIR, HL7, NHS data models). Excellent stakeholder engagement skills, with ability to shape NHS digital strategy. Valid UK work visa required. Contract: Outside IR35 .
As an experienced data engineer with 4+ years of experience, your role will involve delivering end-to-end Microsoft Fabric solutions for NHS data platforms. You will be responsible for ensuring compliance with NHS DSPT, IG Toolkit, and healthcare data standards. Additionally, you will lead the migration of complex on-premises data systems into Fabric Lakehouse and Warehouse, integrating with Power BI, Purview, and secure pipelines. Your expertise will be crucial in defining and enforcing best practices in Medallion architecture, data modelling, governance, and lineage. Furthermore, you will optimize large-scale ETL/ELT workloads using Fabric, Synapse, and Databricks to ensure resilience, scalability, and high performance. Key Responsibilities: - Deliver end-to-end Microsoft Fabric solutions for NHS data platforms, ensuring compliance with NHS DSPT, IG Toolkit, and healthcare data standards. - Lead migration of complex on-premises data systems into Fabric Lakehouse and Warehouse, integrating with Power BI, Purview, and secure pipelines. - Define and enforce best practices in Medallion architecture, data modelling, governance, and lineage. - Optimise large-scale ETL/ELT workloads using Fabric, Synapse, and Databricks, ensuring resilience, scalability, and high performance. Key Requirements: - 4+ years of data engineering experience, with hands-on experience in the Azure Data ecosystem (Fabric, Synapse, Data Factory, Databricks). - Mastery of Microsoft Fabric Lakehouse/Warehouse, Delta Lake, Power BI integration. - Proficiency in Azure Purview for cataloguing, lineage, and governance. - Experience with CI/CD pipelines using YAML, GitHub/DevOps. - Advanced skills in SQL, Python, and PySpark for large-scale data processing. - Familiarity with networking & security concepts such as VPN, private endpoints, and access control. - Proven expertise in delivering NHS or public-sector platforms, with a strong understanding of healthcare interoperability (FHIR, HL7, NHS data models). - Excellent stakeholder engagement skills and the ability to shape NHS digital strategy. - Valid UK work visa required. In this role, you will have the opportunity to work on challenging projects within a dynamic and innovative environment. The contract is outside IR35, providing you with additional flexibility and autonomy in your work.,
PySpark Developer – Bhopal (Onsite – no hybrid) Experience: 5+ years Key Responsibilities Develop and maintain PySpark pipelines in Azure Databricks for batch and streaming data. Implement ETL/ELT workflows integrating Azure Data Lake, Synapse, and Delta Lake. Write efficient Python and SQL code for transformations, joins, aggregations, and validations. Optimise Spark jobs with partitioning, caching, and tuning to reduce runtime and cost. Integrate pipelines into Azure Data Factory with CI/CD deployment via Git/DevOps. Apply monitoring, logging, and documentation standards for reliable production pipelines. Key Requirements (Technical & Delivery Skills) 5+ years’ hands-on experience in PySpark development within the Azure ecosystem. Mastery of: o Azure Databricks (PySpark, Delta Lake, Spark SQL)o Azure Data Factory for orchestrationo Azure Synapse Analytics for data integration/consumption Experience with Delta Lake/Parquet for scalable storage and analytics. Skilled in debugging and optimising Spark jobs for performance and cost efficiency. Familiar with CI/CD using Azure DevOps, Git versioning, and Agile delivery. Proven track record delivering production-grade pipelines in enterprise environments. Job Type: Full-time Work Location: In person
.NET Developer Bhopal Experience: 5+ years Work Arrangement: Onsite (no hybrid) Key Responsibilities Deliver secure and scalable .NET 8/ASP.NET Core solutions end-to-end, from coding to production hardening. Design and implement API-first solutions (REST/GraphQL) with versioning and reliability best practices. Build cloud-ready services using microservices, containerisation, and caching, with CI/CD pipelines and automated testing. Contribute to AI/innovation initiatives using Azure OpenAI/ML where applicable, ensuring measurable business outcomes. Key Requirements (Technical & Delivery Skills) 5+ years of proven expertise in .NET development across design, build, test, release, and run. Mastery of: o ASP.NET Core, .NET 8 o Azure services (App Service, Containers, Key Vault, Storage, Azure SQL/PostgreSQL) o Azure DevOps/GitHub for CI/CD, IaC (Bicep/Terraform), pipelines, and environment promotion Advanced skills in C#, T-SQL, REST Strong experience in shipping production-grade, high-availability systems with resilience and rollback strategies.
Azure Fabric Specialist London (Hybrid) Experience: 4+ years Client: NHS Contract: Outside IR35 Key Responsibilities Deliver end-to-end Microsoft Fabric OneLake solutions for NHS data platforms, ensuring compliance with NHS DSPT, IG Toolkit, and healthcare data standards. Lead migration of complex on-premises data systems into Fabric Lakehouse and Warehouse, integrating with Power BI, Purview, and secure pipelines. Define and enforce Medallion architecture (BronzeSilverGold) with robust data modelling, governance, lineage, and audit. Engineer high-throughput ETL/ELT across Fabric, Synapse, and Databricks; achieve measurable targets for latency, cost, and reliability . Implement FHIR/HL7 data ingestion and interoperability patterns; standardise schemas and terminologies for clinical analytics. Key Requirements (Technical & Delivery Skills) 4+ years of data engineering experience , with hands-on in Azure Data ecosystem (Microsoft Fabric, Synapse, Data Factory, Databricks). Mastery of: Microsoft Fabric Lakehouse/Warehouse , Delta Lake, Power BI integration. Azure Purview for cataloguing, lineage, and governance. CI/CD pipelines with YAML, GitHub/DevOps . Advanced SQL, Python, PySpark for large-scale data processing. Strong understanding of healthcare interoperability (FHIR, HL7, NHS data models).