🧪 Clinical Biometry IT Administrator 🚨 We’re Hiring – Contract-Based Clinical IT Role (Remote, India) 📍 Location: India (Remote) 📅 Duration: 12-Month Contract | 1-Month Notice 🕐 Shift Preference: Flexibility to support UK hours 📌 Preferred Joining: Immediate to 15 Days 💼 Experience Required: Minimum 5+ Years We are looking for an experienced Clinical Biometry IT Administrator who can take ownership of administering, supporting, and optimizing Biometry-related systems and tools used in clinical trials. This role involves technical management of platforms like SAS, R Studio, CDISC tools, and QlikSense, ensuring seamless system operations, compliance, and performance tuning. 🔧 Key Responsibilities: Administer and maintain SAS Server/PC and R Studio environments Write, debug, and validate SAS and R programs Support and maintain CDISC-compliant tools such as Pinnacle21 and RYZE Perform system upgrades, patching, user access management, and performance optimization Support data migration and integration tasks between Biometry tools and external systems Ensure compliance with ICH-GCP and FDA 21 CFR Part 11 standards Troubleshoot technical issues and implement preventive maintenance strategies Document system configurations, validation scripts, and SOPs as per regulatory guidelines 🧠 Required Skills & Experience: Deep knowledge of SAS and R programming (writing/debugging/validation) Proficiency in R Studio (Workbench, Connect, Package Manager) Understanding of Unix scripting, robocopy, and Python scripting Familiarity with CDISC standards – ADaM, SDTM Hands-on experience in system installation, upgrades, enhancements Exposure to Jenkins, Code Commit, or other automation tools (nice to have) 📧 Interested? Send the following to connect@infosprucetech.com: • Updated Resume • Job Title applying for • Current CTC • Expected CTC • Notice Period We’re looking for professionals who are ready to contribute from day one and help drive critical clinical technology initiatives forward. #Hiring #ClinicalIT #Biometry #SAS #RStudio #CTMS #LifeSciencesIT #RemoteJobs #InfospruceTechnologies #ContractJobs #IndiaJobs #ImmediateJoiners
🚀 We’re Hiring: Senior Data Engineer (Remote – India | Full-time) We are helping our client hire a Senior Data Engineer with over 10 years of experience in modern data platforms. This is a remote role open across India , and available on both full-time and contract basis. 💼 Position: Senior Data Engineer 🌍 Location: Remote (India) 📅 Type: Full-Time / Contract 📊 Experience: 10+ Years 🔧 Must-Have Skills: Data Engineering, Data Warehousing, ETL Azure Databricks & Azure Data Factory (ADF) PySpark, SparkSQL Python, SQL 👀 What We’re Looking For: A strong background in building and managing data pipelines Hands-on experience in cloud platforms, especially Azure Ability to work independently and collaborate in distributed teams 📩 How to Apply: Please send your resume to connect@infosprucetech.com with the subject line: "Senior Data Engineer – Remote India" ⚠️ Along with your resume, kindly include the following details: Full Name Mobile Number Total Experience Relevant Experience Current CTC Expected CTC Notice Period Current Location Do you have a PF account? (Yes/No) #DataEngineer #AzureDatabricks #ADF #PySpark #SQL #RemoteJobsIndia #HiringNow #Strive4X #FullTimeJobs #IndiaJobs
🚨 Hiring for Our Client: Senior Program Manager – IT-OT Transformation (Offshore Role – Supporting U.S. Client) 📍 Location : Remote (India-based, with overlap to U.S. Eastern Time) 🕒 Type : Full-Time | Long-Term Project 💼 Experience : 16+ Years 💡 Start : Immediate to 15 Days Preferred Are you a senior program leader who thrives in complex transformation environments? We’re looking for a Senior Program Manager to lead a high-impact IT-OT transformation initiative for a leading U.S.-based enterprise client. This offshore leadership role requires deep experience in finance and operations systems such as Oracle, SAP, NetSuite, or Workday Financials , and strong program execution within global delivery models. 🔧 What You’ll Do: ✅ Lead offshore delivery of a multi-phase transformation program ✅ Manage governance, roadmaps, KPIs, and milestone tracking ✅ Coordinate with U.S.-based stakeholders across business, IT, and finance ✅ Oversee financial system integration and data consolidation ✅ Ensure compliance with SOX, SEC , and audit-readiness ✅ Drive stakeholder communication, risk mitigation, and change adoption ✅ You Bring: ✔ 16+ years in program/project management ✔ 5+ year of experience in large-scale transformation programs ✔ Hands-on with Oracle, SAP, NetSuite, or Workday Financials ✔ Familiarity with compliance and regulatory frameworks (SOX, SEC) ✔ Strong communication and leadership presence in offshore setups ✔ PMP / PgMP certification (preferred) 📩 Interested or know someone who fits? Send profiles to: connect@infosprucetech.com with details of salary expectations and notice period 🌐 Learn more about us: www.infosprucetech.com 🔗 Follow us on LinkedIn: https://www.linkedin.com/company/infosprucetech/ #Hiring #SeniorProgramManager #Oracle #SAP #NetSuite #ITOT #FinanceTransformation #SOXCompliance #RemoteJobs #IndiaJobs #ProjectLeadership #ClientOpportunity
🚨 Hiring: IT Project Manager – .NET / Java Background (WFO – Gurgaon) 📍 Location: Gurgaon (Work From Office, Monday to Friday) 🕒 Employment Type: Full-Time 📅 Experience: 10+ Years ⏱ Notice Period: Immediate to 15 Days 🔍 About the Role We are hiring for a client – a seasoned IT Project Manager with a strong foundation in .NET or Java technologies . This role demands a sharp technical acumen combined with leadership to drive full-cycle project delivery, coordinate cross-functional teams, and ensure alignment between business objectives and technical execution. 🛠 Key Responsibilities Lead end-to-end project lifecycle: Initiation → Planning → Execution → Monitoring → Closure Mentor and manage development teams in .NET or Java ecosystems Liaise between business stakeholders and tech teams to ensure seamless delivery Create and manage project plans, timelines, budgets, risks, and documentation Ensure delivery of high-quality outcomes within timelines and budgets Drive Agile/Scrum or Waterfall delivery models based on project needs Conduct governance meetings and communicate progress to sponsors Handle scope changes, risks, and issues proactively Support pre-sales, estimation, and solution design as needed ✅ Required Skills & Experience 5+ years hands-on with either: → .NET: C#, ASP.NET, MVC, Web API → Java: Spring Boot, Microservices, REST APIs 3+ years in project management Proven track record of delivering software projects on time and within scope Solid grasp of SDLC , Agile (Scrum/Kanban), and DevOps practices Proficiency with tools like JIRA , Azure DevOps , MS Project , etc. Excellent communication, leadership, and stakeholder engagement skills Experience managing distributed teams ⭐️ Preferred Qualifications PMP / PRINCE2 / CSM Certification Cloud experience: Azure / AWS / GCP Familiarity with CI/CD pipelines and modern delivery models Domain exposure in Airline or similar industries 📩 Email your profile to connect@infosprucetech.com along with salary and notice period details 🌐 Learn more about us: www.infosprucetech.com 🔗 Follow us on LinkedIn: https://www.linkedin.com/company/infosprucetech/ #ProjectManager #Java #DotNet #GurgaonJobs #WFO #HiringNow #ImmediateJoiners #TechLeadership
🚀 Hiring for Our Client: Microsoft Fabric Developer 📍 Location: Bangalore / Chennai 💼 Employment Type: Full-Time 🌟 Role Overview Our client is looking for an experienced Microsoft Fabric Developer to design, build, and optimize a modern data platform using Microsoft Fabric. This role will cover the full stack of Fabric capabilities— OneLake, Lakehouse, Data Pipelines, Dataflows Gen2, SQL Endpoint, Semantic Models, and Power BI (Direct Lake) —to enable scalable and governed analytics solutions. 🔑 Key Responsibilities Data Ingestion & Transformation Build batch and streaming pipelines using Data Pipelines, Dataflows Gen2, and PySpark/Spark notebooks Integrate data from APIs, databases, event streams, and file-based sources Implement Medallion architecture with Delta Lake, Parquet, and partitioning Modeling & Serving Develop Lakehouses, Warehouses, and semantic models Design optimized Power BI models (DAX, RLS/OLS, calculation groups) Fine-tune performance for Direct Lake, Import, and DirectQuery modes Orchestration & CI/CD Configure schedules, dependencies, alerts Implement CI/CD pipelines via Azure DevOps / GitHub Automate processes using Fabric CLI / REST APIs Governance, Quality & Monitoring Ensure data quality with testing and monitoring tools Implement lineage, governance, and security (Purview, sensitivity labels, workspace controls) Collaboration & Documentation Work with business & analytics teams to deliver value-driven solutions Prepare clear documentation and technical playbooks ✅ Candidate Profile 5–6 years in Data Engineering/BI , including 1–2 years on Microsoft Fabric (or strong Synapse + Fabric experience) Strong expertise in: SQL (T-SQL), DAX, Python/PySpark Medallion architecture & Delta Lake concepts (ACID, schema evolution, time travel) CI/CD with Azure DevOps / GitHub Power BI optimization (Direct Lake, Import, DirectQuery) Good to Have: Real-time dashboards using KQL/Eventstreams Exposure to ML feature stores & scoring 📢 How to Apply If you’re ready to be part of a client’s digital transformation journey, share your profile with us at connect@infosprucetech.com with the following details: 👉 Name | Mobile | Current & Expected CTC | Notice Period | Current Location | Total & Relevant Experience #MicrosoftFabric #DataEngineering #PowerBI #Lakehouse #HiringNow #FabricDeveloper #Strive4X #JobAlert #BangaloreJobs #ChennaiJobs #DataPlatform #MicrosoftCareers #LinkedInJobs
💡 Looking for your next consulting challenge in Google Cloud? We have an exciting opportunity for a Senior Consultant with expertise in Google Cloud & Apptio logfile analysis to support one of our German clients. This is a contract role, requiring full-time commitment and alignment with German working hours. The consultant will play a key role in analysing log data, identifying cost savings, and engaging directly with senior stakeholders. 📌 Project Details ✅ Start: Mid-September 2025 ✅ Duration: 3 months (extension possible) ✅ Contract: Contractor, Remote (aligned with German time zone) ✅ Language: English (German is a plus) 🎯 What You’ll Do 🔹 Analyse & interpret Apptio log files to identify cost-saving opportunities in Google Cloud 🔹 Optimise network servers moving between regions 🔹 Collaborate with the CIO to derive optimisation measures 🔹 Provide reports & documentation for decision-making 🔹 Act as a trusted advisor with strong client interaction ✅ What We’re Looking For ✨ Strong experience in Apptio (cost analysis & optimisation) ✨ Solid knowledge of Google Cloud infrastructure (servers & cross-region traffic) ✨ Background in FinOps / TBM (advantage) ✨ Excellent communication & client handling skills (senior-level consulting) ✨ Ability to engage confidently with C-level stakeholders 📧 How to Apply Send your CV to connect@infosprucetech.com with the following details: Full Name Mobile Number Current CTC & Expected CTC Notice Period / Availability Current Location Total Exp & Relevant Exp
🚀 Senior Data Engineer – Data Vault 2.0 (Contract-to-Hire | Remote – India) Duration: Oct–Dec 2025 (3 months) with strong possibility of extension through 2026 Type: Contract-to-Hire Location: Remote (India only) About the Project We are driving a high-impact initiative to automate data subject rights processes and expand our core Data Product . The project focuses on data privacy, compliance (GDPR), and large-scale data integrations into our enterprise data platform. What You’ll Work On 🚦 Automate information disclosure & objection handling (data subject rights) 🔗 Integrate multiple data sources (LAR Monitor, Broker Channel, Voice Transcripts, CSC, Schufa, GridX, etc.) into the Core Data Product 🏗️ Design, adapt, and implement database architecture for our next-gen Core Data Warehouse 📊 Migrate existing data and structures into the new Data Vault 2.0–based platform 🧩 Model Raw Vault and Business Vault layers ⚙️ Build complex SQL business transformations , based on real-world requirements 🌐 Enable integration of JSON feedback, outbound contact history, and customer interest data Must-Have Skills ✅ Hands-on expertise in Data Vault 2.0 modeling & implementation ✅ Experience with DPT (Data Product Tooling) ✅ Advanced SQL (business transformations, optimization) ✅ Data integration & migration from legacy to modern data warehouses Nice-to-Have Skills ☁️ Snowflake (data warehouse design, tuning, transformations) 🔧 Familiarity with Data Vault automation frameworks (WhereScape, dbt, etc.) 🛡️ Experience with GDPR / compliance-driven data projects What We’re Looking For We need a Senior Data Engineer / Data Vault Specialist who can work independently, own EPICs end-to-end, and translate business requirements into technical solutions . Strong communication and problem-solving skills are essential, as you will be working on business-critical, compliance-driven initiatives. Why Join Us? 🌍 Remote – India (flexibility to work from anywhere in India) 💼 Be part of a high-impact data governance project ⏳ Initial 3-month engagement with strong extension possibility through 2026 🚀 Contract-to-Hire: Path to a long-term role 👉 Interested? Apply now or reach out to explore this opportunity! 📩 Email your profile to connect@infosprucetech.com 🌐 Learn more about us: www.infosprucetech.com 🔗 Follow us on LinkedIn: https://www.linkedin.com/company/infosprucetech/ #Hiring #JobOpportunity #ContractToHire #RemoteJobsIndia #NowHiring #DataEngineer #DataVault #SQLDeveloper #DataVault20 #DataProduct #Snowflake #DataIntegration #DataWarehouse #GDPRCompliance #DataGovernance #DataPrivacy #DataAutomation #EnterpriseData #InfospruceJobs #InfospruceTechnologies
🚀 We’re Hiring: Offshore Project Lead – Insider Personalization Implementation 🚀 📍 Location: Remote – India (or similar time zone overlap preferred) ⏳ Duration: 6 months (with potential extension) 🚀 Start Date: Immediate We are looking for an experienced Project Lead with direct hands-on expertise in the Insider omnichannel personalization platform to support a leading global retail client. This is a client-facing role with high visibility and impact. 📌 Role Overview As the Offshore Project Lead , you will manage the end-to-end implementation of the Insider platform—driving discovery, roadmap definition, technical integrations, and successful deployment. You’ll collaborate closely with marketing, digital, and technology teams to deliver business-critical personalization and customer engagement initiatives. 🛠️ Key Responsibilities Act as the Insider platform expert and project lead for all phases of implementation Run stakeholder discovery sessions to identify goals, use cases, and customer journeys Build and manage detailed project plans, timelines, and communication cadences Collaborate with client teams and Insider engineers for technical setup and integration Support integration across web/mobile SDKs, CRM, ESP, and analytics tools Oversee campaign configurations, UAT, training, and enablement Provide ongoing documentation, updates, and risk management ✅ Required Qualifications Hands-on experience with Insider platform implementation (must-have) 5+ years leading digital transformation / martech projects Strong expertise in omnichannel personalization, journey orchestration & segmentation Solid knowledge of digital ecosystems (web/mobile, CRM, ESP, analytics) Proven client-facing leadership and excellent communication skills ⭐ Preferred Experience with US-based retail / wellness clients Background in growth marketing, digital strategy, or CX programs Familiarity with platforms like Salesforce Marketing Cloud, Klaviyo, etc. If this sounds like you or someone in your network — let’s connect! 📩 Email your profile to connect@infosprucetech.com 🌐 Learn more about us: www.infosprucetech.com 🔗 Follow us on LinkedIn: https://www.linkedin.com/company/infosprucetech/ #Hiring #InsiderPlatform #ProjectLead #DigitalTransformation #Personalization #MarTech #CustomerExperience #InfospruceJobs
📢 We’re Hiring: GCP Data Engineer – BFSI Domain (Chennai) Employment Type: Full-Time Experience: 4+ Years CTC: Market Standards Location: Chennai – Work From Office (Client Location) | 5 Days a Week 🚀 About the Role We are seeking a GCP Data Engineer to join our client project in the BFSI domain. In this role, you will collaborate with data science teams, build scalable data pipelines, and develop models that empower business decision-making through advanced analytics. 🎯 Key Responsibilities Collaborate with data science teams to implement complex algorithms and analytical solutions. Use agile development practices to improve back-end systems. Model front-end and back-end data flows to enable deeper business insights. Build and maintain ETL pipelines that clean, transform, and aggregate data. Develop predictive models to drive actionable insights for the BFSI business. ✅ Required Skills & Qualifications Minimum 4 years of experience in Python, SQL , and data visualization/exploration tools. Hands-on with Google Cloud Big Data tools – BigQuery, Airflow, CI/CD. Strong knowledge of Spark, Hive, Core Java, Linux, and scripting languages . Experience in ETL design, build, and maintenance . Familiar with analytics tools such as Tableau or R. Strong communication skills for collaborating with technical and non-technical stakeholders. Bachelor’s degree in Computer Science, IT, Engineering, or equivalent. Professional certification in GCP Data Engineer or related fields (preferred). BFSI domain experience is highly desirable. ⭐ Why Join Us? Opportunity to work on a prestigious BFSI project with global impact. Collaborative work environment with cutting-edge cloud and big data technologies . Career growth with professional certifications and upskilling programs . Be part of a strong team delivering real-time business value . 📩 How to Apply If this sounds like you or someone in your network — let’s connect! 📩 Email your profile to connect@infosprucetech.com 🌐 Learn more about us: www.infosprucetech.com 🔗 Follow us on LinkedIn: www.linkedin.com/company/infosprucetech/ #Hiring #ChennaiJobs #DataEngineer #GCP #BigQuery #Airflow #ETL #Python #Spark #Hive #BFSI #Mastek #CareerGrowth
🚀 We’re Hiring: Sr. Cloud Engineer (AWS) 🌐 Are you a Cloud expert with a passion for building scalable, secure, and cost-efficient AWS environments? Join us and be part of a fast-growing team delivering enterprise-grade cloud solutions. 📌 Role: Sr. Cloud Engineer 📌 Experience: 8+ Years 📌 Location: Remote 📌 Compensation: As per market standards 📌 Interview Process: Multiple rounds with client 📌 Notice Period: Immediate joiners preferred Key Skills & Responsibilities: ✅ Design, build, and maintain AWS Cloud Infrastructure. ✅ Proficient in AWS services – EC2, S3, RDS, Redshift, ECS, VPC, Route 53, Load Balancers, Auto Scaling, and more. ✅ Hands-on with ECS/EKS clusters, Docker images, and AMIs. ✅ Automate infrastructure using Terraform and CloudFormation . ✅ Configure IAM roles, policies, and user groups. ✅ Set up and manage RDS databases (MySQL, Redshift, PostgreSQL, MariaDB). ✅ Strong expertise in VPC, Subnets, Security Groups, NAT/IGW, Route 53, WAF, etc. ✅ Implement CI/CD pipelines with GitHub/Bitbucket and work with tools like ServiceNow, Jira . ✅ Optimize cloud costs using Trusted Advisor, Cost Explorer & tagging strategies. ✅ Apply AMI upgrades, patches, and maintenance window planning. ✅ Monitor environments with New Relic, Splunk, Prometheus, Grafana . ✅ Document SOPs, knowledge bases & ensure cloud security best practices. ✅ Strong communication & independent problem-solving skills. If you’re ready to take ownership of large-scale cloud environments and grow with a dynamic team, we’d love to hear from you! 📩 Email your profile to connect@infosprucetech.com 🌐 Learn more about us: www.infosprucetech.com 🔗 Follow us on LinkedIn: www.linkedin.com/company/infosprucetech/ #Hiring #CloudEngineer #AWS #DevOps #RemoteJobs #CloudComputing #Terraform #EKS #SeniorRole #InfospruceJobs #InfospruceTechnologies
🚀 Hiring: Java Developer – Hybrid (Chennai/Bangalore) We are looking for an experienced Java Developer to join our client on a full-time basis. This is a hybrid role requiring work from the client office in Chennai or Bangalore. 📍 Location: Chennai / Bangalore (Hybrid – work from client office) 🕒 Experience: 4+ years 💼 Employment Type: Full-Time (FTE) ⏳ Notice Period: Immediate to 15 days 💰 Budget: Limited (only shortlisted candidates will be contacted) 🔑 Key Responsibilities Develop high-performance Java applications with JDK 21+ compliance Build scalable systems using Java, Webservices, Microservices, Kafka, Spring Boot Work with PostgreSQL and SQL (nice to have) Use frameworks/tools: Spring Core/MVC, Maven/Gradle, JUnit/Mockito, Swagger, Git, Jenkins, uDeploy, Docker Automate CI/CD pipelines and DevOps processes Apply modern security practices to safeguard applications Contribute to API test automation (Cucumber) and quality-first engineering Collaborate with global teams and work in US overlap hours ✅ Requirements 4+ years in Java development (JDK 21+) Strong experience with Microservices, Kafka, Spring Boot Exposure to CI/CD automation & security practices Solid problem-solving & ownership mindset Nice to have: Angular ✨ This is a unique opportunity to work with a leading global client on a cutting-edge trading platform in the crypto space. 📩 Apply by sending your CV to connect@infosprucetech.com 🌐 Learn more: www.infosprucetech.com 🔗 Follow us: https://lnkd.in/gGvkMmDP hashtag #JavaDeveloper hashtag #SpringBoot hashtag #Kafka hashtag #Microservices hashtag #ChennaiJobs hashtag #BangaloreJobs hashtag #CryptoTrading hashtag #Hiring hashtag #Infospruce
🚀 Hiring: C++ Developer (8+ Years) - Kernel Programming and Memory Management 📍 Remote (India) | 💼 Full Time About the Role We’re looking for an experienced C++ Developer with strong expertise in Kernel Programming and Memory Management . The ideal candidate will have hands-on experience in system-level programming, kernel internals, and performance optimization . ✅ Key Responsibilities • Design, develop, and maintain kernel modules, drivers, OS-level components • Work on memory management techniques (allocation, garbage collection, optimization) • Debug, optimize, and enhance performance of multi-threaded & distributed systems • Collaborate with cross-functional teams for system integration • Perform root cause analysis, performance tuning, system debugging • Stay updated with C++, OS internals, kernel development trends 🌟 Required Skills & Experience • Strong proficiency in C/C++ (C++11/14/17 or higher) • Hands-on experience with Kernel programming (Linux/Windows) • Expertise in Memory Management (allocators, fragmentation, optimization) • Knowledge of multi-threading, concurrency, synchronization • Experience with debugging/profiling tools (GDB, Valgrind, Perf, WinDbg) • Familiarity with IPC, sockets, system calls 💡 Good to Have • Experience in Embedded Systems / RTOS • Knowledge of CPU/GPU internals, computer architecture • Exposure to performance profiling, security concepts • Scripting knowledge ( Python, Bash, Shell ) 📌 Important Notes • Immediate / 15 days notice period preferred • PF account mandatory for full time • Budget is limited and based on your experience & expertise 📬 Ready to Apply? Send your resume to connect@infosprucetech.com 📌 Subject line: C++ Developer Include in your email: Full Name Mobile Number Current Location Total Experience (Years) Relevant Experience (Years) Current Company Current CTC (LPA) Expected CTC (LPA) Notice Period (Days) PF account (Yes/No) Remote (Yes/No) UK shift availability (Yes/No) Full time confirmation (Yes/No) 🌐 Learn more: www.infosprucetech.com 🔗 Follow us: www.linkedin.com/company/infosprucetech/ #Hiring #C++Developer #C++Jobs #KernelProgramming #MemoryManagement #SystemProgramming #RemoteJobsIndia #TechJobs #DeveloperJobs #FullTimeJobs #SoftwareEngineering #LinuxKernel #WindowsKernel #HiringNow #JobOpportunity #InfospruceJobs #InfospruceTechnologies
🚀 We’re Hiring: Data Engineer / MLOps Engineer (5+ Years Experience) 📍 Location: Remote 📅 Contract: 6 Months (with possible extension) ⚡ Immediate Joiners Only (0–15 Days Preferred) ✅ BGV Clearance Mandatory 🤝 Full-time commitment required (no parallel employment or freelance work) Are you passionate about building and scaling ML/Data pipelines in the cloud? Join our team to work on high-impact, production-grade data & ML platforms! ✅ Required Experience (5+ Years in any of the below) Software Engineering Data Engineering MLOps / ML Platform Engineering 🔹 Must-Have Skills ✔ Strong experience with ML Pipeline Orchestration (Airflow, Kubeflow, MLflow) ✔ Proficiency in Python ✔ Familiarity with ML Frameworks (TensorFlow, PyTorch) ✔ Cloud platforms (AWS / Azure / GCP) ✔ Infrastructure-as-Code (Terraform) ✔ Solid understanding of DevOps & Observability principles ✔ Experience with Docker & Kubernetes 📩 Interested? Send your resume to connect@infosprucetech.com with: Job Title Current CTC Expected CTC Notice Period BGV clearance confirmation 🔁 Know someone who’s a great fit? Please share or refer! 🌐 Learn more: www.infosprucetech.com 🔗 Follow us: www.linkedin.com/company/infosprucetech/ #MLOps #DataEngineer #MLflow #Kubeflow #Airflow #Python #TensorFlow #PyTorch #AWS #Azure #GCP #Terraform #Docker #Kubernetes #DevOps #RemoteJobs #Hiring #InfospruceTech
🚀 We’re Hiring: Azure Data Engineer | 5+ Years Experience | Full-Time (FTE) 📍 Locations (Hybrid Work Model): Chennai | Mumbai | Pune | Ahmedabad | Gurgaon | Noida ⏰ Experience: 5+ Years 🧠 Role Type: Full-Time | Hybrid 📅 Notice Period: Immediate to 15 Days preferred 💼 About the Role We are looking for a results-driven Azure Data Engineer with proven experience in designing, developing, and deploying end-to-end data engineering, BI, and cloud-based ETL/ELT solutions . The ideal candidate will have deep expertise in Azure Synapse, Azure Data Factory (ADF), and SQL Server , along with hands-on experience in Oracle Cloud data migration projects involving domains such as Item Master, Accounts Receivable (AR), and On-Hand Inventory . ✅ Key Responsibilities Design and build scalable data engineering & BI solutions across cloud and on-premise environments Develop and orchestrate ADF & Synapse pipelines for large-scale data movement Perform data modeling, transformation, and optimization for analytics/reporting Execute Oracle Cloud migration (Item Master, AR, On-Hand Inventory) Optimize ETL/ELT processes and complex SQL queries for performance Collaborate with cross-functional and onshore teams to deliver high-quality solutions 🔧 Required Skills & Expertise Azure Synapse Analytics, Azure Data Factory (ADF) SQL Server & performance tuning Data modeling & transformation Oracle Cloud migration experience Strong problem-solving and analytical skills Excellent communication & stakeholder collaboration 🎯 Why Join Us? Work on large-scale cloud data projects Collaborative hybrid work environment Opportunity to influence technology decisions Growth-oriented, dynamic team 📩 Interested? Apply Now! Share your CV with connect@infosprucetech.com 👉 Mention Current CTC | Expected CTC | Notice Period | Location Preference 💡 Know someone perfect for this role? Refer & make an impact! #AzureDataEngineer #DataEngineering #AzureSynapse #ADF #ETL #ELT #SQLServer #OracleCloud #DataMigration #HybridJobs #HiringNow #InfospruceTech #ChennaiJobs #MumbaiJobs #PuneJobs #GurgaonJobs #NoidaJobs #AhmedabadJobs #BI #CloudEngineering #ImmediateJoiners 🌐 Learn more: www.infosprucetech.com 🔗 Follow us: www.linkedin.com/company/infosprucetech/
🚀 Hiring: Databricks Data Engineer | 6-Month Contract (Extendable) | Remote – India | Immediate Start | German Shift 📍 Location: India (Remote) 🕐 Shift: German Business Hours ⏳ Contract Duration: 6 Months (extendable based on performance) 🧾 Engagement: Contract | BGV Clearance Mandatory We’re seeking a Data Engineer with strong experience in Databricks and PySpark to support an ongoing cloud data transformation program. The ideal candidate is hands-on in building, optimizing, and validating scalable data pipelines on modern cloud platforms. ✅ Key Skills & Experience 5 + years of hands-on experience in data engineering roles Strong expertise in Databricks, Python, PySpark, SQL, and Spark SQL Experience with cloud data platforms ( Azure / AWS / GCP ) Solid understanding of distributed computing and big-data technologies Familiarity with CI/CD pipelines and version control (Git / Bitbucket) 🚀 Immediate joiners preferred 🔎 BGV clearance is mandatory 📬 Ready to Apply? Send your resume to connect@infosprucetech.com 📌 Subject line: Databricks Data Engineer Include in your email: Full Name Mobile Number Current Location Total Experience (Years) Relevant Experience (Years) Current Company Current CTC (LPA) Expected CTC (LPA) Notice Period (Days) PF account (Yes/No) German shift availability (Yes/No) 🌐 Learn more: www.infosprucetech.com 🔗 Follow us: www.linkedin.com/company/infosprucetech/ #Databricks #DataEngineer #PySpark #Azure #BigData #ContractJobs #RemoteJobs #HiringNow #UKShift #GermanShift #InfospruceTechnologies #DataEngineering #ImmediateJoiner
✅ Contract Opportunity – Tester for Data Migration Project (On Prem to cloud) ⏳ Duration: 6 Months | 🌍 Location: India (Remote) | ⚡ Start: ASAP 🕒 Work Mode: Remote with PST overlap We are looking for a QA Tester who has hands-on experience testing data migration projects (on-prem to cloud) and validating ETL pipelines and data quality . The ideal candidate should be skilled in automation testing for UI and backend workflows , with a strong foundation in functional and regression testing . If yes, this role is for you! 🚀 Key Responsibilities Perform end-to-end validation of data migration from on-premise systems to cloud platforms (Azure/AWS/GCP). Conduct ETL pipeline testing — covering transformation logic, mapping, reconciliation, and lineage validation. Perform data validation and reconciliation across staging, transformation, and target layers. Design and execute UI test automation using tools like Selenium, Cypress, or Playwright . Develop and maintain automation frameworks (Selenium/BDD/Cucumber) for functional and regression test suites. Prepare and maintain test plans, test cases, and defect reports for assigned modules. Integrate automated scripts within CI/CD pipelines (Azure DevOps, Jenkins, or GitHub Actions). Collaborate with developers, data engineers, and business analysts to ensure end-to-end testing quality. Deliver test results, reconciliation reports , and migration readiness sign-offs to stakeholders. Must-Have Skills Proven experience in Data Migration Testing (on-prem to cloud) — including ETL pipeline validation, data quality, and record reconciliation . Strong knowledge of ETL tools such as ADF, Informatica, Talend, or SSIS . Hands-on experience in UI test automation using Selenium/Cypress/Playwright . Proficiency in Functional and Regression Testing . Strong SQL skills for data validation and reconciliation. Exposure to CI/CD tools such as Azure DevOps, GitHub Actions, or Jenkins . Familiarity with CosmosDB or other NoSQL databases is an advantage. Nice-to-Have Knowledge of Healthcare/Life Sciences or similar data-sensitive domains. Experience in API testing (Postman, RestAssured). Exposure to Python/Java/JavaScript for automation scripting. 📦 Deliverables 📌 Test Plans & Test Cases 📌 Automated Test Suites 📌 Defect Reports 📌 ETL Validation Results (Reconciliation & Lineage) 📌 Migration Readiness Sign-off 📩 Interested / Available? Send your CV to connect@infosprucetech.com with: ✔ Notice Period ✔ Current Location ✔ Expected Rate 🌐 Learn more: www.infosprucetech.com 🔗 Follow us: www.linkedin.com/company/infosprucetech/ #ETLTesting #DataMigration #AutomationTesting #ContractJobs #RemoteJobs #TestingJobs #ADF #Informatica #Selenium #Azure #Dataverse #InfospruceJobs