Mumbai Metropolitan Region
Not disclosed
On-site
Full Time
Job Title: Data Scientist Location: Pune / Mumbai (On-site) Experience: 3 to 4 Years Notice Period: Immediate Joiners Only Working Days: 5 Days a Week (On-site) Company Overview Optimum Data Analytics is a strategic technology partner committed to delivering reliable, turnkey AI solutions. With a streamlined development approach, we ensure quality results and client satisfaction. Our mission is to empower human decision-making through analytics and AI. Our diverse team includes statisticians, computer science engineers, data scientists, and product managers. We combine technical expertise with cultural alignment and business understanding to drive innovation in AI/ML across the service sector. Role Summary We are seeking a proactive and experienced Data Scientist who can contribute from day one. You’ll work on high-impact projects using machine learning and big data technologies to solve complex business problems and drive actionable insights. Key Responsibilities Apply classical machine learning techniques (e.g., regression, classification) to real-world problems Perform time series analysis and forecasting for business-critical applications Work with large-scale datasets in distributed computing environments Develop, optimize, and deploy models using Databricks and PySpark Collaborate with cross-functional teams to understand business requirements and translate them into analytical solutions Present insights and recommendations clearly to both technical and non-technical stakeholders Must-Have Skills Strong hands-on experience with Classical Machine Learning techniques Proficient in Databricks and PySpark Solid knowledge of Time Series Analysis & Forecasting Proven ability to work with large datasets and distributed systems Excellent problem-solving, communication, and team collaboration skills Why Join Us? Be part of a fast-growing company redefining AI/ML in the service industry Work alongside passionate and skilled professionals in a dynamic environment Opportunity to work on impactful, real-world problems and cutting-edge solutions Skills: data scientist,classical machine learning techniques,large-scale datasets handling,databricks,time series analysis,distributed systems,pyspark,machine learning,forecasting Show more Show less
Pune, Maharashtra, India
Not disclosed
On-site
Full Time
Company Overview Optimum Data Analytics is a strategic technology partner committed to delivering reliable, turnkey AI solutions. With a streamlined development approach, we ensure quality results and client satisfaction. Our mission is to empower human decision-making through analytics and AI. Our diverse team includes statisticians, computer science engineers, data scientists, and product managers. We combine technical expertise with cultural alignment and business understanding to drive innovation in AI/ML across the service sector. Our diverse team includes statisticians, computer science engineers, data scientists, and product managers. We combine technical expertise with cultural alignment and business understanding to drive innovation in AI/ML across the service sector. Job Details Job Title: Tableau BI Tech Lead Location: Pune (On-site) Experience: 12+ Years Shift Timings: 2pm to 10pm IST (overlap with US time) Role Summary Looking for Tech lead who has the architect mindset and has experience building solutions Client is not looking for a typical lead/Senior developer who just coordinates or manages the team but a candidate who has knowledge outside building the reports. Candidate should have experience working on a product where the Tableau report is served to clients via a web application (embedded dashboards), i.e. Experience in embedded analytics using Tableau, and integrating Tableau with React JS Exposure to work on plugin or integration with Tableau Seamless integrating with AI models and different components. Skills: plugin,plugin development,ai models,embedded dashboard,web application development,tableau,ai integration,embedded analytics,react js,integrating Show more Show less
Hyderabad, Telangana, India
Not disclosed
On-site
Full Time
Company Description Maantic Inc is a global strategy and solutions integration firm headquartered in California. Maantic specializes in the implementation of business applications including BPM, CRM, RPA and AI/ML. Maantic's core focus is in Pega, Salesforce, Appian, Robotics Process Automation, ServiceNow and Digital Marketing. Maantic also works with a lot of Product companies helping them in both Engineering and IT. Job Details Position: QA Automation Lead Work Mode: Hybrid Experience: 11+Years Payroll: Maantic Location: Hyderabad (Hitech City-500081) Notice Period: Immediate Interview- First Test Round, L1(virtual round) and Final round F2F (EA Office) Key Skills QA Automation Selenium/Java Javascript API Testing Functional Testing Manual Testing Job Description 10+ years’ overall experience in testing of enterprise web and responsive applications. 3+ years as a QA Lead At least 6+ years of experience in the development of test automation frameworks and scripts (Java/JavaScript) Work with stakeholders and understand business requirements and translate to QE deliverables. Good experience in Agile, QA standards and processes. Excellent verbal and written Communication skills. Good to have exposure to Performance testing and any Cloud Platform(AWS/Google Cloud/Azure) Should be flexible to operate in dual mode as Manual/Automation QA based on project needs Skills: functional testing,qa automation,javascript,cloud,api testing,selenium,testing,manual testing,java,api Show more Show less
Pune, Maharashtra, India
Not disclosed
On-site
Full Time
Please Find The Job Description For IT Support Analyst Job Location: DGS, Kharadi, Pune Work mode: 5 days work from office Job Description Key responsibilities: Provides an interface between the business and technology, monitoring customer engagement channels and responding within set targets Provides customer with an exceptional experience Adheres to standard processes to ensure contacts are accurately prioritised, recorded, assigned, updated and resolved within service targets Builds technical knowledge and ensures it is effectively applied to support the business with a wide range of foundational, occasionally complex, requests Applies effective and timely escalation Skills: customer service,technical support,time management,problem solving,escalation,customer engagement,communication,technology Show more Show less
Hyderabad, Telangana, India
Not disclosed
On-site
Full Time
Note: Candidates having not more than 6 months of career gap will be considered. Key Skills QA Automation Selenium/Java Javascript API Testing Functional Testing Manual Testing Job Description 10+ years’ overall experience in testing of enterprise web and responsive applications. 3+ years as a QA Lead At least 6+ years of experience in the development of test automation frameworks and scripts (Java/JavaScript) Work with stakeholders and understand business requirements and translate to QE deliverables. Good experience in Agile, QA standards and processes. Excellent verbal and written Communication skills. Good to have exposure to Performance testing and any Cloud Platform(AWS/Google Cloud/Azure) Should be flexible to operate in dual mode as Manual/Automation QA based on project needs Skills: selenium,functional testing,selenium/java,manual testing,java,qa automation,javascript,api testing Show more Show less
India
Not disclosed
Remote
Contractual
Technical Skills & Expertise Programming: Expert-level Python (5+ years) – pandas, NumPy, Scikit-learn, FastAPI Strong in SQL and NoSQL (MongoDB, DynamoDB) Machine Learning & Deep Learning: Hands-on with Scikit-learn, PyTorch, and TensorFlow Fine-tuning LLMs using frameworks like HuggingFace Transformers Experience with Agentic AI workflows (e.g., LangChain, AutoGPT) Cloud Platforms: AWS (SageMaker, Bedrock), GCP (Vertex AI, Gemini), Azure AI Studio Model deployment using managed services and containers (Docker, ECS, Cloud Run) CI/CD pipelines for ML (GitHub Actions, SageMaker Pipelines, Vertex Pipelines) OpenAI & Generative AI: Prompt engineering, retrieval-augmented generation (RAG), embeddings Integration with OpenAI APIs and model fine-tuning (Davinci, GPT-4-turbo, Gemini) Soft Skills & Mindset Strong research mindset and hunger to explore emerging trends (RLHF, SLMs, foundation models) Proven experience working in fast-paced, agile remote teams Architecture mindset – capable of designing scalable AI/ML systems Business-first thinking – aligns AI solutions with real-world outcomes Skills: sagemaker pipelines,ecs,pytorch,python,nosql,openai apis,sql,aws,vertex pipelines,prompt engineering,langchain,huggingface transformers,docker,scikit-learn, pytorch, and tensorflow,tensorflow,gcp,retrieval-augmented generation,azure,embeddings,scikit-learn,cloud run,autogpt,github actions Show more Show less
Pune, Maharashtra, India
Not disclosed
On-site
Full Time
Company Description Optimum Data Analytics is a strategic technology partner delivering reliable turnkey AI solutions. Our streamlined approach to development ensures high-quality results and client satisfaction. We bring experience and clarity to organizations, powering every human decision with analytics & AI. Our team consists of statisticians, computer science engineers, data scientists, and product managers. With expertise, flexibility, and cultural alignment, we understand the business, analytics, and data management imperatives of your organization. Our goal is to change how AI/ML is approached in the service sector and deliver outcomes that matter. We provide best-in-class services that increase profit for businesses and deliver improved value for customers, helping businesses grow, transform, and achieve their objectives. Job Details Position: Python Full Stack Developer Experience: 6+ years Work Mode: Onsite Location: Viman Nagar Rd, Pune. Must Have Skills: Python, NumPy & Pandas, Angular, SQL, Flask/Fast API Job Responsibilities Design, develop, and maintain scalable Python applications and systems. Collaborate with cross-functional teams to design and implement features and functionalities. Utilize Numpy & Pandas for data manipulation and analysis. Develop, test, and deploy full-stack web applications using Angular for the frontend. Integrate with SQL databases, writing optimized queries to interact with large data sets. Perform data processing and transformation tasks using Python libraries and ensure data accuracy and consistency. Troubleshoot and debug issues in code, optimizing performance as necessary. Adhere to best coding practices and participate in code reviews to ensure high-quality deliverables. Collaborate with team members to deliver features in an agile environment. Ensure the smooth deployment of applications and continuous integration/continuous delivery (CI/CD). Key Skills Python: Strong proficiency in Python programming and its application in both backend and data processing. NumPy & Pandas: Expertise in data manipulation, cleaning, and transformation using NumPy and Pandas. Web Development: Experience with frontend technologies like Angular and ReactJS to build responsive and dynamic web applications. SQL: Strong understanding of SQL to design and query databases efficiently, with knowledge in database optimization. Problem Solving: Ability to think critically and develop efficient, scalable solutions. Version Control: Familiarity with Git or other version control systems. Teamwork: Excellent communication and collaboration skills to work in an agile team environment. Skills: python,pandas,numpy,fast api,sql,angular,flask Show more Show less
India
Not disclosed
Remote
Contractual
Job Title: SAP Business Analyst with SAP S/4 HANA (Policies and Procedures) Location: Remote Experience: 6+ Years in BA role Key Responsibilities Define, document, and align new policies and procedures to support S/4 HANA functionalities, ensuring compliance with industry standards and governance frameworks. Assist in managing organizational change, developing training materials, and providing user training to ensure smooth adoption of new policies and post migration processes. Assess risks related to policies and procedures, collaborate on risk mitigation strategies, and resolve issues that arise during the migration process. Ensure data governance during migration, support data mapping from legacy systems to S/4 HANA, and validate data integrity against defined policies and procedures. Create detailed documentation for policies and procedures, and report on migration progress, issues, and compliance to senior management. Work closely with SAP consultants to ensure system configuration aligns with business requirements, ensuring policies and procedures are well-supported by S/4 HANA. Qualifications Bachelor’s degree in Business Administration, Information Systems, Finance, or a related field. Experience in business analysis or policy development, preferably in SAP S/4 HANA environments. Proficiency in Signavio for process modeling, documentation, and analysis is required. Demonstrated experience in policy and procedure development. Sufficient knowledge of SAP S/4 HANA functionalities. Skills: risk assessment,data governance,business requirements,signavio,training materials development,sap s/4 hana,process modeling,sap,hana,policy development,documentation,business analysis Show more Show less
Pune, Maharashtra, India
Not disclosed
On-site
Full Time
Company Description Optimum Data Analytics is a strategic technology partner delivering reliable turnkey AI solutions. Our streamlined approach to development ensures high-quality results and client satisfaction. We bring experience and clarity to organizations, powering every human decision with analytics & AI. Our team consists of statisticians, computer science engineers, data scientists, and product managers. With expertise, flexibility, and cultural alignment, we understand the business, analytics, and data management imperatives of your organization. Our goal is to change how AI/ML is approached in the service sector and deliver outcomes that matter. We provide best-in-class services that increase profit for businesses and deliver improved value for customers, helping businesses grow, transform, and achieve their objectives. Job Details Position: Data Engineer (Azure) Experience: 5-8 Years Work Mode: Onsite Location: Pune or Mohali Must Have Data Warehousing, Data Lake, Azure Cloud Services, Azure DevOps ETL-SSIS, ADF, Synapse, SQL Server, Azure SQL Data Transformation, Modelling, Ingestion and Integration. Microsoft Certified: Azure Data Engineer Associate Required Skills And Experiences 5-8 years of experience as a Data Engineer, focusing on Azure cloud services Bachelor’s degree in computer science, Information Technology, or related field. Strong hands-on experience with Azure Data Factory, Azure Synapse Analytics, Azure SQL Database, and Azure Storage. Strong SQL skills, including experience with data modeling, complex queries, and performance optimization. Ability to work independently and manage multiple tasks simultaneously. Familiarity with version control systems (e.g., Git) and CI/CD pipelines (Azure DevOps). Knowledge of Data Lake Architecture, Data Warehousing, and Data Modeling principles. Experience with RESTful APIs, Data APIs, and event-driven architecture. Familiarity with data governance, lineage, security, and privacy best practices. Strong problem-solving, communication, and collaboration skills. Skills: event-driven architecture,data transformation,modeling,data governance,azure devops,azure,azure cloud services,restful apis,sql,data warehousing,azure sql,azure data factory,data modeling,data security,ingestion,data lake architecture,data privacy,synapse,etl-ssis,data apis,integration,data,data lake,adf,sql server,data lineage Show more Show less
Pune, Maharashtra, India
Not disclosed
On-site
Full Time
Company Description Optimum Data Analytics is a strategic technology partner delivering reliable turnkey AI solutions. Our streamlined approach to development ensures high-quality results and client satisfaction. We bring experience and clarity to organizations, powering every human decision with analytics & AI. Our team consists of statisticians, computer science engineers, data scientists, and product managers. With expertise, flexibility, and cultural alignment, we understand the business, analytics, and data management imperatives of your organization. Our goal is to change how AI/ML is approached in the service sector and deliver outcomes that matter. We provide best-in-class services that increase profit for businesses and deliver improved value for customers, helping businesses grow, transform, and achieve their objectives. Job Details Position: Sr. Data Engineer (MS Fabric) Experience: 5-7 Years Work Mode: Onsite Location: Pune or Mohali Must Have Power BI, Azure Cloud Services, MS Fabric, SQL, ETL, Python. Data Transformation, Modelling, Ingestion and Integration. Required Skills And Experiences Collaborate with cross-functional teams, including data analysts, data scientists, and business stakeholders, to understand their data requirements and deliver effective solutions. Leverage Fabric Lakehouse for data storage, governance, and processing to support Power BI and automation initiatives. Expertise in data modeling, with a strong focus on data warehouse and lakehouse design. Design and implement data models, warehouses, and databases using MS Fabric, Azure Synapse Analytics, Azure Data Lake Storage, and other Azure services. Develop ETL (Extract, Transform, Load) processes using SQL Server Integration Services (SSIS), Azure Synapse Pipelines, or similar tools to prepare data for analysis and reporting. Implement data quality checks and governance practices to ensure accuracy, consistency, and security of data assets. Monitor and optimize data pipelines and workflows for performance, scalability, and cost efficiency, utilizing Microsoft Fabric for real-time analytics and AI-powered workloads. Strong proficiency in Business Intelligence (BI) tools such as Power BI, Tableau, and other analytics platforms. Experience with data integration and ETL tools like Azure Data Factory. Proven expertise in Microsoft Fabric or similar data platforms. In-depth knowledge of the Azure Cloud Platform, particularly in data warehousing and storage solutions.. Skills: sql,etl,datalake,data transformation,ssis,python,data integration,data ingestion,azure cloud services,data modeling,business intelligence,azure synapse analytics,azure data factory,azure data lake storage,power bi,azure,ms fabric Show more Show less
Pune, Maharashtra, India
Not disclosed
On-site
Full Time
Company Description Optimum Data Analytics is a strategic technology partner delivering reliable turnkey AI solutions. Our streamlined approach to development ensures high-quality results and client satisfaction. We bring experience and clarity to organizations, powering every human decision with analytics & AI. Our team consists of statisticians, computer science engineers, data scientists, and product managers. With expertise, flexibility, and cultural alignment, we understand the business, analytics, and data management imperatives of your organization. Our goal is to change how AI/ML is approached in the service sector and deliver outcomes that matter. We provide best-in-class services that increase profit for businesses and deliver improved value for customers, helping businesses grow, transform, and achieve their objectives. Job Details Position: Support Engineer – AI & Data Experience: 5-8 Years Work Mode: Onsite Location: Pune or Mohali Job Overview We are seeking a motivated and talented Support Engineer to join our AI & Data Team. This job offers a unique opportunity to gain hands-on experience with the latest tool technologies, quality documentation preparation, and Software Development Lifecycle responsibilities. If you are passionate about technology and eager to apply your academic knowledge in a real-world setting, this role is perfect for you. Key Responsibilities Collaborate with the AI & Data Team to support various projects. Utilize MS Office tools for documentation and project management tasks. Assist in the development, testing, and deployment and support of BI solutions. Part of ITIL process management. Prepare and maintain high-quality documentation for various processes and projects. Stay updated with the latest industry trends and technologies to contribute innovative ideas. Essential Requirements Experience in SQL and Azure Data Factory (ADF) and Data modeling is a must. Experience in Logic Apps and Azure Integrations is nice to have. Good communications skills. Need to connect with Stakeholders directly. Strong critical thinking and problem-solving skills. Certification in any industry-relevant skills is an advantage. Preferred Skills And Qualifications Strong understanding of software development and testing principles. Familiarity with Data warehousing concepts and technologies. Excellent written and verbal communication skills. Ability to work both independently and as part of a team. Attention to detail and strong organizational skills. What We Offer Hands-on experience with the latest digital tools and technologies. Exposure to real-world projects and industry best practices. Opportunities to prepare and contribute to quality documentation. Experience in SDET responsibilities, enhancing your software testing and development skills. Mentorship from experienced professionals in the field. Skills: management,development,ai,ms office,data modeling,azure,testing,data,software development lifecycle,documentation,itil process management,azure data factory,itil,sql,data warehousing,logic apps,azure integrations Show more Show less
Mohali district, India
Not disclosed
On-site
Full Time
Company Description Optimum Data Analytics is a strategic technology partner delivering reliable turnkey AI solutions. Our streamlined approach to development ensures high-quality results and client satisfaction. We bring experience and clarity to organizations, powering every human decision with analytics & AI. Our team consists of statisticians, computer science engineers, data scientists, and product managers. With expertise, flexibility, and cultural alignment, we understand the business, analytics, and data management imperatives of your organization. Our goal is to change how AI/ML is approached in the service sector and deliver outcomes that matter. We provide best-in-class services that increase profit for businesses and deliver improved value for customers, helping businesses grow, transform, and achieve their objectives. Job Details Position: Support Engineer – AI & Data Experience: 5-8 Years Work Mode: Onsite Location: Pune or Mohali Job Overview We are seeking a motivated and talented Support Engineer to join our AI & Data Team. This job offers a unique opportunity to gain hands-on experience with the latest tool technologies, quality documentation preparation, and Software Development Lifecycle responsibilities. If you are passionate about technology and eager to apply your academic knowledge in a real-world setting, this role is perfect for you. Key Responsibilities Collaborate with the AI & Data Team to support various projects. Utilize MS Office tools for documentation and project management tasks. Assist in the development, testing, and deployment and support of BI solutions. Part of ITIL process management. Prepare and maintain high-quality documentation for various processes and projects. Stay updated with the latest industry trends and technologies to contribute innovative ideas. Essential Requirements Experience in SQL and Azure Data Factory (ADF) and Data modeling is a must. Experience in Logic Apps and Azure Integrations is nice to have. Good communications skills. Need to connect with Stakeholders directly. Strong critical thinking and problem-solving skills. Certification in any industry-relevant skills is an advantage. Preferred Skills And Qualifications Strong understanding of software development and testing principles. Familiarity with Data warehousing concepts and technologies. Excellent written and verbal communication skills. Ability to work both independently and as part of a team. Attention to detail and strong organizational skills. What We Offer Hands-on experience with the latest digital tools and technologies. Exposure to real-world projects and industry best practices. Opportunities to prepare and contribute to quality documentation. Experience in SDET responsibilities, enhancing your software testing and development skills. Mentorship from experienced professionals in the field. Skills: management,development,ai,ms office,data modeling,azure,testing,data,software development lifecycle,documentation,itil process management,azure data factory,itil,sql,data warehousing,logic apps,azure integrations Show more Show less
Mohali district, India
Not disclosed
On-site
Full Time
Company Description Optimum Data Analytics is a strategic technology partner delivering reliable turnkey AI solutions. Our streamlined approach to development ensures high-quality results and client satisfaction. We bring experience and clarity to organizations, powering every human decision with analytics & AI. Our team consists of statisticians, computer science engineers, data scientists, and product managers. With expertise, flexibility, and cultural alignment, we understand the business, analytics, and data management imperatives of your organization. Our goal is to change how AI/ML is approached in the service sector and deliver outcomes that matter. We provide best-in-class services that increase profit for businesses and deliver improved value for customers, helping businesses grow, transform, and achieve their objectives. Job Details Position: Sr. Data Engineer (MS Fabric) Experience: 5-7 Years Work Mode: Onsite Location: Pune or Mohali Must Have Power BI, Azure Cloud Services, MS Fabric, SQL, ETL, Python. Data Transformation, Modelling, Ingestion and Integration. Required Skills And Experiences Collaborate with cross-functional teams, including data analysts, data scientists, and business stakeholders, to understand their data requirements and deliver effective solutions. Leverage Fabric Lakehouse for data storage, governance, and processing to support Power BI and automation initiatives. Expertise in data modeling, with a strong focus on data warehouse and lakehouse design. Design and implement data models, warehouses, and databases using MS Fabric, Azure Synapse Analytics, Azure Data Lake Storage, and other Azure services. Develop ETL (Extract, Transform, Load) processes using SQL Server Integration Services (SSIS), Azure Synapse Pipelines, or similar tools to prepare data for analysis and reporting. Implement data quality checks and governance practices to ensure accuracy, consistency, and security of data assets. Monitor and optimize data pipelines and workflows for performance, scalability, and cost efficiency, utilizing Microsoft Fabric for real-time analytics and AI-powered workloads. Strong proficiency in Business Intelligence (BI) tools such as Power BI, Tableau, and other analytics platforms. Experience with data integration and ETL tools like Azure Data Factory. Proven expertise in Microsoft Fabric or similar data platforms. In-depth knowledge of the Azure Cloud Platform, particularly in data warehousing and storage solutions.. Skills: sql,etl,datalake,data transformation,ssis,python,data integration,data ingestion,azure cloud services,data modeling,business intelligence,azure synapse analytics,azure data factory,azure data lake storage,power bi,azure,ms fabric Show more Show less
Jaipur, Rajasthan, India
Not disclosed
On-site
Full Time
Must Have: ServiceNow data models, data extraction architecture, API/MID integration, metadata capture. Job Summary Deeply understands how ServiceNow data is organized across ITSM, CMDB, and change domains. Designs data extraction strategies using available ServiceNow interfaces (REST APIs, MID Server, Export Sets). Builds real-time and event-driven data feeds into Azure, ensuring pre-ingestion DQ checks, metadata tagging, and lineage capture. Previous work includes CMDB reconciliation exports, change-event monitoring, and service request automation integrations. Relevant Tools And Certification ServiceNow Certified System Admin, Azure Data Factory, Logic Apps Skills: mid integration,rest apis,metadata capture,api integration,logic apps,api/mid integration,data extraction architecture,servicenow data models,azure data factory Show more Show less
Jaipur, Rajasthan, India
Not disclosed
On-site
Full Time
Must Have: ServiceNow schema knowledge, data modelling, Delta Lake, governance tagging. Job Summary Specializes in modeling ServiceNow data for analytics. Transforms raw extracts into curated lakehouse formats, including partitioned layers, business domains, and semantic enrichment for consumption by GenAI and BI tools. Ensures schema enforcement, lineage capture, and data classification are built into the transformation logic. Experienced in building Silver and Gold zones for structured ITSM and CMDB use cases Relevant Tools And Certification Databricks, Delta Lake, Azure Data Engineer Associate. Skills: data modelling,databricks,delta lake,azure data engineer associate,servicenow schema knowledge,governance tagging,servicenow schema Show more Show less
Jaipur, Rajasthan, India
Not disclosed
On-site
Full Time
Must Have: Azure platform, data architecture, IT4IT framework, governance integration Job Summary Owns the IT4IT lake house blueprint. Aligns Azure services with Customer’s enterprise architecture. Designs layered architecture (raw, curated, semantic), metadata governance, lifecycle policies, and nonServiceNow system integration (e.g., observability/logs). Delivered multi-zone lake houses and governed mesh architectures in prior engagements. Relevant Tools And Certification Azure Solutions Architect Expert, TOGAF, Microsoft Purview Skills: lifecycle policies,metadata governance,multi-zone lake houses,governance integration,governed mesh architectures,azure platform,it4it framework,data architecture Show more Show less
Pune, Maharashtra, India
Not disclosed
On-site
Full Time
Must Have: Azure platform, data architecture, IT4IT framework, governance integration Job Summary Owns the IT4IT lake house blueprint. Aligns Azure services with Customer’s enterprise architecture. Designs layered architecture (raw, curated, semantic), metadata governance, lifecycle policies, and nonServiceNow system integration (e.g., observability/logs). Delivered multi-zone lake houses and governed mesh architectures in prior engagements. Relevant Tools And Certification Azure Solutions Architect Expert, TOGAF, Microsoft Purview Skills: lifecycle policies,metadata governance,multi-zone lake houses,governance integration,governed mesh architectures,azure platform,it4it framework,data architecture Show more Show less
Pune, Maharashtra, India
Not disclosed
On-site
Full Time
Must Have: ServiceNow schema knowledge, data modelling, Delta Lake, governance tagging. Job Summary Specializes in modeling ServiceNow data for analytics. Transforms raw extracts into curated lakehouse formats, including partitioned layers, business domains, and semantic enrichment for consumption by GenAI and BI tools. Ensures schema enforcement, lineage capture, and data classification are built into the transformation logic. Experienced in building Silver and Gold zones for structured ITSM and CMDB use cases Relevant Tools And Certification Databricks, Delta Lake, Azure Data Engineer Associate. Skills: data modelling,databricks,delta lake,azure data engineer associate,servicenow schema knowledge,governance tagging,servicenow schema Show more Show less
Hyderabad, Telangana, India
Not disclosed
On-site
Full Time
Must Have: Azure platform, data architecture, IT4IT framework, governance integration Job Summary Owns the IT4IT lake house blueprint. Aligns Azure services with Customer’s enterprise architecture. Designs layered architecture (raw, curated, semantic), metadata governance, lifecycle policies, and nonServiceNow system integration (e.g., observability/logs). Delivered multi-zone lake houses and governed mesh architectures in prior engagements. Relevant Tools And Certification Azure Solutions Architect Expert, TOGAF, Microsoft Purview Skills: lifecycle policies,metadata governance,multi-zone lake houses,governance integration,governed mesh architectures,azure platform,it4it framework,data architecture Show more Show less
Hyderabad, Telangana, India
Not disclosed
On-site
Full Time
Must Have: ServiceNow data models, data extraction architecture, API/MID integration, metadata capture. Job Summary Deeply understands how ServiceNow data is organized across ITSM, CMDB, and change domains. Designs data extraction strategies using available ServiceNow interfaces (REST APIs, MID Server, Export Sets). Builds real-time and event-driven data feeds into Azure, ensuring pre-ingestion DQ checks, metadata tagging, and lineage capture. Previous work includes CMDB reconciliation exports, change-event monitoring, and service request automation integrations. Relevant Tools And Certification ServiceNow Certified System Admin, Azure Data Factory, Logic Apps Skills: mid integration,rest apis,metadata capture,api integration,logic apps,api/mid integration,data extraction architecture,servicenow data models,azure data factory Show more Show less
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.