Rapidera Technologies Pvt Ltd

13 Job openings at Rapidera Technologies Pvt Ltd
GCP Data Architect Baner, Pune, Maharashtra 8 years None Not disclosed On-site Full Time

Proficient in data modeling, data warehousing concepts, ETL/ELT pipelines, and big data processing frameworks. Experience with SQL, Python, and Terraform (preferred) for infrastructure as code. Hands-on experience in data security, encryption, access control, and governance on GCP. Experience in integrating with real-time data pipelines and event-driven architectures. Strong understanding of DevOps, CI/CD pipelines for data workflows, and cloud cost optimization. GCP Professional Data Engineer / Cloud Architect certification is a plus. Role Overview: We are seeking a highly skilled GCP Data Architect with 6–8 years of experience in designing, developing, and managing enterprise data solutions on Google Cloud Platform (GCP). The ideal candidate will have a strong background in cloud data architecture, data warehousing, big data processing, and data integration, with proven expertise in delivering scalable, secure, and efficient data platforms. Key Responsibilities: Design and architect end-to-end data solutions on GCP, aligning with business and technical requirements. Define data models, storage strategies, data ingestion, processing, and consumption frameworks. Implement data lakes, data warehouses, and data marts using services like BigQuery, Cloud Storage, Dataflow, Dataproc, Pub/Sub, and Composer. Collaborate with business stakeholders, data scientists, and engineering teams to understand data needs and translate them into scalable architectures. Design and implement data governance, security, and compliance frameworks for cloud-based data platforms. Optimize data workflows, query performance, and storage costs in the GCP environment. Lead data migration and modernization initiatives from on-premise or other cloud platforms to GCP. Stay updated with GCP services, features, and industry best practices to recommend improvements and innovation. Provide technical leadership and mentoring to data engineering teams. Required Skills & Experience: 6–8 years of experience in data architecture and engineering roles, with at least 3 years hands-on on GCP. Strong expertise in GCP data services: BigQuery, Dataflow, Pub/Sub, Dataproc, Cloud Storage, Cloud Composer, Data Catalog. Proficient in data modeling, data warehousing concepts, ETL/ELT pipelines, and big data processing frameworks. Experience with SQL, Python, and Terraform (preferred) for infrastructure as code. Hands-on experience in data security, encryption, access control, and governance on GCP. Experience in integrating with real-time data pipelines and event-driven architectures. Strong understanding of DevOps, CI/CD pipelines for data workflows, and cloud cost optimization. GCP Professional Data Engineer / Cloud Architect certification is a plus. Good to Have : Exposure to AI/ML workflows, data preparation for ML models. Experience with third-party tools like Apache Airflow, Looker, or Dataplex. Knowledge of other cloud platforms (AWS, Azure) for hybrid/multi-cloud strategies. Educational Qualification: Bachelor’s or Master’s degree in Computer Science, Information Technology, Data Engineering, or a related field. Why Join Us: Work on cutting-edge data transformation programs at scale. Opportunity to architect high-impact solutions in a collaborative, innovation-driven environment. Engage with a fast-growing team focused on data-driven business value. Job Type: Full-time Work Location: In person

Senior Golang Developer India 3 years INR 5.53038 - 17.90366 Lacs P.A. On-site Full Time

Exp: 6 to 7+ yrs Strong proficiency in Go (Golang) for backend development Experience building RESTful APIs and microservices Familiarity with PostgreSQL and GORM (or similar ORM) Good understanding of concurrency, goroutines, and performance optimization Experience with cloud platforms (AWS, Azure, GCP) and API integrations Proficient with Git, Docker, and basic CI/CD workflows Ability to write clean, testable code and debug effectively Strong problem-solving skills and ability to work in agile teams 3+ years of hands-on experience with GoLang in production environments Strong understanding of RESTful API design, microservices architecture, and distributed systems Experience with relational databases (PostgreSQL preferred) Familiarity with cloud platforms (AWS, Azure, GCP) and cost management concepts Comfortable with Git, Docker, CI/CD, and modern development workflows Experience working with APIs for billing, monitoring, or infrastructure management is a plus Solid understanding of software engineering principles and best practices Nice to have: Knowledge of FinOps, cloud cost optimization, or billing data analysis Experience with Kafka, RabbitMQ, or other messaging systems Familiarity with Infrastructure as Code tools (Terraform, Pulumi) Exposure to open-source LLM/AI integrations is a plus Job Type: Full-time Pay: ₹553,037.71 - ₹1,790,365.67 per year Work Location: In person

GCP Data Architect India 6 - 8 years INR Not disclosed On-site Full Time

Proficient in data modeling, data warehousing concepts, ETL/ELT pipelines, and big data processing frameworks. Experience with SQL, Python, and Terraform (preferred) for infrastructure as code. Hands-on experience in data security, encryption, access control, and governance on GCP. Experience in integrating with real-time data pipelines and event-driven architectures. Strong understanding of DevOps, CI/CD pipelines for data workflows, and cloud cost optimization. GCP Professional Data Engineer / Cloud Architect certification is a plus. Role Overview: We are seeking a highly skilled GCP Data Architect with 6–8 years of experience in designing, developing, and managing enterprise data solutions on Google Cloud Platform (GCP). The ideal candidate will have a strong background in cloud data architecture, data warehousing, big data processing, and data integration, with proven expertise in delivering scalable, secure, and efficient data platforms. Key Responsibilities: Design and architect end-to-end data solutions on GCP, aligning with business and technical requirements. Define data models, storage strategies, data ingestion, processing, and consumption frameworks. Implement data lakes, data warehouses, and data marts using services like BigQuery, Cloud Storage, Dataflow, Dataproc, Pub/Sub, and Composer. Collaborate with business stakeholders, data scientists, and engineering teams to understand data needs and translate them into scalable architectures. Design and implement data governance, security, and compliance frameworks for cloud-based data platforms. Optimize data workflows, query performance, and storage costs in the GCP environment. Lead data migration and modernization initiatives from on-premise or other cloud platforms to GCP. Stay updated with GCP services, features, and industry best practices to recommend improvements and innovation. Provide technical leadership and mentoring to data engineering teams. Required Skills & Experience: 6–8 years of experience in data architecture and engineering roles, with at least 3 years hands-on on GCP. Strong expertise in GCP data services: BigQuery, Dataflow, Pub/Sub, Dataproc, Cloud Storage, Cloud Composer, Data Catalog. Proficient in data modeling, data warehousing concepts, ETL/ELT pipelines, and big data processing frameworks. Experience with SQL, Python, and Terraform (preferred) for infrastructure as code. Hands-on experience in data security, encryption, access control, and governance on GCP. Experience in integrating with real-time data pipelines and event-driven architectures. Strong understanding of DevOps, CI/CD pipelines for data workflows, and cloud cost optimization. GCP Professional Data Engineer / Cloud Architect certification is a plus. Good to Have : Exposure to AI/ML workflows, data preparation for ML models. Experience with third-party tools like Apache Airflow, Looker, or Dataplex. Knowledge of other cloud platforms (AWS, Azure) for hybrid/multi-cloud strategies. Educational Qualification: Bachelor’s or Master’s degree in Computer Science, Information Technology, Data Engineering, or a related field. Why Join Us: Work on cutting-edge data transformation programs at scale. Opportunity to architect high-impact solutions in a collaborative, innovation-driven environment. Engage with a fast-growing team focused on data-driven business value. Job Type: Full-time Work Location: In person

Senior Salesforce Developer Pune, Maharashtra 0 - 3 years INR 12.0 - 20.0 Lacs P.A. On-site Not specified

Looking for Salesforce developers with minimum 3-5 years relevant experience on immediate basis. Should be hands on developers with Apex, Lightning components, Visualforce, Agentforce, Java and Integration skills. Excellent communication, coding best practices, and debugging/troubleshooting skills required. This is full time position needing work from office in Pune If you are interested and your profile meets the expectations of the job please respond with your detailed CV, current CTC, expected CTC and notice period. Application will be consider only for immediate candidates. Job Type: Permanent Pay: ₹1,200,000.00 - ₹2,000,000.00 per year Benefits: Flexible schedule Health insurance Paid sick time Paid time off Provident Fund Ability to commute/relocate: Pune, Maharashtra: Reliably commute or planning to relocate before starting work (Required) Experience: Salesforce: 3 years (Required) Work Location: In person

Gen AI Data Scientist pune, maharashtra 4 years INR 15.0 - 24.0 Lacs P.A. On-site Full Time

Key Responsibilities Fine-tune LLMs using techniques like LoRA and QLoRA Evaluate and improve RAG (Retrieval-Augmented Generation) pipelines for groundedness, accuracy, and relevance Apply transfer learning and transformer architectures in model development Validate model accuracy and performance using appropriate metrics Collaborate with product teams and communicate insights to senior leadership Participate in problem-solving sessions and contribute innovative ideas Maintain an experimental mindset and continuously explore new approaches Identify and integrate relevant data sources to build meaningful datasets Automate data collection and preprocessing for structured and unstructured data Handle large-scale data to feed analytical and predictive models Build and optimize machine learning and deep learning models, including NLP solutions Requirements: Education & Experience Bachelor’s degree in a quantitative field (Computer Science, Engineering, Physics, Mathematics, Operations Research) or equivalent experience 1–4 years of hands-on experience in Gen AI and NLP Prior experience in startups or high-growth environments is a plus Technical Skills Deep expertise in NLP techniques: text generation, sentiment analysis, NER, and language modeling Hands-on experience with LLMs and RAG pipelines Proficiency in neural network frameworks: TensorFlow, PyTorch Familiarity with transformer architecture and transfer learning Fluency in at least one programming language: Python, R, or Julia Experience with Gen AI libraries: Hugging Face, OpenAI, etc. Strong foundation in ML algorithms: supervised, unsupervised, reinforcement learning, Bayesian inference Fine Tuning, Transfer Learning, Pytorch, Tensorflow Analytical & Communication Skills Strong math skills: statistics, linear algebra, probability Proven problem-solving aptitude with real-world application of Gen AI and NLP Excellent communication skills to translate complex technical concepts for non-technical stakeholders Collaborative mindset with the ability to work across teams Job Type: Full-time Pay: ₹1,500,000.00 - ₹2,400,000.00 per year Work Location: In person

Senior Salesforce Developer pune, maharashtra 0 - 3 years INR 10.0 - 18.0 Lacs P.A. On-site Not specified

Looking for Salesforce developers with minimum 3-5 years relevant experience on immediate basis. Should be hands on developers with Apex, Lightning components, Visualforce, Agentforce, Dashboard and Integration skills. Gen AI skills preferred. Excellent communication, coding best practices, and debugging/troubleshooting skills required. This is full time position needing work from office in Pune If you are interested and your profile meets the expectations of the job please respond with your detailed CV, current CTC, expected CTC and notice period. Application will be consider only for immediate candidates. Job Type: Permanent Pay: ₹1,000,000.00 - ₹1,800,000.00 per year Benefits: Flexible schedule Health insurance Paid sick time Paid time off Provident Fund Ability to commute/relocate: Pune, Maharashtra: Reliably commute or planning to relocate before starting work (Required) Experience: Salesforce: 3 years (Required) Work Location: In person

Gen AI Software Engineer bengaluru 0 years INR 18.0 - 48.0 Lacs P.A. On-site Full Time

Multiple positions at different seniorty levels from developer to Architect Key Responsibilities Develop and optimize AI services for the Gen AI HUB platform, Agentic RAG, MCP Servers/ClientsSupport prompt orchestration, model integration, model finetuning, observability, and toolchain automation. Collaborate with data scientists to operationalize AI models. Core Skills : Azure .NET stack with Microsoft AI ecosystem familiarity. Strong programming in .NET/PythonExperience with REST APIs, model endpoints, and SDK usage. Familiarity with vector databases and LLM integration workflows. Hands-on with embedding stores, orchestrators, and RAG techniques.AI Foundry, Azure stack, .NET, and REST API skills. Willingness to learn and explore Gen AI technologies. Clean coding, Git workflows, and integration testing basics. Nice to Have : Experience using Lang Chain, Semantic Kernel, or similar orchestration frameworks. Exposure to Azure OpenAI and Microsoft Copilot stack, Open Source LLMs Job Type: Full-time Pay: ₹1,800,000.00 - ₹4,800,000.00 per year Work Location: In person

Agentic AI developer pune,maharashtra 3 - 7 years INR Not disclosed On-site Full Time

As an Agentic AI Developer at our company, you will be responsible for designing and delivering scalable applications in Python and building Agentic AI systems. You will collaborate with product management to understand their requirements, challenges, and develop potential solutions. Additionally, you will stay updated with the latest tools, technologies, and methodologies and share your knowledge with key decision makers. Key Responsibilities: - Design and deliver scalable applications in Python and build Agentic AI systems - Collaborate with product management to understand their requirements & challenges and develop potential solutions - Stay current with the latest tools, technology ideas, and methodologies; share knowledge by clearly articulating results and ideas to key decision makers Qualifications Required: - 3+ years of strong experience in Python - Agentic AI and RAG experience required - Experience with AWS services is required - Strong database skills - BS/MS in Computer Science or equivalent - Strong problem-solving skills, data structures, and algorithm design - Strong experience in Object-Oriented Design and developing highly scalable applications - Ability to deliver code quickly from wireframe model in a fast-paced startup environment - Attention to details - Strong communication and collaboration skills - Ability to lead a small team Join our highly energetic and innovative team that believes in the power of creativity and hard work to achieve the impossible. *Note: This job is Full-time and Permanent with benefits including a flexible schedule, health insurance, paid sick time, paid time off, and work from home option.* Experience: - Total work: 3 years (Required),

Lead Python Django karnataka 3 - 7 years INR Not disclosed On-site Full Time

As a Lead Developer at our company, you will play a crucial role in designing and delivering scalable backend solutions using Python Django with microservices. You will work closely with the team to understand requirements, suggest design changes for better user experience, and develop reusable code following design patterns and component architecture. Additionally, you will lead a small team, conduct code reviews, and ensure adherence to design standards. Key Responsibilities: - Design and deliver scalable backend solutions in Python Django with microservices - Collaborate with product management to understand requirements and challenges, and develop potential solutions - Stay updated with the latest tools and technologies, share knowledge with key decision makers - Lead a small team, conduct code reviews, and ensure design standards are followed - Strong communication and collaboration skills Qualifications Required: - 3+ years of backend development experience with microservices - Experience with AWS services - Strong database skills - BS/MS in Computer Science or equivalent - Proficiency in problem-solving, data structures, and algorithm design - Strong experience in Object-Oriented Design and developing highly scalable applications - Ability to deliver code quickly in a fast-paced startup environment - Attention to detail and ability to lead a small team In addition, the job offers benefits such as a flexible schedule, health insurance, paid sick time, paid time off, and the option to work from home. This is a full-time, permanent position requiring a total of 3 years of work experience. Work location is in person.,

Python Backend Developer india 4 years INR 12.0 - 20.0 Lacs P.A. Remote Full Time

We are looking for an experienced 4+ yrs Python & FastAPI Developer to join our team and work on high-performance REST API development . The candidate must have strong expertise in Python programming, REST API design, database connectivity, exception handling, and modern development best practices. The developer will be responsible for building scalable, secure, and maintainable backend services and contributing to deployment automation. Key Responsibilities Design, build, and maintain RESTful APIs using FastAPI. Develop clean, efficient, and reusable Python code following OOPs principles and best practices. Integrate APIs with SQL databases, write optimized queries, and handle DB transactions. Implement robust exception handling, logging, and error-response standards. Collaborate with front-end and DevOps teams to deliver end-to-end features. Optimize application performance, scalability, and security. Participate in code reviews and ensure high-quality coding standards. Work with deployment pipelines (Docker, Azure Kubernetes, CI/CD) for application deployment. Document API endpoints, workflows, and architectural components. Required Skills & Expertise Core Technical Skills Strong hands-on experience in Python (3.x) — syntax, data structures, file handling, modules, decorators, generators, error handling, etc.• Solid understanding of Object-Oriented Programming (OOP) – classes, inheritance, polymorphism, abstraction, encapsulation. Expertise in building REST APIs using FastAPI, including: o Routers & dependency injection o Middleware o Background tasks o Pydantic models & validation Experience working with at least one database (SQL Server, PostgreSQL, MySQL etc.). Ability to write efficient queries and handle DB connections using ORM frameworks (e.g., SQLAlchemy) or raw queries. Strong knowledge of exception handling, logging frameworks, and API error models. Experience with authentication/authorization (JWT, OAuth2, API keys). Knowledge of asynchronous programming (async/await) in Python. Best Practices & Architecture Familiarity with clean code principles, modular architecture, layered architecture, and reusable components. Understanding of API versioning, rate limiting, and security practices. Experience in writing test cases (pytest/unittest) is preferred. Deployment & DevOps Knowledge of deployment using: o Docker & containerization o CI/CD pipelines o Cloud platforms (Azure preferred, AWS/GCP acceptable) Experience with environment management (venv, pipenv, poetry) and version control (Git). Qualifications BE/B.Tech/MCA from a government-recognized university/institute. Total 4–6 years of IT experience with at least 4 years in Python development, preferably in cloud-enabled environments. Soft Skills Strong written and verbal communication skills. Adaptability to dynamic client requirements and evolving business challenges. Proven ability to collaborate in a team environment and lead by example. Self-driven, proactive, and committed to delivering high-quality solutions. Job Types: Full-time, Permanent Pay: ₹1,200,000.00 - ₹2,000,000.00 per year Benefits: Flexible schedule Health insurance Paid sick time Paid time off Work from home Experience: total work: 3 years (Required) Work Location: In person

Data Scientist mumbai, maharashtra 7 years INR 20.0 - 36.0 Lacs P.A. On-site Full Time

We are looking for a strong Data Scientist (5-7 yrs of experience) with hands-on experience in Elasticsearch analytics, machine learning, data engineering fundamentals, and exposure to AIOps/observability ecosystems. This role will be part of an engineering-heavy team helping onboard, manage, and optimize critical applications for a major Indian banking client. You will work with high-volume data pipelines, observability platforms, agentic AI systems, and build ML-driven insights that improve reliability, performance, and automation. Key Responsibilities Data Analysis & Modelling Perform advanced exploratory analysis and anomaly detection using data stored in Enterprise Elasticsearch clusters.• Build ML models (supervised & unsupervised) for event correlation, incident prediction, log intelligence, and capacity forecasting.• Develop automated pipelines for feature engineering, model training, deployment, and monitoring. AIOps & Observability Engineering Leverage Elasticsearch, Kafka, Dynatrace, Grafana, and related tools to build data-driven insights.• Build rule-based and ML-based detectors for alerting, pattern analysis, and log/metric correlation.• Integrate models with AIOps platforms to drive automated incident insights. Agentic AI & Automation Experiment with LLMs and agentic AI frameworks for Automated root-cause suggestion so Log summarization o Knowledge retrieval o ChatOps workflows Engineering & Platform Work Work with data engineers to design scalable pipelines using Kafka, REST APIs, Python, Elasticsearch DSL.• Develop tools, scripts, and services to support model inference, dashboards, and automation workflows. Required Skills Technical Skills Strong Python for ML + data engineering (NumPy, pandas, scikit-learn, MLflow preferred).• Good experience working with Elasticsearch query DSL, index modelling, aggregations.• Understanding of Kafka for streaming data consumption/production.• Experience with AIOps or observability platforms such as Grafana, Dynatrace, Prometheus, Kibana.• Hands-on familiarity with LLMs/agentic AI, embeddings, vector search, or RAG (preferred).• Knowledge of Docker, microservices, and CI/CD is a plus. Soft Skills Strong analytical mindset and problem-solving ability.• Comfortable working in a client-facing or collaborative environment.• Ability to adapt quickly to new tools, platforms, and cloud environments. Education & Experience Bachelor’s/Master’s in Computer Science, Data Science, Engineering, or related fields.• 5–7 years of experience in data science or machine learning engineering roles. Job Type: Full-time Pay: ₹2,000,000.00 - ₹3,600,000.00 per year Work Location: In person

Lead Data Scientist mumbai 7 years INR 36.0 - 48.0 Lacs P.A. On-site Full Time

Overview We are seeking a Senior Data Scientist / Lead (7-10 yrs exp) who can architect data-driven solutions, lead ML/AI initiatives, and guide a team working on a mission-critical banking environment.The ideal candidate has deep expertise in Elasticsearch analytics, AIOps, machine learning, agentic AI, and can drive end-to-end model lifecycle and platform integration. This role requires strong engineering depth and the ability to work across cross-functional teams and customer stakeholders. Key Responsibilities Technical Leadership & Architecture Architect ML pipelines for high-volume logs/metrics/traces ingested via Elasticsearch & Kafka.• Define standards for data modelling, index strategy, retention, aggregation, and search optimization.• Design end-to-end AI/ML solutions integrated with AIOps workflows and customer operations. Advanced ML & AIOps Build advanced models for: o Incident prediction & noise reductiono Event correlation & causal clustering o Predictive capacity planning o Automated RCA• Lead design of reinforcement learning or agentic AI systems to automate incident triage. Agentic AI / LLM Innovation Build and deploy agentic AI solutions for: o Automated RCA assistantso Observability copilots o Log/metric narrative generation o Knowledge graph + RAG systems for SRE & Ops intelligence• Evaluate appropriate foundation models & vector search approaches using Elasticsearch/OpenSearch/similar tools. Stakeholder Management & Delivery Collaborate with architects, SMEs, and client teams to translate requirements into scalable ML solutions.• Lead a small team of DS/ML/DE members; conduct code reviews, mentor, and ensure engineering quality.• Manage project delivery, roadmaps, PoCs, and continuous improvement initiatives. Required Skills Technical Expertise Deep hands-on experience with Elasticsearch query DSL, aggregations, anomaly detection modules, index management, tuning & scaling.• Strong Kafka experience: stream processing, consumers, producers, and integration with ML pipelines.• Expert-level Python for ML engineering; experience with PyTorch/TensorFlow preferred.• Advanced experience with AIOps ecosystems: Dynatrace, Grafana, Prometheus, Kibana, or similar.• Strong exposure to LLMs, agentic AI, embeddings, vector search, and retrieval pipelines.• Hands-on with designing scalable microservices for inference and automation workflows.• Experience working with distributed systems and performance optimization. Leadership Skills Ability to guide DS/ML teams while still being hands-on.• Strong communication: able to explain complex data concepts to non-technical stakeholders.• Demonstrated experience driving end-to-end delivery in enterprise environments. Education & Experience Bachelor’s/Master’s degree in Computer Science, Data Science or related fields.• 7–10 years of experience in Data Science / ML Engineering.• Prior experience in enterprise-scale environments (Banking/NBFC/Telecom preferred). Job Type: Full-time Pay: ₹3,600,000.00 - ₹4,800,000.00 per year Work Location: In person

Data Scientist mumbai 5 years INR 20.0 - 36.0 Lacs P.A. On-site Full Time

We are looking for a strong Data Scientist (5-7 yrs of experience) with hands-on experience in Elasticsearch analytics, machine learning, data engineering fundamentals, and exposure to AIOps/observability ecosystems. This role will be part of an engineering-heavy team helping onboard, manage, and optimize critical applications for a major Indian banking client. You will work with high-volume data pipelines, observability platforms, agentic AI systems, and build ML-driven insights that improve reliability, performance, and automation. Key Responsibilities Data Analysis & Modelling Perform advanced exploratory analysis and anomaly detection using data stored in Enterprise Elasticsearch clusters.• Build ML models (supervised & unsupervised) for event correlation, incident prediction, log intelligence, and capacity forecasting.• Develop automated pipelines for feature engineering, model training, deployment, and monitoring. AIOps & Observability Engineering Leverage Elasticsearch, Kafka, Dynatrace, Grafana, and related tools to build data-driven insights.• Build rule-based and ML-based detectors for alerting, pattern analysis, and log/metric correlation.• Integrate models with AIOps platforms to drive automated incident insights. Agentic AI & Automation Experiment with LLMs and agentic AI frameworks for Automated root-cause suggestion so Log summarization o Knowledge retrieval o ChatOps workflows Engineering & Platform Work Work with data engineers to design scalable pipelines using Kafka, REST APIs, Python, Elasticsearch DSL.• Develop tools, scripts, and services to support model inference, dashboards, and automation workflows. Required Skills Technical Skills Strong Python for ML + data engineering (NumPy, pandas, scikit-learn, MLflow preferred).• Good experience working with Elasticsearch query DSL, index modelling, aggregations.• Understanding of Kafka for streaming data consumption/production.• Experience with AIOps or observability platforms such as Grafana, Dynatrace, Prometheus, Kibana.• Hands-on familiarity with LLMs/agentic AI, embeddings, vector search, or RAG (preferred).• Knowledge of Docker, microservices, and CI/CD is a plus. Soft Skills Strong analytical mindset and problem-solving ability.• Comfortable working in a client-facing or collaborative environment.• Ability to adapt quickly to new tools, platforms, and cloud environments. Education & Experience Bachelor’s/Master’s in Computer Science, Data Science, Engineering, or related fields.• 5–7 years of experience in data science or machine learning engineering roles. Job Type: Full-time Pay: ₹2,000,000.00 - ₹3,600,000.00 per year Work Location: In person