We are looking for a highly skilled Analytics & Data Engineering professional with a strong background in Machine Learning, MLOps, and DevOps . The ideal candidate will have experience designing and implementing scalable data and analytics pipelines, enabling production-grade ML systems, and supporting agent-based development leveraging MCP/OpenAPI to MCP wrapper and A2A protocols . This role combines hands-on technical work with solution design, and will require close collaboration with data scientists, product teams, and engineering stakeholders. Key Responsibilities Design, build, and maintain scalable data pipelines and ETL/ELT processes for analytics and ML workloads. Implement MLOps frameworks to manage model lifecycle (training, deployment, monitoring, and retraining). Apply DevOps best practices (CI/CD, containerization, infrastructure as code) to ML and data engineering workflows. Develop and optimize data models, feature stores, and ML serving architectures. Collaborate with AI/ML teams to integrate models into production environments. Support agent development using MCP/OpenAPI to MCP wrapper and A2A (Agent-to-Agent) communication protocols. Ensure data quality, governance, and compliance with security best practices. Troubleshoot and optimize data workflows for performance and reliability. Required Skills & Experience Core : 6+ years in analytics and data engineering roles. Proficiency in SQL, Python, and data pipeline orchestration tools (e.g., Airflow, Prefect). Experience with distributed data processing frameworks (e.g., Spark, Databricks). ML/MLOps : Experience deploying and maintaining ML models in production. Knowledge of MLOps tools (MLflow, Kubeflow, SageMaker, Vertex AI, etc.). DevOps : Hands-on experience with CI/CD (Jenkins, GitHub Actions, GitLab CI). Proficiency with Docker, Kubernetes, and cloud-based deployment (AWS, Azure, GCP). Specialized : Experience with MCP/OpenAPI to MCP wrapper integrations. Experience working with A2A protocols in agent development. Familiarity with agent-based architectures and multi-agent communication patterns. Preferred Qualifications Master’s degree in Computer Science, Data Engineering, or related field. Experience in real-time analytics and streaming data pipelines (Kafka, Kinesis, Pub/Sub). Exposure to LLM-based systems or intelligent agents. Strong problem-solving skills and ability to work in cross-functional teams.