Role: Lead Engineer – AI-Powered Platforms Experience: 5–8 years Job Type: On-Site Location: Bengaluru, Karnataka, India Company Description Ideafloats Technologies Private Limited is a professional technology services company based in Bengaluru, India. We offer end-to-end solutions in design, development, consulting, and cloud services. Our team consists of experts in product consulting, design, development, and management. We work closely with entrepreneurs and business owners to help them make the right decisions and achieve their goals. Role Overview We’re looking for a Lead Engineer to architect, scale, and lead the development of intelligent, AI-powered systems. This role combines backend excellence, cloud-native infrastructure, and integration of LLMs and multi-agent architectures into real-world applications. If you’re passionate about technical leadership, LLM-based automation, and building future-forward platforms, this role is for you. Key Responsibilities System Architecture & Development ● Design and maintain scalable microservices and APIs using Java, Python, or Node.js ● Architect high-performance systems with modular, maintainable components ● Own the technical roadmap and ensure system scalability and security Engineering Leadership ● Mentor developers and enforce high engineering standards ● Lead technical discussions, code reviews, and architectural decisions ● Guide the team in adopting best practices across development and deployment AI Integration & Agent Systems ● Integrate LLMs (OpenAI, Claude, Mistral, etc.) into production workflows ● Work with AI/ML teams on fine-tuning, embeddings, RAG pipelines ● Develop and manage multi-agent orchestration using LangGraph, Autogen, or similar tools DevOps & Infrastructure ● Build CI/CD pipelines using GitHub Actions, Jenkins, or similar tools ● Manage deployments in AWS, GCP, or Azure environments ● Implement Docker-based containerization and Kubernetes orchestration Monitoring, Security & Performance ● Setup observability tools (Datadog, Prometheus, Grafana, ELK) ● Ensure security compliance (SOC2, GDPR, etc.) ● Troubleshoot performance bottlenecks and resolve critical issues promptly Cross-functional Collaboration ● Collaborate with product, design, and AI teams ● Participate in sprint ceremonies and lead technical delivery ● Translate product needs into scalable engineering systems Requirements Education & Experience ● B.Tech/M.Tech in Computer Science or related field. ● 5–8 years of experience in backend, cloud, or AI product development. Core Technical Skills ● Expertise in at least two: Java, Python, Node.js ● Strong foundation in microservices, OOP, system design ● Practical experience in LLMs, prompt engineering, fine-tuning AI & DevOps Tools ● Hands-on with LangGraph, CrewAI, OpenAI tools, or similar agent frameworks ● Docker, Kubernetes, and cloud-native CI/CD pipelines ● AWS (Lambda, S3, ECS, DynamoDB), GCP, or Azure experience Soft Skills ● Experience leading and mentoring engineering teams ● Strong problem-solving, communication, and technical decision-making
Position: Data Engineer (Python, Databricks, Azure) Work mode: Remote Experience: 3-4 Years Package: 6 - 7.2 LPA Key Responsibilities Design, develop, and maintain clean, reusable Python code with best practices. Build and optimize scalable data pipelines and services using Databricks, Apache Spark, and Azure. Contribute to the design, implementation, and improvement of CI/CD pipelines. Collaborate closely with stakeholders, QA, DataOps, and Infrastructure teams across the software development lifecycle. Troubleshoot, debug, and enhance application performance. Ensure compliance with client-specific coding standards, testing practices, and documentation. Drive continuous improvement, innovation, and automation in an Agile, globally distributed team environment. Required Technical Skills (3–4 years) Strong proficiency in Python (including Pandas). Hands-on experience with Databricks and Apache Spark. Solid understanding of Azure services (Pipelines, AKS, Service Bus, Key Vaults, Monitoring). Background in data engineering and service-oriented development. Nice-to-Have Skills Exposure to cloud platforms such as AWS or GCP. Familiarity with Kubernetes (AKS). Experience with Airflow, DBT, Power BI, and Parquet. Basic knowledge of React. Soft Skills Independent and ownership-driven, with a strong focus on delivery. Excellent communication skills, effective in cross-cultural and global team environments. Analytical thinker with strong problem-solving and decision-making abilities. Experience with unit, integration, and regression testing. Proactive, innovative, and committed to continuous learning and knowledge sharing.