We are seeking a passionate and skilled Senior Software Engineer who excels in backend development using Python and has a strong foundation in data engineering, cloud platforms (AWS/Azure), and modern application design. You will work on scalable microservices, real-time data pipelines, and AI-driven reporting platforms using LLMs and agentic AI for product development.
Roles and responsibilities
- Design, develop, and maintain backend services and REST APIs using Python (FastAPI preferred).
- Build and optimize data pipelines, ETL workflows, and data lake/lakehouse architectures.
- Work with cloud infrastructure (AWS/Azure) for deployment, scaling, and data services.
- Develop and manage Superset-based dashboards and reporting workflows.
- Implement microservice-based architectures and containerized deployments (Docker/Kubernetes).
- Apply database design principles, write performant queries, and manage relational/columnar DBs.
- Integrate LLM-based intelligence into product features using OpenAI or open-source models.
- Architect and orchestrate agentic AI systems that perform reasoning, action selection, and multi-step workflows.
- Collaborate with cross-functional teams, including product managers, analysts, and frontend developers.
- Follow best practices in CI/CD, testing, monitoring, and documentation.
Desired Skills & Background
- Strong expertise in Python backend development (FastAPI, Flask, etc.)
- Experience with SQL, data modeling, and relational/columnar databases (PostgreSQL, Snowflake, etc.)
- Experience building ETL/data pipelines using tools like Pandas, PySpark, or Apache Airflow.
- Good ot have - Hands-on experience with Superset for reporting and custom dashboard development.
- Working knowledge of AWS (Lambda, S3, RDS, EC2) or Azure (Functions, Blob Storage, CosmosDB).
- Solid understanding of microservices architecture and containerization (Docker, Kubernetes).
- Good grounding in LLMs, embeddings, vector stores, and AI agent orchestration frameworks (LangChain, LlamaIndex, etc.)
- Familiarity with software development best practices (unit testing, logging, code reviews, Git).
- Knowledge of event-driven architectures (Kafka, Redis Streams, etc.)
- Exposure to DevOps tools like GitHub Actions or Jenkins.
- Experience deploying AI services to production at scale.
- Open-source contributions or blog posts on AI/ML/Data topics.