Jobs
Interviews

Aralis Infoworks (opc)

3 Job openings at Aralis Infoworks (opc)
Python + Generative AI Engineer gurugram 6 - 10 years INR 15.0 - 22.5 Lacs P.A. Work from Office Full Time

Note - This opportunity is on behalf of one of our esteemed clients were assisting them with the hiring process. They are looking to fill this position immediately (within this week). Only candidates from Gurugram or nearby locations will be considered. Location: Gurugram / Gurgaon (On-Site) Job Type: Full-time Field: Python & Generative Artificial Intelligence Education: Bachelor’s / Master’s degree in Computer Science, AI/ML, Data Science, or related field (or equivalent experience). Experience: 5+ years minimum CTC: Up to 20 LPA Job Overview We are seeking a highly skilled Python + Generative AI Engineer to design, develop, and deploy cutting-edge AI solutions. The ideal candidate will have strong Python programming expertise combined with hands-on experience in building, fine-tuning, and integrating Generative AI models into scalable applications. Key Responsibilities: Design and implement AI-driven applications using Python and Generative AI frameworks. Fine-tune and optimize LLMs (e.g., GPT, LLaMA, Falcon) for domain-specific use cases. Develop APIs and microservices to integrate AI models with enterprise systems. Implement retrieval-augmented generation (RAG) pipelines, embeddings, and vector databases (e.g., Pinecone, FAISS, Chroma). Build and maintain CI/CD pipelines for AI/ML deployments. Collaborate with cross-functional teams (Data Science, DevOps, Product) to deliver scalable AI solutions. Ensure AI model performance, security, and compliance standards are met. Research and adopt emerging AI/ML techniques for continuous improvement. Required Skills & Qualifications: Programming: Strong expertise in Python (FastAPI / Flask preferred). Generative AI: Experience with OpenAI API, Hugging Face Transformers, LangChain, LLaMA, or similar. Machine Learning: Solid understanding of NLP, embeddings, prompt engineering, and fine-tuning LLMs. Infrastructure: Experience with containerization (Docker, Kubernetes) and cloud platforms (AWS / GCP / Azure). Databases: Familiarity with vector DBs (Pinecone, FAISS, ChromaDB) and relational DBs (PostgreSQL/MySQL). DevOps: Experience in CI/CD pipelines, Git, and model deployment best practices. Strong problem-solving skills and ability to work in fast-paced, agile environments. Good to have Knowledge of agentic AI workflows and multi-agent orchestration. Exposure to security and compliance aspects in AI model deployment. Contributions to open-source AI projects. Experience in integrating AI with third-party tools/APIs Why Join Us? Opportunity to work on next-gen AI products impacting real-world industries. Collaborative and innovative work culture. Continuous learning with access to latest AI research and tools. Note - This opportunity is on behalf of one of our esteemed clients – we’re assisting them with the hiring process. They are looking to fill this position immediately (latest by next week). Only candidates from Gurugram or nearby locations will be considered.

Python Developer (FastAPI + RDBMS) bengaluru 8 - 13 years INR 15.0 - 20.0 Lacs P.A. Work from Office Full Time

Note - This opportunity is on behalf of one of our esteemed clients were assisting them with the hiring process. They are looking to fill this position immediately (within this week). Only candidates from Bengaluru or nearby (commutable) locations will be considered. About the Role We are seeking a strong Python Developer with hands-on expertise in FastAPI and solid experience working with relational databases (MS SQL, Oracle) . The ideal candidate should have a deep understanding of backend development, API design, and database optimization. Experience in AI/ML-based solutions will be considered a strong plus. Key Responsibilities Design, develop, and maintain backend services and APIs using FastAPI . Work extensively with relational databases (MS SQL, Oracle) including schema design, query optimization, and stored procedures. Collaborate with cross-functional teams (front-end, QA, DevOps, product) to deliver scalable and high-quality solutions. Ensure best practices in code quality, performance, and security for backend services. Integrate APIs with enterprise applications and external systems . Participate in design discussions, code reviews, and performance tuning . [If AI experience exists] Contribute to AI/ML model integration into production services. Required Skills 8+ years of experience in backend development with Python. Strong expertise in FastAPI framework. Strong working knowledge of MS SQL and Oracle (NoSQL not preferred). Solid understanding of REST API design, microservices, and distributed systems . Proficiency in unit testing, debugging, and CI/CD pipelines . Strong analytical and problem-solving skills. Good to Have Exposure to AI/ML frameworks (PyTorch, TensorFlow, Scikit-learn). Experience in cloud platforms (AWS, Azure, GCP). Knowledge of containerization (Docker, Kubernetes) . Note - This opportunity is on behalf of one of our esteemed clients were assisting them with the hiring process. They are looking to fill this position immediately (within this week). Only candidates from Bengaluru or nearby (commutable) locations will be considered.

AI Data Architect bengaluru 7 - 12 years INR 15.0 - 27.5 Lacs P.A. Work from Office Full Time

Note - This opportunity is on behalf of one of our esteemed clients were assisting them with the hiring process. They are looking to fill this position immediately (within this week). Only candidates from Bengaluru or nearby (commutable) locations will be considered. Role Overview We are looking for an experienced AI Data Architect to design and implement robust, scalable, and secure data architectures that power AI/ML solutions. The role involves defining data strategies, enabling advanced analytics, ensuring data quality/governance, and optimizing infrastructure to support modern AI-driven applications. Key Responsibilities Design and implement end-to-end data architectures to support AI/ML workloads. Define data strategy, governance, and frameworks for structured, semi-structured, and unstructured data. Architect scalable data pipelines, warehouses, and lakehouses optimized for AI/ML. Collaborate with Data Scientists, ML Engineers, and business teams to translate requirements into data architecture solutions. Ensure data security, compliance, lineage, and metadata management. Optimize data platforms for performance, scalability, and cost efficiency. Guide teams on best practices for integrating data platforms with AI/ML model training, deployment, and monitoring. Evaluate emerging tools and technologies in the Data & AI ecosystem. Required Skills & Experience Proven experience as a Data Architect with a focus on AI/ML workloads. Strong expertise in cloud platforms (AWS, Azure, GCP) and cloud-native data services. Hands-on experience with data lakehouse architectures (Databricks, Snowflake, Delta Lake, BigQuery, Synapse). Proficiency in data pipeline frameworks (Apache Spark, Kafka, Airflow, DBT). Strong understanding of ML data lifecycle feature engineering, data versioning, training pipelines, MLOps. Knowledge of data governance frameworks , security, and compliance standards. Experience with SQL, Python, and distributed data systems . Familiarity with AI/ML platforms (SageMaker, Vertex AI, Azure ML) is a plus. Excellent problem-solving and stakeholder management skills. Note - This opportunity is on behalf of one of our esteemed clients were assisting them with the hiring process. They are looking to fill this position immediately (within this week). Only candidates from Bengaluru or nearby (commutable) locations will be considered.