Jobs
Interviews

Quaxigma

Quaxigma is a technology company focusing on innovative solutions in data analytics and software development.

2 Job openings at Quaxigma
Technical Project Manager Tirupati 6 - 11 years INR 5.0 - 15.0 Lacs P.A. Work from Office Full Time

About the Role We are seeking an experienced and driven Technical Project Manager / Technical Delivery Manager to lead complex, high-impact data analytics and data science projects for global clients. This role demands a unique blend of project management expertise, technical depth in cloud and data technologies, and the ability to collaborate across cross-functional teams. You will be responsible for ensuring the successful delivery of data platforms, data products, and enterprise analytics solutions that drive business value. Key Responsibilities Project & Delivery Management Lead the full project lifecycle for enterprise-scale data platformsincluding requirement gathering, development, testing, deployment, and post-production support. Own the delivery of Data Warehousing and Data Lakehouse solutions on cloud platforms (Azure, AWS, or GCP). Prepare and maintain detailed project plans (Microsoft Project Plan), and align them with the Statement of Work (SOW) and client expectations. Utilize hybrid project methodologies (Agile + Waterfall) for managing scope, budget, and timelines. Monitor key project KPIs (e.g., SLA, MTTR, MTTA, MTBF) and ensure adherence using tools like ServiceNow. Data Platform & Architecture Oversight Collaborate with data engineers and architects to guide the implementation of scalable Data Warehouses (e.g., Redshift, Synapse) and Data Lakehouse architectures (e.g., Databricks, Delta Lake). Ensure data platform solutions meet performance, security, and governance standards. Understand and help manage data integration pipelines, ETL/ELT processes, and BI/reporting requirements. Client Engagement & Stakeholder Management Serve as the primary liaison for US/UK clients; manage regular status updates, escalation paths, and expectations across stakeholders. Conduct WSRs, MSRs, and QBRs with clients and internal teams to drive transparency and performance reviews. Facilitate team meetings, highlight risks or blockers, and ensure consistent stakeholder alignment. Technical Leadership & Troubleshooting Provide hands-on support and guidance in data infrastructure troubleshooting using tools like Splunk, AppDynamics, Azure Monitor. Lead incident, problem, and change management processes with data platform operations in mind. Identify automation opportunities and propose technical process improvements across data pipelines and workflows. Governance, Documentation & Compliance Create and maintain SOPs, runbooks, implementation documents, and architecture diagrams. Manage project compliance related to data privacy, security, and internal/external audits. Initiate and track Change Requests (CRs) and look for revenue expansion opportunities with clients. Continuous Improvement & Innovation Participate in and lead at least three internal process optimization or innovation initiatives annually. Work with engineering, analytics, and DevOps teams to improve CI/CD pipelines and data delivery workflows. Monitor production environments to reduce deployment issues and improve time-to-insight. Must-Have Qualifications 10+ years of experience in technical project delivery, with strong focus on data analytics, BI, and cloud data platforms . Strong hands-on experience with SQL and data warehouse technologies like Snowflake, Synapse, Redshift, BigQuery , etc. Proven experience delivering Data Warehouse and Data Lakehouse solutions. Familiarity with tools such as Redshift, Synapse, BigQuery, Databricks, Delta Lake . Strong cloud knowledge with Azure, AWS, or GCP . Proficiency in project management tools like Microsoft Project Plan (MPP) , JIRA, Confluence, and ServiceNow. Expertise in Agile project methodologies. Excellent communication skillsboth verbal and writtenwith no MTI or grammatical errors. Hands-on experience working with global delivery models (onshore/offshore). Preferred Qualifications PMP or Scrum Master certification. Understanding of ITIL processes and DataOps practices. Experience managing end-to-end cloud data transformation projects. Experience in project estimation, proposal writing, and RFP handling. Desired Skills & Competencies Deep understanding of SDLC, data architecture, and data governance principles. Strong leadership, decision-making, and conflict-resolution abilities. High attention to detail and accuracy in documentation and reporting. Ability to handle multiple concurrent projects in a fast-paced, data-driven environment. A passion for data-driven innovation and business impact. Why Join Us? Be part of a collaborative and agile team driving cutting-edge AI and data engineering solutions. Work on impactful projects that make a difference across industries. Opportunities for professional growth and continuous learning. Competitive salary and benefits package.

Python Developer tirupati 5 - 10 years INR 6.0 - 13.0 Lacs P.A. Work from Office Full Time

Job Summary: We are seeking a highly skilled Python Developer with deep expertise in Generative AI (GenAI) systems. This role demands strong experience in designing, developing, deploying, and maintaining full-stack production-grade GenAI solutions. The ideal candidate will bring 5+ years of Python development, a solid foundation in software engineering principles, and a passion for experimenting with the latest advancements in LLMs and AI tooling. Key Responsibilities: Design and build Python-based applications powered by LLMs (OpenAI, Claude, Mistral, etc.) Develop, serve, and maintain robust APIs for GenAI model interaction Integrate and orchestrate components using frameworks like LangChain, LangGraph, and LlamaIndex Implement techniques such as prompt engineering, RAG (Retrieval-Augmented Generation), and LLM fine-tuning Integrate with front-end applications and embed solutions in platforms like Microsoft Teams, Slack, and web dashboards Manage and preprocess large structured and unstructured data Optimize application latency, response time, and model accuracy Implement user authentication, logging, monitoring, and API security Build secure, production-ready APIs and maintain CI/CD pipelines Leverage Azure AI Search for intelligent information retrieval in RAG workflows Use Cosmos DB or similar service for scalable, low-latency, globally distributed data storage Perform rigorous testing and evaluation of GenAI outputs for accuracy, bias, and performance Stay up to date with GenAI research and incorporate emerging best practices into the development cycle Must Have: Bachelors or Master’s degree in Computer Science, Artificial Intelligence, Software Engineering, Data Science or a related field 5+ years of Python development with strong software engineering and logical problem-solving skills Proficiency with LLMs and GenAI libraries (e.g., Hugging Face Transformers, LangChain, LlamaIndex, LangGraph) Solid experience with vector databases (e.g., FAISS, ChromaDB, Pinecone) Strong knowledge of Fast, Flask, REST and/or GraphQL API development Hands-on experience deploying applications on Azure (Functions, AKS, Storage, App Services, etc.) Deep understanding of performance tuning, caching, and response optimization Familiarity with user authentication, role-based access, and secure API practices Exposure to monitoring/logging tools (e.g. Azure Monitor, Prometheus, Grafana, ELK) Experience working with CI/CD pipelines (e.g., GitHub Actions, Azure DevOps) Strong grasp of data structures, algorithms, and software design patterns Good-to-Have: Hands-on experience with OpenAI, Anthropic APIs, or open-source LLMs Understanding of fine-tuning, prompt tuning, and RLHF Knowledge of containerization and orchestration (Docker, Kubernetes) Experience with rapid prototyping tools like Streamlit or Gradio Familiarity with versioning tools like Git, DVC, MLflow Competencies: Tech Savvy - Anticipating and adopting innovations in business-building digital and technology applications. Self-Development - Actively seeking new ways to grow and be challenged using both formal and informal development channels. Action Oriented - Taking on new opportunities and tough challenges with a sense of urgency, high energy, and enthusiasm. Customer Focus - Building strong customer relationships and delivering customer-centric solutions. Optimize Work Processes - Knowing the most effective and efficient processes to get things done, with a focus on continuous improvement. Why Join Us? Be part of a collaborative and agile team driving cutting-edge AI and data engineering solutions. Work on impactful projects that make a difference across industries. Opportunities for professional growth and continuous learning. Competitive salary and benefits package. Application Details Ready to make an impact? Apply today and become part of the QX Impact team!