About the Role: We are looking for a hands-on Data Engineer to join our team and take full ownership of scraping pipelines and data quality. You'll be working on data from 60+ websites involving PDFs, processed via OCR and stored in MySQL/PostgreSQL. You’ll build robust, self-healing pipelines and fix common data issues (missing fields, duplication, formatting errors). Responsibilities: Own and optimize Airflow scraping DAGs for 60+ sites Implement validation checks, retry logic, and error alerts Build pre-processing routines to clean OCR'd text Create data normalization and deduplication workflows Maintain data integrity across MySQL and PostgreSQL Collaborate with ML team for downstream AI use cases Requirements: 2–5 years of experience in Python-based data engineering Experience with Airflow, Pandas, OCR (Tesseract or AWS Textract) Solid SQL and schema design skills (MySQL/PostgreSQL) Familiarity with CSV processing and data pipelines Bonus: Experience with scraping using Scrapy or Selenium Location: Delhi (in-office only) Salary Range : 50-80k/Month Show more Show less
About the Role: We are looking for a hands-on Data Engineer to join our team and take full ownership of scraping pipelines and data quality. You'll be working on data from 60+ websites involvingPDFs, processed via OCR and stored in MySQL/PostgreSQL. You'll build robust, self-healing pipelines and fix common data issues (missing fields, duplication, formatting errors). Responsibilities: Own and optimize Airflow scraping DAGs for 60+ sites Implement validation checks, retry logic, and error alerts Build pre-processing routines to clean OCR'd text Create data normalization and deduplication workflows Maintain data integrity across MySQL and PostgreSQL Collaborate with ML team for downstream AI use cases Requirements: 25 years of experience in Python-based data engineering Experience with Airflow, Pandas, OCR (Tesseract or AWS Textract) Solid SQL and schema design skills (MySQL/PostgreSQL) Familiarity with CSV processing and data pipelines Bonus: Experience with scraping using Scrapy or Selenium Location: Delhi (in-office only) Salary Range : 50-80k/Month
Key Responsibilities: Design, develop, and deploy agentic AI systems capable of autonomous decision-making, reasoning, and executing multi-step tasks. Build and fine-tune LLM-powered applications using frameworks like LangChain, LlamaIndex, or Semantic Kernel. Integrate LLMs into real-world workflows via APIs and tools (e.g., OpenAI, Hugging Face, Anthropic, Mistral). Architect and implement scalable pipelines for model orchestration, retrieval-augmented generation (RAG), and memory/long-context strategies. Develop robust interfaces between LLM agents and external tools, APIs, and databases. Experiment with prompt engineering, few-shot learning, and model evaluation techniques to optimise agent behaviour. Collaborate with cross-functional teams to understand business use cases and deliver end-to-end solutions. Required Skills & Experience: 3+ years of experience in AI/ML, with a strong track record of freelance or consulting work. Proven experience with LLMs, prompt engineering, and agentic frameworks. Proficient in Python and familiar with modern LLM toolchains (e.g., LangChain, OpenAI Function Calling, ReAct, AutoGPT, AgentOps). Experience building autonomous or semi-autonomous agents capable of reasoning and planning. Solid understanding of vector stores (e.g., Pinecone, FAISS, Weaviate) and RAG pipelines. Knowledge of cloud platforms (Azure, AWS, GCP) and containerization (Docker, Kubernetes). Ability to work independently, manage clients, and deliver on fast-paced timelines. Job Types: Full-time, Permanent, Contractual / Temporary
About the Role: We are looking for a hands-on Data Engineer to join our team and take full ownership of scraping pipelines and data quality. You'll be working on data from 60+ websites involving PDFs, processed via OCR and stored in MySQL/PostgreSQL. You’ll build robust, self-healing pipelines and fix common data issues (missing fields, duplication, formatting errors). Responsibilities: Own and optimize Airflow scraping DAGs for 60+ sites Implement validation checks, retry logic, and error alerts Build pre-processing routines to clean OCR'd text Create data normalization and deduplication workflows Maintain data integrity across MySQL and PostgreSQL Collaborate with ML team for downstream AI use cases Requirements: 2–5 years of experience in Python-based data engineering Experience with Airflow, Pandas, OCR (Tesseract or AWS Textract) Solid SQL and schema design skills (MySQL/PostgreSQL) Familiarity with CSV processing and data pipelines Bonus: Experience with scraping using Scrapy or Selenium Location: Delhi (in-office only) Minimum 3 years experience must be a graduate: b tech preferred / BCA/ MCA /BSc /MSc Mandatory keywords (must have skills) scraping python selenium NumPy Pandas Optional Keywords: (good to have the following skills) Beautiful soup MySQL Large Language Model ( LLM) Machine Learning Natural Language Processing (NLP) GitHub Django
Location: Remote Department: Executive Leadership Employment Type: Full-time About the Role We are seeking a visionary and experienced Chief Technology Officer (CTO) to lead our technology strategy, oversee product development, and ensure the scalability and security of our platforms. The CTO will play a key role in driving innovation, building high-performing teams, and aligning technology initiatives with overall business goals. Key Responsibilities Define and implement the company’s overall technology vision and roadmap. Lead end-to-end product development, architecture, and infrastructure. Manage engineering, product, and IT teams, ensuring alignment with business objectives. Evaluate and adopt emerging technologies to drive innovation and maintain a competitive edge. Establish best practices for coding, architecture, deployment, and security. Build scalable systems, ensuring high availability, performance, and data integrity. Oversee cloud infrastructure, DevOps, and cybersecurity frameworks. Collaborate with the executive team to define KPIs, budgets, and resource allocation. Represent the company’s technology vision to partners, investors, and stakeholders. Required Skills & Experience 6+ years of experience in software engineering/technology leadership, with at least 3 years in senior management roles. Proven track record in scaling technology teams and platforms. Strong expertise in cloud platforms (AWS, Azure, GCP) and DevOps practices. Deep understanding of system architecture, microservices, APIs, and security. Experience in AI/ML, data analytics, or emerging technologies (preferred). Excellent leadership, decision-making, and communication skills. Ability to align technology vision with business strategy.
Location: Remote Department: Executive Leadership Employment Type: Full-time About the Role We are seeking a visionary and experienced Chief Technology Officer (CTO) to lead our technology strategy, oversee product development, and ensure the scalability and security of our platforms. The CTO will play a key role in driving innovation, building high-performing teams, and aligning technology initiatives with overall business goals. Key Responsibilities Define and implement the company's overall technology vision and roadmap. Lead end-to-end product development, architecture, and infrastructure. Manage engineering, product, and IT teams, ensuring alignment with business objectives. Evaluate and adopt emerging technologies to drive innovation and maintain a competitive edge. Establish best practices for coding, architecture, deployment, and security. Build scalable systems, ensuring high availability, performance, and data integrity. Oversee cloud infrastructure, DevOps, and cybersecurity frameworks. Collaborate with the executive team to define KPIs, budgets, and resource allocation. Represent the company's technology vision to partners, investors, and stakeholders. Required Skills & Experience 6+ years of experience in software engineering/technology leadership, with at least 3 years in senior management roles. Proven track record in scaling technology teams and platforms. Strong expertise in cloud platforms (AWS, Azure, GCP) and DevOps practices. Deep understanding of system architecture, microservices, APIs, and security. Experience in AI/ML, data analytics, or emerging technologies (preferred). Excellent leadership, decision-making, and communication skills. Ability to align technology vision with business strategy.