GeakMinds, Inc

8 Job openings at GeakMinds, Inc
C# Developer(3-4 yrs experience) Chennai,Tamil Nadu,India 2 - 4 years Not disclosed On-site Full Time

About the Role: Key Responsibilities: Design, develop, and maintain robust C# Web APIs Build and enhance user interfaces using WPF Collaborate with cross-functional teams to define, design, and deliver new features Write clean, scalable, and maintainable code Troubleshoot and debug applications to optimize performance Participate in code reviews and technical discussions Stay updated with new technologies and propose improvements Requirements: 2-4 years of hands-on experience in C# .NET development Strong experience with RESTful Web API development Proficiency in WPF (Windows Presentation Foundation) Solid understanding of object-oriented programming and design patterns Familiarity with version control systems like Git Exposure to CI/CD pipelines and Agile development environments Good verbal and written communication skills in English Strong problem-solving abilities and a growth mindset Good to Have: Experience with TypeScript or front-end technologies (e.g., Angular/React) Familiarity with SQL and database development Show more Show less

Senior Architect – Data Platforms chennai,tamil nadu,india 10 years None Not disclosed On-site Full Time

Position Overview: The Senior Architect will lead the design and development of modern data architectures leveraging Snowflake, Kafka, Splunk, Airflow, AWS, Apache Iceberg, and Presto. This role requires deep expertise in distributed data platforms, cloud infrastructure, real-time streaming, and scalable analytics solutions. The ideal candidate will drive both technical leadership and architectural vision for enterprise-scale systems. Key Responsibilities • Lead end-to-end architecture for secure, scalable, and high-performance data platforms utilizing Snowflake, Kafka, and AWS services. • Architect, implement, and optimize real-time and batch data pipelines using Airflow, Apache Iceberg, Presto, and Kafka Streams. • Drive integration of Splunk for operational analytics, logging, and monitoring across data platforms. • Develop and enforce architecture best practices for metadata management, governance, streaming ingestion, and cloud-native data warehousing. • Collaborate with data engineers and DevOps teams to ensure efficient APIs, data models, connectors, and cloud microservices. • Optimize cost, performance, and reliability of solutions across AWS infrastructure and underlying data services. • Lead technology strategies for advanced analytics, data lake implementation, self-service tools, and machine learning enablement. • Mentor engineers, review code and architectures, communicate technical concepts to stakeholders, and create technical documentation. Experience and Skills • 10+ years’ experience in data architecture, cloud platforms, and distributed systems. • Proven expertise in Snowflake data warehousing, including migration, data sharing, and performance optimization. • Extensive hands-on experience with Kafka (CDC, streams, real-time use cases), Airflow (ETL orchestration), and Splunk (monitoring/log analysis). • Deep knowledge of AWS data services and infrastructure (EC2, S3, Glue, Lambda, etc.). • Practical experience architecting solutions with Apache Iceberg and Presto for large-scale analytics. • Strong collaboration, problem solving, and communication skills, with a track record in mentoring and technical leadership. • Bachelor’s degree in Computer Science, Engineering, or related; Master’s preferred. Preferred Skills • Experience with additional big data tools (Spark, Databricks) and modern DevOps practices. • Knowledge of security, compliance, and multi-geo deployment for global enterprise data platforms. • Certification in AWS or relevant cloud architecture is valued.

Senior Python Developer chennai,tamil nadu,india 5 years None Not disclosed On-site Full Time

Role: Senior Python Developer Experience: 5 - 9 Years Education Qualification: Any Graduate Work Location: Chennai Work Type: Hybrid Employment: Full-time Key Skills: API Development, FastAPI, Pandas, Polars, Numpy, Multi-Processing, Multi-Threading Job Description: We are seeking a skilled Python Developer to join our development team. The ideal candidate will have strong expertise in building secure, scalable web applications and APIs using modern Python frameworks. Responsible for designing, developing, and maintaining backend services while ensuring high security standards and following best practices in software architecture. Key Responsibilities: Design, develop, and maintain Python-based APIs & applications. Lead and manage development projects from inception to deployment. Collaborate with cross-functional teams to define software requirements. Write clean, documented, and maintainable Python code. Ensure software quality through code reviews and adherence to best testing practices. Mentor and support junior developers in best coding practices. Required Skills: Bachelor's degree in computer science, Information Technology, or related field. 5+ years of experience in Python development. Experience with Python programming and various frameworks like Django and FastAPI. Proficiency in version control systems (Git) and testing frameworks like Pytest APIs and services: Experience with RESTful APIs, microservices, and integrating with third-party web services. Work with OAuth 2.0, API key management systems. Experience with Polars, JWT tokens Knowledge of DevOps practices and containerization technologies such as Docker and Kubernetes. Familiarity with cloud computing platforms like AWS, Azure, or GCP is often preferred. Good analytical, problem-solving, and communication skills.

Developer – Data Engineering & Cloud Analytics chennai,tamil nadu,india 4 - 6 years None Not disclosed On-site Full Time

Developer – Data Engineering & Cloud Analytics Role Overview: Responsible for building, maintaining, and optimizing large-scale data pipelines and analytics solutions leveraging Snowflake, Kafka, Splunk, Airflow, AWS, Apache Iceberg, and Presto. The candidate will bring hands-on development skills and collaborate with architects, analysts, and DevOps teams to deliver reliable, efficient, and scalable data services. Key Responsibilities • Develop and maintain ETL/ELT pipelines using Airflow, orchestrating data movement across Snowflake, AWS, Iceberg, and Kafka systems. • Implement and optimize real-time data ingestion and streaming solutions using Apache Kafka, ensuring high throughput and fault tolerance. • Integrate Apache Iceberg and Presto for interactive analytics on large-scale data lakes. • Write SQL, Python, and/or Scala code for complex data transformations, metric calculations, and business logic deployment. • Collaborate with data architects to evolve data models and ensure alignment with enterprise best practices. • Utilize Splunk for operational monitoring, log analysis, and incident troubleshooting within data workflows. • Deploy and manage infrastructure on AWS (S3, EC2, Glue, Lambda, IAM), focusing on automation, scalability, and security. • Document pipelines, produce clear runbooks, and share technical knowledge with team members. Required Skills & Experience • 4 to 6 years of hands-on development experience with modern data stack components: Snowflake, Apache Kafka, Airflow, and AWS. • Strong working knowledge of scalable SQL (preferably Snowflake, Presto) and scripting (Python/Scala). • Experience implementing data lake solutions with Apache Iceberg. • Familiarity with Splunk for monitoring and event management. • Proven history of building, deploying, and troubleshooting ETL/ELT data flows and real-time streaming jobs. • Knowledge of IAM, networking, and security concepts on AWS. Preferred Qualifications • Bachelor’s degree in Computer Science, Engineering, or related field. • Experience in cloud-native data warehousing, cost optimization, and compliance. • Certifications in AWS, Snowflake, or other relevant technologies. This role is ideal for candidates who enjoy end-to-end work on cloud-native analytics platforms, working with cutting-edge data streaming and lakehouse technologies in production environments.

AI Engineer chennai,tamil nadu,india 4 years None Not disclosed On-site Full Time

We are seeking an AI Engineer with hands-on experience in Azure AI services, Large Language Models (LLMs), and Retrieval-Augmented Generation (RAG) to join our team. The ideal candidate will design, develop, and deploy intelligent solutions using generative AI, AI agents, and conversational systems. This role involves working closely with data scientists, architects, and business stakeholders to deliver scalable AI-powered applications and integrate them into enterprise systems. Roles and Responsibilities Develop AI solutions using Azure OpenAI and Cognitive Services. Build and optimize LLM-based applications with RAG, prompt engineering, and fine-tuning. Develop and deploy AI agents for task automation, conversational AI, and knowledge management. Integrate AI systems with enterprise applications, APIs, and data sources. Collaborate with data engineers and scientists to prepare and manage training datasets. Ensure AI solutions follow security, compliance, and responsible AI guidelines. Experience 2–4 years of experience in AI/ML engineering, with at least 1 year in LLM and GenAI projects. Experience with Azure AI stack or Open-Source models using ollama or llama.cpp Hands-on expertise in building RAG pipelines and integrating vector databases (Azure AI Search, Milvus, Qdrant etc.). Experience developing and orchestrating AI agents using frameworks like LangChain Strong Python programming skills; experience with APIs (FastAPI) for deployment. Familiarity with containerization (Docker). Understanding of prompt engineering, embeddings, and fine-tuning approaches. Knowledge of MLOps practices, model lifecycle management, and monitoring. Knowledge in enterprise security, data privacy, and responsible AI principles. Strong communication and collaboration skills to work with cross-functional teams.

Azure AI Engineer chennai,tamil nadu,india 0 years None Not disclosed On-site Full Time

Job Summary: We are seeking a highly skilled and motivated Data Scientist to join our Data & AI team. The candidate should have strong expertise in data analytics, machine learning, and Gen AI, with hands-on experience building and operationalizing ETL pipelines, LLM-based applications, and agentic AI workflows. Key Responsibilities 1. Data Engineering & ETL Development: Design, build, and maintain end-to-end ETL pipelines for data ingestion, transformation, and storage using tools like Azure Data Factory, Databricks, PySpark, and SQL. Work with structured and unstructured data from diverse sources including APIs, data lakes, and streaming platforms (e.g., Kafka, Event Hubs). 2. AI, LLM, and Machine Learning Development: Design and implement machine learning models for classification, regression, clustering, NLP, and recommendation systems using Python, scikit-learn, TensorFlow, or PyTorch. Build agentic AI workflows using frameworks such as LangChain, Autogen etc to orchestrate autonomous reasoning, decision-making, and task automation. Integrate retrieval-augmented generation (RAG) pipelines using vector databases like FAISS, Pinecone, or Azure Cognitive Search for knowledge-grounded AI systems. Utilize Azure AI Services, including Azure OpenAI, Cognitive Services, and Azure Machine Learning, to build and deploy LLM-powered solutions. 3. Azure Cloud Platform Experience: Strong hands-on experience with the Microsoft Azure ecosystem, Azure Data Factory (ADF), Azure Databricks, Azure Synapse Analytics, Azure OpenAI Service, Azure Cognitive Services, Azure Blob Storage, Azure Key Vault etc. 4. Client & Stakeholder Collaboration: Partner with business and technical stakeholders to gather requirements, define KPIs, and develop data-driven solutions aligned with client goals. Present findings, visualizations, and model outputs clearly to both technical and non-technical audiences. Support solution design discussions for data and AI-driven projects. Note: Candidates with Azure certifications will be given preference

Senior Python Developer chennai,tamil nadu,india 5 years None Not disclosed On-site Full Time

Role: Senior Python Developer Experience: 5 - 9 Years Education Qualification: Any Graduate Work Location: Chennai Work Type: Hybrid Employment: Full-time Key Skills: API Development, FastAPI, Pandas, Polars, Numpy, Multi-Processing, Multi-Threading Job Description: We are seeking a skilled Python Developer to join our development team. The ideal candidate will have strong expertise in building secure, scalable web applications and APIs using modern Python frameworks. Responsible for designing, developing, and maintaining backend services while ensuring high security standards and following best practices in software architecture. Key Responsibilities: Design, develop, and maintain Python-based APIs & applications. Lead and manage development projects from inception to deployment. Collaborate with cross-functional teams to define software requirements. Write clean, documented, and maintainable Python code. Ensure software quality through code reviews and adherence to best testing practices. Mentor and support junior developers in best coding practices. Required Skills: Bachelor's degree in computer science, Information Technology, or related field. 5+ years of experience in Python development. Experience with Python programming and FastAPI Framework. Proficiency in version control systems (Git) and testing frameworks like Pytest APIs and services: Experience with RESTful APIs, microservices, and integrating with third-party web services. Work with OAuth 2.0, API key management systems. Experience with Polars, JWT tokens Knowledge of DevOps practices and containerization technologies such as Docker and Kubernetes. Familiarity with cloud computing platforms like AWS, Azure, or GCP is often preferred. Good analytical, problem-solving, and communication skills.

Python Developer chennai,tamil nadu,india 3 - 7 years None Not disclosed On-site Full Time

Job description for Python Developer About the Role: We are looking for a Python Developer someone who is passionate about writing clean, high-performance code and is excited by data-driven applications. The ideal candidate will bring strong expertise in Python, Python 3.12 and modern data processing libraries, API, Numpy and Pandas, SQL, with a proven track record of building and scaling backend systems. This role is perfect for someone who is proactive, quick to learn, and thrives in a fast-paced environment. Key Responsibilities: Design, develop, and maintain backend systems and APIs using Python and FastAPI. Work with data-centric libraries like pandas, polars, and numpy to build scalable data pipelines and services. Implement and manage asynchronous task scheduling using FastAPI schedulers or similar tools. Contribute to architectural decisions and mentor junior developers. Participate in code reviews, testing, and good at troubleshooting. Work closely with data teams and business stakeholders to understand requirements and deliver effective solutions. Continuously learn and adapt to new technologies and best practices. Must-Have Qualifications: 3-7 years of professional experience with Python. Proficient in pandas, polars, and numpy. Strong experience building APIs with FastAPI or similar frameworks. Hands-on experience with task scheduling and background job management. Excellent problem-solving and communication skills. A passion for learning and the ability to pick up new technologies quickly. Good to Have: Excellent English communication (Written and oral) Experience with C# in enterprise environments. Strong knowledge of SQL and experience with relational databases (e.g., PostgreSQL, SQL Server). Familiarity with Docker, data versioning, job orchestration tools, or cloud platforms