About Us At BizCloud Experts, we ignite digital transformation by architecting, automating, and accelerating cloud solutions that empower businesses to scale, optimize, and innovate with confidence. Job Title: Database Administrator (Redshift, MongoDB, SQL, Oracle, Data Migration) Location: India Employment Type: Full-Time Overview: We are seeking an experienced Database Administrator (DBA) with expertise in managing and migrating data across multiple database platforms, including Amazon Redshift, MongoDB, SQL Server, and Oracle. The ideal candidate will lead data migration initiatives, optimize database performance, and ensure reliability, scalability, and data integrity in both on-premises and cloud environments. Key Responsibilities: Administer and maintain Amazon Redshift, MongoDB, SQL Server, and Oracle databases. Lead data migration projects across heterogeneous databases (on-prem to cloud, cross-platform). Design and implement ETL pipelines for data extraction, transformation, and loading into target systems. Use AWS DMS, Glue, or similar tools to plan and execute migrations with minimal downtime. Monitor and optimize database performance, ensuring efficient query execution and data throughput. Implement robust backup, recovery, and disaster recovery strategies. Manage database access control, encryption, and compliance with security policies. Troubleshoot complex database and data migration issues to ensure accuracy and consistency. Automate database administration and migration tasks using scripting (Python, Shell, PowerShell). Collaborate with data engineering and application teams to support new data initiatives and analytics workloads. Document all database configurations, migration plans, and post-migration validation procedures. Required Skills & Qualifications: Bachelors degree in Computer Science, Information Systems, or related field. 5+ years of hands-on experience as a DBA managing multi-platform database environments. Proven experience in data migration between SQL, Oracle, MongoDB, and cloud-based systems (AWS Redshift). Expertise in AWS DMS, Schema Conversion Tool, Data Pipeline, or similar data migration technologies. Strong proficiency in performance tuning, indexing, partitioning, and replication. Familiarity with AWS services (S3, Glue, Lambda, CloudWatch). Strong SQL scripting and automation experience (Python, Shell, PowerShell). Excellent problem-solving, troubleshooting, and communication skills. Preferred: AWS Certified Database – Specialty or equivalent certification. Experience with real-time data replication or streaming (Kafka, Kinesis). Knowledge of data governance, lineage, and compliance frameworks (GDPR, HIPAA).
Job Title: AI Engineer AWS & Generative AI Experience Level: 2 to 5 years (with at least 2 years of hands-on experience in Generative AI solutions on AWS) Role Overview We are looking for a hands-on AI Engineer with strong experience in building production-ready Generative AI solutions on AWS. The ideal candidate will have prior experience implementing RAG (Retrieval-Augmented Generation) solutions , working with agentic frameworks , and deploying AI/ML workloads using AWS-native services. This is a core engineering role that requires deep technical expertise, problem-solving skills, and the ability to build end-to-end AI solutions in real-world enterprise environments. You will work closely with architects, data engineers, and product teams to translate business requirements into scalable AI solutions, with a focus on autonomous agents and production-grade generative AI systems. Key Responsibilities Design and implement RAG-based architectures leveraging AWS services, vector databases, and LLMs with advanced retrieval strategies (hybrid search, reranking, document processing pipelines). Develop and deploy agentic workflows using frameworks such as CrewAI, LangGraph, Bedrock Strands Agents , or similar, including multi-agent orchestration and tool-calling implementations. Build agent memory systems (short-term, long-term, semantic) and implement context management strategies for complex workflows. Build, train, and deploy models on AWS Bedrock, SageMaker, AWS Transcribe/Translate/Comprehend, Kendra , and related services. Implement guardrails, content filters, and safety mechanisms for LLM outputs including PII detection and bias mitigation. Design and maintain evaluation frameworks for agent performance, accuracy, and reliability using metrics like RAGAS and human-in-the-loop evaluation. Build observability and monitoring for agentic workflows including tracing, logging, and debugging using tools like CloudWatch, X-Ray, LangSmith, or similar. Develop prompt engineering strategies and prompt management systems for production environments. Work closely with architects and engineering teams to integrate AI capabilities into existing enterprise systems using APIs, event-driven patterns, and data pipelines. Develop Python-based pipelines and APIs for AI workflows and automation, including streaming responses and real-time inference. Optimize inference, cost, and performance of deployed models on AWS through A/B testing and continuous improvement. Evaluate new LLMs, embeddings, and agent frameworks, and recommend best-fit approaches for projects. Collaborate with cross-functional teams and stakeholders to design MVPs, prototypes, and production-scale solutions from business requirements. Required Skills & Experience Hands-on AWS AI/ML stack : Bedrock, SageMaker, Transcribe, Translate, Comprehend, Kendra, Lambda, API Gateway, Step Functions, EventBridge. RAG implementation experience (at least one production project) using embeddings, vector databases, and hybrid search strategies. Agentic AI frameworks : CrewAI, LangGraph, Bedrock Strands Agents, or equivalent with experience in tool-calling/function-calling and multi-agent orchestration. Experience with agent orchestration patterns (ReAct, Plan-and-Execute, reflection) and autonomous agent design. Strong prompt engineering skills and experience with prompt management at scale. Knowledge of LLM evaluation metrics and agent performance monitoring. Strong Python programming skills with experience in ML/AI libraries (LangChain, HuggingFace, PyTorch/TensorFlow optional but good to have). Experience with vector/semantic databases such as OpenSearch, Weaviate, Pinecone, or equivalent. Understanding of observability tools for AI workflows (CloudWatch, X-Ray, LangSmith, Phoenix, or similar). Familiarity with enterprise integration patterns (APIs, data pipelines, event-driven architectures, security, observability). Experience with document processing pipelines for unstructured data (PDFs, images, etc.). Strong problem-solving and debugging skills in production environments. Preferred (Good to Have) Exposure to MLOps practices (CI/CD for ML, model monitoring, retraining pipelines). Experience with fine-tuning or RLHF on SageMaker or Bedrock. Familiarity with data processing tools (Glue, Athena, Redshift, or similar). Understanding of cost optimization and scaling on AWS including inference optimization. Knowledge of governance frameworks for AI (model cards, bias detection, responsible AI practices). Prior experience in multi-tenant SaaS or enterprise platforms . Experience with context window management and advanced chunking strategies. Familiarity with reranking models and retrieval optimization techniques. Qualifications Bachelors/Masters degree in Computer Science, Engineering, or related field (or equivalent practical experience). 2+ years of hands-on AI/ML development with Generative AI solutions. Demonstrated ability to translate business requirements into technical AI solution
3+ Years of Cloud/Developer Experience? Rebuild Your Career with an AWS Premier Partner! We are hosting an Open House for talented professionals eager to accelerate their careers in the cloud ecosystem only on 17th November '25 (Monday) If you have 3+ years of experience and are looking for a fresh start, new challenges, or a path to specialize come meet us! We might have the perfect opportunity waiting for you. Who Should Attend? Professionals with experience in: Database Administration (DBA) Security Engineering Operating Systems / Systems Engineering DevOps & Automation Contact Center Technologies Agentic AI / Applied Generative AI Why Visit Our Open House? Explore opportunities with a rapidly growing AWS Partner Learn how we help experienced engineers re-skill and re-launch their careers Discover exciting cloud, AI, and modernization projects Meet our leadership, solution architects, and hiring team Understand our training & onboarding pathways designed for success Join Us : Top Floor, C9XP+PH2 RS Silicon Park, Cyber Hills Colony, VIP Hills, Silicon Valley, Madhapur, Hyderabad, Telangana 500081 Location: BizCloud Experts Event: Open House Recruitment Session Purpose: Meet, network, explore, and kickstart your next chapter If you're motivated, curious, and ready to take the next leap come see us. In one of the the time slots , on 17th November '25 9 :00 P M -10:30 A M 10:30 P M -12:00 Noon 1 :00 P M - 3:00 P M Your future may begin here. Immediate Joinees > Less than 30 days will be preferred .
Responsibilities: * Manage office supplies inventory * Maintain facility cleanliness & organization * Coordinate meetings & events * Provide administrative support to team members * Maintain attendance and time sheets Provident fund