Posted:3 months ago|
Platform:
Work from Office
Full Time
We are seeking a highly skilled AI Engineer with expertise in Large Language Models (LLMs) and AWS-based AI solutions. The ideal candidate will provide expert guidance on designing, fine-tuning, and deploying LLMs while implementing scalable and cost-efficient AI solutions. This role requires a deep understanding of AWS cloud services, model optimization techniques, and integration of AI capabilities into enterprise applications. Key Responsibilities: Provide expert guidance on designing, fine-tuning, and deploying Large Language Models (LLMs) for various AI applications. Advise on and implement scalable and cost-efficient AI solutions using AWS services such as SageMaker, Lambda, EC2, S3, Bedrock, DynamoDB, API Gateway, and CloudFormation. Develop and recommend APIs and microservices to integrate LLMs into existing applications using AWS-native solutions. Optimize model performance for inference speed, accuracy, and cost-effectiveness by leveraging AWS Auto Scaling and distributed computing solutions. Collaborate with data engineers to preprocess, label, and manage large datasets for training and fine-tuning using AWS Glue and AWS Data Pipeline. Provide insights on monitoring and maintaining AI applications in production using AWS CloudWatch, AWS X-Ray, and AWS Security services. Research and experiment with the latest advancements in LLMs and cloud computing to improve performance within AWS infrastructure. Develop robust methodologies for evaluating model accuracy, bias detection, and performance benchmarking. Implement automated testing and validation pipelines to ensure consistent model accuracy and reliability using AWS SageMaker Clarify and Model Monitor. Work closely with cross-functional teams including data scientists, software engineers, and product managers to design robust AWS-based AI architectures. Required Technical Skills: Strong experience with Large Language Models (LLMs) and AI model deployment. Hands-on expertise with AWS AI/ML services, including SageMaker, Bedrock, and Lambda. Experience in developing and optimizing APIs and microservices for AI applications. Proficiency in Python, TensorFlow, PyTorch, or other ML frameworks. Knowledge of distributed computing and auto-scaling strategies for AI workloads. Familiarity with MLOps practices, model monitoring, and validation pipelines. Experience with data engineering tools like AWS Glue, AWS Data Pipeline, and DynamoDB. Strong understanding of cloud security, cost optimization, and AI governance.
Gadgeon Smart Systems
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
My Connections Gadgeon Smart Systems
10.0 - 15.0 Lacs P.A.