Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
3.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
CACI India, RMZ Nexity, Tower 30 4th Floor Survey No.83/1, Knowledge City Raidurg Village, Silpa Gram Craft Village, Madhapur, Serilingampalle (M), Hyderabad, Telangana 500081, India Req #1097 02 May 2025 CACI International Inc is an American multinational professional services and information technology company headquartered in Northern Virginia. CACI provides expertise and technology to enterprise and mission customers in support of national security missions and government transformation for defense, intelligence, and civilian customers. CACI has approximately 23,000 employees worldwide. Headquartered in London, CACI Ltd is a wholly owned subsidiary of CACI International Inc., a publicly listed company on the NYSE with annual revenue in excess of US $6.2bn. Founded in 2022, CACI India is an exciting, growing and progressive business unit of CACI Ltd. CACI Ltd currently has over 2000 intelligent professionals and are now adding many more from our Hyderabad and Pune offices. Through a rigorous emphasis on quality, the CACI India has grown considerably to become one of the UKs most well-respected Technology centres. About Data Platform The Data Platform will be built and managed “as a Product” to support a Data Mesh organization. The Data Platform focusses on enabling decentralized management, processing, analysis and delivery of data, while enforcing corporate wide federated governance on data, and project environments across business domains. The goal is to empower multiple teams to create and manage high integrity data and data products that are analytics and AI ready, and consumed internally and externally. What does a Data Infrastructure Engineer do? A Data Infrastructure Engineer will be responsible to develop, maintain and monitor the data platform infrastructure and operations. The infrastructure and pipelines you build will support data processing, data analytics, data science and data management across the CACI business. The data platform infrastructure will conform to a zero trust, least privilege architecture, with a strict adherence to data and infrastructure governance and control in a multi-account, multi-region AWS environment. You will use Infrastructure as Code and CI/CD to continuously improve, evolve and repair the platform. You will be able to design architectures and create re-useable solutions to reflect the business needs. Responsibilities Will Include Collaborating across CACI departments to develop and maintain the data platform Building infrastructure and data architectures in Cloud Formation, and SAM. Designing and implementing data processing environments and integrations using AWS PaaS such as Glue, EMR, Sagemaker, Redshift, Aurora and Snowflake Building data processing and analytics pipelines as code, using python, SQL, PySpark, spark, CloudFormation, lambda, step functions, Apache Airflow Monitoring and reporting on the data platform performance, usage and security Designing and applying security and access control architectures to secure sensitive data You Will Have 3+ years of experience in a Data Engineering role. Strong experience and knowledge of data architectures implemented in AWS using native AWS services such as S3, DataZone, Glue, EMR, Sagemaker, Aurora and Redshift. Experience administrating databases and data platforms Good coding discipline in terms of style, structure, versioning, documentation and unit tests Strong proficiency in Cloud Formation, Python and SQL Knowledge and experience of relational databases such as Postgres, Redshift Experience using Git for code versioning, and lifecycle management Experience operating to Agile principles and ceremonies Hands-on experience with CI/CD tools such as GitLab Strong problem-solving skills and ability to work independently or in a team environment. Excellent communication and collaboration skills. A keen eye for detail, and a passion for accuracy and correctness in numbers Whilst not essential, the following skills would also be useful: Experience using Jira, or other agile project management and issue tracking software Experience with Snowflake Experience with Spatial Data Processing More About The Opportunity The Data Engineer is an excellent opportunity, and CACI Services India reward their staff well with a competitive salary and impressive benefits package which includes: Learning: Budget for conferences, training courses and other materials Health Benefits: Family plan with 4 children and parents covered Future You: Matched pension and health care package We understand the importance of getting to know your colleagues. Company meetings are held every quarter, and a training/work brief weekend is held once a year, amongst many other social events. CACI is an equal opportunities employer. Therefore, we embrace diversity and are committed to a working environment where no one will be treated less favourably on the grounds of their sex, race, disability, sexual orientation religion, belief or age. We have a Diversity & Inclusion Steering Group and we always welcome new people with fresh perspectives from any background to join the group An inclusive and equitable environment enables us to draw on expertise and unique experiences and bring out the best in each other. We champion diversity, inclusion and wellbeing and we are supportive of Veterans and people from a military background. We believe that by embracing diverse experiences and backgrounds, we can collaborate to create better outcomes for our people, our customers and our society. Other details Pay Type Salary Apply Now Show more Show less
Posted 1 week ago
4.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
About the Job We are seeking a highly skilled AI/ML Engineer with expertise in AWS AI/ML services and a strong understanding of Generative AI using Amazon Bedrock. The ideal candidate will have experience in building, deploying, and optimizing AI/ML models on AWS, integrating LLMs into applications, and leveraging AWS services for scalable AI solutions. Experience Required - 4+ years Key Responsibilities Design, develop, and deploy AI/ML models on AWS, leveraging SageMaker, Bedrock, and related services. Build LLM-based applications using Amazon Bedrock and fine-tune models for specific use cases. Implement RAG (Retrieval-Augmented Generation) and integrate vector databases like OpenSearch, Pinecone, or FAISS. Develop scalable, production-ready ML pipelines using AWS services (Lambda, Step Functions, S3, DynamoDB, etc.). Utilize Bedrock, SageMaker, and custom fine-tuned models to deliver business-driven AI solutions. Work with cross-functional teams to integrate ML models into real-world applications. Ensure AI solutions adhere to best practices for security, compliance, and cost optimization. Stay updated with the latest trends in GenAI, prompt engineering, and AI model optimization. Required Skills Strong expertise in AWS AI/ML stack – Amazon Bedrock, SageMaker, Lambda, Step Functions, S3, DynamoDB, etc. Experience with Generative AI models (GPT, Claude, Mistral, LLaMA, etc.) and fine-tuning techniques. Hands-on experience in Python, TensorFlow, PyTorch, or Hugging Face. Knowledge of vector databases and embedding models. Experience in building secure and scalable AI applications using AWS. Familiarity with MLOps practices, CI/CD for ML models, and cloud automation. Strong problem-solving skills and ability to work in a fast-paced environment. Good to Have Experience with LangChain, Prompt Engineering, and RAG techniques. Understanding of data governance, AI ethics, and responsible AI practices. Certification in AWS Machine Learning Specialty/ Associate or relevant AI certifications. Relevant Skills vector databases, models, mlops, ml, pytorch, s3, lambda, tensorflow, ci/cd, amazon Show more Show less
Posted 1 week ago
5.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Project Role : Software Development Lead Project Role Description : Develop and configure software systems either end-to-end or for a specific stage of product lifecycle. Apply knowledge of technologies, applications, methodologies, processes and tools to support a client, project or entity. Must have skills : Python (Programming Language) Good to have skills : AWS Administration Minimum 5 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As a Software Engineer with Python expertise, you will develop data-driven applications on AWS. Responsible for the creation of scalable data pipelines and algorithms to process and deliver actionable vehicle data insights. Roles & Responsibilities: 1. Lead the design and development of Python based applications and services 2. Architect and implement cloud-native solutions using AWS services 3. Mentor and guide the Python development team, promoting best practices and code quality 4. Collaborate with data scientists and analysts to implement data processing pipelines 5. Participate in architecture discussions and contribute to technical decision-making 6. Ensure the scalability, reliability, and performance of Python applications on AWS 7. Stay current with Python ecosystem developments, AWS services, and industry best practices Professional & Technical Skills: 1. Python Programming. 2. Web framework expertise (Django, Flask, or FastAPI) 3. Data processing and analysis 4. Database technologies (SQL and NoSQL) 5. API development 6. Significant experience working with AWS Lambda 7. AWS services (e.g., EC2, S3, RDS, Lambda, SageMaker, EMR) with Any AWS certification is a plus. 8. Infrastructure as Code (e.g., AWS CloudFormation, Terraform) 9. Test-Driven Development (TDD) 10. DevOps practices 11. Agile methodologies. 12. Experience with big data technologies and data warehousing solutions on AWS (e.g., Redshift, EMR, Athena). 13. Strong knowledge of AWS platform and services (e.g., EC2, S3, RDS, Lambda, API Gateway, VPC, IAM). Additional Information: 1. The candidate should have a minimum of 5 years of experience in Python Programming. 2. This position is based at our Hyderabad office 3. A 15 years full time education is required (Bachelor of computer science, or any related stream. master’s degree preferred.) Show more Show less
Posted 1 week ago
5.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Project Role : Software Development Lead Project Role Description : Develop and configure software systems either end-to-end or for a specific stage of product lifecycle. Apply knowledge of technologies, applications, methodologies, processes and tools to support a client, project or entity. Must have skills : Python (Programming Language) Good to have skills : AWS Administration Minimum 5 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As a Software Engineer with Python expertise, you will develop data-driven applications on AWS. Responsible for the creation of scalable data pipelines and algorithms to process and deliver actionable vehicle data insights. Roles & Responsibilities: 1. Lead the design and development of Python based applications and services 2. Architect and implement cloud-native solutions using AWS services 3. Mentor and guide the Python development team, promoting best practices and code quality 4. Collaborate with data scientists and analysts to implement data processing pipelines 5. Participate in architecture discussions and contribute to technical decision-making 6. Ensure the scalability, reliability, and performance of Python applications on AWS 7. Stay current with Python ecosystem developments, AWS services, and industry best practices Professional & Technical Skills: 1. Python Programming. 2. Web framework expertise (Django, Flask, or FastAPI) 3. Data processing and analysis 4. Database technologies (SQL and NoSQL) 5. API development 6. Significant experience working with AWS Lambda 7. AWS services (e.g., EC2, S3, RDS, Lambda, SageMaker, EMR) with Any AWS certification is a plus. 8. Infrastructure as Code (e.g., AWS CloudFormation, Terraform) 9. Test-Driven Development (TDD) 10. DevOps practices 11. Agile methodologies. 12. Experience with big data technologies and data warehousing solutions on AWS (e.g., Redshift, EMR, Athena). 13. Strong knowledge of AWS platform and services (e.g., EC2, S3, RDS, Lambda, API Gateway, VPC, IAM). Additional Information: 1. The candidate should have a minimum of 5 years of experience in Python Programming. 2. This position is based at our Hyderabad office 3. A 15 years full time education is required (Bachelor of computer science, or any related stream. master’s degree preferred.) Show more Show less
Posted 1 week ago
10.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Responsibilities: Evaluate and source appropriate cloud infrastructure solutions for machine learning needs, ensuring cost-effectiveness and scalability based on project requirements. Automate and manage the deployment of machine learning models into production environments, ensuring version control for models and datasets using tools like Docker and Kubernetes. Set up monitoring tools to track model performance and data drift, conduct regular maintenance, and implement updates for production models. Work closely with data scientists, software engineers, and stakeholders to align on project goals, facilitate knowledge sharing, and communicate findings and updates to cross-functional teams. Design, implement, and maintain scalable ML infrastructure, optimizing cloud and on-premise resources for training and inference. Document ML processes, pipelines, and best practices while preparing reports on model performance, resource utilization, and system issues. Provide training and support for team members on ML Ops tools and methodologies, and stay updated on industry trends and emerging technologies. Diagnose and resolve issues related to model performance, infrastructure, and data quality, implementing solutions to enhance model robustness and reliability. Education, Technical Skills & Other Critical Requirement: 10+ years of relevant experience in AI/ analytics product & solution delivery Bachelor’s/master’s degree in an information technology/computer science/ Engineering or equivalent fields experience. Proficiency in frameworks such as TensorFlow, PyTorch, or Scikit-learn. Strong skills in Python and/or R; familiarity with Java, Scala, or Go is a plus. Experience with cloud services such as AWS, Azure, or Google Cloud Platform, particularly in ML services (e.g., AWS SageMaker, Azure ML). CI/CD tools (e.g., Jenkins, GitLab CI), containerization (e.g., Docker), and orchestration (e.g., Kubernetes). Experience with databases (SQL and NoSQL), data pipelines, ETL processes, ML pipeline orchestration (Airflow) Familiarity with monitoring and logging tools such as Prometheus, Grafana, or ELK stack. Proficient in using Git for version control. Strong analytical and troubleshooting abilities to diagnose and resolve issues effectively. Good communication skills for working with cross-functional teams and conveying technical concepts to non-technical stakeholders. Ability to manage multiple projects and prioritize tasks in a fast-paced environment. Show more Show less
Posted 1 week ago
10.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
🚨 We are Hiring 🚨 https://grhombustech.com/jobs/job-description-senior-test-automation-lead-playwright-ai-ml-focus/ Job Description Job Title: Senior Test Automation Lead – Playwright (AI/ML Focus) Location: Hyderabad Experience: 10 - 12 years Job Type: Full-Time Company Overview: GRhombus Technologies Pvt Ltd, a pioneer in Software Solutions – Especially on Test Automation, Cyber Security, Full Stack Development, DevOps, Salesforce, Performance Testing and Manual Testing. GRhombus delivery centres are located in India at Hyderabad, Chennai, Bengaluru and Pune. In the Middle East, we are located in Dubai. Our partner offices are located in the USA and the Netherlands. About the Role: We are seeking a passionate and technically skilled Senior Test Automation Lead with deep experience in Playwright-based frameworks and a solid understanding of AI/ML-driven applications. In this role, you will lead the automation strategy and quality engineering practices for next-generation AI products that integrate large-scale machine learning models, data pipelines, and dynamic, intelligent UIs. You will define, architect, and implement scalable automation solutions across AI-enhanced features such as recommendation engines, conversational UIs, real-time analytics, and predictive workflows, ensuring both functional correctness and intelligent behavior consistency. Key Responsibilities: Test Automation Framework Design & Implementation Design and implement robust, modular, and extensible Playwright automation frameworks using TypeScript/JavaScript. Define automation design patterns and utilities that can handle complex AI-driven UI behaviors (e.g., dynamic content, personalization, chat interfaces). Implement abstraction layers for easy test data handling, reusable components, and multi-browser/platform execution. AI/ML-Specific Testing Strategy Partner with Data Scientists and ML Engineers to understand model behaviors, inference workflows, and output formats. Develop strategies for testing non-deterministic model outputs (e.g., chat responses, classification labels) using tolerance ranges, confidence intervals, or golden datasets. Design tests to validate ML integration points: REST/gRPC API calls, feature flags, model versioning, and output accuracy. Include bias, fairness, and edge-case validations in test suites where applicable (e.g., fairness in recommendation engines or NLP sentiment analysis). End-to-End Test Coverage Lead the implementation of end-to-end automation for: Web interfaces (React, Angular, or other SPA frameworks) Backend services (REST, GraphQL, WebSockets) ML model integration endpoints (real-time inference APIs, batch pipelines) Build test utilities for mocking, stubbing, and simulating AI inputs and datasets. CI/CD & Tooling Integration Integrate automation suites into CI/CD pipelines using GitHub Actions, Jenkins, GitLab CI, or similar. Configure parallel execution, containerized test environments (e.g., Docker), and test artifact management. Establish real-time dashboards and historical reporting using tools like Allure, ReportPortal, TestRail, or custom Grafana integrations. Quality Engineering & Leadership Define KPIs and QA metrics for AI/ML product quality: functional accuracy, model regression rates, test coverage %, time-to-feedback, etc. Lead and mentor a team of automation and QA engineers across multiple projects. Act as the Quality Champion across the AI platform by influencing engineering, product, and data science teams on quality ownership and testing best practices. Agile & Cross-Functional Collaboration Work in Agile/Scrum teams; participate in backlog grooming, sprint planning, and retrospectives. Collaborate across disciplines: Frontend, Backend, DevOps, MLOps, and Product Management to ensure complete testability. Review feature specs, AI/ML model update notes, and data schemas for impact analysis. Required Skills and Qualifications: Technical Skills: Strong hands-on expertise with Playwright (TypeScript/JavaScript). Experience building custom automation frameworks and utilities from scratch. Proficiency in testing AI/ML-integrated applications: inference endpoints, personalization engines, chatbots, or predictive dashboards. Solid knowledge of HTTP protocols, API testing (Postman, Supertest, RestAssured). Familiarity with MLOps and model lifecycle management (e.g., via MLflow, SageMaker, Vertex AI). Experience in testing data pipelines (ETL, streaming, batch), synthetic data generation, and test data versioning. Domain Knowledge: Exposure to NLP, CV, recommendation engines, time-series forecasting, or tabular ML models. Understanding of key ML metrics (precision, recall, F1-score, AUC), model drift, and concept drift. Knowledge of bias/fairness auditing, especially in UI/UX contexts where AI decisions are shown to users. Leadership & Communication: Proven experience leading QA/Automation teams (4+ engineers). Strong documentation, code review, and stakeholder communication skills. Experience collaborating in Agile/SAFe environments with cross-functional teams. Preferred Qualifications: Experience with AI Explainability frameworks like LIME, SHAP, or What-If Tool. Familiarity with Test Data Management platforms (e.g., Tonic.ai, Delphix) for ML training/inference data. Background in performance and load testing for AI systems using tools like Locust, JMeter, or k6. Experience with GraphQL, Kafka, or event-driven architecture testing. QA Certifications (ISTQB, Certified Selenium Engineer) or cloud certifications (AWS, GCP, Azure). Education: Bachelor’s or Master’s degree in Computer Science, Software Engineering, or related technical discipline. Bonus for certifications or formal training in Machine Learning, Data Science, or MLOps. Why Join Us? At GRhombus, we are redefining quality assurance and software testing with cutting-edge methodologies and a commitment to innovation. As a test automation lead, you will play a pivotal role in shaping the future of automated testing, optimizing frameworks, and driving efficiency across our engineering ecosystem. Be part of a workplace that values experimentation, learning, and professional growth. Contribute to an organisation where your ideas drive innovation and make a tangible impact. Show more Show less
Posted 1 week ago
5.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Project Role : Software Development Lead Project Role Description : Develop and configure software systems either end-to-end or for a specific stage of product lifecycle. Apply knowledge of technologies, applications, methodologies, processes and tools to support a client, project or entity. Must have skills : Python (Programming Language) Good to have skills : AWS Architecture Minimum 5 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As a Software Engineer with Python expertise, you will develop data-driven applications on AWS. Responsible for the creation of scalable data pipelines and algorithms to process and deliver actionable vehicle data insights. Roles & Responsibilities: 1. Lead the design and development of Python based applications and services 2. Architect and implement cloud-native solutions using AWS services 3. Mentor and guide the Python development team, promoting best practices and code 4. quality. 5. Collaborate with data scientists and analysts to implement data processing pipelines 6. Participate in architecture discussions and contribute to technical decision-making 7. Ensure the scalability, reliability, and performance of Python applications on AWS 8. Stay current with Python ecosystem developments, AWS services, and industry best practices Professional & Technical Skills: 1. Python Programming 2. Web framework expertise (Django, Flask, or FastAPI) 3. Data processing and analysis 4. Database technologies (SQL and NoSQL) 5. API development 6. Significant experience working with AWS Lambda 7. AWS services (e.g., EC2, S3, RDS, Lambda, SageMaker, EMR) with Any AWS certification is a plus. 8. Infrastructure as Code (e.g., AWS CloudFormation, Terraform) 9. Test-Driven Development (TDD) 10. DevOps practices 11. Agile methodologies. 12. Experience with big data technologies and data warehousing solutions on AWS (e.g., Redshift, EMR, Athena). 13. Strong knowledge of AWS platform and services (e.g., EC2, S3, RDS, Lambda, API Gateway, VPC, IAM). Additional Information: 1. The candidate should have a minimum of 5 years of experience in Python Programming. 2. This position is based at our Hyderabad office 3. A 15 years full time education is required (Bachelor of computer science, or any related stream. master’s degree preferred.) Show more Show less
Posted 1 week ago
7.5 years
0 Lacs
Hyderabad, Telangana, India
On-site
Project Role : Software Development Lead Project Role Description : Develop and configure software systems either end-to-end or for a specific stage of product lifecycle. Apply knowledge of technologies, applications, methodologies, processes and tools to support a client, project or entity. Must have skills : Python (Programming Language) Good to have skills : NA Minimum 7.5 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As a Software Engineer with Python expertise, you will develop data-driven applications on AWS. Responsible for the creation of scalable data pipelines and algorithms to process and deliver actionable vehicle data insights. Roles & Responsibilities: 1. Lead the design and development of Python based applications and services 2. Architect and implement cloud-native solutions using AWS services 3. Mentor and guide the Python development team, promoting best practices and code quality 4. Collaborate with data scientists and analysts to implement data processing pipelines 5. Participate in architecture discussions and contribute to technical decision-making 6. Ensure the scalability, reliability, and performance of Python applications on AWS 7. Stay current with Python ecosystem developments, AWS services, and industry best practices Professional & Technical Skills: 1. Python Programming. 2. Web framework expertise (Django, Flask, or FastAPI) 3. Data processing and analysis 4. Database technologies (SQL and NoSQL) 5. API development 6. Significant experience working with AWS Lambda 7. AWS services (e.g., EC2, S3, RDS, Lambda, SageMaker, EMR) with Any AWS certification is a plus. 8. Infrastructure as Code (e.g., AWS CloudFormation, Terraform) 9. Test-Driven Development (TDD) 10. DevOps practices 11. Agile methodologies. 12. Experience with big data technologies and data warehousing solutions on AWS (e.g., Redshift, EMR, Athena). 13. Strong knowledge of AWS platform and services (e.g., EC2, S3, RDS, Lambda, API Gateway, VPC, IAM). Additional Information: 1. The candidate should have a minimum of 5 years of experience in Python Programming. 2. This position is based at our Hyderabad office 3. A 15 years full time education is required (Bachelor of computer science, or any related stream. master’s degree preferred.) Show more Show less
Posted 1 week ago
5.0 - 8.0 years
0 Lacs
India
Remote
Role: Data Science Developer Location : Remote Responsibilities : Develop and productionize cloud-based services and full-stack applications utilizing NLP solutions, including GenAI models. Implement and manage CI/CD pipelines to ensure efficient and reliable software delivery. Automate cloud infrastructure using Terraform. Write unit tests, integration tests and performance tests Work in a team environment using agile practices Support administration of Data Science experimentation environment including AWS Sagemaker and Nvidia GPU servers Monitor and optimize application performance and infrastructure costs. Collaborate with data scientists and other developers to integrate and deploy data science models into production environments Educate others to improve and coding standards, code quality and test coverage, documentation Work closely with cross-functional teams to ensure seamless integration and operation of services. What We’re Looking For : Basic Required Qualifications : 5-8 years of experience in software engineering Proficiency in Python and JavaScript for full-stack development . Experience in writing and maintaining high quality code – utilizing techniques like unit testing and code reviews Strong understanding of object-oriented design and programming concepts Strong experience with AWS cloud services, including EKS, Lambda, and S3 . Knowledge of Docker containers and orchestration tools including Kubernetes Experience with monitoring, logging, and tracing tools (e.g., Datadog, Kibana, Grafana ). Knowledge of message queues and event-driven architectures (e.g., AWS SQS, Kafka). Experience with CI/CD pipelines in Azure DevOps and GitHub Actions . Additional Preferred Qualifications : Experience writing front-end web applications using Javascript and React Familiarity with infrastructure as code (IaC) using Terraform. Experience in Azure or GPC cloud services Proficiency in C# or Java Experience with SQL and NoSQL databases Knowledge of Machine Learning concepts Experience with Large Language Models Show more Show less
Posted 1 week ago
3.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Project Role : Software Development Engineer Project Role Description : Analyze, design, code and test multiple components of application code across one or more clients. Perform maintenance, enhancements and/or development work. Must have skills : Python (Programming Language) Good to have skills : NA Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As a Software Engineer with Python expertise, you will develop data-driven applications on AWS. Responsible for the creation of scalable data pipelines and algorithms to process and deliver actionable vehicle data insights. Roles & Responsibilities: 1. Lead the design and development of Python based applications and services 2. Architect and implement cloud-native solutions using AWS services 3. Mentor and guide the Python development team, promoting best practices and code quality 4. Collaborate with data scientists and analysts to implement data processing pipelines 5. Participate in architecture discussions and contribute to technical decision-making 6. Ensure the scalability, reliability, and performance of Python applications on AWS 7. Stay current with Python ecosystem developments, AWS services, and industry best practices Professional & Technical Skills: 1. Python Programming. 2. Web framework expertise (Django, Flask, or FastAPI) 3. Data processing and analysis 4. Database technologies (SQL and NoSQL) 5. API development 6. Significant experience working with AWS Lambda 7. AWS services (e.g., EC2, S3, RDS, Lambda, SageMaker, EMR) with Any AWS certification is a plus. 8. Infrastructure as Code (e.g., AWS CloudFormation, Terraform) 9. Test-Driven Development (TDD) 10. DevOps practices 11. Agile methodologies. 12. Experience with big data technologies and data warehousing solutions on AWS (e.g., Redshift, EMR, Athena). 13. Strong knowledge of AWS platform and services (e.g., EC2, S3, RDS, Lambda, API Gateway, VPC, IAM). Additional Information: 1. The candidate should have a minimum of 5 years of experience in Python Programming. 2. This position is based at our Hyderabad office 3. A 15 years full time education is required (Bachelor of computer science, or any related stream. master’s degree preferred.) Show more Show less
Posted 1 week ago
5.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Project Role : Software Development Engineer Project Role Description : Analyze, design, code and test multiple components of application code across one or more clients. Perform maintenance, enhancements and/or development work. Must have skills : Python (Programming Language) Good to have skills : NA Minimum 5 Year(s) Of Experience Is Required Educational Qualification : Bachelor of Engineering in Electronics or any related stream Summary: As a Sr. Full Stack Engineer, you will develop data-driven applications on AWS for the client. Responsible for the creation of scalable data pipelines and algorithms to process and deliver actionable vehicle data insights. Roles & Responsibilities: 1. Lead the design and development of Python based applications and services 2. Architect and implement cloud-native solutions using AWS services 3. Mentor and guide the Python development team, promoting best practices and code quality 4. Collaborate with data scientists and analysts to implement data processing pipelines 5. Participate in architecture discussions and contribute to technical decision-making 6. Ensure the scalability, reliability, and performance of Python applications on AWS 7. Stay current with Python ecosystem developments, AWS services, and industry best practices. Professional & Technical Skills: 1. At least 5 years of experience in Python Programming with Web framework expertise (Django, Flask, or FastAPI). 2. Exposure on database technologies (SQL and NoSQL) and API development. 3. Significant experience working with AWS services (e.g., EC2, S3, RDS, Lambda, SageMaker, EMR) and Infrastructure as Code (e.g., AWS CloudFormation, Terraform). 4. Exposure on Test-Driven Development (TDD) 5. Practices DevOps in software solution and well-versed with Agile methodologies. 6. AWS certification is a plus. 7. Have well-developed analytical skills, a person who is rigorous but pragmatic, being able to justify decisions with solid rationale. Additional Information: 1. The candidate should have a minimum of 5 years of experience in Python Programming. 2. This position is based at our Hyderabad office 3. A 15 years full time education is required (bachelor’s degree in computer science, Software Engineering, or related field). Show more Show less
Posted 1 week ago
3.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Project Role : Software Development Engineer Project Role Description : Analyze, design, code and test multiple components of application code across one or more clients. Perform maintenance, enhancements and/or development work. Must have skills : Python (Programming Language) Good to have skills : NA Minimum 3 Year(s) Of Experience Is Required Educational Qualification : Bachelor of Engineering in Electronics or any related stream Summary: As an IoT Engineer with Python expertise, you will develop data-driven applications on AWS IoT for the client. Responsible for the creation of scalable data pipelines and algorithms to process and deliver actionable vehicle data insights. Roles & Responsibilities: 1. Lead the design and development of Python based applications and services 2. Architect and implement cloud-native solutions using AWS services 3. Collaborate with data scientists and analysts to implement data processing pipelines 4. Participate in architecture discussions and contribute to technical decision-making 5. Ensure the scalability, reliability, and performance of Python applications on AWS 6. Stay current with Python ecosystem developments, AWS services, and industry best practices. Professional & Technical Skills: 1. At least 3 years of experience in Python Programming with integration with AWS IoT core. 2. Exposure on database technologies (SQL and NoSQL) and API development. 3. Significant experience working with AWS services (e.g., EC2, S3, RDS, Lambda, SageMaker, EMR) and Infrastructure as Code (e.g., AWS CloudFormation, Terraform). 4. Exposure on Test-Driven Development (TDD) 5. Practices DevOps in software solution and well-versed with Agile methodologies. 6. AWS certification is a plus. 7. Have well-developed analytical skills, a person who is rigorous but pragmatic, being able to justify decisions with solid rationale. Additional Information: 1. The candidate should have a minimum of 7 years of experience in Python Programming. 2. This position is based at our Hyderabad office 3. A 15 years full time education is required (bachelor’s degree in computer science, Software Engineering, or related field). Show more Show less
Posted 1 week ago
3.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Project Role : Software Development Engineer Project Role Description : Analyze, design, code and test multiple components of application code across one or more clients. Perform maintenance, enhancements and/or development work. Must have skills : Python (Programming Language) Good to have skills : NA Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As a Software Engineer with Python expertise, you will develop data-driven applications on AWS. Responsible for the creation of scalable data pipelines and algorithms to process and deliver actionable vehicle data insights. Roles & Responsibilities: 1. Lead the design and development of Python based applications and services 2. Architect and implement cloud-native solutions using AWS services 3. Mentor and guide the Python development team, promoting best practices and code quality 4. Collaborate with data scientists and analysts to implement data processing pipelines 5. Participate in architecture discussions and contribute to technical decision-making 6. Ensure the scalability, reliability, and performance of Python applications on AWS 7. Stay current with Python ecosystem developments, AWS services, and industry best practices Professional & Technical Skills: 1. Python Programming. 2. Web framework expertise (Django, Flask, or FastAPI) 3. Data processing and analysis 4. Database technologies (SQL and NoSQL) 5. API development 6. Significant experience working with AWS Lambda 7. AWS services (e.g., EC2, S3, RDS, Lambda, SageMaker, EMR) with Any AWS certification is a plus. 8. Infrastructure as Code (e.g., AWS CloudFormation, Terraform) 9. Test-Driven Development (TDD) 10. DevOps practices 11. Agile methodologies. 12. Experience with big data technologies and data warehousing solutions on AWS (e.g., Redshift, EMR, Athena). 13. Strong knowledge of AWS platform and services (e.g., EC2, S3, RDS, Lambda, API Gateway, VPC, IAM). Additional Information: 1. The candidate should have a minimum of 5 years of experience in Python Programming. 2. This position is based at our Hyderabad office 3. A 15 years full time education is required (Bachelor of computer science, or any related stream. master’s degree preferred.) Show more Show less
Posted 1 week ago
3.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Project Role : Software Development Engineer Project Role Description : Analyze, design, code and test multiple components of application code across one or more clients. Perform maintenance, enhancements and/or development work. Must have skills : Python (Programming Language) Good to have skills : NA Minimum 3 Year(s) Of Experience Is Required Educational Qualification : Bachelor of Engineering in Electronics or any related stream Summary: As a Sr. Backend Engineer, you will develop data-driven applications on AWS for the client. Responsible for the creation of scalable data pipelines and algorithms to process and deliver actionable vehicle data insights. Roles & Responsibilities: 1. Lead the design and development of Python based applications and services 2. Architect and implement cloud-native solutions using AWS services 3. Mentor and guide the Python development team, promoting best practices and code quality 4. Collaborate with data scientists and analysts to implement data processing pipelines 5. Participate in architecture discussions and contribute to technical decision-making 6. Ensure the scalability, reliability, and performance of Python applications on AWS 7. Stay current with Python ecosystem developments, AWS services, and industry best practices Professional & Technical Skills: 1. At least 3 years of experience in Python Programming with Web framework expertise (Django, Flask, or FastAPI). 2. Exposure on database technologies (SQL and NoSQL) and API development. 3. Significant experience working with AWS services (e.g., EC2, S3, RDS, Lambda, SageMaker, EMR) and Infrastructure as Code (e.g., AWS CloudFormation, Terraform). 4. Exposure on Test-Driven Development (TDD) 5. Practices DevOps in software solution and well-versed with Agile methodologies. 6. AWS certification is a plus. 7. Have well-developed analytical skills, a person who is rigorous but pragmatic, being able to justify decisions with solid rationale Additional Information: 1. The candidate should have a minimum of 7 years of experience in Python Programming. 2. This position is based at our Hyderabad office 3. A 15 years full time education is required (bachelor’s degree in computer science, Software Engineering, or related field). Show more Show less
Posted 1 week ago
3.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Project Role : Software Development Engineer Project Role Description : Analyze, design, code and test multiple components of application code across one or more clients. Perform maintenance, enhancements and/or development work. Must have skills : Python (Programming Language) Good to have skills : NA Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As a Software Engineer with Python expertise, you will develop data-driven applications on AWS. Responsible for the creation of scalable data pipelines and algorithms to process and deliver actionable vehicle data insights. Roles & Responsibilities: 1. Lead the design and development of Python based applications and services 2. Architect and implement cloud-native solutions using AWS services 3. Mentor and guide the Python development team, promoting best practices and code quality 4. Collaborate with data scientists and analysts to implement data processing pipelines 5. Participate in architecture discussions and contribute to technical decision-making 6. Ensure the scalability, reliability, and performance of Python applications on AWS 7. Stay current with Python ecosystem developments, AWS services, and industry best practices Professional & Technical Skills: 1. Python Programming 2. Web framework expertise (Django, Flask, or FastAPI) 3. Data processing and analysis 4. Database technologies (SQL and NoSQL) 5. API development 6. Significant experience working with AWS Lambda 7. AWS services (e.g., EC2, S3, RDS, Lambda, SageMaker, EMR) with Any AWS certification is a plus. 8. Infrastructure as Code (e.g., AWS CloudFormation, Terraform) 9. Test-Driven Development (TDD) 10. DevOps practices 11. Agile methodologies. 12. Experience with big data technologies and data warehousing solutions on AWS (e.g., Redshift, EMR, Athena). 13. Strong knowledge of AWS platform and services (e.g., EC2, S3, RDS, Lambda, API Gateway, VPC, IAM). Additional Information: 1. The candidate should have a minimum of 3 years of experience in Python Programming. 2. This position is based at our Hyderabad office 3. A 15 years full time education is required (Bachelor of computer science, or any related stream. master’s degree preferred.) Show more Show less
Posted 1 week ago
7.0 years
0 Lacs
India
On-site
WhizzHR is hiring Media Solution Architect – AI/ML & Automation Focus Role Summary: We are seeking a Media Solution Architect to lead the strategic design of AI-driven and automation-centric solutions across digital media operations. This role involves architecting intelligent, scalable systems that enhance efficiency across campaign setup, trafficking, reporting, QA, and billing processes. The ideal candidate will bring a strong blend of automation, AI/ML, and digital marketing expertise to drive innovation and operational excellence. Key Responsibilities: Identify and assess opportunities to apply AI/ML and automation across media operations workflows (e.g., intelligent campaign setup, anomaly detection in QA, dynamic taxonomy validation). Design scalable, intelligent architectures using a combination of machine learning models, RPA, Python-based automation, and media APIs (e.g., Meta, DV360, YouTube). Develop or integrate machine learning models for use cases such as performance prediction, media mix modeling, and anomaly detection in reporting or billing. Ensure adherence to best practices in data governance, compliance, and security, particularly around AI system usage. Partner with business stakeholders to prioritize high-impact AI/automation use cases and define clear ROI and success metrics. Stay informed on emerging trends in AI/ML and translate innovations into actionable media solutions. Ideal Profile: 7+ years of experience in automation, AI/ML, or data science, including 3+ years in marketing, ad tech, or digital media. Strong understanding of machine learning frameworks for predictive modeling, anomaly detection, and NLP-based insight generation. Proficiency in Python and libraries such as scikit-learn, TensorFlow, pandas, or PyTorch. Experience with cloud-based AI platforms (e.g., Google Vertex AI, Azure ML, AWS Sagemaker) and media API integrations. Ability to architect AI-enhanced automations that improve forecasting, QA, and decision-making in media operations. Familiarity with RPA tools (e.g., UiPath, Automation Anywhere); AI-first automation experience is a plus. Demonstrated success in developing or deploying ML models for campaign optimization, fraud detection, or process intelligence. Familiarity with digital media ecosystems such as Google Ads, Meta, TikTok, DSPs, and ad servers. Excellent communication and stakeholder management skills, with the ability to translate technical solutions into business value. Kindly share your Resume at Hello@whizzhr.com Show more Show less
Posted 1 week ago
7.5 years
0 Lacs
Pune, Maharashtra, India
On-site
Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : AWS Glue Good to have skills : NA Minimum 7.5 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As part of a Data Transformation programme you will be part of the Data Marketplace team. In this team you will be responsible for Architecture and design for automating data management compliance validation, monitoring, and reporting through rule-based and AI-driven mechanisms, integrating with metadata repositories and governance tools for real-time policy enforcement and for delivering design specifications for real-time metadata integration, enhanced automation, audit logging, monitoring capabilities, and lifecycle management (including version control, decommissioning, and rollback) Preferably experience with the implementation and adaptation of data management and data governance controls around Data Product implementations, preferably on AWS. Experience with AI appreciated. Examples skills – Data Architecture, Data Marketplace, Data governance, Data Engineering, AWS DataZone, AWS Sagemaker Unified Studio As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure project milestones are met, facilitating discussions to address challenges, and guiding your team in implementing effective solutions. You will also engage in strategic planning sessions to align project goals with organizational objectives, ensuring that all stakeholders are informed and involved in the decision-making process. Your role will require a balance of technical expertise and leadership skills to drive project success and foster a collaborative team environment. Roles & Responsibilities: - Expected to be an SME. - Collaborate and manage the team to perform. - Responsible for team decisions. - Engage with multiple teams and contribute on key decisions. - Provide solutions to problems for their immediate team and across multiple teams. - Facilitate knowledge sharing sessions to enhance team capabilities. - Monitor project progress and implement necessary adjustments to meet deadlines. Professional & Technical Skills: - Must To Have Skills: Proficiency in AWS Glue. - Strong understanding of data integration and ETL processes. - Experience with cloud computing platforms and services. - Familiarity with data warehousing concepts and best practices. - Ability to troubleshoot and optimize data workflows. Additional Information: - The candidate should have minimum 7.5 years of experience in AWS Glue. - This position is based in Pune. - A 15 years full time education is required. Show more Show less
Posted 1 week ago
20.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Job Title: Staff AI Engineer - MLOps Company: Rapid7 Team: AI Center of Excellence Team Overview: Cross-functional team of Data Scientists and AI Engineers Mission: Leverage AI/ML to protect customer attack surfaces Partners with Detection and Response teams, including MDR Encourages creativity, collaboration, and research publication Uses 20+ years of threat analysis and growing patent portfolio Tech Stack: Cloud/Infra: AWS (SageMaker, Bedrock), EKS, Terraform Languages/Tools: Python, Jupyter, NumPy, Pandas, Scikit-learn ML Focus: Anomaly detection, unlabeled data Role Summary: Build and deploy ML production systems Manage end-to-end data pipelines and ensure data quality Implement ML guardrails and robust monitoring Deploy web apps and REST APIs with strong data security Share knowledge, mentor engineers, collaborate cross-functionally Embrace agile, iterative development Requirements: 8–12 years in Software Engineering (3+ in ML deployment on AWS) Strong in Python, Flask/FastAPI, API development Skilled in CI/CD, Docker, Kubernetes, MLOps, cloud AI tools Experience in data pre-processing, feature engineering, model monitoring Strong communication and documentation skills Collaborative mindset, growth-oriented problem-solving Preferred Qualifications: Experience with Java Background in the security industry Familiarity with AI/ML model operations, LLM experimentation Knowledge of model risk management (drift monitoring, hyperparameter tuning, registries) About Rapid7: Rapid7 is committed to securing the digital world through passion, collaboration, and innovation. With over 10,000 customers globally, it offers a dynamic, growth-focused workplace and tackles major cybersecurity challenges with diverse teams and a mission-driven approach. 4o Show more Show less
Posted 1 week ago
5.0 years
0 Lacs
India
On-site
Machine Learning Engineer (Python, AWS) We are seeking an experienced Machine Learning Engineer with 5+ years of hands-on experience in developing and deploying ML solutions. The ideal candidate will have strong Python programming skills and a proven track record working with AWS services for machine learning. Responsibilities: Design, develop, and deploy scalable machine learning models. Implement and optimize ML algorithms using Python. Leverage AWS services (e.g., Sagemaker, EC2, S3, Lambda) for ML model training, deployment, and monitoring. Collaborate with data scientists and other engineers to bring ML solutions to production. Ensure the performance, reliability, and scalability of ML systems. Qualifications: Bachelor's or master's degree in computer science, Engineering, or a related field. 5+ years of professional experience as a Machine Learning Engineer. Expertise in Python programming for machine learning. Strong experience with AWS services for ML (SageMaker, EC2, S3, Lambda, etc.). Solid understanding of machine learning algorithms and principles. Experience with MLOps practices is a plus. Show more Show less
Posted 1 week ago
2.0 - 5.0 years
0 Lacs
Delhi, India
Remote
Job Title: AI Engineer Location: Remote Employment Type: Full-time About the Role: We are seeking a skilled and motivated AI Engineer to help us build intelligent, agentic systems that drive real-world impact. In this role, you will develop, deploy, and maintain AI models and pipelines - working with large language models (LLMs), vector databases, and orchestration frameworks like Langchain. You will collaborate across teams to build robust, scalable AI-driven solutions. Key Responsibilities- Design and develop intelligent systems using LLMs, retrieval-augmented generation (RAG), and agentic frameworks. Build and deploy AI pipelines using Langchain, vector stores, and custom tools.- Integrate models with production APIs and backend systems. Monitor, fine-tune, and improve performance of deployed AI systems.- Collaborate with data engineers, product managers, and UX designers to deliver AI-first user experiences. Stay up to date with advancements in generative AI, LLMs, and orchestration frameworks. Required Qualifications 2-5 years of experience in building and deploying machine learning or AI-based systems. Hands-on experience with Langchain in building agent workflows or RAG pipelines. Proficiency in Python and frameworks such as PyTorch, TensorFlow, or Scikit-learn. Experience with cloud platforms (AWS, GCP, Azure) and containerization (Docker, Kubernetes). Strong understanding of prompt engineering, embeddings, and vector database operations (e.g., FAISS, Pinecone, Weaviate). Familiarity with MLOps tools such as MLflow, SageMaker, or Vertex AI. Preferred Qualifications Experience with large language models (e.g., GPT, Claude, LLaMA) and GenAI platforms (e.g., OpenAI, Bedrock, Anthropic). Background in NLP, RAG architectures, or autonomous agents. Experience in deploying AI applications via APIs and microservices. Contributions to open-source Langchain or GenAI ecosystems. Why Join Us? Remote-first company working on frontier AI systems. Opportunity to shape production-grade AI experiences used globally. Dynamic, collaborative, and intellectually curious team. Competitive compensation with fast growth pot Show more Show less
Posted 1 week ago
12.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
About the Role We are seeking a Director of Software Engineering to lead our engineering team. This role requires a strategic and hands-on leader with deep expertise in Java and Amazon Web Services (AWS) with experience in modernizing platforms, cloud native migrations and hybrid strategies. The ideal candidate will have a strong product mindset, extensive experience in building scalable cloud-native applications, and the ability to drive engineering excellence in a fast-paced environment. Key Responsibilities • Technical Leadership: Define and implement best practices for Java-based architectures and scalable backend systems. • Team Management: Lead, mentor, and grow a high-performing team of software engineers and engineering managers. • Cloud & Infrastructure: Design, deploy, and optimize AWS-based solutions, leveraging services like EC2, Lambda, S3, RDS, DynamoDB. • Performance & Scalability: Ensure high availability, security, and performance of distributed systems on AWS and in our data centers. • APIs: Architect, design and document Restful APIs as a product for both internal and external customers • Agile Development: Foster an engineering culture of excellence with focus on product delivery with quality and technological advantage • Technology Roadmap: Stay ahead of industry trends, identifying opportunities for modernization and innovation. • Stakeholder Collaboration: Work closely with leadership, product, and operation teams to align engineering efforts with business goals. Required Qualifications • Experience: 12+ years in software engineering, with at least 5 years in a leadership role. • Technical Expertise: • - Strong background in Java, JDK and its ecosystem • - Hands-on expertise in both data center and AWS architectures, deployments, and automation. • - Strong experience with SQL/NoSQL databases (Oracle, PostgreSQL, MySQL, DynamoDB). • - Proficiency in RESTful APIs, event-driven architecture (Kafka, SNS/SQS), and service design. • - Strong grasp of security best practices, IAM roles, and compliance standards on AWS. • Leadership & Strategy: Proven track record of scaling engineering teams and aligning technology with business goals. • Problem-Solving Mindset: Ability to diagnose complex technical issues and optimize outcomes. Preferred Qualifications • Experience in high-scale SaaS applications using Java and AWS. • Knowledge of AI/ML services on AWS (SageMaker, Bedrock) and data engineering pipelines. • Agile & DevOps: Experience implementing DevOps pipelines, CI/CD, and Infrastructure as Code (Terraform, CloudFormation). • Background in fintech, e-commerce, or enterprise software is a plus. Show more Show less
Posted 1 week ago
10.0 years
0 Lacs
Gurugram, Haryana, India
On-site
We are seeking a highly driven Client Services Manager – Data Science Lead to lead and deliver end-to-end data science solutions for pharmaceutical clients. This is a consulting-focused role requiring hands-on technical expertise, leadership, and the ability to translate business problems into data science solutions. Key Responsibilities: Lead a team of data scientists, analysts, and engineers on complex healthcare and pharma data projects. Partner with senior client stakeholders and contribute to technical solutioning and proposal discussions. Deliver full-lifecycle data science projects including data cleaning, feature engineering, model development, validation, and deployment. Apply advanced ML/statistical techniques such as regression models, neural networks, and NLP (including BERT). Stay up to date on AI/ML advancements and guide the team on new approaches. Ensure models are scalable, optimized, and implemented within sustainable frameworks. Preferred Experience & Skills: Strong programming skills in Python and SQL . Familiarity with relational and NoSQL databases such as Postgres and Redshift . Hands-on experience with AWS, Azure , and tools like Sagemaker and Athena. Experience working with healthcare data (e.g., HEOR, RWE, Claims data). Exposure to large language models (LLMs) like ChatGPT, and prompt engineering. Knowledge of visualization tools such as Tableau or Power BI is a plus. Previous experience in a consulting environment and building data science products is highly desirable. Qualifications: Bachelor’s or Master’s degree in Computer Science, Data Science, or a related field from a reputed institution Experience: 7–10 years Preferred Industry: Healthcare, Pharma, Life Sciences Show more Show less
Posted 1 week ago
0.0 - 18.0 years
0 Lacs
Indore, Madhya Pradesh
On-site
Indore, Madhya Pradesh, India Qualification : BTech degree in computer science, engineering or related field of study or 12+ years of related work experience 7+ years design & implementation experience with large scale data centric distributed applications Professional experience architecting, operating cloud-based solutions with good understanding of core disciplines like compute, networking, storage, security, databases etc. Good understanding of data engineering concepts like storage, governance, cataloging, data quality, data modeling etc. Good understanding about various architecture patterns like data lake, data lake house, data mesh etc. Good understanding of Data Warehousing concepts, hands-on experience working with tools like Hive, Redshift, Snowflake, Teradata etc. Experience migrating or transforming legacy customer solutions to the cloud. Experience working with services like AWS EMR, Glue, DMS, Kinesis, RDS, Redshift, Dynamo DB, Document DB, SNS, SQS, Lambda, EKS, Data Zone etc. Thorough understanding of Big Data ecosystem technologies like Hadoop, Spark, Hive, HBase etc. and other competent tools and technologies Understanding in designing analytical solutions leveraging AWS cognitive services like Textract, Comprehend, Rekognition etc. in combination with Sagemaker is good to have. Experience working with modern development workflows, such as git, continuous integration/continuous deployment pipelines, static code analysis tooling, infrastructure-as-code, and more. Experience with a programming or scripting language – Python/Java/Scala AWS Professional/Specialty certification or relevant cloud expertise Skills Required : AWS, Big Data, Spark, Technical Architecture Role : Drive innovation within Data Engineering domain by designing reusable and reliable accelerators, blueprints, and libraries. Capable of leading a technology team, inculcating innovative mindset and enable fast paced deliveries. Able to adapt to new technologies, learn quickly, and manage high ambiguity. Ability to work with business stakeholders, attend/drive various architectural, design and status calls with multiple stakeholders. Exhibit good presentation skills with a high degree of comfort speaking with executives, IT Management, and developers. Drive technology/software sales or pre-sales consulting discussions Ensure end-to-end ownership of all tasks being aligned. Ensure high quality software development with complete documentation and traceability. Fulfil organizational responsibilities (sharing knowledge & experience with other teams / groups) Conduct technical training(s)/session(s), write whitepapers/ case studies / blogs etc. Experience : 10 to 18 years Job Reference Number : 12895
Posted 1 week ago
0.0 - 5.0 years
0 Lacs
Noida, Uttar Pradesh
On-site
Noida, Uttar Pradesh, India;Bangalore, Karnataka, India;Gurugram, Haryana, India;Indore, Madhya Pradesh, India;Pune, Maharashtra, India;Hyderabad, Telangana, India Qualification : Strong experience in Python 2+ years’ experience of working on feature/data pipelines using PySpark Understanding and experience around data science Exposure to AWS cloud services such as Sagemaker, Bedrock, Kendra etc. Experience with machine learning model lifecycle management tools, and an understanding of MLOps principles and best practice Experience with statistical models e.g., multinomial logistic regression Experience of technical architecture, design, deployment, and operational level knowledge Exploratory Data Analysis Knowledge around Model building, Hyperparameter tuning and Model performance metrics. Statistics Knowledge (Probability Distributions, Hypothesis Testing) Time series modelling, Forecasting, Image/Video Analytics, and Natural Language Processing (NLP). Good To Have: Experience researching and ing large language and Generative AI models. Experience with LangChain, LLAMAIndex, Foundation model tuning, Data Augmentation, and Performance Evaluation frameworks Able to provide analytical expertise in the process of model development, refining, and implementation in a variety of analytics problems. Knowledge on Docker and Kubernetes. Skills Required : Machine Learning, Natural Language Processing , AWS Sagemaker, Python Role : Generate actionable insights for business improvements. Ability to understand business requirements. Write clean, efficient, and reusable code following best practices. Troubleshoot and debug applications to ensure optimal performance. Write unit test cases Collaborate with cross-functional teams to define and deliver new features Use case derivation and solution creation from structured/unstructured data. Actively drive a culture of knowledge-building and sharing within the team Experience ing theoretical models in an applied environment. MLOps, Data Pipeline, Data engineering Statistics Knowledge (Probability Distributions, Hypothesis Testing) Experience : 4 to 5 years Job Reference Number : 13027
Posted 1 week ago
0.0 - 3.0 years
0 Lacs
Noida, Uttar Pradesh
On-site
Noida, Uttar Pradesh, India;Gurugram, Haryana, India;Indore, Madhya Pradesh, India;Bengaluru, Karnataka, India;Pune, Maharashtra, India;Hyderabad, Telangana, India Qualification : 2-4 years of experience in designing, developing, and training machine learning models using diverse algorithms and techniques, including deep learning, NLP, computer vision, and time series analysis. Proven ability to optimize model performance through experimentation with architectures, hyperparameter tuning, and evaluation metrics. Hands-on experience in processing large datasets, including preprocessing, feature engineering, and data augmentation. Demonstrated ability to deploy trained AI/ML models to production using frameworks like Kubernetes and cloud-based ML platforms Solid understanding of monitoring and logging for performance tracking. Experience in exploring new AI/ML methodologies and documenting the development and deployment lifecycle, including performance metrics. Familiarity with AWS services, particularly SageMaker, is expected. Excellent communication, presentation, and interpersonal skills are essential. Good to have: Knowledge of GenAI (LangChain, Foundation model tuning, and GPT3) Amazon AWS Certified Machine Learning - Specialty certifications Skills Required : Machine Learning, Langchain, AWS Sagemaker, Python Role : Explore different models and transform data science prototypes for given problem Analyze dataset perform data enrichment, feature engineering and model training Abale to write code using Python, Pandas and Dataframe APIs Develop machine learning applications according to requirements Perform statistical analysis and fine-tuning using test results Collaborate with data engineers & architects to implement and deploy scalable solutions. Encourage continuous innovation and out-of-the-box thinking. Experience ing theoretical models in an applied environment. Experience : 1 to 3 years Job Reference Number : 13047
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2