Hyderabad, Telangana, India
Not disclosed
On-site
Full Time
What we're looking for: At least 5 years of experience in designing & building AI applications for customer and deploying them into production. Software engineering experience in building Secure, scalable and performant applications for customers Experience with Document extraction using AI, Conversational AI, Vision AI, NLP or Gen AI Design, develop, and operationalize existing ML models by fine tuning, personalizing it Evaluate machine learning models and perform necessary tuning Develop prompts that instruct LLM to generate relevant and accurate responses Collaborate with data scientists and engineers to analyze and preprocess datasets for prompt development, including data cleaning, transformation, and augmentation Conduct thorough analysis to evaluate LLM responses, iteratively modify prompts to improve LLM performance Hands on customer experience with RAG solution or fine tuning of LLM model Build and deploy scalable machine learning pipelines on GCP or any equivalent cloud platform involving data warehouses, machine learning platforms, dashboards or CRM tools Experience working with the end-to-end steps involving but not limited to data cleaning, exploratory data analysis, dealing outliers, handling imbalances, analyzing data distributions (univariate, bivariate, multivariate), transforming numerical and categorical data into features, feature selection, model selection, model training and deployment Proven experience building and deploying machine learning models in production environments for real life applications Good understanding of natural language processing, computer vision or other deep learning techniques Expertise in Python, Numpy, Pandas and various ML libraries (e.g., XGboost, TensorFlow, PyTorch, Scikit-learn, LangChain) Familiarity with Google Cloud or any other Cloud Platform and its machine learning services Excellent communication, collaboration, and problem-solving skills Good to Have Google Cloud Certified Professional Machine Learning or TensorFlow Certified Developer certifications or equivalent Experience of working with one or more public cloud platforms - namely GCP, AWS or Azure Experience with AutoML and vision techniques Master’s degree in statistics, machine learning or related fields Show more Show less
Hyderabad, Telangana, India
Not disclosed
On-site
Full Time
We are seeking a highly motivated and experienced ML Engineer/Data Scientist to join our growing ML/GenAI team. You will play a key role in designing, developing and productionalizing ML applications by evaluating models, training and/or fine tuning them. You will play a crucial role in developing Gen AI based solutions for our customers. As a senior member of the team, you will take ownership of projects, collaborating with engineers and stakeholders to ensure successful project delivery. What we're looking for: At least 3 years of experience in designing & building AI applications for customer and deploying them into production At least 5 years of Software engineering experience in building Secure, scalable and performant applications for customers. Experience with Document extraction using AI, Conversational AI, Vision AI, NLP or Gen AI. Design, develop, and operationalize existing ML models by fine tuning, personalizing it. Evaluate machine learning models and perform necessary tuning. Develop prompts that instruct LLM to generate relevant and accurate responses. Collaborate with data scientists and engineers to analyze and preprocess datasets for prompt development, including data cleaning, transformation, and augmentation. Conduct thorough analysis to evaluate LLM responses, iteratively modify prompts to improve LLM performance. Hands on customer experience with RAG solution or fine tuning of LLM model. Build and deploy scalable machine learning pipelines on GCP or any equivalent cloud platform involving data warehouses, machine learning platforms, dashboards or CRM tools. Experience working with the end-to-end steps involving but not limited to data cleaning, exploratory data analysis, dealing outliers, handling imbalances, analyzing data distributions (univariate, bivariate, multivariate), transforming numerical and categorical data into features, feature selection, model selection, model training and deployment. Proven experience building and deploying machine learning models in production environments for real life applications Good understanding of natural language processing, computer vision or other deep learning techniques. Expertise in Python, Numpy, Pandas and various ML libraries (e.g., XGboost, TensorFlow, PyTorch, Scikit-learn, LangChain). Familiarity with Google Cloud or any other Cloud Platform and its machine learning services. Excellent communication, collaboration, and problem-solving skills. Good to Have Google Cloud Certified Professional Machine Learning or TensorFlow Certified Developer certifications or equivalent. Experience of working with one or more public cloud platforms - namely GCP, AWS or Azure. Experience with Amazon Lex or Google DialogFlow CX or Microsoft Copilot studio for CCAI Agent workflows Experience with AutoML and vision techniques. Master’s degree in statistics, machine learning or related fields. Show more Show less
Hyderabad, Telangana, India
Not disclosed
On-site
Full Time
Egen is a fast-growing and entrepreneurial company with a data-first mindset. We bring together the best engineering talent working with the most advanced technology platforms, including Google Cloud and Salesforce, to help clients drive action and impact through data and insights. We are committed to being a place where the best people choose to work so they can apply their engineering and technology expertise to envision what is next for how data and platforms can change the world for the better. We are dedicated to learning, thrive on solving tough problems, and continually innovate to achieve fast, effective results. Job Summary We are seeking a talented and passionate Python Developer to join our dynamic team. In this role, you will be instrumental in designing, developing, and deploying scalable and efficient applications on the Google Cloud Platform. You will have the opportunity to work on exciting projects and contribute to the growth and innovation of our products/services. You will also mentorship to other engineers, and engage with clients to understand their needs and deliver effective solutions. Responsibilities: Design, develop, and maintain robust and scalable applications using Python Build and consume RESTful APIs using FastAPI Deploy and manage applications on the Google Cloud Platform (GCP) Collaborate effectively with cross-functional teams, including product managers, designers, and other engineers Write clean, well-documented, and testable code Participate in code reviews to ensure code quality and adherence to best practices Troubleshoot and debug issues in development and production environments Create clear and effective documents Stay up-to-date with the latest industry trends and technologies Assist the junior team members Required Skills And Experience 5+ years of relevant work experience in software development using Python Solid understanding and practical experience with the FastAPI framework Hands-on experience with the Google Cloud Platform (GCP) and its core services Experience with CI/CD pipelines Ability to write unit test cases and execute them Able to discuss and propose architectural changes Knowledge of security best practices Strong problem-solving and analytical skills Excellent communication and collaboration abilities Bachelor’s degree in Computer Science or a related field (or equivalent practical experience) Optional Skills (a Plus) Experience with any front-end framework such as Angular, React, Vue.js, etc Familiarity with DevOps principles and practices Experience with infrastructure-as-code tools like Terraform Knowledge of containerization technologies such as Docker and Kubernetes Show more Show less
Hyderabad, Telangana, India
Not disclosed
On-site
Full Time
Experience Level: 4 to 6 years of relevant IT experience Job Overview: We are looking for a skilled and motivated Data Engineer with strong experience in Python programming and Google Cloud Platform (GCP) to join our data engineering team. The ideal candidate will be responsible for designing, developing, and maintaining robust and scalable ETL (Extract, Transform, Load) data pipelines. The role involves working with various GCP services, implementing data ingestion and transformation logic, and ensuring data quality and consistency across systems. Key Responsibilities: ● Design, develop, test, and maintain scalable ETL data pipelines using Python. ● Work extensively on Google Cloud Platform (GCP) services such as: ○ Dataflow for real-time and batch data processing ○ Cloud Functions for lightweight serverless compute ○ BigQuery for data warehousing and analytics ○ Cloud Composer for orchestration of data workflows (based on Apache Airflow) ○ Google Cloud Storage (GCS) for managing data at scale ○ IAM for access control and security ○ Cloud Run for containerized applications ● Perform data ingestion from various sources and apply transformation and cleansing logic to ensure high-quality data delivery. ● Implement and enforce data quality checks, validation rules, and monitoring. ● Collaborate with data scientists, analysts, and other engineering teams to understand data needs and deliver efficient data solutions. ● Manage version control using GitHub and participate in CI/CD pipeline deployments for data projects. ● Write complex SQL queries for data extraction and validation from relational databases such as SQL Server, Oracle, or PostgreSQL. ● Document pipeline designs, data flow diagrams, and operational support procedures. Required Skills: ● 4–6 years of hands-on experience in Python for backend or data engineering projects. ● Strong understanding and working experience with GCP cloud services (especially Dataflow, BigQuery, Cloud Functions, Cloud Composer, etc.). ● Solid understanding of data pipeline architecture, data integration, and transformation techniques. ● Experience in working with version control systems like GitHub and knowledge of CI/CD practices. ● Strong experience in SQL with at least one enterprise database (SQL Server, Oracle, PostgreSQL, etc.). Good to Have (Optional Skills): ● Experience working with Snowflake cloud data platform. ● Hands-on knowledge of Databricks for big data processing and analytics. ● Familiarity with Azure Data Factory (ADF) and other Azure data engineering tools. Additional Details: ● Excellent problem-solving and analytical skills. ● Strong communication skills and ability to collaborate in a team environment. Show more Show less
Hyderabad, Telangana, India
Not disclosed
On-site
Full Time
Job Overview: We are looking for a skilled and motivated Data Engineer with strong experience in Python programming and Google Cloud Platform (GCP) to join our data engineering team. The ideal candidate will be responsible for designing, developing, and maintaining robust and scalable ETL (Extract, Transform, Load) data pipelines. The role involves working with various GCP services, implementing data ingestion and transformation logic, and ensuring data quality and consistency across systems. Key Responsibilities: Design, develop, test, and maintain scalable ETL data pipelines using Python Work extensively on Google Cloud Platform (GCP) services such as: Dataflow for real-time and batch data processing Cloud Functions for lightweight serverless compute BigQuery for data warehousing and analytics Cloud Composer for orchestration of data workflows (based on Apache Airflow) Google Cloud Storage (GCS) for managing data at scale IAM for access control and security Cloud Run for containerized applications Perform data ingestion from various sources and apply transformation and cleansing logic to ensure high-quality data delivery Implement and enforce data quality checks, validation rules, and monitoring Collaborate with data scientists, analysts, and other engineering teams to understand data needs and deliver efficient data solutions Manage version control using GitHub and participate in CI/CD pipeline deployments for data projects Write complex SQL queries for data extraction and validation from relational databases such as SQL Server, Oracle, or PostgreSQL Document pipeline designs, data flow diagrams, and operational support procedures Required Skills: 4–6 years of hands-on experience in Python for backend or data engineering projects Strong understanding and working experience with GCP cloud services (especially Dataflow, BigQuery, Cloud Functions, Cloud Composer, etc.) Solid understanding of data pipeline architecture, data integration, and transformation techniques Experience in working with version control systems like GitHub and knowledge of CI/CD practices Strong experience in SQL with at least one enterprise database (SQL Server, Oracle, PostgreSQL, etc.) Good to Have (Optional Skills): Experience working with Snowflake cloud data platform Hands-on knowledge of Databricks for big data processing and analytics Familiarity with Azure Data Factory (ADF) and other Azure data engineering tools Show more Show less
Hyderabad, Telangana, India
Not disclosed
On-site
Full Time
Job Overview: We are looking for a skilled and motivated Data Engineer with strong experience in Python programming and Google Cloud Platform (GCP) to join our data engineering team. The ideal candidate will be responsible for designing, developing, and maintaining robust and scalable ETL (Extract, Transform, Load) data pipelines. The role involves working with various GCP services, implementing data ingestion and transformation logic, and ensuring data quality and consistency across systems. Key Responsibilities: 1. Design, develop, test, and maintain scalable ETL data pipelines using Python. 2. Perform data ingestion from various sources and apply transformation & cleansing logic to ensure high-quality data delivery. 3. Implement and enforce data quality checks, validation rules, and monitoring. 4. Collaborate with data scientists, analysts, and other engineering teams to understand data needs and deliver efficient data solutions. 5. Manage version control using GitHub and participate in CI/CD pipeline deployments for data projects. 6 . Write complex SQL queries for data extraction and validation from relational databases such as SQL Server , Oracle , or PostgreSQL 7. Document pipeline designs, data flow diagrams, and operational support procedures. 8. Work extensively on Google Cloud Platform (GCP) services such as: Dataflow for real-time and batch data process. Cloud Functions for lightweight serverless compute. BigQuery for data warehousing and analytics. Cloud Composer for orchestration of data workflows ( Apache Airflow.) Google Cloud Storage (GCS) for managing data at scale. IAM for access control and secure. Cloud Run for containerized applications Required Skills: 4–6 years of hands-on experience in Python for backend or data engineering projects. Strong understanding and working experience with GCP cloud services (especially Dataflow, BigQuery, Cloud Functions, Cloud Composer, etc.). Solid understanding of data pipeline architecture , data integration , and transformation techniques . Experience in working with version control systems like GitHub and knowledge of CI/CD practices . Strong experience in SQL with at least one enterprise database (SQL Server, Oracle, PostgreSQL, etc.). Good to Have (Optional Skills): Experience working with Snowflake cloud data platform. Hands-on knowledge of Databricks for big data processing and analytics. Familiarity with Azure Data Factory (ADF) and other Azure data engineering tools. Show more Show less
Hyderabad, Telangana, India
None Not disclosed
On-site
Full Time
Job Summary: We are seeking a highly skilled and experienced Senior Infrastructure Engineer to join our dynamic team. The ideal candidate will be passionate about building and maintaining complex systems, with a holistic approach to architecture. You will play a key role in designing, implementing, and managing cloud infrastructure, ensuring scalability, availability, security, and optimal performance. You will also provide mentorship to other engineers, and engage with clients to understand their needs and deliver effective solutions. Responsibilities: Design, architect, and implement scalable, highly available, and secure infrastructure solutions, primarily on Amazon Web Services (AWS) Develop and maintain Infrastructure as Code (IaC) using Terraform or AWS CDK for enterprise-scale maintainability and repeatability Implement robust access control via IAM roles and policy orchestration, ensuring least-privilege and auditability across multi-environment deployments Contribute to secure, scalable identity and access patterns, including OAuth2-based authorization flows and dynamic IAM role mapping across environments Support deployment of infrastructure lambda functions Troubleshoot issues and collaborate with cloud vendors on managed service reliability and roadmap alignment Utilize Kubernetes deployment tools such as Helm/Kustomize in combination with GitOps tools such as ArgoCD for container orchestration and management Design and implement CI/CD pipelines using platforms like GitHub, GitLab, Bitbucket, Cloud Build, Harness, etc., with a focus on rolling deployments, canaries, and blue/green deployments Ensure auditability and observability of pipeline states Implement security best practices, audit, and compliance requirements within the infrastructure Engage with clients to understand their technical and business requirements, and provide tailored solutions If needed, lead agile ceremonies and project planning, including developing agile boards and backlogs with support from our Service Delivery Leads Troubleshoot and resolve complex infrastructure issues Qualifications: 6+ years of experience in Infrastructure Engineering or similar role Extensive experience with Amazon Web Services (AWS) Proven ability to architect for scale, availability, and high-performance workloads Deep knowledge of Infrastructure as Code (IaC) with Terraform Strong experience with Kubernetes and related tools (Helm, Kustomize, ArgoCD) Solid understanding of git, branching models, CI/CD pipelines and deployment strategies Experience with security, audit, and compliance best practices Excellent problem-solving and analytical skills Strong communication and interpersonal skills, with the ability to engage with both technical and non-technical stakeholders Experience in technical mentoring, team-forming and fostering self-organization and ownership Experience with client relationship management and project planning Certifications: Relevant certifications (e.g., Kubernetes Certified Administrator, AWS Certified Machine Learning Engineer - Associate, AWS Certified Data Engineer - Associate, AWS Certified Developer - Associate, etc.) Software development experience (e.g., Terraform, Python) Experience/Exposure with machine learning infrastructure Education: B.Tech/BE in computer sciences, a related field or equivalent experience Show more Show less
Hyderabad, Telangana, India
None Not disclosed
On-site
Full Time
Job Summary: We are seeking a highly skilled and experienced Lead Infrastructure Engineer to join our dynamic team. The ideal candidate will be passionate about building and maintaining complex systems, with a holistic approach to architecting infrastructure that survives and thrives in production. You will play a key role in designing, implementing, and managing cloud infrastructure, ensuring scalability, availability, security, and optimal performance vs spend. You will also provide technical leadership and mentorship to other engineers, and engage with clients to understand their needs and deliver effective solutions. Responsibilities: Design, architect, and implement scalable, highly available, and secure infrastructure solutions, primarily on Amazon Web Services (AWS) Develop and maintain Infrastructure as Code (IaC) using Terraform or AWS CDK for enterprise-scale maintainability and repeatability Implement robust access control via IAM roles and policy orchestration, ensuring least-privilege and auditability across multi-environment deployments Contribute to secure, scalable identity and access patterns, including OAuth2-based authorization flows and dynamic IAM role mapping across environments Support deployment of infrastructure lambda functions Troubleshoot issues and collaborate with cloud vendors on managed service reliability and roadmap alignment Utilize Kubernetes deployment tools such as Helm/Kustomize in combination with GitOps tools such as ArgoCD for container orchestration and management Design and implement CI/CD pipelines using platforms like GitHub, GitLab, Bitbucket, Cloud Build, Harness, etc., with a focus on rolling deployments, canaries, and blue/green deployments Ensure auditability and observability of pipeline states Implement security best practices, audit, and compliance requirements within the infrastructure Provide technical leadership, mentorship, and training to engineering staff Engage with clients to understand their technical and business requirements, and provide tailored solutions. If needed, lead agile ceremonies and project planning, including developing agile boards and backlogs with support from our Service Delivery Leads Troubleshoot and resolve complex infrastructure issues. Potentially participate in pre-sales activities and provide technical expertise to sales teams Qualifications: 10+ years of experience in an Infrastructure Engineer or similar role Extensive experience with Amazon Web Services (AWS) Proven ability to architect for scale, availability, and high-performance workloads Ability to plan and execute zero-disruption migrations Experience with enterprise IAM and familiarity with authentication technology such as OAuth2 and OIDC Deep knowledge of Infrastructure as Code (IaC) with Terraform and/or AWS CDK Strong experience with Kubernetes and related tools (Helm, Kustomize, ArgoCD) Solid understanding of git, branching models, CI/CD pipelines and deployment strategies Experience with security, audit, and compliance best practices Excellent problem-solving and analytical skills Strong communication and interpersonal skills, with the ability to engage with both technical and non-technical stakeholders Experience in technical leadership, mentoring, team-forming and fostering self-organization and ownership Experience with client relationship management and project planning Certifications: Relevant certifications (for example Kubernetes Certified Administrator, AWS Certified Solutions Architect - Professional, AWS Certified DevOps Engineer - Professional etc.) Software development experience (for example Terraform, Python) Experience with machine learning infrastructure Education: B.Tech /BE in computer science, a related field or equivalent experience Show more Show less
Hyderabad, Telangana, India
None Not disclosed
On-site
Full Time
Egen is seeking a proactive and versatile Technical Project Manager to lead multiple internal IT Applications teams. The IT Applications Technical Project Manager is responsible for managing the critical applications, systems and tools used by 500+ consultants delivering 100+ concurrent AI and Data Analytics Professional Services projects. This role requires the ability to act as Product Owner to interface with business owners, understand business use cases and define appropriate upgrades and new features; as a Technical Project Manager to provide technical oversight, direct software design and development and guide technical direction; and as a Scrum Master to plan and manage multiple teams implementing biweekly Sprints and releases to Production. Key Responsibilities: Product Ownership: Understand how critical business applications function to meet business needs Work closely with internal stakeholders and business owners to understand their application needs, translate them into clear and actionable user stories, define acceptance criteria, and drive innovation and continuous improvement Design new processes within existing business applications to meet key business priorities Scrum Master: Lead multiple scrum teams using Agile Best Practices in requirements gathering, User Story documentation, Quality Assurance and Task lifecycle management Plan biweekly sprints to build key functionality to meet business needs Facilitate Scrum ceremonies (e.g., daily stand-ups, Sprint planning, retrospectives) to ensure maximum team effectiveness Manage and prioritize application backlogs, including bug fixes, enhancement requests, and operational improvements, ensuring they align with business needs and user impact Project Management: Create and manage project plans, resource allocation, timelines, and deliverables Oversee project planning, execution, and delivery for longer-term system upgrades and implementations Manage multiple concurrent teams delivering features to multiple business owners Identify and communicate potential risks and issues, develop mitigation strategies, and manage to closure Create status reports as needed Stakeholder Communication: Coordinate with different departments (e.g. Operations, Finance, HR, Delivery) to communicate progress, manage expectations, and gather feedback Communicate project status and issues to stakeholders Develop and communicate short and long term roadmaps Manage executive prioritization of backlog Coordinate business users developing Test Suites and executing UAT of candidate releases Resource Management: Allocate resources to projects. Balance workload among team members Helpdesk support: Oversee IT helpdesk process Ensure SLAs are met Reporting and Analytics: Define and measure performance KPIs Create and distribute regular reports on project status, resource utilization, financial performance, etc. Analyze data to identify trends, issues, and opportunities for improvement Provide insights and recommendations to senior management Process Improvement: Identify areas for process improvement within the PSA system Implement best practices and new tools to enhance efficiency Train team members on new processes and tools Required Skills and Qualifications: Project Management: Strong project management skills and experience with project management tools (e.g., Asana, Smartsheet, Jira, MS Project) Agile Development: Strong experience planning and managing Agile Sprints, defining User Stories, managing team capacity and managing backlogs Analytical Skills: Ability to analyze data, generate reports, and provide actionable insights. Technical Proficiency: Experience managing Salesforce implementations Familiarity with PSA software (e.g. Mavenlink, NetSuite OpenAir). FinancialForce PSA (Certinia) preferred Experience using reporting and analytics tools (Looker, Tableau, CRMA) Experience with Google Cloud Platform is a plus Communication: Excellent verbal and written communication skills Organizational Skills: Strong organizational skills and attention to detail Problem-Solving: Ability to identify issues and develop effective solutions. Collaboration: Ability to work effectively in a team environment and collaborate with various stakeholders Adaptability: Ability to adapt to changing priorities and manage multiple tasks simultaneously Global Collaboration: Ability to work synchronously and asynchronously with stakeholders around the globe
Hyderabad, Telangana, India
None Not disclosed
On-site
Full Time
Egen is a fast-growing and entrepreneurial company with a data-first mindset. We bring together the best engineering talent working with the most advanced technology platforms, including Google Cloud and Salesforce, to help clients drive action and impact through data and insights. We are committed to being a place where the best people choose to work so they can apply their engineering and technology expertise to envision what is next for how data and platforms can change the world for the better. We are dedicated to learning, thrive on solving tough problems, and continually innovate to achieve fast, effective results. Job Summary We are seeking a talented and passionate Python Developer to join our dynamic team. In this role, you will be instrumental in designing, developing, and deploying scalable and efficient applications on the Google Cloud Platform. You will have the opportunity to work on exciting projects and contribute to the growth and innovation of our products/services. You will also mentorship to other engineers, and engage with clients to understand their needs and deliver effective solutions. Responsibilities: · Design, develop, and maintain robust and scalable applications using Python. · Build and consume RESTful APIs using FastAPI. · Deploy and manage applications on the Google Cloud Platform (GCP) and GCP services.(Mandate) · Collaborate effectively with cross-functional teams, including product managers, designers, and other engineers. · Write clean, well-documented, and testable code. · Participate in code reviews to ensure code quality and adherence to best practices. · Troubleshoot and debug issues in development and production environments. · Create clear and effective documents. · Stay up-to-date with the latest industry trends and technologies. · Assist the junior team members. Required Skills and Experience · 5+ years of relevant work experience in software development using Python · Solid understanding and practical experience with the FastAPI framework · Hands-on experience with the Google Cloud Platform (GCP) and its core services · Experience with CI/CD pipelines · Ability to write unit test cases and execute them · Able to discuss and propose architectural changes · Knowledge of security best practices · Strong problem-solving and analytical skills · Excellent communication and collaboration abilities · Bachelor’s degree in Computer Science or a related field (or equivalent practical experience) Optional Skills (a plus) · Experience with any front-end framework such as Angular, React, Vue.js , etc. · Familiarity with DevOps principles and practices. · Experience with infrastructure-as-code tools like Terraform. · Knowledge of containerization technologies such as Docker and Kubernetes.
Hyderabad, Telangana, India
None Not disclosed
On-site
Full Time
We are seeking a highly motivated and experienced ML Engineer/Data Scientist to join our growing ML/GenAI team. You will play a key role in designing, developing and productionalizing ML applications by evaluating models, training and/or fine tuning them. You will play a crucial role in developing Gen AI based solutions for our customers. As a senior member of the team, you will take ownership of projects, collaborating with engineers and stakeholders to ensure successful project delivery. What we're looking for: At least 3 years of experience in designing & building AI applications for customer and deploying them into production At least 5 years of Software engineering experience in building Secure, scalable and performant applications for customers. Experience with Document extraction using AI, Conversational AI, Vision AI, NLP or Gen AI. Design, develop, and operationalize existing ML models by fine tuning, personalizing it. Evaluate machine learning models and perform necessary tuning. Develop prompts that instruct LLM to generate relevant and accurate responses. Collaborate with data scientists and engineers to analyze and preprocess datasets for prompt development, including data cleaning, transformation, and augmentation. Conduct thorough analysis to evaluate LLM responses, iteratively modify prompts to improve LLM performance. Hands on customer experience with RAG solution or fine tuning of LLM model. Build and deploy scalable machine learning pipelines on GCP or any equivalent cloud platform involving data warehouses, machine learning platforms, dashboards or CRM tools. Experience working with the end-to-end steps involving but not limited to data cleaning, exploratory data analysis, dealing outliers, handling imbalances, analyzing data distributions (univariate, bivariate, multivariate), transforming numerical and categorical data into features, feature selection, model selection, model training and deployment. Proven experience building and deploying machine learning models in production environments for real life applications Good understanding of natural language processing, computer vision or other deep learning techniques. Expertise in Python, Numpy, Pandas and various ML libraries (e.g., XGboost, TensorFlow, PyTorch, Scikit-learn, LangChain). Familiarity with Google Cloud or any other Cloud Platform and its machine learning services. Excellent communication, collaboration, and problem-solving skills. Good to Have Google Cloud Certified Professional Machine Learning or TensorFlow Certified Developer certifications or equivalent. Experience of working with one or more public cloud platforms - namely GCP, AWS or Azure. Experience with Amazon Lex or Google DialogFlow CX or Microsoft Copilot studio for CCAI Agent workflows Experience with AutoML and vision techniques. Master’s degree in statistics, machine learning or related fields.
Hyderabad, Telangana, India
None Not disclosed
On-site
Full Time
Egen is a fast-growing and entrepreneurial company with a data-first mindset. We bring together the best engineering talent working with the most advanced technology platforms, including Google Cloud and Salesforce, to help clients drive action and impact through data and insights. We are committed to being a place where the best people choose to work so they can apply their engineering and technology expertise to envision what is next for how data and platforms can change the world for the better. We are dedicated to learning, thrive on solving tough problems, and continually innovate to achieve fast, effective results. Job Summary We are seeking a talented and passionate Python Developer to join our dynamic team. In this role, you will be instrumental in designing, developing, and deploying scalable and efficient applications on the Google Cloud Platform. You will have the opportunity to work on exciting projects and contribute to the growth and innovation of our products/services. You will also mentorship to other engineers, and engage with clients to understand their needs and deliver effective solutions. Responsibilities: Design, develop, and maintain robust and scalable applications using Python Build and consume RESTful APIs using FastAPI Deploy and manage applications on the Google Cloud Platform (GCP) Collaborate effectively with cross-functional teams, including product managers, designers, and other engineers Write clean, well-documented, and testable code Participate in code reviews to ensure code quality and adherence to best practices Troubleshoot and debug issues in development and production environments Create clear and effective documents Stay up-to-date with the latest industry trends and technologies Assist the junior team members Required Skills And Experience 5+ years of relevant work experience in software development using Python Solid understanding and practical experience with the FastAPI framework Hands-on experience with the Google Cloud Platform (GCP) and its core services Experience with CI/CD pipelines Ability to write unit test cases and execute them Able to discuss and propose architectural changes Knowledge of security best practices Strong problem-solving and analytical skills Excellent communication and collaboration abilities Bachelor’s degree in Computer Science or a related field (or equivalent practical experience) Optional Skills (a Plus) Experience with any front-end framework such as Angular, React, Vue.js, etc Familiarity with DevOps principles and practices Experience with infrastructure-as-code tools like Terraform Knowledge of containerization technologies such as Docker and Kubernetes
Hyderabad, Telangana, India
None Not disclosed
On-site
Full Time
Job Summary: We are seeking a highly skilled and experienced Senior Cloud Infrastructure (GCP) to join our dynamic team. The ideal candidate is passionate about building and maintaining complex systems, with a holistic approach to architecting resilient infrastructure that thrives in production. You will be responsible for designing, implementing, and managing cloud infrastructure with a strong focus on scalability, availability, security, and cost optimization. Additionally, you will provide technical leadership and mentorship to engineers while engaging with clients to understand their requirements and deliver effective solutions. Responsibilities: Design, architect, and implement scalable, highly available, and secure infrastructure solutions, primarily on Google Cloud Platform (GCP) . Develop and maintain Infrastructure as Code (IaC) using Terraform for enterprise-scale maintainability and repeatability. Utilize Kubernetes deployment tools such as Helm and Kustomize , along with GitOps tools like ArgoCD, for container orchestration and management. Design and implement robust CI/CD pipelines using platforms like GitHub, GitLab, Bitbucket, Cloud Build, Harness , etc., with a focus on rolling deployments, canary releases, and blue/green deployments. Ensure pipeline auditability and observability throughout the deployment process. Implement security best practices and ensure compliance with audit requirements across the infrastructure. Provide technical leadership, mentorship, and training to engineering staff. Collaborate with clients to understand technical and business needs and provide tailored infrastructure solutions. When required, lead Agile ceremonies and project planning efforts, including backlog creation and board management, in collaboration with Service Delivery Leads. Troubleshoot and resolve complex infrastructure issues promptly. Potentially participate in pre-sales activities , offering technical expertise to support the sales team. Qualifications: 5+ years of experience in an Infrastructure Engineer , DevOps , or similar role. Extensive hands-on experience with Google Cloud Platform (GCP) . Proven ability to architect systems for scalability, high availability, and performance. Experience executing zero-downtime migrations . Deep expertise in Terraform and Infrastructure as Code practices. Strong experience with Kubernetes and ecosystem tools (Helm, Kustomize, ArgoCD). Solid understanding of Git workflows , branching strategies, and CI/CD pipeline automation. Experience implementing security, audit, and compliance standards within infrastructure. Excellent analytical and problem-solving skills. Strong communication skills with the ability to engage both technical and non-technical stakeholders. Demonstrated leadership in mentoring teams and fostering ownership and self-organization. Experience with client engagement , project planning , and delivery management . Certifications (Preferred): Google Cloud Certified – Professional Cloud Architect Certified Kubernetes Administrator (CKA) Google Cloud Networking or Security Certifications Additional certifications in related areas are a plus. Bonus Experience: Software development experience using Terraform , Python , or similar scripting languages. Experience with Machine Learning infrastructure or ML Ops tools. Education: Bachelor's degree in Computer Science , a related technical field, or equivalent practical experience.
hyderabad, telangana
INR Not disclosed
On-site
Full Time
We are looking for an experienced professional with a minimum of 8+ years in the industry. The ideal candidate should have proven expertise in working with any cloud technologies (AWS/Azure/GCP). It is essential for the candidate to have an ownership mindset and be a leader who takes initiative, leads by example, and drives projects and teams towards success. Prior experience in managing and leading teams is also required. If you thrive in a dynamic environment and are ready to take on leadership roles, we encourage you to apply. Required Experience: - Minimum of a Bachelor's Degree or its equivalent in Computer Science, Computer Information Systems, Information Technology and Management, Electrical Engineering, or a related field. - Experience working and a strong understanding of object-oriented programming and cloud technologies. - End-to-end experience delivering production-ready code with Java8, Spring Boot, Spring Data, and API libraries. - Familiarity with web application standards and end-to-end implementation with React/Angular/Vue and REST APIs. - Strong experience with unit and integration testing of the Spring Boot APIs. - Strong understanding and production experience of RESTful APIs and microservice architecture. - Strong understanding of SQL databases and NoSQL databases and experience with writing abstraction layers to communicate with the databases. - Strong understanding and production experience working with Docker container environments and cloud-based CI/CD processes. - Cloud Environments: any of AWS, GCP, or Azure. Nice to have's (but not required): - Exposure to JavaScript frameworks and complex web application workflows. - Strong understanding and production experience working with Kafka.,
Hyderabad, Telangana, India
None Not disclosed
On-site
Full Time
Job Overview: We are looking for a skilled and motivated Data Engineer with strong experience in Python programming and Google Cloud Platform (GCP) to join our data engineering team. The ideal candidate will be responsible for designing, developing, and maintaining robust and scalable ETL (Extract, Transform, Load) data pipelines. The role involves working with various GCP services, implementing data ingestion and transformation logic, and ensuring data quality and consistency across systems. Key Responsibilities: Design, develop, test, and maintain scalable ETL data pipelines using Python. Work extensively on Google Cloud Platform (GCP) services such as: Dataflow for real-time and batch data processing Cloud Functions for lightweight serverless compute BigQuery for data warehousing and analytics Cloud Composer for orchestration of data workflows (based on Apache Airflow) Google Cloud Storage (GCS) for managing data at scale IAM for access control and security Cloud Run for containerized applications Perform data ingestion from various sources and apply transformation and cleansing logic to ensure high-quality data delivery. Implement and enforce data quality checks, validation rules, and monitoring. Collaborate with data scientists, analysts, and other engineering teams to understand data needs and deliver efficient data solutions. Manage version control using GitHub and participate in CI/CD pipeline deployments for data projects. Write complex SQL queries for data extraction and validation from relational databases such as SQL Server, Oracle, or PostgreSQL. Document pipeline designs, data flow diagrams, and operational support procedures. Required Skills: 4–6 years of hands-on experience in Python for backend or data engineering projects. Strong understanding and working experience with GCP cloud services (especially Dataflow, BigQuery, Cloud Functions, Cloud Composer, etc.). Solid understanding of data pipeline architecture, data integration, and transformation techniques. Experience in working with version control systems like GitHub and knowledge of CI/CD practices. Strong experience in SQL with at least one enterprise database (SQL Server, Oracle, PostgreSQL, etc.). Good to Have (Optional Skills): Experience working with the Snowflake cloud data platform. Hands-on knowledge of Databricks for big data processing and analytics. Familiarity with Azure Data Factory (ADF) and other Azure data engineering tools.
Hyderabad, Telangana, India
None Not disclosed
On-site
Full Time
Job Overview: We are looking for a skilled and motivated Data Engineer with strong experience in Python programming and Google Cloud Platform (GCP) to join our data engineering team. The ideal candidate will be responsible for designing, developing, and maintaining robust and scalable ETL (Extract, Transform, Load) data pipelines. The role involves working with various GCP services, implementing data ingestion and transformation logic, and ensuring data quality and consistency across systems. Key Responsibilities: Design, develop, test, and maintain scalable ETL data pipelines using Python Work extensively on Google Cloud Platform (GCP) services such as: Dataflow for real-time and batch data processing Cloud Functions for lightweight serverless compute BigQuery for data warehousing and analytics Cloud Composer for orchestration of data workflows (based on Apache Airflow) Google Cloud Storage (GCS) for managing data at scale IAM for access control and security Cloud Run for containerized applications Perform data ingestion from various sources and apply transformation and cleansing logic to ensure high-quality data delivery Implement and enforce data quality checks, validation rules, and monitoring Collaborate with data scientists, analysts, and other engineering teams to understand data needs and deliver efficient data solutions Manage version control using GitHub and participate in CI/CD pipeline deployments for data projects Write complex SQL queries for data extraction and validation from relational databases such as SQL Server, Oracle, or PostgreSQL Document pipeline designs, data flow diagrams, and operational support procedures Required Skills: 4–6 years of hands-on experience in Python for backend or data engineering projects Strong understanding and working experience with GCP cloud services (especially Dataflow, BigQuery, Cloud Functions, Cloud Composer, etc.) Solid understanding of data pipeline architecture, data integration, and transformation techniques Experience in working with version control systems like GitHub and knowledge of CI/CD practices Strong experience in SQL with at least one enterprise database (SQL Server, Oracle, PostgreSQL, etc.) Good to Have (Optional Skills): Experience working with Snowflake cloud data platform Hands-on knowledge of Databricks for big data processing and analytics Familiarity with Azure Data Factory (ADF) and other Azure data engineering tools Education: Bachelor's degree in Computer Science, a related field, or equivalent experience
Hyderabad, Telangana, India
None Not disclosed
On-site
Full Time
Egen is a fast-growing and entrepreneurial company with a data-first mindset. We bring together the best engineering talent working with the most advanced technology platforms, including Google Cloud and Salesforce, to help clients drive action and impact through data and insights. We are committed to being a place where the best people choose to work so they can apply their engineering and technology expertise to envision what is next for how data and platforms can change the world for the better. We are dedicated to learning, thrive on solving tough problems, and continually innovate to achieve fast, effective results. Job Summary We are seeking a talented and passionate Python Developer to join our dynamic team. In this role, you will be instrumental in designing, developing, and deploying scalable and efficient applications on the Google Cloud Platform. You will have the opportunity to work on exciting projects and contribute to the growth and innovation of our products/services. You will also mentorship to other engineers, and engage with clients to understand their needs and deliver effective solutions. Responsibilities: · Design, develop, and maintain robust and scalable applications using Python. · Build and consume RESTful APIs using FastAPI. · Deploy and manage applications on the Google Cloud Platform (GCP) and GCP services.(Mandate) · Collaborate effectively with cross-functional teams, including product managers, designers, and other engineers. · Write clean, well-documented, and testable code. · Participate in code reviews to ensure code quality and adherence to best practices. · Troubleshoot and debug issues in development and production environments. · Create clear and effective documents. · Stay up-to-date with the latest industry trends and technologies. · Assist the junior team members. Required Skills and Experience · 5+ years of relevant work experience in software development using Python · Solid understanding and practical experience with the FastAPI framework · Hands-on experience with the Google Cloud Platform (GCP) and its core services · Experience with CI/CD pipelines · Ability to write unit test cases and execute them · Able to discuss and propose architectural changes · Knowledge of security best practices · Strong problem-solving and analytical skills · Excellent communication and collaboration abilities · Bachelor’s degree in Computer Science or a related field (or equivalent practical experience) Optional Skills (a plus) · Experience with any front-end framework such as Angular, React, Vue.js , etc. · Familiarity with DevOps principles and practices. · Experience with infrastructure-as-code tools like Terraform. · Knowledge of containerization technologies such as Docker and Kubernetes.
Hyderabad, Telangana, India
None Not disclosed
On-site
Full Time
Job Summary: We are seeking a highly skilled and experienced Lead Infrastructure Engineer to join our dynamic team. The ideal candidate will be passionate about building and maintaining complex systems, with a holistic approach to architecture. You will play a key role in designing, implementing, and managing cloud infrastructure, ensuring scalability, availability, security, and optimal performance. You will also provide technical leadership and mentorship to other engineers, and engage with clients to understand their needs and deliver effective solutions. Responsibilities: Design, architect, and implement scalable, highly available, and secure infrastructure solutions, primarily on Google Cloud Platform (GCP). Develop and maintain Infrastructure as Code (IaC) using Terraform for enterprise-scale deployments. Utilize Kubernetes deployment tools such as Helm/Kustomize for container orchestration and management. Design and implement CI/CD pipelines using platforms like GitHub, GitLab, Bitbucket, Cloud Build, Harness, etc., with a focus on rolling deployments, canaries, and blue/green deployments. Ensure auditability and observability of pipeline states. Implement security best practices, audit, and compliance requirements within the infrastructure. Provide technical leadership, mentorship, and training to engineering staff. Engage with clients to understand their technical and business requirements, and provide tailored solutions. If needed, lead agile ceremonies and project planning, including developing agile boards and backlogs. Troubleshoot and resolve complex infrastructure issues. Potentially participate in pre-sales activities and provide technical expertise to sales teams. Qualifications: 5(+) years of experience in an Infrastructure Engineer or similar role. Extensive experience with Google Cloud Platform (GCP) and/or Amazon Web Services (AWS). Proven ability to architect for scale, availability, and high-performance workloads. Deep knowledge of Infrastructure as Code (IaC) with Terraform. Strong experience with Kubernetes and related tools (Helm, Kustomize). Solid understanding of CI/CD pipelines and deployment strategies. Experience with security, audit, and compliance best practices. Excellent problem-solving and analytical skills. Strong communication and interpersonal skills, with the ability to engage with both technical and non-technical stakeholders. Experience in technical leadership and mentoring. Experience with client relationship management and project planning. Certifications: Relevant certifications (e.g., Kubernetes Certified Administrator,Google Cloud Certified Professional Cloud Architect,Google Cloud Networking Certifications.Google Cloud Security Certifications etc.). Software development experience (e.g., Terraform, Python). Experience with machine learning infrastructure. Education: Bachelor's degree in Computer Science, a related field, or equivalent experience.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.