Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
6.0 - 10.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Summary Position Summary AI & Data In this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment. The AI & Data team leverages the power of data, analytics, robotics, science and cognitive technologies to uncover hidden relationships from vast troves of data, generate insights, and inform decision-making. The offering portfolio helps clients transform their business by architecting organizational intelligence programs and differentiated strategies to win in their chosen markets. AI & Data will work with our clients to: Implement large-scale data ecosystems including data management, governance and the integration of structured and unstructured data to generate insights leveraging cloud-based platforms. Leverage automation, cognitive and science-based techniques to manage data, predict scenarios and prescribe actions. Drive operational efficiency by maintaining their data ecosystems, sourcing analytics expertise and providing As-a-Service offerings for continuous insights and improvements. Job Title: Senior Generative AI Developer/Team Lead Job Summary: We are looking for a Generative AI Team Lead with hands-on experience to design, develop and deploy AI and Generative AI models that generate high quality content, such as text, images, chatbots, etc. The Ideal candidate will have expertise in deep learning, natural language processing, and computer vision. Key Responsibilities: Lead and deliver large-scale AI/Gen AI projects across multiple industries and domains. Liaison with on-site and client teams to understand various business problem statements and project requirements. Lead a team of Data Engineers, ML/AI Engineers, Prompt engineers, and other Data & AI professionals to deliver projects from inception to implementation. Brainstorm, build & improve AI/Gen AI models developed by the team & identify scope for model improvements & best practices. Assist and participate in pre-sales, client pursuits and proposals. Drive a human-led culture of Inclusion & Diversity by caring deeply for all team members. Qualifications: 6-10 years of relevant experience in Generative AI, Deep Learning, or NLP Bachelor’s or master’s degree in a quantitative field. Led a 3–5-member team on multiple end to end AI/GenAI projects Excellent communication and client/stakeholder management skills Must have strong hands-on experience with programming languages like Python, Cuda and SQL, and frameworks such as TensorFlow, PyTorch and Keras Hands-on experience with top LLM models like OpenAI GPT-3.5/4, Google Gemini, AWS Bedrock, LLaMA 3.0, and Mistral, along with RAG and Agentic workflows Well versed with GANs and Transformer architecture, knows about Diffusion models, up to date with new research/progress in the field of Gen AI Should follow research papers, comprehend and innovate/present the best approaches/solutions related to Generative AI components. Knowledge of hyperscaler offerings (NVIDIA, AWS, Azure, GCP, Oracle) and Gen AI tools (Copilot, Vertex AI). Knowledge of Vector DB, Neo4J/relevant Graph DBs Familiar with Docker containerization, GIT, etc. AI/Cloud certification from a premier institute is preferred. Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Professional development At Deloitte, professionals have the opportunity to work with some of the best and discover what works best for them. Here, we prioritize professional growth, offering diverse learning and networking opportunities to help accelerate careers and enhance leadership skills. Our state-of-the-art DU: The Leadership Center in India, located in Hyderabad, represents a tangible symbol of our commitment to the holistic growth and development of our people. Explore DU: The Leadership Center in India . Benefits To Help You Thrive At Deloitte, we know that great people make a great organization. Our comprehensive rewards program helps us deliver a distinctly Deloitte experience that helps that empowers our professionals to thrive mentally, physically, and financially—and live their purpose. To support our professionals and their loved ones, we offer a broad range of benefits. Eligibility requirements may be based on role, tenure, type of employment and/ or other criteria. Learn more about what working at Deloitte can mean for you. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Requisition code: 303629
Posted 3 weeks ago
6.0 - 10.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Summary Position Summary AI & Data In this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment. The AI & Data team leverages the power of data, analytics, robotics, science and cognitive technologies to uncover hidden relationships from vast troves of data, generate insights, and inform decision-making. The offering portfolio helps clients transform their business by architecting organizational intelligence programs and differentiated strategies to win in their chosen markets. AI & Data will work with our clients to: Implement large-scale data ecosystems including data management, governance and the integration of structured and unstructured data to generate insights leveraging cloud-based platforms. Leverage automation, cognitive and science-based techniques to manage data, predict scenarios and prescribe actions. Drive operational efficiency by maintaining their data ecosystems, sourcing analytics expertise and providing As-a-Service offerings for continuous insights and improvements. Job Title: Senior Generative AI Developer/Team Lead Job Summary: We are looking for a Generative AI Team Lead with hands-on experience to design, develop and deploy AI and Generative AI models that generate high quality content, such as text, images, chatbots, etc. The Ideal candidate will have expertise in deep learning, natural language processing, and computer vision. Key Responsibilities: Lead and deliver large-scale AI/Gen AI projects across multiple industries and domains. Liaison with on-site and client teams to understand various business problem statements and project requirements. Lead a team of Data Engineers, ML/AI Engineers, Prompt engineers, and other Data & AI professionals to deliver projects from inception to implementation. Brainstorm, build & improve AI/Gen AI models developed by the team & identify scope for model improvements & best practices. Assist and participate in pre-sales, client pursuits and proposals. Drive a human-led culture of Inclusion & Diversity by caring deeply for all team members. Qualifications: 6-10 years of relevant experience in Generative AI, Deep Learning, or NLP Bachelor’s or master’s degree in a quantitative field. Led a 3–5-member team on multiple end to end AI/GenAI projects Excellent communication and client/stakeholder management skills Must have strong hands-on experience with programming languages like Python, Cuda and SQL, and frameworks such as TensorFlow, PyTorch and Keras Hands-on experience with top LLM models like OpenAI GPT-3.5/4, Google Gemini, AWS Bedrock, LLaMA 3.0, and Mistral, along with RAG and Agentic workflows Well versed with GANs and Transformer architecture, knows about Diffusion models, up to date with new research/progress in the field of Gen AI Should follow research papers, comprehend and innovate/present the best approaches/solutions related to Generative AI components. Knowledge of hyperscaler offerings (NVIDIA, AWS, Azure, GCP, Oracle) and Gen AI tools (Copilot, Vertex AI). Knowledge of Vector DB, Neo4J/relevant Graph DBs Familiar with Docker containerization, GIT, etc. AI/Cloud certification from a premier institute is preferred. Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Professional development At Deloitte, professionals have the opportunity to work with some of the best and discover what works best for them. Here, we prioritize professional growth, offering diverse learning and networking opportunities to help accelerate careers and enhance leadership skills. Our state-of-the-art DU: The Leadership Center in India, located in Hyderabad, represents a tangible symbol of our commitment to the holistic growth and development of our people. Explore DU: The Leadership Center in India . Benefits To Help You Thrive At Deloitte, we know that great people make a great organization. Our comprehensive rewards program helps us deliver a distinctly Deloitte experience that helps that empowers our professionals to thrive mentally, physically, and financially—and live their purpose. To support our professionals and their loved ones, we offer a broad range of benefits. Eligibility requirements may be based on role, tenure, type of employment and/ or other criteria. Learn more about what working at Deloitte can mean for you. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Requisition code: 303629
Posted 3 weeks ago
6.0 - 10.0 years
0 Lacs
Kolkata, West Bengal, India
On-site
Summary Position Summary AI & Data In this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment. The AI & Data team leverages the power of data, analytics, robotics, science and cognitive technologies to uncover hidden relationships from vast troves of data, generate insights, and inform decision-making. The offering portfolio helps clients transform their business by architecting organizational intelligence programs and differentiated strategies to win in their chosen markets. AI & Data will work with our clients to: Implement large-scale data ecosystems including data management, governance and the integration of structured and unstructured data to generate insights leveraging cloud-based platforms. Leverage automation, cognitive and science-based techniques to manage data, predict scenarios and prescribe actions. Drive operational efficiency by maintaining their data ecosystems, sourcing analytics expertise and providing As-a-Service offerings for continuous insights and improvements. Job Title: Senior Generative AI Developer/Team Lead Job Summary: We are looking for a Generative AI Team Lead with hands-on experience to design, develop and deploy AI and Generative AI models that generate high quality content, such as text, images, chatbots, etc. The Ideal candidate will have expertise in deep learning, natural language processing, and computer vision. Key Responsibilities: Lead and deliver large-scale AI/Gen AI projects across multiple industries and domains. Liaison with on-site and client teams to understand various business problem statements and project requirements. Lead a team of Data Engineers, ML/AI Engineers, Prompt engineers, and other Data & AI professionals to deliver projects from inception to implementation. Brainstorm, build & improve AI/Gen AI models developed by the team & identify scope for model improvements & best practices. Assist and participate in pre-sales, client pursuits and proposals. Drive a human-led culture of Inclusion & Diversity by caring deeply for all team members. Qualifications: 6-10 years of relevant experience in Generative AI, Deep Learning, or NLP Bachelor’s or master’s degree in a quantitative field. Led a 3–5-member team on multiple end to end AI/GenAI projects Excellent communication and client/stakeholder management skills Must have strong hands-on experience with programming languages like Python, Cuda and SQL, and frameworks such as TensorFlow, PyTorch and Keras Hands-on experience with top LLM models like OpenAI GPT-3.5/4, Google Gemini, AWS Bedrock, LLaMA 3.0, and Mistral, along with RAG and Agentic workflows Well versed with GANs and Transformer architecture, knows about Diffusion models, up to date with new research/progress in the field of Gen AI Should follow research papers, comprehend and innovate/present the best approaches/solutions related to Generative AI components. Knowledge of hyperscaler offerings (NVIDIA, AWS, Azure, GCP, Oracle) and Gen AI tools (Copilot, Vertex AI). Knowledge of Vector DB, Neo4J/relevant Graph DBs Familiar with Docker containerization, GIT, etc. AI/Cloud certification from a premier institute is preferred. Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Professional development At Deloitte, professionals have the opportunity to work with some of the best and discover what works best for them. Here, we prioritize professional growth, offering diverse learning and networking opportunities to help accelerate careers and enhance leadership skills. Our state-of-the-art DU: The Leadership Center in India, located in Hyderabad, represents a tangible symbol of our commitment to the holistic growth and development of our people. Explore DU: The Leadership Center in India . Benefits To Help You Thrive At Deloitte, we know that great people make a great organization. Our comprehensive rewards program helps us deliver a distinctly Deloitte experience that helps that empowers our professionals to thrive mentally, physically, and financially—and live their purpose. To support our professionals and their loved ones, we offer a broad range of benefits. Eligibility requirements may be based on role, tenure, type of employment and/ or other criteria. Learn more about what working at Deloitte can mean for you. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Requisition code: 303629
Posted 3 weeks ago
6.0 - 10.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Summary Position Summary AI & Data In this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment. The AI & Data team leverages the power of data, analytics, robotics, science and cognitive technologies to uncover hidden relationships from vast troves of data, generate insights, and inform decision-making. The offering portfolio helps clients transform their business by architecting organizational intelligence programs and differentiated strategies to win in their chosen markets. AI & Data will work with our clients to: Implement large-scale data ecosystems including data management, governance and the integration of structured and unstructured data to generate insights leveraging cloud-based platforms. Leverage automation, cognitive and science-based techniques to manage data, predict scenarios and prescribe actions. Drive operational efficiency by maintaining their data ecosystems, sourcing analytics expertise and providing As-a-Service offerings for continuous insights and improvements. Job Title: Senior Generative AI Developer/Team Lead Job Summary: We are looking for a Generative AI Team Lead with hands-on experience to design, develop and deploy AI and Generative AI models that generate high quality content, such as text, images, chatbots, etc. The Ideal candidate will have expertise in deep learning, natural language processing, and computer vision. Key Responsibilities: Lead and deliver large-scale AI/Gen AI projects across multiple industries and domains. Liaison with on-site and client teams to understand various business problem statements and project requirements. Lead a team of Data Engineers, ML/AI Engineers, Prompt engineers, and other Data & AI professionals to deliver projects from inception to implementation. Brainstorm, build & improve AI/Gen AI models developed by the team & identify scope for model improvements & best practices. Assist and participate in pre-sales, client pursuits and proposals. Drive a human-led culture of Inclusion & Diversity by caring deeply for all team members. Qualifications: 6-10 years of relevant experience in Generative AI, Deep Learning, or NLP Bachelor’s or master’s degree in a quantitative field. Led a 3–5-member team on multiple end to end AI/GenAI projects Excellent communication and client/stakeholder management skills Must have strong hands-on experience with programming languages like Python, Cuda and SQL, and frameworks such as TensorFlow, PyTorch and Keras Hands-on experience with top LLM models like OpenAI GPT-3.5/4, Google Gemini, AWS Bedrock, LLaMA 3.0, and Mistral, along with RAG and Agentic workflows Well versed with GANs and Transformer architecture, knows about Diffusion models, up to date with new research/progress in the field of Gen AI Should follow research papers, comprehend and innovate/present the best approaches/solutions related to Generative AI components. Knowledge of hyperscaler offerings (NVIDIA, AWS, Azure, GCP, Oracle) and Gen AI tools (Copilot, Vertex AI). Knowledge of Vector DB, Neo4J/relevant Graph DBs Familiar with Docker containerization, GIT, etc. AI/Cloud certification from a premier institute is preferred. Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Professional development At Deloitte, professionals have the opportunity to work with some of the best and discover what works best for them. Here, we prioritize professional growth, offering diverse learning and networking opportunities to help accelerate careers and enhance leadership skills. Our state-of-the-art DU: The Leadership Center in India, located in Hyderabad, represents a tangible symbol of our commitment to the holistic growth and development of our people. Explore DU: The Leadership Center in India . Benefits To Help You Thrive At Deloitte, we know that great people make a great organization. Our comprehensive rewards program helps us deliver a distinctly Deloitte experience that helps that empowers our professionals to thrive mentally, physically, and financially—and live their purpose. To support our professionals and their loved ones, we offer a broad range of benefits. Eligibility requirements may be based on role, tenure, type of employment and/ or other criteria. Learn more about what working at Deloitte can mean for you. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Requisition code: 303629
Posted 3 weeks ago
6.0 - 10.0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
Summary Position Summary AI & Data In this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment. The AI & Data team leverages the power of data, analytics, robotics, science and cognitive technologies to uncover hidden relationships from vast troves of data, generate insights, and inform decision-making. The offering portfolio helps clients transform their business by architecting organizational intelligence programs and differentiated strategies to win in their chosen markets. AI & Data will work with our clients to: Implement large-scale data ecosystems including data management, governance and the integration of structured and unstructured data to generate insights leveraging cloud-based platforms. Leverage automation, cognitive and science-based techniques to manage data, predict scenarios and prescribe actions. Drive operational efficiency by maintaining their data ecosystems, sourcing analytics expertise and providing As-a-Service offerings for continuous insights and improvements. Job Title: Senior Generative AI Developer/Team Lead Job Summary: We are looking for a Generative AI Team Lead with hands-on experience to design, develop and deploy AI and Generative AI models that generate high quality content, such as text, images, chatbots, etc. The Ideal candidate will have expertise in deep learning, natural language processing, and computer vision. Key Responsibilities: Lead and deliver large-scale AI/Gen AI projects across multiple industries and domains. Liaison with on-site and client teams to understand various business problem statements and project requirements. Lead a team of Data Engineers, ML/AI Engineers, Prompt engineers, and other Data & AI professionals to deliver projects from inception to implementation. Brainstorm, build & improve AI/Gen AI models developed by the team & identify scope for model improvements & best practices. Assist and participate in pre-sales, client pursuits and proposals. Drive a human-led culture of Inclusion & Diversity by caring deeply for all team members. Qualifications: 6-10 years of relevant experience in Generative AI, Deep Learning, or NLP Bachelor’s or master’s degree in a quantitative field. Led a 3–5-member team on multiple end to end AI/GenAI projects Excellent communication and client/stakeholder management skills Must have strong hands-on experience with programming languages like Python, Cuda and SQL, and frameworks such as TensorFlow, PyTorch and Keras Hands-on experience with top LLM models like OpenAI GPT-3.5/4, Google Gemini, AWS Bedrock, LLaMA 3.0, and Mistral, along with RAG and Agentic workflows Well versed with GANs and Transformer architecture, knows about Diffusion models, up to date with new research/progress in the field of Gen AI Should follow research papers, comprehend and innovate/present the best approaches/solutions related to Generative AI components. Knowledge of hyperscaler offerings (NVIDIA, AWS, Azure, GCP, Oracle) and Gen AI tools (Copilot, Vertex AI). Knowledge of Vector DB, Neo4J/relevant Graph DBs Familiar with Docker containerization, GIT, etc. AI/Cloud certification from a premier institute is preferred. Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Professional development At Deloitte, professionals have the opportunity to work with some of the best and discover what works best for them. Here, we prioritize professional growth, offering diverse learning and networking opportunities to help accelerate careers and enhance leadership skills. Our state-of-the-art DU: The Leadership Center in India, located in Hyderabad, represents a tangible symbol of our commitment to the holistic growth and development of our people. Explore DU: The Leadership Center in India . Benefits To Help You Thrive At Deloitte, we know that great people make a great organization. Our comprehensive rewards program helps us deliver a distinctly Deloitte experience that helps that empowers our professionals to thrive mentally, physically, and financially—and live their purpose. To support our professionals and their loved ones, we offer a broad range of benefits. Eligibility requirements may be based on role, tenure, type of employment and/ or other criteria. Learn more about what working at Deloitte can mean for you. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Requisition code: 303629
Posted 3 weeks ago
5.0 years
0 Lacs
Pune, Maharashtra, India
On-site
STAND 8 provides end-to-end IT solutions to enterprise partners across the United States and with offices in Los Angeles, New York, New Jersey, Atlanta, and more including internationally in Mexico and India. We are seeking a Senior AI Engineer / Data Engineer to join our engineering team and help build the future of AI-powered business solutions. In this role, you'll be developing intelligent systems that leverage advanced large language models (LLMs), real-time AI interactions, and cutting-edge retrieval architectures. Your work will directly contribute to products that are reshaping how businesses operate-particularly in recruitment, data extraction, and intelligent decision-making. This is an exciting opportunity for someone who thrives in building production-grade AI systems and working across the full stack of modern AI technologies. Responsibilities Design, build, and optimize AI-powered systems using multi-modal architectures (text, voice, visual). Integrate and deploy LLM APIs from providers such as OpenAI, Anthropic, and AWS Bedrock. Build and maintain RAG (Retrieval-Augmented Generation) systems with hybrid search, re-ranking, and knowledge graphs. Develop real-time AI features using streaming analytics and voice interaction tools (e.g., ElevenLabs). Build APIs and pipelines using FastAPI or similar frameworks to support AI workflows. Process and analyze unstructured documents with layout and semantic understanding. Implement predictive models that power intelligent business recommendations. Deploy and maintain scalable solutions using AWS services (EC2, S3, RDS, Lambda, Bedrock, etc.). Use Docker for containerization and manage CI/CD workflows and version control via Git. Debug, monitor, and optimize performance for large-scale data pipelines. Collaborate cross-functionally with product, data, and engineering teams. Qualifications 5+ years of experience in AI/ML or data engineering with Python in production environments. Hands-on experience with LLM APIs and frameworks such as OpenAI, Anthropic, Bedrock, or LangChain. Production experience using vector databases like PGVector, Weaviate, FAISS, or Pinecone. Strong understanding of NLP, document extraction, and text processing. Proficiency in AWS cloud services including Bedrock, EC2, S3, Lambda, and monitoring tools. Experience with FastAPI or similar frameworks for building AI/ML APIs. Familiarity with embedding models, prompt engineering, and RAG systems. Asynchronous programming knowledge for high-throughput pipelines. Experience with Docker, Git workflows, CI/CD pipelines, and testing best practices. Preferred Background in HRTech or ATS integrations (e.g., Greenhouse, Workday, Bullhorn). Experience working with knowledge graphs (e.g., Neo4j) for semantic relationships. Real-time AI systems (e.g., WebRTC, OpenAI Realtime API) and voice AI tools (e.g., ElevenLabs). Advanced Python development skills using design patterns and clean architecture. Large-scale data processing experience (1-2M+ records) with cost optimization techniques for LLMs. Event-driven architecture experience using AWS SQS, SNS, or EventBridge. Hands-on experience with fine-tuning, evaluating, and deploying foundation models.
Posted 3 weeks ago
6.0 - 10.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Summary Position Summary AI & Data In this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment. The AI & Data team leverages the power of data, analytics, robotics, science and cognitive technologies to uncover hidden relationships from vast troves of data, generate insights, and inform decision-making. The offering portfolio helps clients transform their business by architecting organizational intelligence programs and differentiated strategies to win in their chosen markets. AI & Data will work with our clients to: Implement large-scale data ecosystems including data management, governance and the integration of structured and unstructured data to generate insights leveraging cloud-based platforms. Leverage automation, cognitive and science-based techniques to manage data, predict scenarios and prescribe actions. Drive operational efficiency by maintaining their data ecosystems, sourcing analytics expertise and providing As-a-Service offerings for continuous insights and improvements. Job Title: Senior Generative AI Developer/Team Lead Job Summary: We are looking for a Generative AI Team Lead with hands-on experience to design, develop and deploy AI and Generative AI models that generate high quality content, such as text, images, chatbots, etc. The Ideal candidate will have expertise in deep learning, natural language processing, and computer vision. Key Responsibilities: Lead and deliver large-scale AI/Gen AI projects across multiple industries and domains. Liaison with on-site and client teams to understand various business problem statements and project requirements. Lead a team of Data Engineers, ML/AI Engineers, Prompt engineers, and other Data & AI professionals to deliver projects from inception to implementation. Brainstorm, build & improve AI/Gen AI models developed by the team & identify scope for model improvements & best practices. Assist and participate in pre-sales, client pursuits and proposals. Drive a human-led culture of Inclusion & Diversity by caring deeply for all team members. Qualifications: 6-10 years of relevant experience in Generative AI, Deep Learning, or NLP Bachelor’s or master’s degree in a quantitative field. Led a 3–5-member team on multiple end to end AI/GenAI projects Excellent communication and client/stakeholder management skills Must have strong hands-on experience with programming languages like Python, Cuda and SQL, and frameworks such as TensorFlow, PyTorch and Keras Hands-on experience with top LLM models like OpenAI GPT-3.5/4, Google Gemini, AWS Bedrock, LLaMA 3.0, and Mistral, along with RAG and Agentic workflows Well versed with GANs and Transformer architecture, knows about Diffusion models, up to date with new research/progress in the field of Gen AI Should follow research papers, comprehend and innovate/present the best approaches/solutions related to Generative AI components. Knowledge of hyperscaler offerings (NVIDIA, AWS, Azure, GCP, Oracle) and Gen AI tools (Copilot, Vertex AI). Knowledge of Vector DB, Neo4J/relevant Graph DBs Familiar with Docker containerization, GIT, etc. AI/Cloud certification from a premier institute is preferred. Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Professional development At Deloitte, professionals have the opportunity to work with some of the best and discover what works best for them. Here, we prioritize professional growth, offering diverse learning and networking opportunities to help accelerate careers and enhance leadership skills. Our state-of-the-art DU: The Leadership Center in India, located in Hyderabad, represents a tangible symbol of our commitment to the holistic growth and development of our people. Explore DU: The Leadership Center in India . Benefits To Help You Thrive At Deloitte, we know that great people make a great organization. Our comprehensive rewards program helps us deliver a distinctly Deloitte experience that helps that empowers our professionals to thrive mentally, physically, and financially—and live their purpose. To support our professionals and their loved ones, we offer a broad range of benefits. Eligibility requirements may be based on role, tenure, type of employment and/ or other criteria. Learn more about what working at Deloitte can mean for you. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Requisition code: 303629
Posted 3 weeks ago
3.0 - 5.0 years
5 - 7 Lacs
Bengaluru
Work from Office
Mandatory Skills:: Machine Learning & Deep Learning Strong understanding of LLM architectures, transformers, and fine-tuning techniques. MLOps & DevOps Experience with CI/CD pipelines, model deployment, and monitoring. Vector Databases Knowledge of storing and retrieving embeddings efficiently. Prompt Engineering Ability to craft effective prompts for optimal model responses. Retrieval-Augmented Generation (RAG) Implementing techniques to enhance LLM outputs with external knowledge. Cloud Platforms Familiarity with AWS, Azure, or GCP for scalable deployments. Containerization & Orchestration Using Docker and Kubernetes for model deployment. Observability & Monitoring Tracking model performance, latency, and drift. Security & Ethics Ensuring responsible AI practices and data privacy. Programming Skills Strong proficiency in Python, SQL, and API development. Prefered Skills : Knowledge of Open-Source LLMs Familiarity with models like LLaMA, Falcon, and Mistral. Fine-Tuning & Optimization Experience with LoRA, quantization, and efficient training techniques. LLM Frameworks Hands-on experience with Hugging Face, LangChain, or OpenAI APIs. Data Engineering Understanding of ETL pipelines and data preprocessing. Microservices Architecture Ability to design scalable AI-powered applications. Explainability & Interpretability Techniques for understanding and debugging LLM outputs. Graph Databases Knowledge of Neo4j or similar technologies for complex data relationships. Collaboration & Communication Ability to work with cross-functional teams and explain technical concepts clearly. Mandatory Skills:: LLM Ops Experience: 3-5 Years
Posted 3 weeks ago
3.0 - 8.0 years
5 - 10 Lacs
Coimbatore
Work from Office
Experience: 3+years Requirements: Implement the tools and processes required of a data processing pipeline Primary responsibilities include implementing ETL (extract, transform and load) pipelines, monitoring/maintaining data pipeline performance Familiar with key architectures including Lambda and Kappa architectures Broad experience across a set of data stores (e.g., SQL, PostgreSQL, MongoDB, InfluxDB, neo4j, Redis) Experience in messaging systems (e.g., Apache Kafka, RabbitMQ) Experience in data processing engines (e.g., BigQuery, Redshift, Snowflake) Experience working on solutions that collect, process, store and analyze huge volume of data, fast moving data or data that has significant schema variability Technologies SQL, PostgreSQL, BigQuery, Redshift, Snowflake, MariaDB, MongoDB, InfluxDB, neo4j, Redis, Elasticsearch Programming/Scripting Languages: Scala, SQL, Python, MapReduce Cloud: AWS / AZURE / GCP
Posted 3 weeks ago
6.0 - 11.0 years
8 - 13 Lacs
Bengaluru
Work from Office
Detailed job description has been attached to this request Must have: 6+ years of backend development experience. Expertise in backend programming languages such as C#.NET, Web API. Experience in developing and consuming RESTful APIs. Proficiency in database design, indexing, and query optimization for SQL. Experience with cloud platforms (Azure) Strong knowledge of authentication and authorization mechanisms (OAuth, JWT, etc.). Familiarity with CI/CD pipelines and DevOps practices. Familiarity with GitHub code repositories and best practices. Excellent problem solving and debugging skills. Familiar with Azure Dev Ops Nice to have: Minimum 2 years of experience working with Neo4j. Strong proficiency in Cypher query language. Experience containerization (Docker, Kubernetes).
Posted 3 weeks ago
12.0 - 17.0 years
11 - 15 Lacs
Bengaluru
Work from Office
Project description Join our data engineering team to lead the design and implementation of advanced graph database solutions using Neo4j. This initiative supports the organization's mission to transform complex data relationships into actionable intelligence. You will play a critical role in architecting scalable graph-based systems, driving innovation in data connectivity, and empowering cross-functional teams with powerful tools for insight and decision-making. Responsibilities Graph Data Modeling & Implementation. Design and implement complex graph data models using Cypher and Neo4j best practices. Leverage APOC procedures, custom plugins, and advanced graph algorithms to solve domain-specific problems. Oversee integration of Neo4j with other enterprise systems, microservices, and data platforms. Develop and maintain APIs and services in Java, Python, or JavaScript to interact with the graph database. Mentor junior developers and review code to maintain high-quality standards. Establish guidelines for performance tuning, scalability, security, and disaster recovery in Neo4j environments. Work with data scientists, analysts, and business stakeholders to translate complex requirements into graph-based solutions. Skills Must have 12+ years in software/data engineering, with at least 3-5 years hands-on experience with Neo4j. Lead the technical strategy, architecture, and delivery of Neo4j-based solutions. Design, model, and implement complex graph data structures using Cypher and Neo4j best practices. Guide the integration of Neo4j with other data platforms and microservices. Collaborate with cross-functional teams to understand business needs and translate them into graph-based models. Mentor junior developers and ensure code quality through reviews and best practices. Define and enforce performance tuning, security standards, and disaster recovery strategies for Neo4j. Stay up-to-date with emerging technologies in the graph database and data engineering space. Strong proficiency in Cypher query language, graph modeling, and data visualization tools (e.g., Bloom, Neo4j Browser). Solid background in Java, Python, or JavaScript and experience integrating Neo4j with these languages. Experience with APOC procedures, Neo4j plugins, and query optimization. Familiarity with cloud platforms (AWS) and containerization tools (Docker, Kubernetes). Proven experience leading engineering teams or projects. Excellent problem-solving and communication skills. Nice to have N/A
Posted 3 weeks ago
6.0 - 8.0 years
7 - 12 Lacs
India, Bengaluru
Work from Office
Hello Visionary! We empower our people to stay resilient and relevant in a constantly evolving world. We’re looking for people who are always searching for creative ways to grow and learn. People who want to make a real impact, now and in the future. Does that sound like youThen it seems like you’d make a great addition to our vibrant team. We are looking for Semantic Web ETL Developer . Before our software developers write even a single line of code, they have to understand what drives our customers. What is the environmentWhat is the user story based onImplementation means – trying, testing, and improving outcomes until a final solution emerges. Knowledge means exchange – discussions with colleagues from all over the world. Join our Digitalization Technology and Services (DTS) team based in Bangalore. YOU’LL MAKE A DIFFERENCE BY: - Implementing innovative Products and Solution Development processes and tools by applying your expertise in the field of responsibility. JOB REQUIREMENTS/ S: International experience with global projects and collaboration with intercultural team is preferred 6 - 8 years’ experience on developing software solutions with Python language. Experience in research and development processes (Software based solutions and products) ; in commercial topics; in implementation of strategies, POC’s Manage end-to-end development of web applications and knowledge graph projects, ensuring best practices and high code quality. Provide technical guidance and mentorship to junior developers, fostering their growth and development. Design scalable and efficient architectures for web applications, knowledge graphs, and database models. Carry out code standards and perform code reviews, ensuring alignment with standard methodologies like PEP8, DRY, and SOLID principles. Collaborate with frontend developers, DevOps teams, and database administrators to deliver cohesive solutions. Strong and Expert-like proficiency in Python web frameworks Django, Flask, FAST API, Knowledge Graph Libraries. Experience in designing and developing complex RESTful APIs and microservices architectures. Strong understanding of security standard processes in web applications (e.g., authentication, authorization, and data protection). Extensive experience in building and querying knowledge graphs using Python libraries like RDFLib, Py2neo, or similar. Proficiency in SPARQL for advanced graph data querying. Experience with graph databases like Neo4j, GraphDB, or Blazegraph.or AWS Neptune Experience in expert functions like Software Development / Architecture, Software Testing (Unit Testing, Integration Testing) Excellent in DevOps practices, including CI/CD pipelines, containerization (Docker), and orchestration (Kubernetes). Excellent in Cloud technologies and architecture. Should have exposure on S3, EKS, ECR, AWS Neptune Exposure to and working experience in the relevant Siemens sector domain (Industry, Energy, Healthcare, Infrastructure and Cities) required. LEADERSHIP QUALITIES Visionary LeadershipAbility to lead the team towards long-term technical goals while managing immediate priorities. Strong CommunicationGood interpersonal skills to work effectively with both technical and non-technical stakeholders. Mentorship & CoachingFoster a culture of continuous learning, skill development, and collaboration within the team. Conflict ResolutionAbility to manage team conflicts and provide constructive feedback to improve team dynamics. Create a better #TomorrowWithUs! This role is in Bangalore, where you’ll get the chance to work with teams impacting entire cities, countries – and the craft of things to come. We’re Siemens. A collection of over 312,000 minds building the future, one day at a time in over 200 countries. All employment decisions at Siemens are based on qualifications, merit and business need. Bring your curiosity and creativity and help us craft tomorrow. At Siemens, we are always challenging ourselves to build a better future. We need the most innovative and diverse Digital Minds to develop tomorrow ‘s reality. Find out more about the Digital world of Siemens here/digitalminds (http:///digitalminds)
Posted 3 weeks ago
6.0 - 8.0 years
5 - 9 Lacs
India, Bengaluru
Work from Office
Hello Visionary! We empower our people to stay resilient and relevant in a constantly evolving world. We’re looking for people who are always searching for creative ways to grow and learn. People who want to make a real impact, now and in the future. Does that sound like youThen it seems like you’d make a great addition to our vibrant team. We are looking for Semantic Web ETL Developer. Before our software developers write even a single line of code, they have to understand what drives our customers. What is the environmentWhat is the user story based onImplementation means – trying, testing, and improving outcomes until a final solution emerges. Knowledge means exchange – discussions with colleagues from all over the world. Join our Digitalization Technology and Services (DTS) team based in Bangalore. YOU’LL MAKE A DIFFERENCE BY: - Implementing innovative Products and Solution Development processes and tools by applying your expertise in the field of responsibility. JOB REQUIREMENTS/ S: International experience with global projects and collaboration with intercultural team is preferred 6 - 8 years’ experience on developing software solutions with Python language. Experience in research and development processes (Software based solutions and products) ; in commercial topics; in implementation of strategies, POC’s Manage end-to-end development of web applications and knowledge graph projects, ensuring best practices and high code quality. Provide technical guidance and mentorship to junior developers, fostering their growth and development. Design scalable and efficient architectures for web applications, knowledge graphs, and database models. Carry out code standards and perform code reviews, ensuring alignment with standard methodologies like PEP8, DRY, and SOLID principles. Collaborate with frontend developers, DevOps teams, and database administrators to deliver cohesive solutions. Strong and Expert-like proficiency in Python web frameworks Django, Flask, FAST API, Knowledge Graph Libraries. Experience in designing and developing complex RESTful APIs and microservices architectures. Strong understanding of security standard processes in web applications (e.g., authentication, authorization, and data protection). Extensive experience in building and querying knowledge graphs using Python libraries like RDFLib, Py2neo, or similar. Proficiency in SPARQL for advanced graph data querying. Experience with graph databases like Neo4j, GraphDB, or Blazegraph.or AWS Neptune Experience in expert functions like Software Development / Architecture, Software Testing (Unit Testing, Integration Testing) Excellent in DevOps practices, including CI/CD pipelines, containerization (Docker), and orchestration (Kubernetes). Excellent in Cloud technologies and architecture. Should have exposure on S3, EKS, ECR, AWS Neptune Exposure to and working experience in the relevant Siemens sector domain (Industry, Energy, Healthcare, Infrastructure and Cities) required. LEADERSHIP QUALITIES Visionary LeadershipAbility to lead the team towards long-term technical goals while managing immediate priorities. Strong CommunicationGood interpersonal skills to work effectively with both technical and non-technical stakeholders. Mentorship & CoachingFoster a culture of continuous learning, skill development, and collaboration within the team. Conflict ResolutionAbility to manage team conflicts and provide constructive feedback to improve team dynamics. Create a better #TomorrowWithUs! This role is in Bangalore , where you’ll get the chance to work with teams impacting entire cities, countries – and the craft of things to come. We’re Siemens. A collection of over 312,000 minds building the future, one day at a time in over 200 countries. All employment decisions at Siemens are based on qualifications, merit and business need. Bring your curiosity and creativity and help us craft tomorrow. At Siemens, we are always challenging ourselves to build a better future. We need the most innovative and diverse Digital Minds to develop tomorrow ‘s reality. Find out more about the Digital world of Siemens here/digitalminds (http:///digitalminds)
Posted 3 weeks ago
7.0 - 12.0 years
14 - 18 Lacs
India, Bengaluru
Work from Office
Dear Aspirant! We empower our people to stay resilient and relevant in a constantly changing world. We’re looking for people who are always searching for creative ways to grow and learn. People who want to make a real impact, now and in the future. Does that sound like youThen it seems like you’d make a great addition to our vibrant international team. We are looking for Lead Software Engineer (AIML Engineer – NLP & Generative AI) , You’ll make an impact by Architect and lead the development of NLP and Generative AI solutions, including LLM integration, RAG pipelines, and multi-agent frameworks. Design and optimize retrieval systems using knowledge graphs and vector databases, improving contextual accuracy and semantic relevance in RAG workflows. Apply advanced techniques (e.g., document chunking strategies, rerankers, hybrid retrieval, query rewriting, feedback loops) to enhance RAG chain precision and reduce hallucinations. Collaborate with ontology/domain experts to integrate structured knowledge bases and semantic relationships into the solution stack. Leverage modern frameworks like Lang Graph, Lang Chain, Llama Index, Smol Agents, and others for orchestrating agent-based and tool-augmented pipelines. Incorporate AWS Bedrock, Sagemaker, Azure ML Studio, Azure OpenAI Service, and Azure AI Foundry for cloud-native scalability and operational efficiency. Ensure high observability and maintainability of AI solutions through robust MLOps practices, logging, and model monitoring. Lead code/design reviews, mentor team members, and help shape long-term AI strategy and technical roadmaps. Collaborate with product, cloud, software, and data engineering teams to deploy impactful AI capabilities in real-world settings. Use your skills to move the world forward! Bachelor’s or Master’s degree in Computer Science, Machine Learning, AI, or a related field. 7+ years of AI/ML experience, with 3–4 years in NLP, and 2+ years in Generative AI applications. Expertise in designing production-grade RAG systems, including single-agent and multi-agent architectures. Solid understanding of LLM internals, prompt engineering, fine-tuning (LoRA, PEFT), and use of open-source and hosted foundation models. Experience with Knowledge Graphs, graph databases (e.g., Neo4j), and semantic enrichment strategies. Proficiency in Python and hands-on experience with frameworks like LangGraph, LlamaIndex, Transformers, and SmolAgents. Knowledge of vector databases (e.g., Azure AI Search, FAISS, Weaviate, Pinecone) and search optimization techniques. Familiarity with model observability tools, evaluation frameworks, and performance diagnostics. Strong experience with AWS and/or Azure managed services for AI development. Experience incorporating ontologies, taxonomies, and domain-specific schemas in knowledge enhanced AI systems. Prior exposure to industrial AI or Electrification/Power sector challenges is a strong plus. Knowledge of hybrid retrieval techniques combining symbolic and statistical methods. Strong stakeholder engagement and mentoring capabilities. Familiarity with compliance, safety, and ethical considerations in LLM deployments. Create a better #TomorrowWithUs! This role is based in Bangalore, where you’ll get the chance to work with teams impacting entire cities, countries - and the shape of things to come. We’re Siemens. A collection of over 312,000 minds building the future, one day at a time in over 200 countries. All employment decisions at Siemens are based on qualifications, merit and business need. Bring your curiosity and imagination and help us shape tomorrow. Find out more about Siemens careers at Find out more about the Digital world of Siemens here /digitalminds
Posted 3 weeks ago
6.0 - 10.0 years
30 - 35 Lacs
Bengaluru
Work from Office
: Job Title- Data Science Engineer, AVP Location- Bangalore, India Role Description We are seeking a seasoned Data Science Engineer to spearhead the development of intelligent, autonomous AI systems. The ideal candidate will have a robust background in agentic AI, LLMs, SLMs, vector DB, and knowledge graphs. This role involves designing and deploying AI solutions that leverage Retrieval-Augmented Generation (RAG), multi-agent frameworks, and hybrid search techniques to enhance enterprise applications. What well offer you 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Accident and Term life Insurance Your key responsibilities Design & Develop Agentic AI Applications:Utilise frameworks like LangChain, CrewAI, and AutoGen to build autonomous agents capable of complex task execution. Implement RAG PipelinesIntegrate LLMs with vector databases (e.g., Milvus, FAISS) and knowledge graphs (e.g., Neo4j) to create dynamic, context-aware retrieval systems. Fine-Tune Language Models:Customise LLMs (e.g., Gemini, chatgpt, Llama) and SLMs (e.g., Spacy, NLTK) using domain-specific data to improve performance and relevance in specialised applications. NER ModelsTrain OCR and NLP leveraged models to parse domain-specific details from documents (e.g., DocAI, Azure AI DIS, AWS IDP) Develop Knowledge GraphsConstruct and manage knowledge graphs to represent and query complex relationships within data, enhancing AI interpretability and reasoning. Collaborate Cross-Functionally:Work with data engineers, ML researchers, and product teams to align AI solutions with business objectives and technical requirements. Optimise AI Workflows:Employ MLOps practices to ensure scalable, maintainable, and efficient AI model deployment and monitoring. Your skills and experience 8+ years of professional experience in AI/ML development, with a focus on agentic AI systems. Proficient in Python, Python API frameworks, SQL and familiar with AI/ML frameworks such as TensorFlow or PyTorch. Experience in deploying AI models on cloud platforms (e.g., GCP, AWS). Experience with LLMs (e.g., GPT-4), SLMs (Spacy), and prompt engineering. Understanding of semantic technologies, ontologies, and RDF/SPARQL. Familiarity with MLOps tools and practices for continuous integration and deployment. Skilled in building and querying knowledge graphs using tools like Neo4j. Hands-on experience with vector databases and embedding techniques. Familiarity with RAG architectures and hybrid search methodologies. Experience in developing AI solutions for specific industries such as healthcare, finance, or e-commerce. Strong problem-solving abilities and analytical thinking. Excellent communication skills for cross-functional collaboration. Ability to work independently and manage multiple projects simultaneously. How well support you About us and our teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.
Posted 3 weeks ago
14.0 years
0 Lacs
Kolkata, West Bengal, India
On-site
Unified Infotech is a 14-year-old, multi-award winning digital transformation partner. We turbocharge business growth for Fortune 500 companies, multinational corporations (MNCs), small and medium-sized enterprises (SMEs), and Startups using emerging tech and streamlined digital processes. We’re your go-to partner for: · Digital Transformation, Custom Web, Mobile, and Desktop Software Development · Digital Customer Experience - UX/UI Research & Design · SaaS and Software Product Development · IT Consulting & Staff Augmentation · Software Modernization & Cloud Migration · Data and Analytics · Cloud Engineering You can get more details about us from our website www.unifiedinfotech.net Position Overview We are looking for a highly skilled and experienced Solution Architect to join our team. This role is responsible for delivering both technical and functional expertise to clients across various projects. The ideal candidate will have a strong background in designing, implementing, and optimizing scalable and highly available cloud (SaaS) services and solutions. This role involves collaborating closely with business development, account management, and executive leadership teams to ensure that technical solutions align with business goals and are implemented seamlessly. Key Responsibilities • Solution Design & Development: Analyze client requirements and functional specifications, and collaborate with development teams to design and implement scalable, distributed cloud-based solutions (SaaS). • Cloud Architecture: Lead the design and implementation of highly available, resilient, and efficient cloud architectures. Build complex distributed systems from the ground up with a focus on minimizing downtime, ensuring failproof deployments, and maintaining data integrity. • Stakeholder Collaboration: Work closely with business development, account managers, and executive management to align technical solutions with business goals and increase overall company productivity and profitability. • Database Expertise: Provide expertise in SQL and NoSQL databases such as MySQL, Oracle, MongoDB, Cassandra, Redis, and Neo4J to design efficient data models and schemas. • Continuous Improvement: Evaluate and recommend improvements to current technologies and processes within the organization to drive greater efficiency and performance. • Mentorship & Best Practices: Mentor development teams by guiding them in best practices for coding, architecture design, and software development methodologies. • Version Control & CI/CD: Implement and manage version control systems (e.g., Git) and Continuous Integration/Continuous Deployment (CI/CD) pipelines to ensure smooth, efficient development workflows. • Security & Compliance: Ensure that all solutions adhere to security best practices and comply with relevant standards to protect data and systems. • Agile Methodology: Participate in Agile/Scrum development processes, collaborating with cross-functional teams to deliver high-quality solutions on time. • Strategic Planning: Contribute to long-term architectural strategy, identifying areas for improvement and ensuring solutions meet business requirements and performance goals. Desired Candidate Profile • Experience: Proven experience in solution architecture, design, and implementation of scalable cloud-based solutions (SaaS). Hands-on experience with high availability and distributed systems is essential. • Technical Skills: o Strong proficiency in SQL and NoSQL databases (e.g., MySQL, MongoDB, Cassandra, Neo4J, Redis). o Expertise in cloud architectures, distributed systems, and high-performance computing. o Proficient in version control systems, particularly Git. o Familiarity with CI/CD processes and pipeline automation. o Understanding of web application security principles. • Programming & Frameworks: Experience with technologies and frameworks such as NodeJS, Laravel, Spring, Angular, React, or similar frameworks is highly desirable. • Leadership & Mentorship: Strong ability to mentor and guide technical teams in adopting best practices and delivering high-quality solutions. • Methodology: Practical experience in Agile/Scrum development methodologies with a collaborative approach to team success. • Communication: Excellent communication skills, with the ability to effectively present complex technical concepts to both technical and non-technical stakeholders.
Posted 3 weeks ago
3.0 - 4.0 years
1 - 7 Lacs
India
On-site
Job Title: Python Backend Developer (Data Layer) Location: Mohali, Punjab Company: RevClerx About RevClerx: RevClerx Pvt. Ltd., founded in 2017 and based in the Chandigarh/Mohali area (India), is a dynamic Information Technology firm providing comprehensive IT services with a strong focus on client-centric solutions. As a global provider, we cater to diverse business needs including website designing and development, digital marketing, lead generation services (including telemarketing and qualification), and appointment setting. Job Summary: We are seeking a skilled Python Backend Developer with a strong passion and proven expertise in database design and implementation. This role requires 3-4 years of backend development experience, focusing on building robust, scalable applications and APIs. The ideal candidate will not only be proficient in Python and common backend frameworks but will possess significant experience in designing, modeling, and optimizing various database solutions, including relational databases (like PostgreSQL) and, crucially, graph databases (specifically Neo4j). You will play a vital role in architecting the data layer of our applications, ensuring efficiency, scalability, and the ability to handle complex, interconnected data. Key Responsibilities: ● Design, develop, test, deploy, and maintain scalable and performant Python-based backend services and APIs. ● Lead the design and implementation of database schemas for relational (e.g., PostgreSQL) and NoSQL databases, with a strong emphasis on Graph Databases (Neo4j). ● Model complex data relationships and structures effectively, particularly leveraging graph data modeling principles where appropriate. ● Write efficient, optimized database queries (SQL, Cypher, potentially others). ● Develop and maintain data models, ensuring data integrity, consistency, and security. ● Optimize database performance through indexing strategies, query tuning, caching mechanisms, and schema adjustments. ● Collaborate closely with product managers, frontend developers, and other stakeholders to understand data requirements and translate them into effective database designs. ● Implement data migration strategies and scripts as needed. ● Integrate various databases seamlessly with Python backend services using ORMs (like SQLAlchemy, Django ORM) or native drivers. ● Write unit and integration tests, particularly focusing on data access and manipulation logic. ● Contribute to architectural decisions, especially concerning data storage, retrieval, and processing. ● Stay current with best practices in database technologies, Python development, and backend systems. Minimum Qualifications: ● Bachelor's degree in Computer Science, Engineering, Information Technology, or a related field, OR equivalent practical experience. ● 3-4 years of professional software development experience with a primary focus on Python backend development. ● Strong proficiency in Python and its standard libraries. ● Proven experience with at least one major Python web framework (e.g., Django, Flask, FastAPI). ● Demonstrable, hands-on experience designing, implementing, and managing relational databases (e.g., PostgreSQL). ● Experience with at least one NoSQL database (e.g., MongoDB, Redis, Cassandra). ● Solid understanding of data structures, algorithms, and object-oriented programming principles. ● Experience designing and consuming RESTful APIs. ● Proficiency with version control systems, particularly Git. ● Strong analytical and problem-solving skills, especially concerning data modeling and querying. ● Excellent communication and teamwork abilities. Preferred (Good-to-Have) Qualifications: ● Graph Database Expertise: ○ Significant, demonstrable experience designing and implementing solutions using Graph Databases (Neo4j strongly preferred). ○ Proficiency in graph query languages, particularly Cypher. ○ Strong understanding of graph data modeling principles, use cases (e.g., recommendation engines, fraud detection, knowledge graphs, network analysis), and trade-offs. ● Advanced Database Skills: ○ Experience with database performance tuning and monitoring tools. ○ Experience with Object-Relational Mappers (ORMs) like SQLAlchemy or Django ORM in depth. ○ Experience implementing data migration strategies for large datasets. ● Cloud Experience: Familiarity with cloud platforms (e.g., AWS, Azure, Google Cloud Platform) and their managed database services (e.g., RDS, Aurora, Neptune, DocumentDB, MemoryStore). ● Containerization & Orchestration: Experience with Docker and Kubernetes. ● Asynchronous Programming: Experience with Python's asyncio and async frameworks. ● Data Pipelines: Familiarity with ETL processes or data pipeline tools (e.g., Apache Airflow). ● Testing: Experience writing tests specifically for database interactions and data integrity. What We Offer: ● Challenging projects with opportunities to work on cutting-edge technologies especially in the field of AI. ● Competitive salary and comprehensive benefits package. ● Opportunities for professional development and learning (e.g., conferences, courses, certifications). ● A collaborative, innovative, and supportive work environment. How to Apply: Interested candidates are invited to submit their resume and a cover letter outlining their relevant experience, specifically highlighting their database design expertise (including relational, NoSQL, and especially Graph DB/Neo4j experience) to Job Type: Full-time Pay: ₹14,154.00 - ₹65,999.72 per month Benefits: Food provided Health insurance Location Type: In-person Schedule: Morning shift Work Location: In person
Posted 3 weeks ago
5.0 years
0 Lacs
Haryana
Remote
About Teramind Teramind is the leading platform for user behavior analytics, serving multiple use cases from insider risk mitigation to business process optimization. With our comprehensive suite of solutions, organizations gain unprecedented visibility into user activities while enhancing security, optimizing productivity, and ensuring compliance. Trusted by Fortune 500 companies and businesses of all sizes across industries, our innovative platform helps organizations protect sensitive data, maximize workforce performance, and create safer, more efficient digital workplaces. Through real-time monitoring and advanced analytics, we enable businesses to safeguard their most sensitive information while optimizing employee productivity in both in-office and remote work environments. Our Core Values At Teramind, our values drive everything we do. We embrace innovation as a fundamental principle, constantly pushing boundaries to improve our products, streamline processes, and enhance customer experiences. We foster resourcefulness by empowering our team members with the autonomy and confidence to solve problems independently while providing collaborative support when needed. As a globally inclusive organization, we celebrate diversity and create an adaptable work culture where respect and collaboration thrive across our international teams. Above all, we are committed to excellence, delivering the highest quality in every aspect of our work and consistently exceeding expectations in service to our clients and each other. About the Role As our AI Data Engineering Lead , you will be a pivotal figure in shaping and executing our AI/ML strategy. You will spearhead the design, development, and deployment of cutting-edge AI applications, with a strong focus on building data infrastructure to power Large Language Models (LLMs), agentic frameworks, and multi-agent systems. This role demands a blend of hands-on technical expertise, strategic thinking, and leadership to build and scale our AI capabilities on the Google Cloud Platform (GCP). You'll not only architect robust systems but also champion data engineering excellence and build and lead a team of talented AI engineers. What You’ll Do Architect & Build: Design scalable, AI-first data infrastructure on GCP (Vertex AI, BigQuery, Dataflow, Pub/Sub) to power LLMs and agentic systems. Pipeline Mastery: Develop high-performance, real-time data pipelines to process user behavior and drive ML systems. End-to-End AI Systems: Lead the design, development, and deployment of AI/LLM applications using LangChain, HuggingFace, AutoGen, and more. Operationalize ML: Build MLOps pipelines with robust CI/CD, monitoring, testing, and model evaluation — especially for LLM outputs. Drive Evaluation: Create frameworks to assess safety, performance, and quality of generative AI applications. Code & Lead: Write production-grade Python, contribute to core infrastructure, and raise the bar for technical excellence. Collaborate Strategically: Work with Product, AIML leadership, and cross-functional partners to align tech execution with company vision. Help us Grow the Team: Mentor and grow a team of AI engineers as we scale — help us build not just software, but a world-class engineering culture. Requirements 5+ years of experience in Software, Data, and/or ML engineering Expert in Python, with strong knowledge of data engineering tools, GCP (Vertex AI, Dataflow, BigQuery) , and AI pipelines Hands-on with CI/CD, model monitoring, and observability in ML systems. Experience launching production-grade GenAI systems, especially involving LLMs, agentic workflows, or multi-agent coordination. Familiarity with AI/LLM frameworks (Langchain, LlamaIndex, HuggingFace) and modern prompt engineering techniques. Bonus Points For Master's or PhD in CS, AI, ML, or related fields Experience with AWS, Azure, or other cloud AI stacks Experience with Graph databases (e.g., Neo4j) and proficiency in SQL and NoSQL databases Background in big data (Spark, Flink) or open-source AI contributions Understanding of responsible AI, ethics, and data governance Benefits This is a remote job. Work from anywhere! We’ve been thriving as a fully-remote team since 2014. To us, remote work means flexibility and having truly diverse, global teams. Additionally: Collaboration with a forward-thinking team where new ideas come to life, experience is valued, and talent is incubated. Competitive salary Career growth opportunities Flexible paid time off Laptop reimbursement Ongoing training and development opportunities About our recruitment process We don’t expect a perfect fit for every requirement we’ve outlined. If you can see yourself contributing to the team, we want to hear your story. You can expect up to 3 interviews. In some scenarios, we’re able to streamline the process to have minimal rounds. Director-level roles and above should expect a more thorough process, with multiple rounds of interviews. All roles require reference and background checks Teramind is an equal opportunity/affirmative action employer. All qualified applicants will receive consideration without regard to race, age, religion, color, marital status, national origin, gender, gender identity or expression, sexual orientation, disability, or veteran status.
Posted 3 weeks ago
1.0 years
0 Lacs
Gāndhīnagar
On-site
Software Engineering intern Are you interested to be part of fin tech startup that is going to disrupt people's financial life style? If yes than this is a very unique opportunity with lot of scope of growth and scale for you. We are passionate what we do at Argyle Enigma Tech Labs. Our deep-tech AI product is helping people to cut down time and improve efficiency. Come join us and share your passion. Duration: 6 months Education- B. E/B. Tech-IT or MCA/M. Sc-IT (Pursuing last year) Requirements : 1. 6 months of experience in Python or Java frameworks - J2EE, SpringBoot etc. preferred 3. Understanding of OOP and Data Structures and know when to apply them in daily coding scenarios. 4. Expertise working with and building RESTful APIs. 5. Experience with SOA, JSON/XML and REST Web Services. 6. Knowledge of API security frameworks, token management and user access control including OAuth, JWT, etc. 7. Experience with Message queues (Kakfa, RabbitMQ, ZeroMQ, etc). 8. Knowledge of Postgres/ Oracle / MySQL / Graph databases (Neo4J, GraphDB lite, Graph engine) 9. Experience configuring container like systems (Docker, Kubernetes etc). Perks and Benefits: Opportunity to work on live project with Ek Software industry partners. Eligibility Criteria: All semesters should be cleared without any backlog. Selection Process follows: Face to Face in person Interview Job Type: Internship Contract length: 6 months Pay: From ₹7,000.00 per month Ability to commute/relocate: Gandhinagar, Gujarat: Reliably commute or planning to relocate before starting work (Required) Experience: total work: 1 year (Preferred) Front End Developers: 1 year (Preferred) Work Location: In person
Posted 3 weeks ago
0 years
0 Lacs
Ahmedabad, Gujarat, India
On-site
We at TecMantras Solutions are seeking a skilled AI/ML Engineer to join our innovative team. In this role, you will design, develop, and deploy machine learning models and systems that drive our products and enhance user experiences. You will work closely with cross-functional teams to implement cutting-edge AI solutions, including recommendation engines and large language models. Required Technical Skill Set: Data Science, Machine Learning, Data Analytics, Deep Learning, Natural Language Processing, Business Intelligence, Computer Vision, Feature Engineering, Data Mining, Data Processing, Data Visualization, SQL, Python, Transformers, Predictive Modelling, Statistics, Text Analytics, MS Excel, Azure/ AWS, LLMs, MLOps, Generative AI, Deployment, Prompt Engineering. About the job Role: AIML GenAI Engineer Required Technical Skill Set: Data Science, Machine Learning, Data Analytics, Deep Learning, Natural Language Processing, Business Intelligence, Computer Vision, Feature Engineering, Data Mining, Data Processing, Data Visualization, SQL, Python, Transformers, Predictive Modelling, Statistics, Text Analytics, MS Excel, Azure/ AWS, LLMs, MLOps, Generative AI, Deployment, Prompt Engineering. Key Responsibilities: Design and implement robust machine learning models and algorithms, focusing on recommendation systems. Conduct data analysis to identify trends, insights, and opportunities for model improvement. Collaborate with data scientists and software engineers to build and integrate end-to-end machine learning systems. Optimize and fine-tune models for performance and scalability, ensuring seamless deployment. Work with large datasets using SQL and Postgres to support model training and evaluation. Implement and refine prompt engineering techniques for large language models (LLMs). Stay current with advancements in AI/ML technologies, particularly in core ML algorithms like clustering and community detection. Monitor model performance, conduct regular evaluations, and retrain models as needed. Document processes, model performance metrics, and technical specifications. Experienced in working with vector databases such as ChromaDB, Milvus, QdrantFAISS, Pinecone, Weaviate, etc for efficient similarity search and large-scale data retrieval. Understanding of Large Language Models (LLM) And Other Generative AI (Genai) Models, Including Prompt Engineering, Model Evaluation, Stable Diffusion, Optimization And Deployment, LLMOps, LLM Training Framework’s/Deploying Tools like LangChain, LangGraph, CrewAI, AutoGen, etc. Required Skills And Qualifications: Bachelors or Master’s degree in Computer Science, Data Science, or a related field. Strong expertise in Python and experience with machine learning libraries (e.g., TensorFlow, PyTorch, Scikit-learn). Proven experience with SQL and Postgres for data manipulation and analysis. Demonstrated experience building and deploying recommendation engines. Solid understanding of core machine learning algorithms, including clustering and community detection. Prior experience in building end-to-end machine learning systems. Familiarity with prompt engineering and working with large language models (LLMs). Experience working with near-real-time recommendation systems Any graph databases hands-on experience like Neo4j, Neptune, etc Experience in Flask or Fast API frameworks Experience with SQL to write/modify/understand the existing queries and optimize DB connections Experience with LLMs, using tools like OpenAI, Deepseek etc. Presentation building skills Experience with Agentic AI. Familiarity with cloud platforms (AWS, Azure, Google Cloud) and MLOps tools. Understanding of data structures, algorithms, and software engineering principles and also when to prototype. Familiarity with version control (Git), CI/CD pipelines, and containerization (Docker, Kubernetes). Preferred Qualifications: Hands-on experience with real-world AI applications, either through internships, research, or personal projects. Knowledge of large-scale data handling and optimization techniques. Experience with data preprocessing, feature engineering, and model evaluation techniques. Excellent problem-solving skills and the ability to work in a fast-paced, collaborative environment. Why Join Us: Flexible and friendly work environment Opportunity to work on innovative projects Continuous learning and growth Leave enhancement policy and employee recognition
Posted 3 weeks ago
5.0 - 9.0 years
10 - 16 Lacs
Pune, Greater Noida, Delhi / NCR
Work from Office
Responsibilities: Create and optimize complex SPARQL Protocol and RDF Query Language queries to retrieve and analyze data from graph databases Develop graph-based applications and models to solve real-world problems and extract valuable insights from data. Design, develop, and maintain scalable data pipelines using Python rest apis get data from different cloud platforms Create and optimize complex SPARQL queries to retrieve and analyze data from graph databases. Study and understand the nodes, edges, and properties in graphs, to represent and store data in relational databases. Write clean, efficient, and well-documented code. Troubleshoot and fix bugs. Collaborate with other developers and stakeholders to deliver high-quality solutions. Stay up-to-date with the latest technologies and trends. Qualifications: Strong proficiency in SparQL, and RDF query language Strong proficiency in Python and Rest APIs Experience with database technologies sql and sparql Strong problem-solving and debugging skills. Ability to work independently and as part of a team. Preferred Skills: Knowledge of cloud platforms like AWS, Azure, or GCP. Experience with version control systems like Github. Understanding of environments and deployment processes and cloud infrastructure. Share your resume at Aarushi.Shukla@coforge.com if you are an early or immediate joiner.
Posted 3 weeks ago
0.0 - 1.0 years
0 Lacs
Gandhinagar, Gujarat
On-site
Software Engineering intern Are you interested to be part of fin tech startup that is going to disrupt people's financial life style? If yes than this is a very unique opportunity with lot of scope of growth and scale for you. We are passionate what we do at Argyle Enigma Tech Labs. Our deep-tech AI product is helping people to cut down time and improve efficiency. Come join us and share your passion. Duration: 6 months Education- B. E/B. Tech-IT or MCA/M. Sc-IT (Pursuing last year) Requirements : 1. 6 months of experience in Python or Java frameworks - J2EE, SpringBoot etc. preferred 3. Understanding of OOP and Data Structures and know when to apply them in daily coding scenarios. 4. Expertise working with and building RESTful APIs. 5. Experience with SOA, JSON/XML and REST Web Services. 6. Knowledge of API security frameworks, token management and user access control including OAuth, JWT, etc. 7. Experience with Message queues (Kakfa, RabbitMQ, ZeroMQ, etc). 8. Knowledge of Postgres/ Oracle / MySQL / Graph databases (Neo4J, GraphDB lite, Graph engine) 9. Experience configuring container like systems (Docker, Kubernetes etc). Perks and Benefits: Opportunity to work on live project with Ek Software industry partners. Eligibility Criteria: All semesters should be cleared without any backlog. Selection Process follows: Face to Face in person Interview Job Type: Internship Contract length: 6 months Pay: From ₹7,000.00 per month Ability to commute/relocate: Gandhinagar, Gujarat: Reliably commute or planning to relocate before starting work (Required) Experience: total work: 1 year (Preferred) Front End Developers: 1 year (Preferred) Work Location: In person
Posted 3 weeks ago
0.0 - 4.0 years
0 - 0 Lacs
Sahibzada Ajit Singh Nagar, Mohali, Punjab
On-site
Job Title: Python Backend Developer (Data Layer) Location: Mohali, Punjab Company: RevClerx About RevClerx: RevClerx Pvt. Ltd., founded in 2017 and based in the Chandigarh/Mohali area (India), is a dynamic Information Technology firm providing comprehensive IT services with a strong focus on client-centric solutions. As a global provider, we cater to diverse business needs including website designing and development, digital marketing, lead generation services (including telemarketing and qualification), and appointment setting. Job Summary: We are seeking a skilled Python Backend Developer with a strong passion and proven expertise in database design and implementation. This role requires 3-4 years of backend development experience, focusing on building robust, scalable applications and APIs. The ideal candidate will not only be proficient in Python and common backend frameworks but will possess significant experience in designing, modeling, and optimizing various database solutions, including relational databases (like PostgreSQL) and, crucially, graph databases (specifically Neo4j). You will play a vital role in architecting the data layer of our applications, ensuring efficiency, scalability, and the ability to handle complex, interconnected data. Key Responsibilities: ● Design, develop, test, deploy, and maintain scalable and performant Python-based backend services and APIs. ● Lead the design and implementation of database schemas for relational (e.g., PostgreSQL) and NoSQL databases, with a strong emphasis on Graph Databases (Neo4j). ● Model complex data relationships and structures effectively, particularly leveraging graph data modeling principles where appropriate. ● Write efficient, optimized database queries (SQL, Cypher, potentially others). ● Develop and maintain data models, ensuring data integrity, consistency, and security. ● Optimize database performance through indexing strategies, query tuning, caching mechanisms, and schema adjustments. ● Collaborate closely with product managers, frontend developers, and other stakeholders to understand data requirements and translate them into effective database designs. ● Implement data migration strategies and scripts as needed. ● Integrate various databases seamlessly with Python backend services using ORMs (like SQLAlchemy, Django ORM) or native drivers. ● Write unit and integration tests, particularly focusing on data access and manipulation logic. ● Contribute to architectural decisions, especially concerning data storage, retrieval, and processing. ● Stay current with best practices in database technologies, Python development, and backend systems. Minimum Qualifications: ● Bachelor's degree in Computer Science, Engineering, Information Technology, or a related field, OR equivalent practical experience. ● 3-4 years of professional software development experience with a primary focus on Python backend development. ● Strong proficiency in Python and its standard libraries. ● Proven experience with at least one major Python web framework (e.g., Django, Flask, FastAPI). ● Demonstrable, hands-on experience designing, implementing, and managing relational databases (e.g., PostgreSQL). ● Experience with at least one NoSQL database (e.g., MongoDB, Redis, Cassandra). ● Solid understanding of data structures, algorithms, and object-oriented programming principles. ● Experience designing and consuming RESTful APIs. ● Proficiency with version control systems, particularly Git. ● Strong analytical and problem-solving skills, especially concerning data modeling and querying. ● Excellent communication and teamwork abilities. Preferred (Good-to-Have) Qualifications: ● Graph Database Expertise: ○ Significant, demonstrable experience designing and implementing solutions using Graph Databases (Neo4j strongly preferred). ○ Proficiency in graph query languages, particularly Cypher. ○ Strong understanding of graph data modeling principles, use cases (e.g., recommendation engines, fraud detection, knowledge graphs, network analysis), and trade-offs. ● Advanced Database Skills: ○ Experience with database performance tuning and monitoring tools. ○ Experience with Object-Relational Mappers (ORMs) like SQLAlchemy or Django ORM in depth. ○ Experience implementing data migration strategies for large datasets. ● Cloud Experience: Familiarity with cloud platforms (e.g., AWS, Azure, Google Cloud Platform) and their managed database services (e.g., RDS, Aurora, Neptune, DocumentDB, MemoryStore). ● Containerization & Orchestration: Experience with Docker and Kubernetes. ● Asynchronous Programming: Experience with Python's asyncio and async frameworks. ● Data Pipelines: Familiarity with ETL processes or data pipeline tools (e.g., Apache Airflow). ● Testing: Experience writing tests specifically for database interactions and data integrity. What We Offer: ● Challenging projects with opportunities to work on cutting-edge technologies especially in the field of AI. ● Competitive salary and comprehensive benefits package. ● Opportunities for professional development and learning (e.g., conferences, courses, certifications). ● A collaborative, innovative, and supportive work environment. How to Apply: Interested candidates are invited to submit their resume and a cover letter outlining their relevant experience, specifically highlighting their database design expertise (including relational, NoSQL, and especially Graph DB/Neo4j experience) to Job Type: Full-time Pay: ₹14,154.00 - ₹65,999.72 per month Benefits: Food provided Health insurance Location Type: In-person Schedule: Morning shift Work Location: In person
Posted 3 weeks ago
8.0 - 13.0 years
0 Lacs
kochi, kerala
On-site
As a Senior Technical Analyst at Maxwell GeoSystems, based in Kochi, Kerala, India, you will play a crucial role in the development and implementation of Company-wide SOA (Service-oriented Architecture) for instrumentation and construction monitoring SaaS (Software as a Service). Your primary focus will be on planning, runtime design, and integration of software services for data handling and transformation. Working under the guidance of the IT Head, you will collaborate with a diverse team of Senior System Developers, Programmers, and Management Executives to ensure the success of projects throughout their life cycle. Leveraging the latest web technologies, you will strive to achieve optimal results and contribute to the company's mission of driving digitalization in the ground engineering industry. Your responsibilities will include developing the logical and physical layout of the overall solution and its components, mediating between business and technology, transforming business operations concepts into IT infrastructure terms, and defining service interface contracts through data and function modeling techniques. You will also work closely with the Architect to create these contracts, investigate service orchestration possibilities, define technical process flows, and create and test software implementations. To excel in this role, you should possess 8-13 years of experience in handling large data and have a strong understanding of technologies such as Cassandra, Neo4J, HDFS, MYSQL, REACTJS, PYTHON, GOLANG, AWS, AZURE, and MongoDB. Additionally, you should have knowledge of common web server exploits and their solutions, fundamental design principles for scalable applications, integration of multiple data sources and databases, creation of database schemas supporting business processes, familiarity with SQL/NoSQL databases, proficiency in code versioning tools like GIT, and the ability to prioritize and execute tasks effectively in a high-pressure environment. Strong written communication skills are essential for this role. If you are ready to join a market-defining company and contribute to the advancement of ground engineering through innovative technology, we encourage you to send your CV to recruitment@maxwellgeosystems.com. Become a part of Maxwell GeoSystems and help us make a real difference in performance and advancement through our revolutionary software, MissionOSa powerful data management system for geotechnical and project-related data acquisition, monitoring, and analysis.,
Posted 3 weeks ago
2.0 - 4.0 years
11 - 16 Lacs
Bengaluru
Work from Office
Your Impact: As a Python Developer in the Debricked data science team, you will work on enhancing data intake processes and optimizing data pipelines. You will apply many different approaches, depending on the needs of the product and the challenges you encounter. In some cases, we use AI/LLM techniques, and we expect the number of such cases to increase. Your contributions will directly impact Debrickeds scope and quality and will help ensure future commercial growth of the product. What the role offers: As a Python Developer, you will: Innovative Data Solutions: Develop and optimize data pipelines that improve the efficiency, accuracy, and automation of the Debricked SCA tools data intake processes. Collaborative Environment: Work closely with engineers and product managers from Sweden and India to create impactful, data-driven solutions. Continuous Improvement: Play an essential role in maintaining and improving the data quality that powers Debrickeds analysis, improving the products competitiveness. Skill Development: Collaborate across teams and leverage OpenTexts resources (including an educational budget) to develop your expertise in software engineering,data science and AI, expanding your skill set in both traditional and cutting-edge technologies. What you need to Succeed: 2-4 years of experience in Python development, with a focus on optimizing data processes and improving data quality. Proficiency in Python and related tools and libraries like Jupyter, Pandas and Numpy. A degree in Computer Science or a related discipline. An interest in application security. Asset to have skills in Go, Java, LLMs (specifically Gemini), GCP, Kubernetes, MySQL, Elastic, Neo4J. A strong understanding of how to manage and improve data quality in automated systems and pipelines. Ability to address complex data challenges and develop solutions to optimize systems. Comfortable working in a distributed team, collaborating across different time zones.
Posted 3 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough