Ahmedabad, Gujarat
INR Not disclosed
Remote
Full Time
About the Role: We are seeking a hands-on Product Manager with strong technical depth who is fluent in both technology and business domains. The ideal candidate began their career as a Developer (Java, Python, or AI/ML) and has transitioned into Product or Project Management. Technical expertise is a must — you should be comfortable engaging deeply with engineering, AI/ML models, and system architectures. If you dream in product roadmaps but think in data structures and algorithms, we would love to meet you. Key Responsibilities: · Own and drive the product lifecycle from ideation to launch, specifically for AI-driven products. · Collaborate cross-functionally with engineering, design, data science, and marketing teams to deliver exceptional product experiences. · Understand technical architecture and system integrations; actively engage with engineering teams and challenge decisions logically and constructively. · Translate business problems into tech solutions and vice versa without missing a beat. · Author detailed Product Requirement Documents (PRDs), user stories, acceptance criteria, and sprint plans · Conduct market research, competitor analysis, and customer discovery for AI product initiatives. · Define and monitor model performance metrics (accuracy, precision, recall) where applicable for AI/ML products. · Evangelize the product vision both internally and externally. Qualifications: · 7+ years of overall experience. · Started career as a Developer in Java, Python, or AI/ML domains. · 2+ years of Product or Project Management experience (ownership of end-to-end delivery). · Strong technical background: ability to understand engineering complexities and make informed decisions. · Solid experience working with or building AI/ML-powered products. · Exceptional problem-solving skills — comfortable navigating ambiguity and wearing multiple hats. · Sharp communication, negotiation, and stakeholder management skills. · Bachelor’s degree in Computer Science, Engineering, or related fields (Master’s preferred but not mandatory). Job Type: Full-time Pay: ₹800,000.00 - ₹1,200,000.00 per year Benefits: Flexible schedule Health insurance Leave encashment Provident Fund Work from home Schedule: Day shift Application Question(s): Current CTC Expected CTC Notice period Will you be able to join immediately? Are you comfortable with Hybrid? Work Location: Hybrid remote in Ahmedabad, Gujarat
Bengaluru, Karnataka
INR Not disclosed
Remote
Full Time
Primary skills: Java, Springboot, Microservices, AWS, Kafka Must-Have: ● Strong Java engineer with AWS experience and Kafka ● Tech-savvy engineer - willing and able to learn new skills, track industry trends ● Strong programming skills with 5+ years experience ● Good knowledge of Service based architecture ● Understanding and working experience of: o Java, Multi-threading o Web services - REST or SOAP; Microservices; Domain driven architecture o Spring framework basics - IOC, DI o Spring Boot, other modules of Spring o Hibernate / some ORM framework o Web application fundamentals o Front-end experience in technologies such as HTML5, CSS3, React will be an add-on o Git, Jenkins, SonarQube and other tools o SQL and NoSQL databases ● Understanding of design patterns, common concepts such as caching, logging, troubleshooting, performance tuning, etc. ● Exposure to cloud/ containers/ search engines, etc. will be considered a plu Job Type: Full-time Pay: ₹1,100,000.00 - ₹1,900,000.00 per year Benefits: Flexible schedule Health insurance Paid sick time Paid time off Provident Fund Work from home Location Type: Hybrid work Schedule: Day shift Ability to commute/relocate: Bengaluru, Karnataka: Reliably commute or planning to relocate before starting work (Preferred) Application Question(s): CTC Notice period (days) Expected CTC Total exp Experience in AWS Experience in Kafka Work Location: Hybrid remote in Bengaluru, Karnataka
Ahmedabad
INR 10.0 - 20.0 Lacs P.A.
Remote
Full Time
Location: Ahmedabad | Hybrid - Remote Benefits: 5 Day Working | Career Growth | Flexible working | Medical Insurance Primary skills: AI/ML, LLM, Python, Deep Learning, GenAI Experience: 7+ Years Responsibilities: Lead the design and development of cutting-edge AI/ML solutions across NLP, LLM, deep learning, and computer vision. Mentor and guide junior engineers and data scientists in best practices and architecture decisions. Collaborate with leadership and cross-functional teams (product, design, engineering) to align AI capabilities with business goals. Evaluate, implement, and optimize models for scale, performance, and real-world impact. Architect reusable AI modules and frameworks that power multiple products. Stay ahead of trends by researching advancements in generative AI, LLMs, and multimodal models. Oversee data pipeline integrity, governance, and readiness for training and production. Contribute to model monitoring, CI/CD for ML, and feedback loops in deployed systems. Represent the AI/ML team in technical discussions and reviews. Required Skills: Strong experience in Python and frameworks like PyTorch, TensorFlow, Keras. Expertise in natural language processing (NLP) and large language models (LLMs). Solid foundation in traditional machine learning, deep learning, and AI architecture patterns. Proven experience in leading or architecting end-to-end ML solutions—from concept to deployment. Comfortable with MLOps tools, model versioning, and production-grade infrastructure. Excellent communication skills, with a track record of team leadership and stakeholder collaboration. Passion for emerging AI tech and a vision for applying it in real-world applications. Job Type: Full-time Pay: ₹1,000,000.00 - ₹2,000,000.00 per year Benefits: Flexible schedule Health insurance Paid sick time Paid time off Provident Fund Work from home Schedule: Day shift Ability to commute/relocate: Ahmedabad, Gujarat: Reliably commute or planning to relocate before starting work (Preferred) Application Question(s): Notice period (days) current ctc Expected CTC Total experience Experience in AI Experience in LLM Work Location: In person
Ahmedabad, Gujarat
INR Not disclosed
Remote
Full Time
Location: Ahmedabad | Hybrid - Remote Benefits: 5 Day Working | Career Growth | Flexible working | Medical Insurance Primary skills: AI/ML, LLM, Python, Deep Learning, GenAI Experience: 7+ Years Responsibilities: Lead the design and development of cutting-edge AI/ML solutions across NLP, LLM, deep learning, and computer vision. Mentor and guide junior engineers and data scientists in best practices and architecture decisions. Collaborate with leadership and cross-functional teams (product, design, engineering) to align AI capabilities with business goals. Evaluate, implement, and optimize models for scale, performance, and real-world impact. Architect reusable AI modules and frameworks that power multiple products. Stay ahead of trends by researching advancements in generative AI, LLMs, and multimodal models. Oversee data pipeline integrity, governance, and readiness for training and production. Contribute to model monitoring, CI/CD for ML, and feedback loops in deployed systems. Represent the AI/ML team in technical discussions and reviews. Required Skills: Strong experience in Python and frameworks like PyTorch, TensorFlow, Keras. Expertise in natural language processing (NLP) and large language models (LLMs). Solid foundation in traditional machine learning, deep learning, and AI architecture patterns. Proven experience in leading or architecting end-to-end ML solutions—from concept to deployment. Comfortable with MLOps tools, model versioning, and production-grade infrastructure. Excellent communication skills, with a track record of team leadership and stakeholder collaboration. Passion for emerging AI tech and a vision for applying it in real-world applications. Job Type: Full-time Pay: ₹1,000,000.00 - ₹2,000,000.00 per year Benefits: Flexible schedule Health insurance Paid sick time Paid time off Provident Fund Work from home Schedule: Day shift Ability to commute/relocate: Ahmedabad, Gujarat: Reliably commute or planning to relocate before starting work (Preferred) Application Question(s): Notice period (days) current ctc Expected CTC Total experience Experience in AI Experience in LLM Work Location: In person
Gurgaon
INR 12.0 - 20.0 Lacs P.A.
Remote
Full Time
Position: GCP Data Engineer Company Info: Prama (HQ : Chandler, AZ, USA) Prama specializes in AI-powered and Generative AI solutions for Data, Cloud, and APIs. We collaborate with businesses worldwide to develop platforms and AI-powered products that offer valuable insights and drive business growth. Our comprehensive services include architectural assessment, strategy development, and execution to create secure, reliable, and scalable systems. We are experts in creating innovative platforms for various industries. We help clients to overcome complex business challenges. Our team is dedicated to delivering cutting-edge solutions that elevate the digital experience for corporations. Prama is headquartered in Phoenix with offices in USA, Canada, Mexico, Brazil and India. Location: Bengaluru | Gurugram | Hybrid Benefits: 5 Day Working | Career Growth | Flexible working | Potential On-site Opportunity Kindly send your CV or Resume to careers@prama.ai Primary skills: GCP, PySpark, Python, SQL, ETL Job Description: We are seeking a highly skilled and motivated GCP Data Engineer to join our team. As a GCP Data Engineer, you will play a crucial role in designing, developing, and maintaining robust data pipelines and data warehousing solutions on the Google Cloud Platform (GCP). You will work closely with data analysts, data scientists, and other stakeholders to ensure the efficient collection, transformation, and analysis of large datasets. Responsibilities: · Design, develop, and maintain scalable data pipelines using GCP tools such as Dataflow, Dataproc, and Cloud Functions. · Implement ETL processes to extract, transform, and load data from various sources into BigQuery. · Optimize data pipelines for performance, cost-efficiency, and reliability. · Collaborate with data analysts and data scientists to understand their data needs and translate them into technical solutions. · Design and implement data warehouses and data marts using BigQuery. · Model and structure data for optimal performance and query efficiency. · Develop and maintain data quality checks and monitoring processes. · Use SQL and Python (PySpark) to analyze large datasets and generate insights. · Create visualizations using tools like Data Studio or Looker to communicate data findings effectively. · Manage and maintain GCP resources, including virtual machines, storage, and networking. · Implement best practices for security, cost optimization, and scalability. · Automate infrastructure provisioning and management using tools like Terraform. Qualifications: · Strong proficiency in SQL, Python, and PySpark. · Hands-on experience with GCP services, including BigQuery, Dataflow, Dataproc, Cloud Storage, and Cloud Functions. · Experience with data warehousing concepts and methodologies. · Understanding of data modeling techniques and best practices. · Strong analytical and problem-solving skills. · Excellent communication and collaboration skills. · Experience with data quality assurance and monitoring. · Knowledge of cloud security best practices. · A passion for data and a desire to learn new technologies. Preferred Qualifications: · Google Cloud Platform certification. · Experience with machine learning and AI. · Knowledge of data streaming technologies (Kafka, Pub/Sub). · Experience with data visualization tools (Looker, Tableau, Data Studio Job Type: Full-time Pay: ₹1,200,000.00 - ₹2,000,000.00 per year Benefits: Flexible schedule Health insurance Leave encashment Paid sick time Provident Fund Work from home Ability to commute/relocate: Gurugram, Haryana: Reliably commute or planning to relocate before starting work (Required) Application Question(s): CTC Expected CTC Notice Period (days) Experience in GCP Total Experience Work Location: Hybrid remote in Gurugram, Haryana
Gurugram, Haryana
INR Not disclosed
Remote
Full Time
Position: GCP Data Engineer Company Info: Prama (HQ : Chandler, AZ, USA) Prama specializes in AI-powered and Generative AI solutions for Data, Cloud, and APIs. We collaborate with businesses worldwide to develop platforms and AI-powered products that offer valuable insights and drive business growth. Our comprehensive services include architectural assessment, strategy development, and execution to create secure, reliable, and scalable systems. We are experts in creating innovative platforms for various industries. We help clients to overcome complex business challenges. Our team is dedicated to delivering cutting-edge solutions that elevate the digital experience for corporations. Prama is headquartered in Phoenix with offices in USA, Canada, Mexico, Brazil and India. Location: Bengaluru | Gurugram | Hybrid Benefits: 5 Day Working | Career Growth | Flexible working | Potential On-site Opportunity Kindly send your CV or Resume to careers@prama.ai Primary skills: GCP, PySpark, Python, SQL, ETL Job Description: We are seeking a highly skilled and motivated GCP Data Engineer to join our team. As a GCP Data Engineer, you will play a crucial role in designing, developing, and maintaining robust data pipelines and data warehousing solutions on the Google Cloud Platform (GCP). You will work closely with data analysts, data scientists, and other stakeholders to ensure the efficient collection, transformation, and analysis of large datasets. Responsibilities: · Design, develop, and maintain scalable data pipelines using GCP tools such as Dataflow, Dataproc, and Cloud Functions. · Implement ETL processes to extract, transform, and load data from various sources into BigQuery. · Optimize data pipelines for performance, cost-efficiency, and reliability. · Collaborate with data analysts and data scientists to understand their data needs and translate them into technical solutions. · Design and implement data warehouses and data marts using BigQuery. · Model and structure data for optimal performance and query efficiency. · Develop and maintain data quality checks and monitoring processes. · Use SQL and Python (PySpark) to analyze large datasets and generate insights. · Create visualizations using tools like Data Studio or Looker to communicate data findings effectively. · Manage and maintain GCP resources, including virtual machines, storage, and networking. · Implement best practices for security, cost optimization, and scalability. · Automate infrastructure provisioning and management using tools like Terraform. Qualifications: · Strong proficiency in SQL, Python, and PySpark. · Hands-on experience with GCP services, including BigQuery, Dataflow, Dataproc, Cloud Storage, and Cloud Functions. · Experience with data warehousing concepts and methodologies. · Understanding of data modeling techniques and best practices. · Strong analytical and problem-solving skills. · Excellent communication and collaboration skills. · Experience with data quality assurance and monitoring. · Knowledge of cloud security best practices. · A passion for data and a desire to learn new technologies. Preferred Qualifications: · Google Cloud Platform certification. · Experience with machine learning and AI. · Knowledge of data streaming technologies (Kafka, Pub/Sub). · Experience with data visualization tools (Looker, Tableau, Data Studio Job Type: Full-time Pay: ₹1,200,000.00 - ₹2,000,000.00 per year Benefits: Flexible schedule Health insurance Leave encashment Paid sick time Provident Fund Work from home Ability to commute/relocate: Gurugram, Haryana: Reliably commute or planning to relocate before starting work (Required) Application Question(s): CTC Expected CTC Notice Period (days) Experience in GCP Total Experience Work Location: Hybrid remote in Gurugram, Haryana
Ahmedabad, Gujarat
INR 5.0 - 9.0 Lacs P.A.
Remote
Full Time
We're looking for a sharp, detail-oriented Accounts Executive to manage end-to-end accounting operations including payroll, financial reporting, and balance sheet preparation. If you're someone who enjoys working with numbers, thrives on accuracy, and can turn data into insights through advanced Excel analysis — we want to hear from you! Key Responsibilities: Prepare and analyze profit & loss statements, provisional balance sheets, and monthly financial reports Manage payroll processing, ensuring compliance with statutory requirements Maintain and update company accounts, ledgers, and journal entries Perform advanced Excel-based analysis for budgeting, forecasting, and performance tracking Generate custom reports as required by management using Paycheck software or similar tools Assist in internal and external audits Ensure timely reconciliation of bank statements, vendor payments, and receivables Qualifications & Requirements: MBA (Finance) or CA Inter 3 to 5 years of relevant experience in accounting and payroll functions Strong knowledge of accounting principles, taxation, and financial compliance Proficient in Microsoft Excel (VLOOKUP, Pivot Tables, Macros preferred) Familiarity with Paycheck software is a big plus Excellent analytical and organizational skills Ability to work independently and meet deadlines Job Type: Full-time Pay: ₹500,000.00 - ₹900,000.00 per year Benefits: Flexible schedule Health insurance Leave encashment Paid sick time Provident Fund Work from home Schedule: Day shift Monday to Friday Ability to commute/relocate: Ahmedabad, Gujarat: Reliably commute or planning to relocate before starting work (Required) Application Question(s): Notice period Current CTC Expected CTC Communication rating out of 5 Location Work Location: In person
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.