Rapidera Technologies Pvt Ltd

9 Job openings at Rapidera Technologies Pvt Ltd
GCP Data Architect Baner, Pune, Maharashtra 8 years None Not disclosed On-site Full Time

Proficient in data modeling, data warehousing concepts, ETL/ELT pipelines, and big data processing frameworks. Experience with SQL, Python, and Terraform (preferred) for infrastructure as code. Hands-on experience in data security, encryption, access control, and governance on GCP. Experience in integrating with real-time data pipelines and event-driven architectures. Strong understanding of DevOps, CI/CD pipelines for data workflows, and cloud cost optimization. GCP Professional Data Engineer / Cloud Architect certification is a plus. Role Overview: We are seeking a highly skilled GCP Data Architect with 6–8 years of experience in designing, developing, and managing enterprise data solutions on Google Cloud Platform (GCP). The ideal candidate will have a strong background in cloud data architecture, data warehousing, big data processing, and data integration, with proven expertise in delivering scalable, secure, and efficient data platforms. Key Responsibilities: Design and architect end-to-end data solutions on GCP, aligning with business and technical requirements. Define data models, storage strategies, data ingestion, processing, and consumption frameworks. Implement data lakes, data warehouses, and data marts using services like BigQuery, Cloud Storage, Dataflow, Dataproc, Pub/Sub, and Composer. Collaborate with business stakeholders, data scientists, and engineering teams to understand data needs and translate them into scalable architectures. Design and implement data governance, security, and compliance frameworks for cloud-based data platforms. Optimize data workflows, query performance, and storage costs in the GCP environment. Lead data migration and modernization initiatives from on-premise or other cloud platforms to GCP. Stay updated with GCP services, features, and industry best practices to recommend improvements and innovation. Provide technical leadership and mentoring to data engineering teams. Required Skills & Experience: 6–8 years of experience in data architecture and engineering roles, with at least 3 years hands-on on GCP. Strong expertise in GCP data services: BigQuery, Dataflow, Pub/Sub, Dataproc, Cloud Storage, Cloud Composer, Data Catalog. Proficient in data modeling, data warehousing concepts, ETL/ELT pipelines, and big data processing frameworks. Experience with SQL, Python, and Terraform (preferred) for infrastructure as code. Hands-on experience in data security, encryption, access control, and governance on GCP. Experience in integrating with real-time data pipelines and event-driven architectures. Strong understanding of DevOps, CI/CD pipelines for data workflows, and cloud cost optimization. GCP Professional Data Engineer / Cloud Architect certification is a plus. Good to Have : Exposure to AI/ML workflows, data preparation for ML models. Experience with third-party tools like Apache Airflow, Looker, or Dataplex. Knowledge of other cloud platforms (AWS, Azure) for hybrid/multi-cloud strategies. Educational Qualification: Bachelor’s or Master’s degree in Computer Science, Information Technology, Data Engineering, or a related field. Why Join Us: Work on cutting-edge data transformation programs at scale. Opportunity to architect high-impact solutions in a collaborative, innovation-driven environment. Engage with a fast-growing team focused on data-driven business value. Job Type: Full-time Work Location: In person

Senior Golang Developer India 3 years INR 5.53038 - 17.90366 Lacs P.A. On-site Full Time

Exp: 6 to 7+ yrs Strong proficiency in Go (Golang) for backend development Experience building RESTful APIs and microservices Familiarity with PostgreSQL and GORM (or similar ORM) Good understanding of concurrency, goroutines, and performance optimization Experience with cloud platforms (AWS, Azure, GCP) and API integrations Proficient with Git, Docker, and basic CI/CD workflows Ability to write clean, testable code and debug effectively Strong problem-solving skills and ability to work in agile teams 3+ years of hands-on experience with GoLang in production environments Strong understanding of RESTful API design, microservices architecture, and distributed systems Experience with relational databases (PostgreSQL preferred) Familiarity with cloud platforms (AWS, Azure, GCP) and cost management concepts Comfortable with Git, Docker, CI/CD, and modern development workflows Experience working with APIs for billing, monitoring, or infrastructure management is a plus Solid understanding of software engineering principles and best practices Nice to have: Knowledge of FinOps, cloud cost optimization, or billing data analysis Experience with Kafka, RabbitMQ, or other messaging systems Familiarity with Infrastructure as Code tools (Terraform, Pulumi) Exposure to open-source LLM/AI integrations is a plus Job Type: Full-time Pay: ₹553,037.71 - ₹1,790,365.67 per year Work Location: In person

GCP Data Architect India 6 - 8 years INR Not disclosed On-site Full Time

Proficient in data modeling, data warehousing concepts, ETL/ELT pipelines, and big data processing frameworks. Experience with SQL, Python, and Terraform (preferred) for infrastructure as code. Hands-on experience in data security, encryption, access control, and governance on GCP. Experience in integrating with real-time data pipelines and event-driven architectures. Strong understanding of DevOps, CI/CD pipelines for data workflows, and cloud cost optimization. GCP Professional Data Engineer / Cloud Architect certification is a plus. Role Overview: We are seeking a highly skilled GCP Data Architect with 6–8 years of experience in designing, developing, and managing enterprise data solutions on Google Cloud Platform (GCP). The ideal candidate will have a strong background in cloud data architecture, data warehousing, big data processing, and data integration, with proven expertise in delivering scalable, secure, and efficient data platforms. Key Responsibilities: Design and architect end-to-end data solutions on GCP, aligning with business and technical requirements. Define data models, storage strategies, data ingestion, processing, and consumption frameworks. Implement data lakes, data warehouses, and data marts using services like BigQuery, Cloud Storage, Dataflow, Dataproc, Pub/Sub, and Composer. Collaborate with business stakeholders, data scientists, and engineering teams to understand data needs and translate them into scalable architectures. Design and implement data governance, security, and compliance frameworks for cloud-based data platforms. Optimize data workflows, query performance, and storage costs in the GCP environment. Lead data migration and modernization initiatives from on-premise or other cloud platforms to GCP. Stay updated with GCP services, features, and industry best practices to recommend improvements and innovation. Provide technical leadership and mentoring to data engineering teams. Required Skills & Experience: 6–8 years of experience in data architecture and engineering roles, with at least 3 years hands-on on GCP. Strong expertise in GCP data services: BigQuery, Dataflow, Pub/Sub, Dataproc, Cloud Storage, Cloud Composer, Data Catalog. Proficient in data modeling, data warehousing concepts, ETL/ELT pipelines, and big data processing frameworks. Experience with SQL, Python, and Terraform (preferred) for infrastructure as code. Hands-on experience in data security, encryption, access control, and governance on GCP. Experience in integrating with real-time data pipelines and event-driven architectures. Strong understanding of DevOps, CI/CD pipelines for data workflows, and cloud cost optimization. GCP Professional Data Engineer / Cloud Architect certification is a plus. Good to Have : Exposure to AI/ML workflows, data preparation for ML models. Experience with third-party tools like Apache Airflow, Looker, or Dataplex. Knowledge of other cloud platforms (AWS, Azure) for hybrid/multi-cloud strategies. Educational Qualification: Bachelor’s or Master’s degree in Computer Science, Information Technology, Data Engineering, or a related field. Why Join Us: Work on cutting-edge data transformation programs at scale. Opportunity to architect high-impact solutions in a collaborative, innovation-driven environment. Engage with a fast-growing team focused on data-driven business value. Job Type: Full-time Work Location: In person

Senior Salesforce Developer Pune, Maharashtra 0 - 3 years INR 12.0 - 20.0 Lacs P.A. On-site Not specified

Looking for Salesforce developers with minimum 3-5 years relevant experience on immediate basis. Should be hands on developers with Apex, Lightning components, Visualforce, Agentforce, Java and Integration skills. Excellent communication, coding best practices, and debugging/troubleshooting skills required. This is full time position needing work from office in Pune If you are interested and your profile meets the expectations of the job please respond with your detailed CV, current CTC, expected CTC and notice period. Application will be consider only for immediate candidates. Job Type: Permanent Pay: ₹1,200,000.00 - ₹2,000,000.00 per year Benefits: Flexible schedule Health insurance Paid sick time Paid time off Provident Fund Ability to commute/relocate: Pune, Maharashtra: Reliably commute or planning to relocate before starting work (Required) Experience: Salesforce: 3 years (Required) Work Location: In person

Gen AI Data Scientist pune, maharashtra 4 years INR 15.0 - 24.0 Lacs P.A. On-site Full Time

Key Responsibilities Fine-tune LLMs using techniques like LoRA and QLoRA Evaluate and improve RAG (Retrieval-Augmented Generation) pipelines for groundedness, accuracy, and relevance Apply transfer learning and transformer architectures in model development Validate model accuracy and performance using appropriate metrics Collaborate with product teams and communicate insights to senior leadership Participate in problem-solving sessions and contribute innovative ideas Maintain an experimental mindset and continuously explore new approaches Identify and integrate relevant data sources to build meaningful datasets Automate data collection and preprocessing for structured and unstructured data Handle large-scale data to feed analytical and predictive models Build and optimize machine learning and deep learning models, including NLP solutions Requirements: Education & Experience Bachelor’s degree in a quantitative field (Computer Science, Engineering, Physics, Mathematics, Operations Research) or equivalent experience 1–4 years of hands-on experience in Gen AI and NLP Prior experience in startups or high-growth environments is a plus Technical Skills Deep expertise in NLP techniques: text generation, sentiment analysis, NER, and language modeling Hands-on experience with LLMs and RAG pipelines Proficiency in neural network frameworks: TensorFlow, PyTorch Familiarity with transformer architecture and transfer learning Fluency in at least one programming language: Python, R, or Julia Experience with Gen AI libraries: Hugging Face, OpenAI, etc. Strong foundation in ML algorithms: supervised, unsupervised, reinforcement learning, Bayesian inference Fine Tuning, Transfer Learning, Pytorch, Tensorflow Analytical & Communication Skills Strong math skills: statistics, linear algebra, probability Proven problem-solving aptitude with real-world application of Gen AI and NLP Excellent communication skills to translate complex technical concepts for non-technical stakeholders Collaborative mindset with the ability to work across teams Job Type: Full-time Pay: ₹1,500,000.00 - ₹2,400,000.00 per year Work Location: In person

Senior Salesforce Developer pune, maharashtra 0 - 3 years INR 10.0 - 18.0 Lacs P.A. On-site Not specified

Looking for Salesforce developers with minimum 3-5 years relevant experience on immediate basis. Should be hands on developers with Apex, Lightning components, Visualforce, Agentforce, Dashboard and Integration skills. Gen AI skills preferred. Excellent communication, coding best practices, and debugging/troubleshooting skills required. This is full time position needing work from office in Pune If you are interested and your profile meets the expectations of the job please respond with your detailed CV, current CTC, expected CTC and notice period. Application will be consider only for immediate candidates. Job Type: Permanent Pay: ₹1,000,000.00 - ₹1,800,000.00 per year Benefits: Flexible schedule Health insurance Paid sick time Paid time off Provident Fund Ability to commute/relocate: Pune, Maharashtra: Reliably commute or planning to relocate before starting work (Required) Experience: Salesforce: 3 years (Required) Work Location: In person

Gen AI Software Engineer bengaluru 0 years INR 18.0 - 48.0 Lacs P.A. On-site Full Time

Multiple positions at different seniorty levels from developer to Architect Key Responsibilities Develop and optimize AI services for the Gen AI HUB platform, Agentic RAG, MCP Servers/ClientsSupport prompt orchestration, model integration, model finetuning, observability, and toolchain automation. Collaborate with data scientists to operationalize AI models. Core Skills : Azure .NET stack with Microsoft AI ecosystem familiarity. Strong programming in .NET/PythonExperience with REST APIs, model endpoints, and SDK usage. Familiarity with vector databases and LLM integration workflows. Hands-on with embedding stores, orchestrators, and RAG techniques.AI Foundry, Azure stack, .NET, and REST API skills. Willingness to learn and explore Gen AI technologies. Clean coding, Git workflows, and integration testing basics. Nice to Have : Experience using Lang Chain, Semantic Kernel, or similar orchestration frameworks. Exposure to Azure OpenAI and Microsoft Copilot stack, Open Source LLMs Job Type: Full-time Pay: ₹1,800,000.00 - ₹4,800,000.00 per year Work Location: In person

Agentic AI developer pune,maharashtra 3 - 7 years INR Not disclosed On-site Full Time

As an Agentic AI Developer at our company, you will be responsible for designing and delivering scalable applications in Python and building Agentic AI systems. You will collaborate with product management to understand their requirements, challenges, and develop potential solutions. Additionally, you will stay updated with the latest tools, technologies, and methodologies and share your knowledge with key decision makers. Key Responsibilities: - Design and deliver scalable applications in Python and build Agentic AI systems - Collaborate with product management to understand their requirements & challenges and develop potential solutions - Stay current with the latest tools, technology ideas, and methodologies; share knowledge by clearly articulating results and ideas to key decision makers Qualifications Required: - 3+ years of strong experience in Python - Agentic AI and RAG experience required - Experience with AWS services is required - Strong database skills - BS/MS in Computer Science or equivalent - Strong problem-solving skills, data structures, and algorithm design - Strong experience in Object-Oriented Design and developing highly scalable applications - Ability to deliver code quickly from wireframe model in a fast-paced startup environment - Attention to details - Strong communication and collaboration skills - Ability to lead a small team Join our highly energetic and innovative team that believes in the power of creativity and hard work to achieve the impossible. *Note: This job is Full-time and Permanent with benefits including a flexible schedule, health insurance, paid sick time, paid time off, and work from home option.* Experience: - Total work: 3 years (Required),

Lead Python Django karnataka 3 - 7 years INR Not disclosed On-site Full Time

As a Lead Developer at our company, you will play a crucial role in designing and delivering scalable backend solutions using Python Django with microservices. You will work closely with the team to understand requirements, suggest design changes for better user experience, and develop reusable code following design patterns and component architecture. Additionally, you will lead a small team, conduct code reviews, and ensure adherence to design standards. Key Responsibilities: - Design and deliver scalable backend solutions in Python Django with microservices - Collaborate with product management to understand requirements and challenges, and develop potential solutions - Stay updated with the latest tools and technologies, share knowledge with key decision makers - Lead a small team, conduct code reviews, and ensure design standards are followed - Strong communication and collaboration skills Qualifications Required: - 3+ years of backend development experience with microservices - Experience with AWS services - Strong database skills - BS/MS in Computer Science or equivalent - Proficiency in problem-solving, data structures, and algorithm design - Strong experience in Object-Oriented Design and developing highly scalable applications - Ability to deliver code quickly in a fast-paced startup environment - Attention to detail and ability to lead a small team In addition, the job offers benefits such as a flexible schedule, health insurance, paid sick time, paid time off, and the option to work from home. This is a full-time, permanent position requiring a total of 3 years of work experience. Work location is in person.,