Home
Jobs
4 Job openings at Rapidera Technologies Pvt Ltd
GCP Data Architect

Baner, Pune, Maharashtra

8 years

None Not disclosed

On-site

Full Time

Proficient in data modeling, data warehousing concepts, ETL/ELT pipelines, and big data processing frameworks. Experience with SQL, Python, and Terraform (preferred) for infrastructure as code. Hands-on experience in data security, encryption, access control, and governance on GCP. Experience in integrating with real-time data pipelines and event-driven architectures. Strong understanding of DevOps, CI/CD pipelines for data workflows, and cloud cost optimization. GCP Professional Data Engineer / Cloud Architect certification is a plus. Role Overview: We are seeking a highly skilled GCP Data Architect with 6–8 years of experience in designing, developing, and managing enterprise data solutions on Google Cloud Platform (GCP). The ideal candidate will have a strong background in cloud data architecture, data warehousing, big data processing, and data integration, with proven expertise in delivering scalable, secure, and efficient data platforms. Key Responsibilities: Design and architect end-to-end data solutions on GCP, aligning with business and technical requirements. Define data models, storage strategies, data ingestion, processing, and consumption frameworks. Implement data lakes, data warehouses, and data marts using services like BigQuery, Cloud Storage, Dataflow, Dataproc, Pub/Sub, and Composer. Collaborate with business stakeholders, data scientists, and engineering teams to understand data needs and translate them into scalable architectures. Design and implement data governance, security, and compliance frameworks for cloud-based data platforms. Optimize data workflows, query performance, and storage costs in the GCP environment. Lead data migration and modernization initiatives from on-premise or other cloud platforms to GCP. Stay updated with GCP services, features, and industry best practices to recommend improvements and innovation. Provide technical leadership and mentoring to data engineering teams. Required Skills & Experience: 6–8 years of experience in data architecture and engineering roles, with at least 3 years hands-on on GCP. Strong expertise in GCP data services: BigQuery, Dataflow, Pub/Sub, Dataproc, Cloud Storage, Cloud Composer, Data Catalog. Proficient in data modeling, data warehousing concepts, ETL/ELT pipelines, and big data processing frameworks. Experience with SQL, Python, and Terraform (preferred) for infrastructure as code. Hands-on experience in data security, encryption, access control, and governance on GCP. Experience in integrating with real-time data pipelines and event-driven architectures. Strong understanding of DevOps, CI/CD pipelines for data workflows, and cloud cost optimization. GCP Professional Data Engineer / Cloud Architect certification is a plus. Good to Have : Exposure to AI/ML workflows, data preparation for ML models. Experience with third-party tools like Apache Airflow, Looker, or Dataplex. Knowledge of other cloud platforms (AWS, Azure) for hybrid/multi-cloud strategies. Educational Qualification: Bachelor’s or Master’s degree in Computer Science, Information Technology, Data Engineering, or a related field. Why Join Us: Work on cutting-edge data transformation programs at scale. Opportunity to architect high-impact solutions in a collaborative, innovation-driven environment. Engage with a fast-growing team focused on data-driven business value. Job Type: Full-time Work Location: In person

Senior Golang Developer

India

3 years

INR 5.53038 - 17.90366 Lacs P.A.

On-site

Full Time

Exp: 6 to 7+ yrs Strong proficiency in Go (Golang) for backend development Experience building RESTful APIs and microservices Familiarity with PostgreSQL and GORM (or similar ORM) Good understanding of concurrency, goroutines, and performance optimization Experience with cloud platforms (AWS, Azure, GCP) and API integrations Proficient with Git, Docker, and basic CI/CD workflows Ability to write clean, testable code and debug effectively Strong problem-solving skills and ability to work in agile teams 3+ years of hands-on experience with GoLang in production environments Strong understanding of RESTful API design, microservices architecture, and distributed systems Experience with relational databases (PostgreSQL preferred) Familiarity with cloud platforms (AWS, Azure, GCP) and cost management concepts Comfortable with Git, Docker, CI/CD, and modern development workflows Experience working with APIs for billing, monitoring, or infrastructure management is a plus Solid understanding of software engineering principles and best practices Nice to have: Knowledge of FinOps, cloud cost optimization, or billing data analysis Experience with Kafka, RabbitMQ, or other messaging systems Familiarity with Infrastructure as Code tools (Terraform, Pulumi) Exposure to open-source LLM/AI integrations is a plus Job Type: Full-time Pay: ₹553,037.71 - ₹1,790,365.67 per year Work Location: In person

GCP Data Architect

India

6 - 8 years

INR Not disclosed

On-site

Full Time

Proficient in data modeling, data warehousing concepts, ETL/ELT pipelines, and big data processing frameworks. Experience with SQL, Python, and Terraform (preferred) for infrastructure as code. Hands-on experience in data security, encryption, access control, and governance on GCP. Experience in integrating with real-time data pipelines and event-driven architectures. Strong understanding of DevOps, CI/CD pipelines for data workflows, and cloud cost optimization. GCP Professional Data Engineer / Cloud Architect certification is a plus. Role Overview: We are seeking a highly skilled GCP Data Architect with 6–8 years of experience in designing, developing, and managing enterprise data solutions on Google Cloud Platform (GCP). The ideal candidate will have a strong background in cloud data architecture, data warehousing, big data processing, and data integration, with proven expertise in delivering scalable, secure, and efficient data platforms. Key Responsibilities: Design and architect end-to-end data solutions on GCP, aligning with business and technical requirements. Define data models, storage strategies, data ingestion, processing, and consumption frameworks. Implement data lakes, data warehouses, and data marts using services like BigQuery, Cloud Storage, Dataflow, Dataproc, Pub/Sub, and Composer. Collaborate with business stakeholders, data scientists, and engineering teams to understand data needs and translate them into scalable architectures. Design and implement data governance, security, and compliance frameworks for cloud-based data platforms. Optimize data workflows, query performance, and storage costs in the GCP environment. Lead data migration and modernization initiatives from on-premise or other cloud platforms to GCP. Stay updated with GCP services, features, and industry best practices to recommend improvements and innovation. Provide technical leadership and mentoring to data engineering teams. Required Skills & Experience: 6–8 years of experience in data architecture and engineering roles, with at least 3 years hands-on on GCP. Strong expertise in GCP data services: BigQuery, Dataflow, Pub/Sub, Dataproc, Cloud Storage, Cloud Composer, Data Catalog. Proficient in data modeling, data warehousing concepts, ETL/ELT pipelines, and big data processing frameworks. Experience with SQL, Python, and Terraform (preferred) for infrastructure as code. Hands-on experience in data security, encryption, access control, and governance on GCP. Experience in integrating with real-time data pipelines and event-driven architectures. Strong understanding of DevOps, CI/CD pipelines for data workflows, and cloud cost optimization. GCP Professional Data Engineer / Cloud Architect certification is a plus. Good to Have : Exposure to AI/ML workflows, data preparation for ML models. Experience with third-party tools like Apache Airflow, Looker, or Dataplex. Knowledge of other cloud platforms (AWS, Azure) for hybrid/multi-cloud strategies. Educational Qualification: Bachelor’s or Master’s degree in Computer Science, Information Technology, Data Engineering, or a related field. Why Join Us: Work on cutting-edge data transformation programs at scale. Opportunity to architect high-impact solutions in a collaborative, innovation-driven environment. Engage with a fast-growing team focused on data-driven business value. Job Type: Full-time Work Location: In person

Senior Salesforce Developer

Pune, Maharashtra

0 - 3 years

INR 12.0 - 20.0 Lacs P.A.

On-site

Not specified

Looking for Salesforce developers with minimum 3-5 years relevant experience on immediate basis. Should be hands on developers with Apex, Lightning components, Visualforce, Agentforce, Java and Integration skills. Excellent communication, coding best practices, and debugging/troubleshooting skills required. This is full time position needing work from office in Pune If you are interested and your profile meets the expectations of the job please respond with your detailed CV, current CTC, expected CTC and notice period. Application will be consider only for immediate candidates. Job Type: Permanent Pay: ₹1,200,000.00 - ₹2,000,000.00 per year Benefits: Flexible schedule Health insurance Paid sick time Paid time off Provident Fund Ability to commute/relocate: Pune, Maharashtra: Reliably commute or planning to relocate before starting work (Required) Experience: Salesforce: 3 years (Required) Work Location: In person

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Job Titles Overview