Job
Description
Role: Data Engineer Role Type: Individual Contributor Experience: 3-4 years Who Are We BimaKavach is reimagining how Indian businesses access protection with technology, speed, and simplicity at the core of everything we do. We proudly serve 3,000+ companies, including names like BSNL, Daikin, The Whole Truth, and CleverTap , and are backed by top investors like Waterbridge, Blume, Arali, and Eximius. Our missionTo safeguard every Indian business by 2047. Our mindsetBold, fast-moving, and customer-obsessed. Join us at BimaKavach and be part of a once-in-a-generation opportunity to reshape how insurance works for millions of businesses. Bring your expertise, curiosity, and ambition and help build the future of SME insurance in India. Job Overview: As a Data Engineer at BimaKavach, you will be pivotal in building and maintaining the scalable data infrastructure and pipelines that drive our data-driven decision-making. You will work with large datasets, ensuring data quality, accessibility, and reliability for analytics, reporting, and machine learning initiatives within the insurance domain. This role requires strong expertise in data warehousing, ETL processes, and cloud-based data solutions. Key Responsibilities: Design, build, and maintain robust and scalable data pipelines for data ingestion, transformation, and loading from various sources into our data warehouse. Develop and optimize ETL/ELT processes using appropriate tools and technologies. Work extensively with PostgreSQL for data storage, querying, and optimization. Manage data infrastructure on AWS EC2 and leverage other AWS services (e.g., S3, RDS) for data storage and processing. Ensure data quality, consistency, and reliability across all data pipelines and datasets. Collaborate with data scientists, analysts, and product teams to understand data requirements and deliver actionable insights. Implement monitoring and alerting for data pipelines to ensure data integrity and system health. Troubleshoot and resolve data-related issues, optimizing queries and data models for performance. Contribute to data governance, security, and compliance best practices. (Good to have): Experience with serverless functions (AWS Lambda/Google Cloud Functions) for event-driven data processing. Qualifications: Bachelors or Masters degree in Computer Science, Engineering, Data Science, or a related quantitative field. 3-4 years of professional experience in data engineering. Strong proficiency in SQL, especially with PostgreSQL. Proven experience building and maintaining data pipelines. Hands-on experience with AWS services, particularly EC2, and familiarity with other relevant services (S3, RDS, Glue, Redshift etc.). Experience with scripting languages (e.g., Python, Node.js) for data manipulation and automation. Understanding of data warehousing concepts, data modeling, and ETL/ELT processes. Experience with big data technologies (e.g., Apache Spark, Hadoop) is a plus. Excellent problem-solving and analytical skills. Strong communication and collaboration abilities. (Good to have): Experience with AWS Lambda or Google Cloud Functions for data processing. Key Details: Joining : ASAP Compensation: Market competitive pay along with a variable performance-based component Location : Bangalore or Indore