Home
Jobs

3 Duckdb Jobs

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 5.0 years

50 - 60 Lacs

Bengaluru

Work from Office

Naukri logo

Staff Data Engineer Experience: 3 - 5 Years Exp Salary : INR 50-60 Lacs per annum Preferred Notice Period : Within 30 Days Shift : 4:00PM to 1:00AM IST Opportunity Type: Remote Placement Type: Permanent (*Note: This is a requirement for one of Uplers' Clients) Must have skills required : ClickHouse, DuckDB, AWS, Python, SQL Good to have skills : DBT, Iceberg, Kestra, Parquet, SQLGlot Rill Data (One of Uplers' Clients) is Looking for: Staff Data Engineer who is passionate about their work, eager to learn and grow, and who is committed to delivering exceptional results. If you are a team player, with a positive attitude and a desire to make a difference, then we want to hear from you. Role Overview Description Rill is the worlds fastest BI tool, designed from the ground up for real-time databases like DuckDB and ClickHouse. Our platform combines last-mile ETL, an in-memory database, and interactive dashboards into a full-stack solution thats easy to deploy and manage. With a BI-as-code approach, Rill empowers developers to define and collaborate on metrics using SQL and YAML. Trusted by leading companies in e-commerce, digital marketing, and financial services, Rill provides the speed and scalability needed for operational analytics and partner-facing reporting. Job Summary Overview Rill is looking for a Staff Data Engineer to join our Field Engineering team. In this role, you will work closely with enterprise customers to design and optimize high-performance data pipelines powered by DuckDB and ClickHouse. You will also collaborate with our platform engineering team to evolve our incremental ingestion architectures and support proof-of-concept sales engagements. The ideal candidate has strong SQL fluency, experience with orchestration frameworks (e.g., Kestra, dbt, SQLGlot), familiarity with data lake table formats (e.g., Iceberg, Parquet), and an understanding of cloud databases (e.g., Snowflake, BigQuery). Most importantly, you should have a passion for solving real-world data engineering challenges at scale. Key Responsibilities Collaborate with enterprise customers to optimize data models for performance and cost efficiency. Work with the platform engineering team to enhance and refine our incremental ingestion architectures. Partner with account executives and solution architects to rapidly prototype solutions for proof-of-concept sales engagements. Qualifications (required) Fluency in SQL and competency in Python. Bachelors degree in a STEM discipline or equivalent industry experience. 3+ years of experience in a data engineering or related role. Familiarity with major cloud environments (AWS, Google Cloud, Azure) Benefits Competitive salary Health insurance Flexible vacation policy How to apply for this opportunity: Easy 3-Step Process: 1. Click On Apply! And Register or log in on our portal 2. Upload updated Resume & Complete the Screening Form 3. Increase your chances to get shortlisted & meet the client for the Interview! About Our Client: Rill is an operational BI tool that provides fast dashboards that your team will actually use. Data teams build fewer, more flexible dashboards for business users, while business users make faster decisions and perform root cause analysis, with fewer ad hoc requests. Rills unique architecture combines a last-mile ETL service, an in-memory database, and operational dashboards - all in a single solution. Our customers are leading media & advertising platforms, including Comcast's Freewheel, tvScientific, AT&T's DishTV, and more. About Uplers: Our goal is to make hiring and getting hired reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant product and engineering job opportunities and progress in their career. (Note: There are many more opportunities apart from this on the portal.) So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!

Posted 2 weeks ago

Apply

7 - 12 years

25 - 35 Lacs

Kolkata

Hybrid

Naukri logo

About the Role We are seeking a Senior Python/Data Engineer to design, develop, and optimize large-scale data pipelines, transformation workflows, and analytics-ready datasets . This role requires expertise in Python, Apache Airflow, Apache Spark, SQL, and DuckDB , along with strong experience in data quality, data processing, and automation . As a Senior Data Engineer , you will play a key role in building scalable, high-performance data engineering solutions , ensuring data integrity, and supporting real-time and batch data workflows . You will work closely with Data Scientists, Analysts, DevOps, and Engineering teams to build efficient, cost-effective, and reliable data architectures . Key Responsibilities Design, build, and maintain scalable ETL/ELT data pipelines using Apache Airflow, Spark, and SQL . Develop Python-based data engineering solutions to automate data ingestion, transformation, and validation. Implement data transformation and quality checks for structured and unstructured datasets. Work with DuckDB and other in-memory databases to enable fast exploratory data analysis (EDA). Optimize data storage and retrieval using Parquet, Apache Iceberg, and S3-based data lakes . Develop SQL-based analytics workflows and optimize performance for querying large datasets. Implement data lineage, governance, and metadata management for enterprise-scale data solutions. Ensure high availability, fault tolerance, and security of data pipelines. Collaborate with Data Science, AI/ML, and Business Intelligence teams to enable real-time and batch analytics . Work with cloud platforms ( AWS, Azure, GCP ) for data pipeline deployment and scaling. Write clean, efficient, and maintainable code following best software engineering practices. Required Skills & Qualifications 7+ years of experience in data engineering, big data processing, and backend development . Expertise in Python for data processing and automation. Strong experience with Apache Airflow for workflow orchestration. Hands-on experience with Apache Spark for big data transformations. Proficiency in SQL (PostgreSQL, DuckDB, Snowflake, etc.) for analytics and ETL workflows. Experience with data transformation, data validation, and quality assurance frameworks . Hands-on experience with DuckDB, Apache Arrow, or Vaex for in-memory data processing. Knowledge of data lake architectures (S3, Parquet, Iceberg) and cloud data storage. Familiarity with distributed computing, parallel processing, and optimized query execution . Experience working in CI/CD, DevOps, containerization (Docker, Kubernetes), and cloud environments . Strong problem-solving and debugging skills. Excellent written and verbal communication skills. Preferred Skills (Nice to Have) Experience programming in JAVA/JEE platform is highly desired. Experience with data streaming technologies (Kafka, Flink, Kinesis) . Familiarity with NoSQL databases (MongoDB, DynamoDB) . Exposure to AI/ML data pipelines and feature engineering . Knowledge of data security, compliance (SOC2 Type2, GDPR, HIPAA), and governance best practices . Experience in building metadata-driven data pipelines for self-service analytics.

Posted 2 months ago

Apply

5 - 8 years

10 - 15 Lacs

Bengaluru

Hybrid

Naukri logo

Role & responsibilities Develop interactive maps using libraries and technologies like Leaflet.js, Mapbox, Google Maps API, and OpenLayers. Implement H3 Indexing for spatial partitioning and optimization to improve data analysis and map rendering performance. Manage and optimize geospatial data querying, storage, and transformation using Snowflake and Databricks. Leverage DuckDB for efficient local geospatial querying and real-time analysis. Develop and maintain clean, scalable, and type-safe code using TypeScript for frontend and backend geospatial solutions. Build spatial queries, conduct geospatial analysis, and optimize pipelines for mapping and visualization tasks. Collaborate with data engineers and backend developers to integrate geospatial data pipelines into cloud platforms (e.g., Snowflake and Databricks). Work with GIS tools (QGIS, ArcGIS) to analyse and visualize large-scale spatial data. Integrate mapping tools with cloud platforms and automate data workflows for geospatial analytics. Stay up-to-date with the latest tools and technologies in cloud data platforms, geospatial mapping, and spatial data indexing. Preferred candidate profile Bachelors degree in computer science, Geographic Information Systems (GIS), Data Engineering, or a related field. Proficiency in mapping libraries/APIs: Google Maps, Mapbox, Leaflet, OpenLayers, or similar. Experience with H3 Index for spatial indexing, analysis, and partitioning. Strong hands-on experience with Snowflake and Databricks for managing, analysing, and processing large-scale geospatial data. Proficiency with DuckDB for real-time geospatial querying. Strong programming skills in TypeScript and modern web technologies (HTML, CSS, JavaScript). Experience working with geospatial data formats: GeoJSON, KML, Shapefiles, and GPX. Familiarity with GIS software (QGIS, ArcGIS) for spatial data analysis. Solid understanding of SQL and experience optimizing spatial queries. Ability to collaborate in a cross-functional team and integrate solutions with cloud services. Perks and benefits Training in Databricks. Support on certifications. Hands-on AWS, Azure, and GCP. Support you on any cloud certification. Build leadership skills.

Posted 3 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies