Home
Jobs

3 Data Partitioning Jobs

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4 - 8 years

12 - 22 Lacs

Pune

Hybrid

Naukri logo

Performance Tuning , table & data partitioning ORACLE performance tests.Must have worked as application DBA & having proficiency in setting up health & performance check process for day to day monitoring. Having hands on exp. in query optimization. Required Candidate profile Must have exposure/experience in HDFS/HADOOP, HIVE and ICEBERG, Capital Market Exp. preffer

Posted 1 month ago

Apply

5 - 8 years

15 - 27 Lacs

Hyderabad, Gurgaon, Noida

Work from Office

Naukri logo

We're Nagarro. We are a Digital Product Engineering company that is scaling in a big way! We build products, services, and experiences that inspire, excite, and delight. We work at a scale across all devices and digital mediums, and our people exist everywhere in the world (18000+ experts across 36 countries, to be exact). Our work culture is dynamic and non-hierarchical. We are looking for great new colleagues. That is where you come in! REQUIREMENTS: Expert knowledge in databases like PostgreSQL (preferably cloud-hosted in AWS, Azure, GCP), and Snowflake Data Warehouse with strong programming experience in SQL. Competence in data preparation and/or ETL tools to build and maintain data pipelines and flows. Expertise in Python and experience working on ML models. Deep knowledge of databases, stored procedures, and optimization of large data sets. In-depth knowledge of ingestion techniques, data cleaning, de-duplication, and partitioning. Understanding of index design and performance-tuning techniques. Familiarity with SQL security techniques such as data encryption at the column level, Transparent Data Encryption (TDE), signed stored procedures, and assignment of user permissions. Experience in understanding source data from various platforms and mapping them into Entity Relationship Models (ER) for data integration and reporting. Exposure to source control tools like GIT, Azure DevOps. Understanding of Agile methodologies (Scrum, Kanban). Experience with automated testing and coverage tools. Experience with CI/CD automation tools (desirable). Programming language experience in Golang (desirable). RESPONSIBILITIES: Design and implement Snowflake-based data warehouse solutions. Develop and optimize complex SQL queries, stored procedures, and views in Snowflake. Build ETL/ELT data pipelines for efficient data processing. Work with structured and semi-structured data (JSON, Parquet, Avro) for data ingestion and processing. Implement data partitioning, clustering, and performance tuning strategies. Manage role-based access control (RBAC), security, and data governance in Snowflake. Integrate Snowflake with BI tools (Power BI, Tableau, Looker) for reporting and analytics. Create and maintain optimal data pipeline architecture. Assemble large, complex data sets that meet functional/non-functional business requirements. Build pipelines for optimal extraction, transformation, and loading of data from various sources using SQL and cloud database technologies. Prepare ML models for data analysis and prediction. Work with stakeholders including Executive, Product, Data, and Design teams to assist with data-related technical issues and support their data infrastructure needs. Ensure data separation and security across national boundaries through multiple data centers and regions. Collaborate with data and analytics experts to enhance functionality in our data systems. Manage exploratory data analysis to support database and dashboard development.

Posted 2 months ago

Apply

5 - 7 years

7 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

Snowflake Data Engineer (Only immediate joiners 0-7 days) Key Responsibilities: Develop and implement efficient data pipelines and ETL processes to migrate and manage client, investment and accounting data in Snowflake. Work closely with the fund teams to understand data structures and business requirements, ensuring data accuracy and quality. Monitor and troubleshoot data pipelines, ensuring high availability and reliability of data systems. Optimize Snowflake database performance by designing scalable and cost-effective solutions. Design snowflake data model to effectively handle business needs. Work closely with AI Engineer and build data pipelines where necessary to support AI/ML projects. Skills Required: 5+ years of experience in IT working on Data projects with 3+ years of experience with Snowflake. Proficiency in Snowflake Data Cloud, including schema design, data partitioning, and query optimization. Strong SQL and Python skills , hands on experience working with python libraries such as pyspark, pandas, beautiful soup. Experience with ETL/ELT tools like Fivetran, Apache spark, dbt. Experience with RESTful APIs Familiarity workload automation and job scheduling tool such as Control M or Apache airflow. Familiar with data governance frameworks. Familiarity with Azure cloud.

Posted 2 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies