Home
Jobs

3 Data Frame Jobs

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

9.0 - 14.0 years

20 - 30 Lacs

Bengaluru

Hybrid

Naukri logo

My profile :- linkedin.com/in/yashsharma1608 Hiring manager profile :- on payroll of - https://www.nyxtech.in/ Clinet : Brillio PAYROLL AWS Architect Primary skills Aws (Redshift, Glue, Lambda, ETL and Aurora), advance SQL and Python , Pyspark Note : -Aurora Database mandatory skill Experience 9 + yrs Notice period Immediate joiner Location Any Brillio location (Preferred is Bangalore) Budget – 30 LPA Job Description: year of IT experiences with deep expertise in S3, Redshift, Aurora, Glue and Lambda services. Atleast one instance of proven experience in developing Data platform end to end using AWS Hands-on programming experience with Data Frames, Python, and unit testing the python as well as Glue code. Experience in orchestrating mechanisms like Airflow, Step functions etc. Experience working on AWS redshift is Mandatory. Must have experience writing stored procedures, understanding of Redshift data API and writing federated queries Experience in Redshift performance tunning.Good in communication and problem solving. Very good stakeholder communication and management

Posted 1 week ago

Apply

5.0 - 10.0 years

20 - 30 Lacs

Gurugram

Hybrid

Naukri logo

Hi, Wishes from GSN!!! Pleasure connecting with you!!! We been into Corporate Search Services for Identifying & Bringing in Stellar Talented Professionals for our reputed IT / Non-IT clients in India. We have been successfully providing results to various potential needs of our clients for the last 20 years. At present, GSN is hiring Pyspark Developer for one of our leading MNC client. PFB the details for your better understanding: ~~~~ LOOKING FOR IMMEDIATE JOINERS ~~~~ WORK LOCATION: Gurugram Job Role: Pyspark Developer EXPERIENCE: 5 Yrs -10 Yrs CTC Range: 20LPA -28 LPA Work Type: HYBRID Only JD: Must be strong in Advanced SQL (e.g., joins and aggregations) Should have good experience in Pyspark (atleast 4 years) Good have knowledge in AWS services Experience across the data lifecycle Design & develop ETL pipeline using PySpark on AWS framework If interested, kindly APPLY for IMMEDIATE response. Thanks & Regards Sathya K GSN Consulting Mob: 8939666794 Mail ID: sathya@gsnhr.net; Web: https://g.co/kgs/UAsF9W

Posted 3 weeks ago

Apply

7 - 11 years

20 - 25 Lacs

Bengaluru

Remote

Naukri logo

We are seeking a Data Analyst with strong SQL, Python, and Tableau skills to join our team. You will be responsible for extracting insights from data, automating workflows, and creating dashboards to support business decision-making. Key Responsibilities: Write and optimize SQL queries for data extraction and analysis. Use Python (e.g., Pandas, Airflow) to automate processes and analyze data. Build and maintain interactive Tableau dashboards. Communicate data insights effectively to both technical and non-technical stakeholders. 6+ years of relevant experience. Qualifications: Strong experience with SQL and Python. Proficiency in Tableau for creating visual reports. Excellent analytical and communication skills. Ability to collaborate with cross-functional teams.

Posted 2 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies