Home
Jobs

2 Notebooks Jobs

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

8 - 10 years

13 - 18 Lacs

Chennai

Hybrid

Naukri logo

Data Analyst II Experience - 8 yrs location Chennai Budget upto 23 lpa JD: This role will be part of Enterprise Risk data solutions team. • Strong programming skills in BQ/SQL. • Strong analytical skills including the ability to define problems, collect data, establish facts, and draw valid conclusions. • Familiar with big data stack; Knowledge of Google Cloud Platform is highly preferred. • Expertise in data movement techniques and best practices to handle large volumes of data. • Experience with data warehousing architecture and data modeling best practices. Candidate Requirement: 8 years of experience in Data Technology is required. Strong analytical skill and expertise in Notebooks & BQ are mandatory Bachelors degree (Computer science, Information technology, or a similar field.) Able to drive collaboration within a matrixed environment. Strong interpersonal skills, results-oriented and an appreciation of diversity in teamwork. Top 3 required Skills: 1 GCP BQ, Notebooks, Analytical skill 2 Technical experience and analysis 3 Understanding of Risk and control measures from data governance standpoint

Posted 1 month ago

Apply

8 - 10 years

25 - 30 Lacs

Hyderabad

Work from Office

Naukri logo

Position Summary: Data engineer on the Data integration team Job Description & Responsibilities: Work with business and technical leadership to understand requirements. Design to the requirements and document the designs Ability to write product-grade performant code for data extraction, transformations and loading using Spark, Py-Spark Do data modeling as needed for the requirements. Write performant queries using Teradata SQL, Hive SQL and Spark SQL against Teradata and Hive Implementing dev-ops pipelines to deploy code artifacts on to the designated platform/servers like AWS. Troubleshooting the issues, providing effective solutions and jobs monitoring in the production environment Participate in sprint planning sessions, refinement/story-grooming sessions, daily scrums, demos and retrospectives. Experience Required: Overall 8-10 years of experience Experience Desired: Strong development experience in Spark, Py-Spark, Shell scripting, Teradata. Strong experience in writing complex and effective SQLs (using Teradata SQL, Hive SQL and Spark SQL) and Stored Procedures Health care domain knowledge is a plus Primary Skills: Excellent work experience on Databricks as Data Lake implementations Experience in Agile and working knowledge on DevOps tools (Git, Jenkins, Artifactory) AWS (S3, EC2, SNS, SQS, Lambda, ECS, Glue, IAM, and CloudWatch) Databricks (Delta lake, Notebooks, Pipelines, cluster management, Azure/AWS integration Additional Skills: Experience in Jira and Confluence Exercises considerable creativity, foresight, and judgment in conceiving, planning, and delivering initiatives.

Posted 2 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies