Senior Consultant Specialist

8 - 9 years

18.0 - 23.0 Lacs P.A.

Hyderabad

Posted:2 months ago| Platform: Naukri logo

Apply Now

Skills Required

UnixAutomationAnalyticalOracleContinuous improvementTeradataOperationsFinancial servicesSQLData extraction

Work Mode

Work from Office

Job Type

Full Time

Job Description

At least 8 years of experience in Big Data, Pyspark, GCP , Hadoop Data Platform (HDP) and ETL and capable of configuring data pipelines. Possess the following technical skills SQL, Python, Pyspark, Hive, Unix, ETL, Control-M (or similar) Skills and experience to support the upgrade from Hadoop to Cloudera. Good knowledge of Industry Best Practice for ETL Design, Principles, Concepts. Data Extraction from Teradata, HIVE, Oracle, and Flat files Ability to work independently on specialized assignments within the context of project deliverables Take ownership of providing solutions and tools that iteratively increase engineering efficiencies. Design should help embed standard processes, systems and operational models into the BAU approach for end-to-end execution of Data Pipelines Proven problem solving and analytical abilities including the ability to critically evaluate information gathered from multiple sources, reconcile conflicts, decompose high-level information into details and apply sound business and technical domain knowledge Communicate openly and honestly. Advanced oral, written and visual communication and presentation skills - the ability to communicate efficiently at a global level is paramount. Ability to deliver materials of the highest quality to management against tight deadlines. Ability to work effectively under pressure with competing and rapidly changing priorities. Responsible for automating the continuous integration/continuous delivery pipeline within a DevOps Product/Service team driving a culture of continuous improvement Keep up-to-date and have expertise on current tools, technologies and areas like cyber security and regulations pertaining to aspects like data privacy, consent, data residency etc. that are applicable. Requirements To be successful in this role, you should meet the following requirements: 8+ Years of IT Experience. Working knowledge on leading Regulatory projects. Possess in-depth knowledge and understanding of technical concepts, systems, and processes relevant to the organization. Strong working knowledge on Pyspark, GCP, Dataproc, Airflow. Hands-on experience in Python, Spark, Hive, SQL, GCP and Hadoop technologies Good to have exposure in Cloudera Data Platform (CDP), Big Data, Hadoop Data Platform (HDP) and ETL and capable of configuring data pipelines. Skills and experience to support the upgrade from Hadoop to Cloudera. Good knowledge of Industry Best Practice for ETL Design, Principles, Concepts. Data Extraction from Teradata, HIVE, Oracle, and Flat files. Experience in handling/Conducting scrum ceremonies and practice Agile best practices. Working knowledge on DevOps best practices and building CI/CD pipelines with end-to-end automation. Ability to work independently on specialized assignments within the context of project deliverables .

Financial Services
London

RecommendedJobs for You

Chennai, Pune, Delhi, Mumbai, Bengaluru, Hyderabad, Kolkata

Pune, Bengaluru, Mumbai (All Areas)

Chennai, Pune, Delhi, Mumbai, Bengaluru, Hyderabad, Kolkata

Bengaluru, Hyderabad, Mumbai (All Areas)

Hyderabad, Gurgaon, Mumbai (All Areas)