Jobs
Interviews

21 Hue Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

6.0 - 11.0 years

0 Lacs

hyderabad, pune

Work from Office

Strong technical acumen in AWS including expert working knowledge of AWS services such as Glue, ,EC2, ECS, Lamda, Step Functions, IAM, Athena, HUE, Presto, S3, Redshift etc. Strong technical acumen in Data Engineering enablement, and working knowledge of frameworks/languages such as Python, SPARK etc. Dremino knowledge is a plus.

Posted 1 week ago

Apply

4.0 - 9.0 years

11 - 21 Lacs

hyderabad

Hybrid

Role & responsibilities Job Description: Solid experience at Linux Administration Experience at OS level Upgrades and Patching, including vulnerability and remediations. Strong knowledge of Linux internals, Networking, Firewalls and system security. Experience working with AWS cloud platform and infrastructure. Experience working with infrastructure as code with Terraform or Ansible tools. Hands-on experience with Ansible for configuration management and infrastructure automation. Strong experience in Jenkins for building and managing CI/CD pipelines, automating software builds, tests, and deployments. Monitor system performance and ensure high availability and reliability of services. Profi...

Posted 2 weeks ago

Apply

5.0 - 8.0 years

5 - 8 Lacs

navi mumbai

Work from Office

Job Title: Big Data Developer Location: Navi Mumbai, India Exp : 5+ Years Department: Big Data and Cloud Job Summary: Smartavya Analytica Private Limited is seeking a skilled Hadoop Developer to join our team and contribute to the development and maintenance of large-scale Big Data solutions. The ideal candidate will have extensive experience in Hadoop ecosystem technologies and a solid understanding of distributed computing, data processing, and data management. Company: Smartavya Analytica Private limited is a niche Data and AI company. Based in Pune, we are pioneers in data-driven innovation, transforming enterprise data into strategic insights. Established in 2017, our team has experienc...

Posted 2 weeks ago

Apply

6.0 - 10.0 years

11 - 16 Lacs

mumbai, pune, chennai

Work from Office

Strong Experience in Apache Hadoop, Spark, Spark SQL, Hive, Impala, Yarn, Talend, Hue Strong Experience in SQL queries/stored Procedures/Functions/Triggers, work experience in banking projects focused on liquidity management Spark Calculators based on business logic/rules Programming and reverse engineering skills with Scala, Java, Python, SQL, Unix/Linux shell, Spark RD Understanding of OOP and Functional design approach. Big Data Reporting, Querying and analysis Create T SQL queries/stored Procedures/Functions/Triggers using SQL or Server 2014 and 2017 or any other any other query language Should have fair ETL knowledge. Strong writing, communication, time-management, decision-making, and ...

Posted 3 weeks ago

Apply

18.0 - 22.0 years

0 Lacs

pune, maharashtra

On-site

You will be working as a C14 (People Manager) in the Citi Analytics & Information Management (AIM) team based in Pune, India. Reporting to the Director/Managing Director, AIM, you will lead a team of 15+ data scientists in the Financial Crimes & Fraud Prevention Analytics modelling team. Your primary responsibility will be to develop and implement Machine Learning (ML) /AI/Gen AI models to mitigate fraud losses and minimize customer impact. As the team lead, you will be expected to work as a Subject Matter Expert (SME) in ML/Generative AI, articulate complex AI and ML concepts to various stakeholders, and stay updated with the latest advancements in the AI and ML space. You will also provide...

Posted 1 month ago

Apply

4.0 - 8.0 years

0 Lacs

haryana

On-site

As a Hadoop Admin, you will be responsible for managing and supporting Hadoop clusters and various components such as HDFS, HBase, Hive, Sentry, Hue, Yarn, Sqoop, Spark, Oozie, ZooKeeper, Flume, and Solr. With a minimum of 4 years of experience in Hadoop administration, you will play a crucial role in installing, configuring, maintaining, troubleshooting, and monitoring these clusters to ensure their efficient functioning in production support projects. Your primary duties will include integrating analytical tools like Datameer, Paxata, DataRobot, H2O, MRS, Python, R-Studio, SAS, and Dataiku-Bluedata with Hadoop, along with conducting job level troubleshooting for components such as Yarn, Im...

Posted 1 month ago

Apply

15.0 - 19.0 years

0 Lacs

pune, maharashtra

On-site

The Financial Crimes & Fraud Prevention Analytics team at Citi is looking for a skilled individual to join as a C14 (people manager) reporting to the Director/Managing Director, AIM. This role will involve leading a team of data scientists based in Pune/Bangalore, focusing on the development and implementation of Machine Learning (ML) /AI/Gen AI models for Fraud Prevention. The successful candidate will be responsible for designing, developing, and deploying generative AI based solutions, analyzing data to understand fraud patterns, and developing models to achieve overall business goals. Additionally, the individual will collaborate with the model implementation team, ensure model documenta...

Posted 1 month ago

Apply

15.0 - 19.0 years

0 Lacs

pune, maharashtra

On-site

As a part of the Citi Analytics & Information Management (AIM) team in the Financial Crimes & Fraud Prevention Analytics unit within the Fraud Operation team, you will have the opportunity to lead a team of data scientists in Pune/Bangalore. Reporting to the Director/Managing Director, AIM, your primary focus will be to develop and implement Machine Learning (ML)/AI/Gen AI models for fraud prevention. You will analyze data, identify fraud patterns, and work towards achieving overall business goals. Additionally, you will collaborate with the model implementation team, ensure model documentation, and address questions from model risk management (MRM) while adapting to changing business needs....

Posted 1 month ago

Apply

4.0 - 10.0 years

0 Lacs

hyderabad, telangana

On-site

As a Big Data Architect with 4 years of experience, you will be responsible for designing and implementing scalable solutions using technologies such as Spark, Scala, Hadoop MapReduce/HDFS, PIG, HIVE, and AWS cloud computing. Your role will involve hands-on experience with tools like EMR, EC2, Pentaho BI, Impala, ElasticSearch, Apache Kafka, Node.js, Redis, Logstash, statsD, Ganglia, Zeppelin, Hue, and KETTLE. Additionally, you should have sound knowledge in Machine learning, Zookeeper, Bootstrap.js, Apache Flume, FluentD, Collectd, Sqoop, Presto, Tableau, R, GROK, MongoDB, Apache Storm, and HBASE. To excel in this role, you must have a strong background in development with both Core Java an...

Posted 1 month ago

Apply

8.0 - 12.0 years

0 Lacs

chennai, tamil nadu

On-site

The Applications Development Senior Programmer Analyst position is an intermediate level role where you will be responsible for participating in the establishment and implementation of new or revised application systems and programs in coordination with the Technology team. Your main objective will be to contribute to applications systems analysis and programming activities. Your responsibilities will include conducting tasks related to feasibility studies, time and cost estimates, IT planning, risk technology, applications development, model development, and establishing and implementing new or revised applications systems and programs to meet specific business needs or user areas. You will...

Posted 1 month ago

Apply

8.0 - 11.0 years

35 - 37 Lacs

Kolkata, Ahmedabad, Bengaluru

Work from Office

Dear Candidate, We are looking for a Big Data Developer to build and maintain scalable data processing systems. The ideal candidate will have experience handling large datasets and working with distributed computing frameworks. Key Responsibilities: Design and develop data pipelines using Hadoop, Spark, or Flink. Optimize big data applications for performance and reliability. Integrate various structured and unstructured data sources. Work with data scientists and analysts to prepare datasets. Ensure data quality, security, and lineage across platforms. Required Skills & Qualifications: Experience with Hadoop ecosystem (HDFS, Hive, Pig) and Apache Spark. Proficiency in Java, Scala, or Python...

Posted 1 month ago

Apply

6.0 - 11.0 years

15 - 19 Lacs

Hyderabad, Pune, Bengaluru

Work from Office

Project description During the 2008 financial crisis, many big banks failed or faced issues due to liquidity issues. Lack of liquidity can kill any financial institution over the night. That's why it's so critical to constantly monitor liquidity risks and properly maintain collaterals. We are looking for a number of talented developers, who would like to join our team in Pune, which is building liquidity risk and collateral management platform for one of the biggest investment banks over the globe. The platform is a set of front-end tools and back-end engines. Our platform helps the bank to increase efficiency and scalability, reduce operational risk and eliminate the majority of manual inte...

Posted 2 months ago

Apply

4.0 - 6.0 years

6 - 8 Lacs

Bengaluru

Work from Office

: Job TitleESG DataSustainability Business Analyst Corporate TitleAssociate LocationBangalore, India Role Description The Sustainability Data and Technology Program is a bank wide program to deliver a strategic solution for Environmental, Social and Governance data across Deutsche Bank. The Program is part of the Sustainability Strategy Key Deliverable. As a Business Analyst, you will be part of the Data Team. You will be responsible for reviewing business use cases from stakeholders, gathering & documenting requirements, defining high level implementation steps and creating business user stories. You will closely work with the Product Owner and development teams and bring business and funct...

Posted 2 months ago

Apply

5.0 - 9.0 years

15 - 19 Lacs

Bengaluru

Work from Office

Project description During the 2008 financial crisis, many big banks failed or faced issues due to liquidity issues. Lack of liquidity can kill any financial institution over the night. That's why it's so critical to constantly monitor liquidity risks and properly maintain collaterals. We are looking for a number of talented developers, who would like to join our team in Pune, which is building liquidity risk and collateral management platform for one of the biggest investment banks over the globe. The platform is a set of front-end tools and back-end engines. Our platform helps the bank to increase efficiency and scalability, reduce operational risk and eliminate the majority of manual inte...

Posted 2 months ago

Apply

8.0 - 11.0 years

45 - 50 Lacs

Noida, Kolkata, Chennai

Work from Office

Dear Candidate, We are hiring a Julia Developer to build computational and scientific applications requiring speed and mathematical accuracy. Ideal for domains like finance, engineering, or AI research. Key Responsibilities: Develop applications and models using the Julia programming language . Optimize for performance, parallelism, and numerical accuracy . Integrate with Python or C++ libraries where needed. Collaborate with data scientists and engineers on simulations and modeling. Maintain well-documented and reusable codebases. Required Skills & Qualifications: Proficient in Julia , with knowledge of multiple dispatch and type system Experience in numerical computing or scientific resear...

Posted 2 months ago

Apply

8.0 - 11.0 years

35 - 37 Lacs

Kolkata, Ahmedabad, Bengaluru

Work from Office

Dear Candidate, We are hiring a Data Engineer to build and maintain data pipelines for our analytics platform. Perfect for engineers focused on data processing and scalability. Key Responsibilities: Design and implement ETL processes Manage data warehouses and ensure data quality Collaborate with data scientists to provide necessary data Optimize data workflows for performance Required Skills & Qualifications: Proficiency in SQL and Python Experience with data pipeline tools like Apache Airflow Familiarity with big data technologies (Spark, Hadoop) Bonus: Knowledge of cloud data services (AWS Redshift, Google BigQuery) Soft Skills: Strong troubleshooting and problem-solving skills. Ability t...

Posted 2 months ago

Apply

6.0 - 10.0 years

11 - 15 Lacs

Pune

Work from Office

We at Onix Datametica Solutions Private Limited are looking for Bigdata Lead who have a passion for cloud with knowledge of different on-premise and cloud Data implementation in the field of Big Data and Analytic s including and not limiting to Teradata, Netezza, Exadata, Oracle, Cloudera, Hortonworks and alike. Ideal candidates should have technical experience in migrations and the ability to help customers get value from Datametica's tools and accelerators Job Description 6+ years of overall experience in developing, testing implementing Big data projects using Hadoop, Spark, Hive Hands-on experience playing lead role in Big data projects, responsible for implementing one or more tracks wi...

Posted 3 months ago

Apply

3.0 - 5.0 years

12 - 13 Lacs

Thane, Navi Mumbai, Pune

Work from Office

We at Acxiom Technologies are hiring for Pyspark Developer for Mumbai Location Relevant Experience : 1 to 4 Years Location : Mumbai Mode of Work : Work From Office Notice Period : Upto 20 days. Job Description: Proven experience as a Pyspark Developer . Hands-on expertise with AWS Redshift . Strong proficiency in Pyspark , Spark , Python , and Hive . Solid experience with SQL . Excellent communication skills. Benefits of working at Acxiom: - Statutory Benefits - Paid Leaves - Phenomenal Career Growth - Exposure to Banking Domain About Acxiom Technologies: Acxiom Technologies is a leading software solutions services company that provides consulting services to global firms and has established...

Posted 3 months ago

Apply

6.0 - 10.0 years

10 - 16 Lacs

Mumbai

Work from Office

Responsibilities Design and Implement Big Data solutions, complex ETL pipelines and data modernization projects. Required Past Experience: 6+ years of overall experience in developing, testing & implementing big data projects using Hadoop, Spark, Hive and Sqoop. Hands-on experience playing lead role in big data projects, responsible for implementing one or more tracks within projects, identifying and assigning tasks within the team and providing technical guidance to team members. Experience in setting up Hadoop services, implementing Extract transform and load/Extract load and transform (ETL/ELT) pipelines, working with Terabytes/Petabytes of data ingestion & processing from varied systems ...

Posted 3 months ago

Apply

6.0 - 11.0 years

16 - 31 Lacs

hyderabad, chennai, bengaluru

Hybrid

ETL DeveloperData Modeling Tools like ErwinSnowflake, Oracle, Amazon RDS (Aurora, Postgres), DB2, SQL server and Casandra,Apache Sqoop, AWS S3, Hue, AWS CLI, Amazon EMR, Amazon MSK, Amazon Sagemaker, Apache SparkAutosys, SFTP, AirFlow

Posted Date not available

Apply

4.0 - 9.0 years

17 - 25 Lacs

hyderabad, chennai, bengaluru

Hybrid

ETL Developer Snowflake, Oracle, Amazon RDS (Aurora, Postgres), DB2, SQL server and Casandra,Apache Sqoop, AWS S3, Hue, AWS CLI, Amazon EMR, Sagemaker, Apache SparkErwin,(Extract, Transform, Load)Snowflake, Oracle, Amazon RDS

Posted Date not available

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies