Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
12.0 - 15.0 years
0 - 3 Lacs
hyderabad
Remote
Job Description: We are looking for a Hadoop resource for running a Hadoop Assessment with the help of internal Hadoop Decoded tool. Hadoop Decoded tool analyses different logs including Hadoop, Yarn etc. to get insight into the cluster usage of Hadoop ecosystem. The tool parses and ingests logs into Elasticsearch via Logstash, and visualizations are built in Kibana. Key Responsibilities - Understand and enhance (if required) the existing Hadoop Decoded codebase and pipeline architecture. Manage data ingestion workflows using Logstash and indexing in Elasticsearch. Optimize Kibana dashboards for visualization and reporting as required Execute Hadoop Decoded on the Hadoop cluster and deliver ...
Posted 6 days ago
10.0 - 12.0 years
0 Lacs
bengaluru, karnataka, india
Remote
At BairesDev, we've been leading the way in technology projects for over 15 years. We deliver cutting-edge solutions to giants like Google and the most innovative startups in Silicon Valley. Our diverse 4,000+ team, composed of the world's Top 1% of tech talent, works on roles that drive significant impact worldwide. When you apply for this position, you're taking the first step in a process that goes beyond the ordinary. We aim to align your passions and skills with our vacancies, setting you on a path to exceptional career development and success. Principal Apache Hadoop Engineer at BairesDev We are seeking a Principal Apache Hadoop Engineer with deep expertise in big data ecosystem, HDFS ...
Posted 1 week ago
6.0 - 9.0 years
27 - 42 Lacs
pune
Work from Office
Job Title: - GCP Data Engineer with Big Query ( AIA Kolkata ) Location: - AIA Kochi Minimum 6 to 9 years of experience Mandatory Skills: GCP Services – Big Query, Cloud SQL, Data flow Around 7+ years of experience in development in Data warehousing projects in GCP platforms, Good understanding of dimensional modelling Expertise in MySQL & SQL/PL GCP & BigQuery knowledge is must, GCP certification is added advantage Good experience in Cloud Composer, DAGs , airflow REST API development experience Good in analytical and problem solving, efficient communication Ability to lead the team members, convert requirements into technical solutions, assign the work, coordinate the team and review the de...
Posted 1 week ago
3.0 - 5.0 years
7 - 12 Lacs
bengaluru
Work from Office
Loc: Manyata Tech Park Exp: 2-5 yrs, AI/ML Engineer Skills Required: ? Exp. with Data engineering & Data/ML pipelines for Analytics & AI/ML Models ? Working with large sets of structured & unstructured data from disparate sources ? Foundational acquaintance of Data Science & AI/ML models ? SQL, Python, Spark, PySpark ? Big Data systems - Hadoop Ecosystem (HDFS, Hive, MapReduce) ? Analytics database like Druid, Data visualization/exploration tools like Superset ? Understanding of cloud platforms (AWS, Azure, or GCP) ? GIT Secondary Skills (desired/preferred): ? Apache Airflow, Apache Oozie, Nifi ? GCP cloud exp, Big Query, GCP Big Data Ecosystem ? Trino/Presto ? Familiar with forecasting algo...
Posted 1 week ago
7.0 - 10.0 years
14 - 24 Lacs
hyderabad, chennai, bengaluru
Hybrid
Years of Experience 7 to 10 Years Notice Period Only Immediate to 15 days Mandatory Scala and Python Apache Spark (batch & streaming) must! Deep knowledge of HDFS internals and migration strategies. Experience with Apache Iceberg (or similar table formats like Delta Lake / Apache Hudi) for schema evolution, ACID transactions, and time travel. Running Spark and/or Flink jobs on Kubernetes (e.g., Spark-on-K8s operator, Flink-on-K8s). Experience with distributed blob storages like Ceph or AWS S3 and similar Building ingestion, transformation, and enrichment pipelines for large-scale datasets. Infrastructure-as-Code (Terraform, Helm) for provisioning data infrastructure. Ability to work independ...
Posted 1 week ago
6.0 - 9.0 years
27 - 42 Lacs
kochi
Work from Office
Skill: Pyspark Experience: 6 to 9 years Location: AIA Kochi Responsibilities Develop and maintain scalable data pipelines using Python and PySpark. Collaborate with data engineers and data scientists to understand and fulfill data processing needs. Optimize and troubleshoot existing PySpark applications for performance improvements. Write clean, efficient, and well-documented code following best practices. Participate in design and code reviews. Develop and implement ETL processes to extract, transform, and load data. Ensure data integrity and quality throughout the data lifecycle. Stay current with the latest industry trends and technologies in big data and cloud computing.
Posted 1 week ago
5.0 - 7.0 years
4 - 8 Lacs
hyderabad, telangana, india
On-site
We are looking for a skilled Big Data Engineer with 5 to 7 years of hands-on experience in Big Data technologies. The ideal candidate should have strong expertise in Hadoop, HDFS, Hive, SQL, and Unix, and be able to work on large-scale data processing systems. Roles and Responsibilities: Work on design, development, and optimization of Big Data processing pipelines. Manage and support Hadoop ecosystems including HDFS and Hive. Develop complex SQL queries for data extraction, transformation, and analysis. Perform Unix-based scripting for automation and operational support. Troubleshoot and resolve issues related to data ingestion, processing, and delivery. Collaborate with cross-functional te...
Posted 1 week ago
2.0 - 5.0 years
3 - 12 Lacs
hyderabad, telangana, india
On-site
Job Summary: The Spark Developer is responsible for designing and developing big data applications using Apache Spark and related technologies. This role involves working on large-scale data processing, real-time analytics, and building efficient data pipelines for various business use cases. Key Responsibilities: Design and implement scalable data processing solutions using Apache Spark Develop data pipelines and ETL workflows for batch and streaming data Optimize Spark applications for performance and resource usage Integrate Spark jobs with Hadoop, Hive, Kafka, and other big data technologies Collaborate with data engineers, analysts, and architects to meet data processing requirements Wr...
Posted 1 week ago
7.0 - 12.0 years
0 Lacs
delhi
On-site
As a Super Quality Data Architect, Data Engineering Manager/Director, your primary role will be to lead and oversee data engineering initiatives within the organization. Your responsibilities will include: - Having 12+ years of experience in Data Engineering roles, with at least 2+ years in a Leadership position. - Possessing 7+ years of experience in hands-on Tech development using Java (Highly preferred) or Python, Node.JS, GoLang. - Demonstrating strong expertise in large data technologies and tools such as HDFS, YARN, Map-Reduce, Hive, Kafka, Spark, Airflow, Presto, etc. - Showing proficiency in High-Level Design (HLD) and Low-Level Design (LLD) to create scalable and maintainable data a...
Posted 1 week ago
6.0 - 10.0 years
0 Lacs
india
Remote
Job Description: Data Engineer (Big Data/Kafka) We are seeking a highly experienced Senior Data Engineer with a deep background in Big Data technologies to join our team. This is a contract role for a major project in the Banking sector. Key Details: Role: Data Engineer Industry: Banking (Financial Services) Work Location: Remote (India) Contract Duration: 6 months initial contract (with a strong likelihood of extension to long-term) Notice Period: 30 days or less Note: This role is for professionals seeking a contract engagement, not for freelancers or independent contractors. Key Responsibilities: Technical Execution: Design, write, and tune complex data processing jobs using Java, MapRedu...
Posted 1 week ago
0.0 years
0 Lacs
pune, maharashtra, india
On-site
GCP/Bigdata Provides technical leadership in Big Data space (Hadoop Stack like Spark, M/R, HDFS, Hive, HBase etc) and contributes to open source Big Data technologies. Must have : Operating knowledge of cloud computing platforms (GCP, especially Big Query, Dataflow, Dataproc, Storage, VMs, Networking, Pub Sub, Cloud Functions, Composer servics) Should be aware with columnar database e.g parquet, ORC etc Visualize and evangelize next generation infrastructure in Cloud platform/Big Data space (Batch, Near Real-time, Real-time technologies). Passionate for continuous learning, experimenting, applying and contributing towards cutting edge open source technologies and software paradigms Developin...
Posted 1 week ago
8.0 - 12.0 years
0 Lacs
pune, maharashtra
On-site
Role Overview: You will be a Senior Engineer, VP based in Pune, India. Your role will involve managing or performing work across multiple areas of the bank's IT Platform/Infrastructure, including analysis, development, and administration. You may also be responsible for providing functional oversight of engineering delivery for specific departments. Your key responsibilities will include planning and developing engineering solutions to achieve business goals, ensuring reliability and resiliency in solutions, promoting maintainability and reusability, reviewing engineering plans and quality, and participating in industry forums to drive the adoption of innovative technologies in the bank. You...
Posted 2 weeks ago
0.0 years
7 - 11 Lacs
bengaluru
Work from Office
Strong python programming skills. Strong in SQL scripting. Good to have experiece in snowflake cloud datawarehouse Experience in AWS services like s3, cloudwatch, kms Required good debugging skills Nice to have exposure to airflow orchestration and CI/CD Good communication and problem solving skills Experience in working on Pyspark & Databricks
Posted 2 weeks ago
5.0 - 8.0 years
15 - 20 Lacs
bengaluru
Work from Office
Location: Manyata Tech Park. Experience: 5-8 years in Data science. JD - Skill Set Required: ? Familiar with forecasting algorithms, such as Time Series (Arima, Prophet), ML (GLM, GBDT, XgBoost), Hierarchical model (Top Down, Bottom Up), DL (Seq2Seq, Transformer), and Ensemble methods. ? Hand-on experience (real-time) in building end-to-end ML models and pipelines. ? SQL, Python, Spark, PySpark. ? Big Data systems - Hadoop Ecosystem (HDFS, Hive, MapReduce) or Cloud ? MLflow, Kubeflow, GCP Vertex AI, Databricks or equivalents. ? Analytics database like Druid, Data visualization/exploration tools like Superset. ? CI\CD ? GIT Secondary Skills (desired): ? Apache Airflow, Apache Oozie, Nifi ? GC...
Posted 2 weeks ago
8.0 - 11.0 years
13 - 17 Lacs
bengaluru
Work from Office
? Mentor and guide engineering teams in all aspects of the SDLC. ? Lead technical design sessions and translate ideas into robust technical architecture. ? Develop and maintain scalable, high-performance data pipelines and streaming systems. ? Use Python, SQL, Scala, NodeJS, and work with relational databases, NoSQL stores, and Kafka. ? Refactor and optimize legacy codebases for improved performance and scalability. ? Collaborate with product and engineering teams to deliver innovative solutions in an agile environment. ? Support technical support teams with ad-hoc data queries and operational troubleshooting. ? Ensure data pipelines and APIs meet enterprise-grade performance and usability s...
Posted 2 weeks ago
3.0 - 5.0 years
7 - 11 Lacs
bengaluru
Work from Office
? Design, develop, and maintain scalable data pipelines and streaming systems. ? Work with Python, SQL, Scala, NodeJS, and various data stores (relational, NoSQL, Kafka). ? Analyze healthcare and marketing datasets using SQL. ? Refactor and optimize existing codebases for performance improvements. ? Collaborate closely within an agile team to deliver creative technical solutions. ? Support ad-hoc data requests and assist technical support teams as needed. ? Ensure high performance of data ingestion, transformation, and API consumption layers. ? Follow best practices for coding, testing, CI/CD, and cloud deployments
Posted 2 weeks ago
10.0 - 13.0 years
22 - 26 Lacs
bengaluru
Work from Office
Key Responsibilities - Engage with clients face-to-face to understand business needs, present architectural solutions, and maintain strong professional relationships. - Design and implement scalable, secure, and efficient data pipelines using a variety of AWS services (e.g., Glue, Lambda, S3, Redshift, EMR, etc.). - Lead and support the preparation and submission of RFPs, ensuring innovative and tailored solutions meet client expectations. - Develop comprehensive architectural strategies and roadmaps for cloud-based data solutions. - Collaborate with cross-functional teams to deliver end-to-end solutions, ensuring alignment with business objectives. - Optimize cloud resources for performance...
Posted 2 weeks ago
3.0 - 5.0 years
7 - 11 Lacs
bengaluru
Work from Office
Python Programming Strong experience in writing clean, modular, and production-grade Python code. SQL Proficient in writing complex queries performance tuning and working with large datasets. Basic Advanced AWS services S3 Redshift Snowflake Glue EMR getting feel of AWS tech stack Team Leadership Team Work Ability to mentor, guide and manage a team of engineers effectively. Questions such as how in the past you have led the team and collaboration of work Please take note of communication skills and articulation from here as well Design Experience of understanding design of system from technical and functional perspective you can get from asking overview of any one project Communication Speci...
Posted 2 weeks ago
0.0 years
0 Lacs
pune, maharashtra, india
On-site
Join us as a Data Analyst at Barclays, where you'll take part in the evolution of our digital landscape, driving innovation and excellence. You'll harness cutting-edge technology to revolutionize our digital offerings, ensuring unparalleled customer experiences. As a part of the team, you will deliver technology stack, using strong analytical and problem solving skills to understand the business requirements and deliver quality solutions. You'll be working on complex technical problems that will involve detailed analytical skills and analysis. This will be done in conjunction with fellow engineers, business analysts and business stakeholders. To be successful as a Data Analyst you should hav...
Posted 2 weeks ago
10.0 - 14.0 years
0 Lacs
bengaluru, karnataka, india
On-site
Position Summary... What you'll do... About Team: As part of the Conversational AI team, we are building completely new capabilities to allow our customers to shop by seamlessly interacting with their connected devices using text and spoken language. This team is part of the Emerging tech organization and will build new experiences both in-house and in collaboration with strategic partners. As part of this team, you will get to work on industry leading solutions and be at the forefront of this emerging platform. What You'll Do: Drive architecture, design, development, operation and documentation of large-scale services. Design, build, test and deploy cutting edge solutions at scale, impactin...
Posted 2 weeks ago
3.0 - 5.0 years
0 Lacs
mumbai, maharashtra, india
On-site
Choosing Capgemini means choosing a company where you will be empowered to shape your career in the way you'd like, where you'll be supported and inspired by a collaborative community of colleagues around the world, and where you'll be able to reimagine what's possible. Join us and help the world's leading organizations unlock the value of technology and build a more sustainable, more inclusive world. Your Role As a senior software engineer with Capgemini, you will have 3 + years of experience in Scala with strong project track record. Hands-on expertise in Scala and Spark, strong SQL skills on DB2, and proficiency in handling diverse file formats including JSON, Parquet, AVRO, ORC, and XML....
Posted 2 weeks ago
5.0 - 8.0 years
15 - 20 Lacs
bengaluru
Work from Office
Location: Manyata Tech Park. Experience: 5-8 years in Data science. JD - Skill Set Required: ? Familiar with forecasting algorithms, such as Time Series (Arima, Prophet), ML (GLM, GBDT, XgBoost), Hierarchical model (Top Down, Bottom Up), DL (Seq2Seq, Transformer), and Ensemble methods. ? Hand-on experience (real-time) in building end-to-end ML models and pipelines. ? SQL, Python, Spark, PySpark. ? Big Data systems - Hadoop Ecosystem (HDFS, Hive, MapReduce) or Cloud ? MLflow, Kubeflow, GCP Vertex AI, Databricks or equivalents. ? Analytics database like Druid, Data visualization/exploration tools like Superset. ? CI\CD ? GIT Secondary Skills (desired): ? Apache Airflow, Apache Oozie, Nifi ? GC...
Posted 2 weeks ago
7.0 - 10.0 years
10 - 14 Lacs
gurugram, bengaluru
Work from Office
Need to hire GCP enabled Module Leads and Leads with proficiency on data engineering technologies and languages. These folks should be able to to drive and lead migration from On Prem to GCP for Amex use cases Roles and Responsibilities 7-10 years of experience in the role of implementation of high end software products. Provides technical leadership in Big Data space (Hadoop Stack like Spark, M/R, HDFS, Hive, HBase etc) and contributes to open source Big Data technologies. Must have : Operating knowledge of cloud computing platforms (GCP, especially Big Query, Dataflow, Dataproc, Storage, VMs, Networking, Pub Sub, Cloud Functions, Composer servics) Should be aware with columnar database e.g...
Posted 2 weeks ago
4.0 - 7.0 years
10 - 14 Lacs
gurugram, bengaluru
Work from Office
Must have : Bigdata ,GCP Roles and Responsibilities Must have : Bigdata ,GCP
Posted 2 weeks ago
10.0 - 15.0 years
12 - 20 Lacs
bengaluru
Work from Office
10+ years as a hands-on Solutions Architect and/or Data Engineer designing and implementing data solutions Team lead, and/or mentorship of other engineers Ability to develop end-to-end technical solutions into production and to help ensure performance, security, scalability, and robust data integration. Programming expertise in Java, Python and/or Scala Core cloud data platforms including Snowflake, AWS, Azure, Databricks and GCP SQL and the ability to write, debug, and optimize SQL queries Client-facing written and verbal communication skills and experience Create and deliver detailed presentations Detailed solution documentation (e.g. including POCS and roadmaps, sequence diagrams, class h...
Posted 2 weeks ago
 
        Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
 
            
         
                            
                            Accenture
112680 Jobs | Dublin
 
                            
                            Wipro
38528 Jobs | Bengaluru
 
                            
                            EY
31593 Jobs | London
 
                            
                            Accenture in India
29380 Jobs | Dublin 2
 
                            
                            Uplers
23909 Jobs | Ahmedabad
 
                            
                            Turing
21712 Jobs | San Francisco
 
                            
                            Amazon.com
18899 Jobs |
 
                            
                            IBM
18825 Jobs | Armonk
 
                            
                            Accenture services Pvt Ltd
18675 Jobs |
 
                            
                            Capgemini
18333 Jobs | Paris,France