Jobs
Interviews

4 Step Function Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 8.0 years

0 Lacs

pune, maharashtra

On-site

You should have strong experience in PySpark, Python, Unix scripting, SparkSQL, and Hive. You must be proficient in writing SQL queries, creating views, and possess excellent oral and written communication skills. Prior experience in the Insurance domain would be beneficial. A good understanding of the Hadoop Ecosystem including HDFS, Map Reduce, Pig, Hive, Oozie, and Yarn is required. Knowledge of AWS services such as Glue, AWS S3, Lambda function, Step Function, and EC2 is essential. Experience in data migration from platforms like Hive/S3 to Data Bricks is a plus. You should be able to prioritize, plan, organize, and manage multiple tasks efficiently while delivering high-quality work. As a candidate, you should have 6-8 years of technical experience in PySpark, AWS (Glue, EMR, Lambda, Steps functions, S3), with at least 3 years of experience in Big Data/ETL using Python, Spark, and Hive, along with 3+ years of experience in AWS. Your primary key skills should include PySpark, AWS (Glue, EMR, Lambda, Steps functions, S3), and Big Data with Python, Spark, and Hive experience. Exposure to Big Data migration is also important. Secondary key skills that would be beneficial for this role include Informatica BDM/Power center, Data Bricks, and MongoDB.,

Posted 6 days ago

Apply

2.0 - 6.0 years

0 Lacs

karnataka

On-site

You should have a strong knowledge of SQL and Python. Experience in Snowflake is preferred. Additionally, you should have knowledge of AWS services such as S3, Lamdba, IAM, Step function, SNS, SQS, ECS, and Dynamo. It is important to have expertise in data movement technologies like ETL/ELT. Good to have skills include knowledge on DevOps, Continuous Integration, and Continuous Delivery with tools such as Maven, Jenkins, Stash, Control-M, Docker. Experience in automation and REST APIs would be beneficial for this role.,

Posted 1 week ago

Apply

5.0 - 10.0 years

10 - 20 Lacs

Hyderabad, Pune, Bengaluru

Hybrid

Who We Are:- We are a digitally native company that helps organizations reinvent themselves and unleash their potential. We are the place where innovation, design and engineering meet scale. Globant is 20 years old, NYSE listed public organization with more than 33,000+ employees worldwide working out of 35 countries globally. www.globant.com Job location: Pune/Hyderabad/Bangalore Work Mode: Hybrid Experience: 5 to 10 Years Must have skills are 1) AWS (EC2 & EMR & EKS) 2) RedShift 3) Lambda Functions 4) Glue 5) Python 6) Pyspark 7) SQL 8) Cloud watch 9) No SQL Database - DynamoDB/MongoDB/ OR any We are seeking a highly skilled and motivated Data Engineer to join our dynamic team. The ideal candidate will have a strong background in designing, developing, and managing data pipelines, working with cloud technologies, and optimizing data workflows. You will play a key role in supporting our data-driven initiatives and ensuring the seamless integration and analysis of large datasets. Design Scalable Data Models: Develop and maintain conceptual, logical, and physical data models for structured and semi-structured data in AWS environments. Optimize Data Pipelines: Work closely with data engineers to align data models with AWS-native data pipeline design and ETL best practices. AWS Cloud Data Services: Design and implement data solutions leveraging AWS Redshift, Athena, Glue, S3, Lake Formation, and AWS-native ETL workflows. Design, develop, and maintain scalable data pipelines and ETL processes using AWS services (Glue, Lambda, RedShift). Write efficient, reusable, and maintainable Python and PySpark scripts for data processing and transformation. Optimize SQL queries for performance and scalability. Expertise in writing complex SQL queries and optimizing them for performance. Monitor, troubleshoot, and improve data pipelines for reliability and performance. Focusing on ETL automation using Python and PySpark, responsible for design, build, and maintain efficient data pipelines, ensuring data quality and integrity for various applications.

Posted 2 weeks ago

Apply

4.0 - 8.0 years

20 - 25 Lacs

Bengaluru

Hybrid

About Business Unit: The Product team forms the crux of our powerful platforms and helps connect millions of customers worldwide with the brands that matter most to them. This team of innovative thinkers develops and builds products that position Epsilon as a differentiator, fostering an open and balanced marketplace built on respect for individuals, where every brand interaction holds value. Our full-cycle product engineering and data teams chart the future and set new benchmarks for our products, by leveraging industry best practices and advanced capabilities in data, machine learning, and artificial intelligence. Driven by a passion for delivering smart end-to-end solutions, this team plays a key role in Epsilons success story. The Product team forms the crux of our powerful platforms and connects millions of customers to the product magic. This team of innovative thinkers develop and build products that help Epsilon be a market differentiator. They map the future and set new standards for our products, empowered with industry best practices, ML and AI capabilities. The team passionately delivers intelligent end-to-end solutions and plays a key role in Epsilons success story Candidate will be a member of the Product Development Team responsible for developing, managing, and implementing internet applications for the product engineering group predominantly using Angular and .NET. Why we are looking for you: You have a hands-on experience in AWS or Azure. You have a hands-on experience in .NET Development. Good to have knowledge on Terraform to develop Infrastructure as code. Good to have knowledge in Angular and Node JS. You enjoy new challenges and are solution oriented. What you will enjoy in this role: As part of the Epsilon Product Engineering team, the pace of the work matches the fast-evolving demands of Fortune 500 clients across the globe As part of an innovative team thats not afraid to take risks, your ideas will come to life in digital marketing products that support more than 50% automotive dealers in the US The open and transparent environment that values innovation and efficiency. Opportunity to explore various AWS & Azure services at depth and enrich your experience on these fast-growing Cloud Services as well. Click here to view how Epsilon transforms marketing with 1 View, 1 Vision and 1 Voice. What you will do: Design and Develop applications and components primarily using .NET Core and Angular. Evaluate services of AWS & Azure and implement and manage infrastructure automation using Terraform. Collaborate with cross-functional teams to deliver high-quality software solutions. Improve and optimize deployment challenges and help in delivering reliable solution. Interact with technical leads and architects to discover solutions that help solve challenges faced by Product Engineering teams. Contribute to building an environment where continuous improvement of the development and delivery process is in focus and our goal is to deliver outstanding software. Qualifications: BE / B.Tech / MCA – No correspondence course 5 -8 years of experience Must have strong experience of working with .Net Core and Rest APIs . Good to have a working experience on Angular, NodeJS and Terraform. At least 2+ years of experience of working on AWS or Azure and certified in AWS or Azure.

Posted 2 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies