Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
3.0 - 5.0 years
0 Lacs
bengaluru, karnataka, india
On-site
We have an exciting and rewarding opportunity for you to take your software engineering career to the next level. As a Software Engineer III at JPMorgan Chase within Asset and Wealth Management you serve as a seasoned member of an agile team to design and deliver trusted market-leading technology products in a secure, stable, and scalable way. You are responsible for carrying out critical technology solutions across multiple technical areas within various business functions in support of the firm's business objectives. Job responsibilities Executes software solutions, design, development, and technical troubleshooting with ability to think beyond routine or conventional approaches to build solutions or break down technical problems Creates secure and high-quality production code and maintains algorithms that run synchronously with appropriate systems Produces architecture and design artifacts for complex applications while being accountable for ensuring design constraints are met by software code development Gathers, analyzes, synthesizes, and develops visualizations and reporting from large, diverse data sets in service of continuous improvement of software applications and systems Proactively identifies hidden problems and patterns in data and uses these insights to drive improvements to coding hygiene and system architecture Contributes to software engineering communities of practice and events that explore new and emerging technologies Adds to team culture of diversity, opportunity, inclusion, and respect Required qualifications, capabilities, and skills Formal training or certification on software engineering concepts and 3+ years applied experience Experience in core java, spring, Web development, AWS services , System design, lambda , SQS, step function Hands-on practical experience in system design, application development, testing, and operational stability Experience in developing, debugging, and maintaining code in a large corporate environment with one or more modern programming languages and database querying languages Overall knowledge of the Software Development Life Cycle Solid understanding of agile methodologies such as CI/CD, Application Resiliency, and Security Demonstrated knowledge of software applications and technical processes within a technical discipline (e.g., cloud, artificial intelligence, machine learning, mobile, etc.) Preferred qualifications, capabilities, and skills Familiarity with modern front-end technologies Exposure to cloud technologies
Posted 1 week ago
7.0 - 11.0 years
0 Lacs
maharashtra
On-site
As a Data Engineer with 7-10 years of experience, you will be responsible for architecting, creating, and maintaining data pipelines and ETL processes in AWS. Your role will involve supporting and optimizing the current desktop data tool set and Excel analysis pipeline to a transformative Cloud-based highly scalable architecture. You will work in an agile environment within a collaborative agile cross-functional product team using Scrum and Kanban methodologies. Collaboration is key in this role, as you will work closely with data science teams and business analysts to refine data requirements for various initiatives and data consumption needs. Additionally, you will be required to educate and train colleagues such as data scientists, analysts, and stakeholders in data pipelining and preparation techniques to facilitate easier integration and consumption of data for their use cases. Your expertise in programming languages like Python, Spark, and SQL will be essential, along with prior experience in AWS services such as AWS Lambda, Glue, Step function, Cloud Formation, and CDK. Knowledge of building bespoke ETL solutions, data modeling, and T-SQL for managing business data and reporting is also crucial for this role. You should be capable of conducting technical deep-dives into code and architecture and have the ability to design, build, and manage data pipelines encompassing data transformation, data models, schemas, metadata, and workload management. Furthermore, your role will involve working with data science teams to refine and optimize data science and machine learning models and algorithms. Effective communication skills are essential to collaborate effectively across departments and ensure compliance and governance during data use. In this role, you will be expected to work within and promote a DevOps culture and Continuous Delivery process to enhance efficiency and productivity. This position offers the opportunity to be part of a dynamic team that aims to drive positive change through technology and innovation. Please note that this role is based in Mumbai, with the flexibility to work remotely from anywhere in India.,
Posted 2 weeks ago
3.0 - 7.0 years
0 Lacs
pune, maharashtra
On-site
YASH Technologies is a leading technology integrator specializing in helping clients reimagine operating models, enhance competitiveness, optimize costs, foster exceptional stakeholder experiences, and drive business transformation. At YASH, you will be part of a team of talented individuals working with cutting-edge technologies. Our purpose is centered around making real positive changes in an increasingly virtual world, transcending generational gaps and disruptions of the future. We are currently seeking AWS Data Pipeline Professionals with 3-5 years of experience in the following areas: - Designing, developing, and implementing cloud solutions on AWS, utilizing a wide range of AWS services such as Glue ETL, Glue Data Catalog, Athena, Redshift, RDS, DynamoDB, Step Function, Event Bridge, Lambda, API Gateway, ECS, and ECR. - Demonstrating expertise in implementing AWS core services like EC2, RDS, VPC, ELB, EBS, Route 53, ELB, S3, Dynamo DB, and CloudWatch. - Leveraging strong Python and PySpark data engineering capabilities to analyze business requirements, translate them into technical solutions, and ensure successful execution. - Possessing expertise in AWS Data and Analytics Stack, including Glue ETL, Glue Data Catalog, Athena, Redshift, RDS, DynamoDB, Step Function, Event Bridge, Lambda, API Gateway, ECS, and ECR for containerization. - Developing HLDs, LLDs, test plans, and execution plans for cloud solution implementations, including Work Breakdown Structures (WBS). - Interacting with customers to understand cloud service requirements, transforming requirements into workable solutions, and building and testing those solutions. - Managing multiple cloud solution projects, demonstrating technical ownership and accountability. - Capturing and sharing best-practice knowledge within the AWS solutions architect community. - Serving as a technical liaison between customers, service engineering teams, and support. - Possessing a strong understanding of cloud and infrastructure components to deliver end-to-end cloud infrastructure architectures and designs. - Collaborating effectively with team members globally. - Demonstrating excellent analytical and problem-solving skills. - Exhibiting strong communication and presentation skills. - Ability to work independently and as part of a team. - Experience working with onshore-offshore teams. Required Behavioral Competencies include: - Accountability - Collaboration - Agility - Customer Focus - Communication - Drives Results Certifications are considered good to have. At YASH, you will have the opportunity to create a career path that aligns with your goals while being part of an inclusive team environment. We emphasize career-oriented skilling models and leverage technology for continuous learning, unlearning, and relearning at a rapid pace and scale. Our workplace culture is grounded in four principles: - Flexible work arrangements, free spirit, and emotional positivity - Agile self-determination, trust, transparency, and open collaboration - All support needed for the realization of business goals - Stable employment with a great atmosphere and ethical corporate culture.,
Posted 2 weeks ago
5.0 - 10.0 years
17 - 25 Lacs
pune, bengaluru, mumbai (all areas)
Hybrid
Experience : 5-12 Years Location : Mumbai, Bangalore, Pune, Chennai ,Hyderabad, Kolkata, Noida, Kochi, Coimbatore, Mysore, Nagpur, Bhubaneswar, Indore, Warangal AWS , Python programming , Lambda function, SQS, SNS, Step Function, DynamoDB, IAM, S3, API Gateway AWS CDK, PyTest , ECR, ECS, AWS Certified Architect, AWS Developer Associate certification Job description Responsibilities Collaborate with stakeholders to understand requirements and translate them into technical specifications Design develop and implement software solutions for SDVI with a strong emphasis on Python programming Create and maintain efficient algorithms and data structures for managing video infrastructure resources Troubleshoot and debug issues in existing software systems providing timely resolutions and enhancements Participate in code reviews and provide constructive feedback to team members to maintain code quality and adherence to best practices Document software designs implementation details and troubleshooting procedures for knowledge sharing and future reference Requirements Strong proficiency in Python programming language with 5 years of expertise in developing scalable and efficient software solutions Expertise in SDVI Rally Experience with CICD practices and their implementation Working experience with Spring Boot and Prometheus Practical experience with Amazon Web Services and Grafana Experience with containerization technologies such as Docker and Kubernetes Solid understanding of computer networks Proficiency in developing RESTful APIs and microservices architectures Familiarity with cloud computing platforms such as AWS Azure or Google Cloud Platform Knowledge of video encodingtranscoding technologies and standards such as MPEGDASH and HLS is a plus Solid understanding of software development methodologies including Agile and Scrum Excellent problemsolving skills and a strong attention to detail Effective communication skills with the ability to collaborate effectively in a team environment Skills Mandatory Skills : Microservices,Python,AWS Lambda,AWS RDS,AWS S3,AWS API Gateway,SQS,SNS,Aws Step Functions,Django,Docker,Dynamo DB
Posted 3 weeks ago
3.0 - 8.0 years
0 Lacs
pune, maharashtra
On-site
You should have strong experience in PySpark, Python, Unix scripting, SparkSQL, and Hive. You must be proficient in writing SQL queries, creating views, and possess excellent oral and written communication skills. Prior experience in the Insurance domain would be beneficial. A good understanding of the Hadoop Ecosystem including HDFS, Map Reduce, Pig, Hive, Oozie, and Yarn is required. Knowledge of AWS services such as Glue, AWS S3, Lambda function, Step Function, and EC2 is essential. Experience in data migration from platforms like Hive/S3 to Data Bricks is a plus. You should be able to prioritize, plan, organize, and manage multiple tasks efficiently while delivering high-quality work. As a candidate, you should have 6-8 years of technical experience in PySpark, AWS (Glue, EMR, Lambda, Steps functions, S3), with at least 3 years of experience in Big Data/ETL using Python, Spark, and Hive, along with 3+ years of experience in AWS. Your primary key skills should include PySpark, AWS (Glue, EMR, Lambda, Steps functions, S3), and Big Data with Python, Spark, and Hive experience. Exposure to Big Data migration is also important. Secondary key skills that would be beneficial for this role include Informatica BDM/Power center, Data Bricks, and MongoDB.,
Posted 1 month ago
2.0 - 6.0 years
0 Lacs
karnataka
On-site
You should have a strong knowledge of SQL and Python. Experience in Snowflake is preferred. Additionally, you should have knowledge of AWS services such as S3, Lamdba, IAM, Step function, SNS, SQS, ECS, and Dynamo. It is important to have expertise in data movement technologies like ETL/ELT. Good to have skills include knowledge on DevOps, Continuous Integration, and Continuous Delivery with tools such as Maven, Jenkins, Stash, Control-M, Docker. Experience in automation and REST APIs would be beneficial for this role.,
Posted 1 month ago
5.0 - 10.0 years
10 - 20 Lacs
Hyderabad, Pune, Bengaluru
Hybrid
Who We Are:- We are a digitally native company that helps organizations reinvent themselves and unleash their potential. We are the place where innovation, design and engineering meet scale. Globant is 20 years old, NYSE listed public organization with more than 33,000+ employees worldwide working out of 35 countries globally. www.globant.com Job location: Pune/Hyderabad/Bangalore Work Mode: Hybrid Experience: 5 to 10 Years Must have skills are 1) AWS (EC2 & EMR & EKS) 2) RedShift 3) Lambda Functions 4) Glue 5) Python 6) Pyspark 7) SQL 8) Cloud watch 9) No SQL Database - DynamoDB/MongoDB/ OR any We are seeking a highly skilled and motivated Data Engineer to join our dynamic team. The ideal candidate will have a strong background in designing, developing, and managing data pipelines, working with cloud technologies, and optimizing data workflows. You will play a key role in supporting our data-driven initiatives and ensuring the seamless integration and analysis of large datasets. Design Scalable Data Models: Develop and maintain conceptual, logical, and physical data models for structured and semi-structured data in AWS environments. Optimize Data Pipelines: Work closely with data engineers to align data models with AWS-native data pipeline design and ETL best practices. AWS Cloud Data Services: Design and implement data solutions leveraging AWS Redshift, Athena, Glue, S3, Lake Formation, and AWS-native ETL workflows. Design, develop, and maintain scalable data pipelines and ETL processes using AWS services (Glue, Lambda, RedShift). Write efficient, reusable, and maintainable Python and PySpark scripts for data processing and transformation. Optimize SQL queries for performance and scalability. Expertise in writing complex SQL queries and optimizing them for performance. Monitor, troubleshoot, and improve data pipelines for reliability and performance. Focusing on ETL automation using Python and PySpark, responsible for design, build, and maintain efficient data pipelines, ensuring data quality and integrity for various applications.
Posted 2 months ago
4.0 - 8.0 years
20 - 25 Lacs
Bengaluru
Hybrid
About Business Unit: The Product team forms the crux of our powerful platforms and helps connect millions of customers worldwide with the brands that matter most to them. This team of innovative thinkers develops and builds products that position Epsilon as a differentiator, fostering an open and balanced marketplace built on respect for individuals, where every brand interaction holds value. Our full-cycle product engineering and data teams chart the future and set new benchmarks for our products, by leveraging industry best practices and advanced capabilities in data, machine learning, and artificial intelligence. Driven by a passion for delivering smart end-to-end solutions, this team plays a key role in Epsilons success story. The Product team forms the crux of our powerful platforms and connects millions of customers to the product magic. This team of innovative thinkers develop and build products that help Epsilon be a market differentiator. They map the future and set new standards for our products, empowered with industry best practices, ML and AI capabilities. The team passionately delivers intelligent end-to-end solutions and plays a key role in Epsilons success story Candidate will be a member of the Product Development Team responsible for developing, managing, and implementing internet applications for the product engineering group predominantly using Angular and .NET. Why we are looking for you: You have a hands-on experience in AWS or Azure. You have a hands-on experience in .NET Development. Good to have knowledge on Terraform to develop Infrastructure as code. Good to have knowledge in Angular and Node JS. You enjoy new challenges and are solution oriented. What you will enjoy in this role: As part of the Epsilon Product Engineering team, the pace of the work matches the fast-evolving demands of Fortune 500 clients across the globe As part of an innovative team thats not afraid to take risks, your ideas will come to life in digital marketing products that support more than 50% automotive dealers in the US The open and transparent environment that values innovation and efficiency. Opportunity to explore various AWS & Azure services at depth and enrich your experience on these fast-growing Cloud Services as well. Click here to view how Epsilon transforms marketing with 1 View, 1 Vision and 1 Voice. What you will do: Design and Develop applications and components primarily using .NET Core and Angular. Evaluate services of AWS & Azure and implement and manage infrastructure automation using Terraform. Collaborate with cross-functional teams to deliver high-quality software solutions. Improve and optimize deployment challenges and help in delivering reliable solution. Interact with technical leads and architects to discover solutions that help solve challenges faced by Product Engineering teams. Contribute to building an environment where continuous improvement of the development and delivery process is in focus and our goal is to deliver outstanding software. Qualifications: BE / B.Tech / MCA – No correspondence course 5 -8 years of experience Must have strong experience of working with .Net Core and Rest APIs . Good to have a working experience on Angular, NodeJS and Terraform. At least 2+ years of experience of working on AWS or Azure and certified in AWS or Azure.
Posted 3 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
73564 Jobs | Dublin
Wipro
27625 Jobs | Bengaluru
Accenture in India
22690 Jobs | Dublin 2
EY
20638 Jobs | London
Uplers
15021 Jobs | Ahmedabad
Bajaj Finserv
14304 Jobs |
IBM
14148 Jobs | Armonk
Accenture services Pvt Ltd
13138 Jobs |
Capgemini
12942 Jobs | Paris,France
Amazon.com
12683 Jobs |