Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 10.0 years
7 - 11 Lacs
mumbai
Work from Office
Grade Level (for internal use): 10 S&P Global Dow Jones Indices The Role : User Interface Software Engineer Vue JS The Team : As a team member on an Agile team that develops and supports an internal cloud-based platform that managed financial market indices. You will be part of Agile technology team in a global organization comprising of DEV/QA/PO teams. The team supports both back-end services in Python and AWS as well as a Vue 3 front-end UI. S&P DJI is the worlds leading resource for benchmarks and investable indices. As a developer in this role, you will contribute to building a system that has a broad global impact on the stability of financial markets and to the results of individual investing. The team builds tools for our business users to manage market indices. This project is a new phase of development focused on improving user experience and introducing innovative new capabilities. Responsibilities and Impact : As an application engineer for this role means that you are working with cutting edge UI technology, but you are also comfortable working with REST APIs and SQL databases, and you easily understand business logic implementation. The candidate must be a delivery-focused person, initiative-taking with an agile development mindset. You enjoy agile prototyping, collaborating with your team and customers, and delivering high-quality high business value product. You are a problem solver and an innovator more than someone who just implements requirements. The role is approximately 50% UI engineer, 20% collaboration with full stack developers working on the backend, and 20% UI designer. An accomplished software developer with 5+ years of professional experience you need to understand code abstraction, library creation, code reuse, front end architecture and design patterns. An accomplished UI developer on modern frameworks, with strong Vue 3 experience. Evaluates and utilizes off-the-shelf libraries and frameworks to accelerate development, but not afraid to build from scratch if required. More of a software engineer than a UI designer but understands UI/UX enough to build features independently without a formal design. Enjoys solving complex technical challenges and unique business problems; always curious and wanting to learn. Familiarity with back-end systems and REST API development, preferably using Python. A strong team player: Focused on success of the team completing sprint goals, aligning with the objectives of the larger organization, and understanding the users needs. Prior team technical leadership skills preferred. Must communicate and collaborate well in Agile team of 5 8 engineers and QA, balancing individual tasks with larger team goals. Initiative-taking and delivery focused. Able to produce quality results from only high-level direction, and iteratively commits PRs to collaborate with the team. Whats in it for you : This is an opportunity to work on a team of highly talented and motivated engineers at a highly respected company. You will work on new development as well as enhancements to existing functionality. What Were Looking For: Basic Qualifications 5+ years of diverse software engineering on a wide range of technologies Primary languages include JavaScript, TypeScript, HTML5, CSS3, SCSS, Python. Front-end engineering and architecture experience in building complex Single Page Applications (SPA) using JavaScript Typescript Strong Vue JS (including Vue3) expertise. React and/or Angular experience is helpful, but this is primarily a Vue3 based UI. (must be able to get started immediately on Vue 3) Understanding of UI/UX design systems and strong knowledge of Tailwind CSS. Big data grid rendering and data visualization experience. AG-grid and AG-charts experience preferred. Additional Preferred Qualifications : Back-end skills including Python and JavaScript, PostgreSQL database, Python app servers such as Flask or FastAPI and deploying on Nginx. Object-oriented programming, design patterns, and working with modular systems. AWS Cloud services development experience including Docker, ECS, S3, and Redis. Experience building enterprise applications that have proprietary concepts (i.e. not a shopping cart or social media site) Strong experience collaborating with backend teams on RESTful web APIs development. Maintains a high standard of code style, performance and testability. Strong unit testing, logging and benchmarking experience. Experience with working within a CI/CD pipeline scan/test/build/deploy processes. Bachelor's degree in Computer Science, Information Systems, Engineering or, or in lieu, a demonstrated equivalence in work experience.
Posted 1 week ago
8.0 - 13.0 years
6 - 9 Lacs
hyderabad
Work from Office
The Team:You will be part of global technology team comprising of Developers, QA and BA teams and will be responsible for analysis, design, and development and testing. The Impact: You will be working on one of the core technology platforms responsible for the end of day calculation as well as dissemination of index values. Whats In It for You: You will have the opportunity to work on the enhancements to the existing index calculation system as well as implement new methodologies as required. Responsibilities: Design and development of Java applications for S&P Dow Jones Indices (SPDJI) web sites and its feeder systems Participate in multiple software development processes including Coding, Testing, Debugging & Documentation Develop software applications based on clear business specifications Work on new initiatives and support existing Index applications Perform Application & System Performance tuning and troubleshoot performance issues Develop web based applications and build rich front-end user interfaces Build applications with object oriented concepts and apply design patterns Integrate in-house applications with various vendor software platforms Setup development environment sandbox for application development Check-in application code changes into the source repository Perform unit testing of application code and fix errors Interface with databases to extract information and build reports Effectively interact with customers, business users and IT staff Basic Qualifications: Bachelor's degree in Computer Science, Information Systems or Engineering is required, or in lieu, a demonstrated equivalence in work experience Excellent communication and interpersonal skills are essential, with strong verbal and writing proficiency 8+ years of experience in application development and support Must have experience of AWS cloud (EC2, EMR, Lambda, S3, Glue, etc.) Strong hands-onexperience with Java, J2EE, Java Messaging Service (JMS) & Enterprise JavaBeans (EJB) Strong hands-onexperience with advanced SQL, PL/SQL programming Basic networking knowledge Unix scripting Exposure to addressingVulnerabilities Minimum 2 years of experience in anythreeor more offollowing: Advanced Python Advanced Scala Infrastructure/ CICDDevOps/ Ansible Fortify Jenkins Big data Microservices Spark using Scala Python Java and Hadoop Distributed File System (HDFS) QA Automation (Cucumber, Selenium, Karate etc.) Preferred Qualification: Experience working with large datasets in Equity, Commodities, Forex, Futures and Options asset classes preferred Experience with Index/Benchmarks or Asset Management or Trading platforms preferred Basic Knowledge of User Interface design & development using JQuery, HTML5 & CSS preferred
Posted 1 week ago
9.0 - 14.0 years
9 - 14 Lacs
mumbai
Work from Office
Grade Level (for internal use): 12 The Team: You will be part of global technology team comprising of Developers, QA and BA teams and will be responsible for analysis, design, and development and testing. The Impact: You will be working on one of the core technology platforms responsible for the end of day calculation as well as dissemination of index values. Whats In It for You: You will have the opportunity to work on the enhancements to the existing index calculation system as well as implement new methodologies as required. Responsibilities: Design and development of Java applications for S&P Dow Jones Indices (SPDJI) web sites and its feeder systems Participate in multiple software development processes including Coding, Testing, Debugging & Documentation Develop software applications based on clear business specifications Work on new initiatives and support existing Index applications Perform Application & System Performance tuning and troubleshoot performance issues Develop web based applications and build rich front-end user interfaces Build applications with object oriented concepts and apply design patterns Integrate in-house applications with various vendor software platforms Setup development environment / sandbox for application development Check-in application code changes into the source repository Perform unit testing of application code and fix errors Interface with databases to extract information and build reports Effectively interact with customers, business users and IT staff Basic Qualifications: Bachelor's degree in Computer Science, Information Systems or Engineering is required, or in lieu, a demonstrated equivalence in work experience Excellent communication and interpersonal skills are essential, with strong verbal and writing proficiency 9+ years of experience in application development and support Must have experience of AWS cloud (EC2, EMR, Lambda, S3, Glue, etc.) Strong hands-onexperience with Java, J2EE, Java Messaging Service (JMS) & Enterprise JavaBeans (EJB) Strong hands-onexperience with advanced SQL, PL/SQL programming Basic networking knowledge / Unix scripting Exposure to addressingVulnerabilities Minimum 2 years of experience in anythreeor more offollowing: Advanced Python Advanced Scala Infrastructure/ CICD /DevOps/ Ansible / Fortify / Jenkins Big data / Microservices QA Automation (Cucumber, Selenium, Karate etc.) Preferred Qualification: Experience working with large datasets in Equity, Commodities, Forex, Futures and Options asset classes preferred Experience with Index/Benchmarks or Asset Management or Trading platforms preferred Basic Knowledge of User Interface design & development using JQuery, HTML5 & CSS preferred
Posted 1 week ago
5.0 - 10.0 years
7 - 11 Lacs
mumbai
Work from Office
Grade Level (for internal use): 10 S&P Global Dow Jones Indices The Role : User Interface Software Engineer Vue JS The Team : As a team member on an Agile team that develops and supports an internal cloud-based platform that managed financial market indices. You will be part of Agile technology team in a global organization comprising of DEV/QA/PO teams. The team supports both back-end services in Python and AWS as well as a Vue 3 front-end UI. S&P DJI is the worlds leading resource for benchmarks and investable indices. As a developer in this role, you will contribute to building a system that has a broad global impact on the stability of financial markets and to the results of individual investing. The team builds tools for our business users to manage market indices. This project is a new phase of development focused on improving user experience and introducing innovative new capabilities. Responsibilities and Impact : As an application engineer for this role means that you are working with cutting edge UI technology, but you are also comfortable working with REST APIs and SQL databases, and you easily understand business logic implementation. The candidate must be a delivery-focused person, initiative-taking with an agile development mindset. You enjoy agile prototyping, collaborating with your team and customers, and delivering high-quality / high business value product. You are a problem solver and an innovator more than someone who just implements requirements. The role is approximately 50% UI engineer, 20% collaboration with full stack developers working on the backend, and 20% UI designer. An accomplished software developer with 5+ years of professional experience you need to understand code abstraction, library creation, code reuse, front end architecture and design patterns. An accomplished UI developer on modern frameworks, with strong Vue 3 experience. Evaluates and utilizes off-the-shelf libraries and frameworks to accelerate development, but not afraid to build from scratch if required. More of a software engineer than a UI designer but understands UI/UX enough to build features independently without a formal design. Enjoys solving complex technical challenges and unique business problems; always curious and wanting to learn. Familiarity with back-end systems and REST API development, preferably using Python. A strong team player: Focused on success of the team completing sprint goals, aligning with the objectives of the larger organization, and understanding the users needs. Prior team technical leadership skills preferred. Must communicate and collaborate well in Agile team of 5 8 engineers and QA, balancing individual tasks with larger team goals. Initiative-taking and delivery focused. Able to produce quality results from only high-level direction, and iteratively commits PRs to collaborate with the team. Whats in it for you : This is an opportunity to work on a team of highly talented and motivated engineers at a highly respected company. You will work on new development as well as enhancements to existing functionality. What Were Looking For: Basic Qualifications 5+ years of diverse software engineering on a wide range of technologies Primary languages include JavaScript, TypeScript, HTML5, CSS3, SCSS, Python. Front-end engineering and architecture experience in building complex Single Page Applications (SPA) using JavaScript / Typescript Strong Vue JS (including Vue3) expertise. React and/or Angular experience is helpful, but this is primarily a Vue3 based UI. (must be able to get started immediately on Vue 3) Understanding of UI/UX design systems and strong knowledge of Tailwind CSS. Big data grid rendering and data visualization experience. AG-grid and AG-charts experience preferred. Additional Preferred Qualifications : Back-end skills including Python and JavaScript, PostgreSQL database, Python app servers such as Flask or FastAPI and deploying on Nginx. Object-oriented programming, design patterns, and working with modular systems. AWS Cloud services development experience including Docker, ECS, S3, and Redis. Experience building enterprise applications that have proprietary concepts (i.e. not a shopping cart or social media site) Strong experience collaborating with backend teams on RESTful web APIs development. Maintains a high standard of code style, performance and testability. Strong unit testing, logging and benchmarking experience. Experience with working within a CI/CD pipeline scan/test/build/deploy processes. Bachelor's degree in Computer Science, Information Systems, Engineering or, or in lieu, a demonstrated equivalence in work experience.
Posted 1 week ago
8.0 - 13.0 years
30 - 35 Lacs
bengaluru
Work from Office
About The Role Data Engineer -1 (Experience 0-2 years) What we offer Our mission is simple Building trust. Our customer's trust in us is not merely about the safety of their assets but also about how dependable our digital offerings are. That"™s why, we at Kotak Group are dedicated to transforming banking by imbibing a technology-first approach in everything we do, with an aim to enhance customer experience by providing superior banking services. We welcome and invite the best technological minds in the country to come join us in our mission to make banking seamless and swift. Here, we promise you meaningful work that positively impacts the lives of many. About our team DEX is a central data org for Kotak Bank which manages entire data experience of Kotak Bank. DEX stands for Kotak"™s Data Exchange. This org comprises of Data Platform, Data Engineering and Data Governance charter. The org sits closely with Analytics org. DEX is primarily working on greenfield project to revamp entire data platform which is on premise solutions to scalable AWS cloud-based platform. The team is being built ground up which provides great opportunities to technology fellows to build things from scratch and build one of the best-in-class data lake house solutions. The primary skills this team should encompass are Software development skills preferably Python for platform building on AWS; Data engineering Spark (pyspark, sparksql, scala) for ETL development, Advanced SQL and Data modelling for Analytics. The org size is expected to be around 100+ member team primarily based out of Bangalore comprising of ~10 sub teams independently driving their charter. As a member of this team, you get opportunity to learn fintech space which is most sought-after domain in current world, be a early member in digital transformation journey of Kotak, learn and leverage technology to build complex data data platform solutions including, real time, micro batch, batch and analytics solutions in a programmatic way and also be futuristic to build systems which can be operated by machines using AI technologies. The data platform org is divided into 3 key verticals: Data Platform This Vertical is responsible for building data platform which includes optimized storage for entire bank and building centralized data lake, managed compute and orchestrations framework including concepts of serverless data solutions, managing central data warehouse for extremely high concurrency use cases, building connectors for different sources, building customer feature repository, build cost optimization solutions like EMR optimizers, perform automations and build observability capabilities for Kotak"™s data platform. The team will also be center for Data Engineering excellence driving trainings and knowledge sharing sessions with large data consumer base within Kotak. Data Engineering This team will own data pipelines for thousands of datasets, be skilled to source data from 100+ source systems and enable data consumptions for 30+ data analytics products. The team will learn and built data models in a config based and programmatic and think big to build one of the most leveraged data model for financial orgs. This team will also enable centralized reporting for Kotak Bank which cuts across multiple products and dimensions. Additionally, the data build by this team will be consumed by 20K + branch consumers, RMs, Branch Managers and all analytics usecases. Data Governance The team will be central data governance team for Kotak bank managing Metadata platforms, Data Privacy, Data Security, Data Stewardship and Data Quality platform. If you"™ve right data skills and are ready for building data lake solutions from scratch for high concurrency systems involving multiple systems then this is the team for you. You day to day role will include Drive business decisions with technical input and lead the team. Design, implement, and support an data infrastructure from scratch. Manage AWS resources, including EC2, EMR, S3, Glue, Redshift, and MWAA. Extract, transform, and load data from various sources using SQL and AWS big data technologies. Explore and learn the latest AWS technologies to enhance capabilities and efficiency. Collaborate with data scientists and BI engineers to adopt best practices in reporting and analysis. Improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers. Build data platforms, data pipelines, or data management and governance tools. BASIC QUALIFICATIONS for Data Engineer/ SDE in Data Bachelor's degree in Computer Science, Engineering, or a related field Experience in data engineering Strong understanding of AWS technologies, including S3, Redshift, Glue, and EMR Experience with data pipeline tools such as Airflow and Spark Experience with data modeling and data quality best practices Excellent problem-solving and analytical skills Strong communication and teamwork skills Experience in at least one modern scripting or programming language, such as Python, Java, or Scala Strong advanced SQL skills PREFERRED QUALIFICATIONS AWS cloud technologiesRedshift, S3, Glue, EMR, Kinesis, Firehose, Lambda, IAM, Airflow Prior experience in Indian Banking segment and/or Fintech is desired. Experience with Non-relational databases and data stores Building and operating highly available, distributed data processing systems for large datasets Professional software engineering and best practices for the full software development life cycle Designing, developing, and implementing different types of data warehousing layers Leading the design, implementation, and successful delivery of large-scale, critical, or complex data solutions Building scalable data infrastructure and understanding distributed systems concepts SQL, ETL, and data modelling Ensuring the accuracy and availability of data to customers Proficient in at least one scripting or programming language for handling large volume data processing Strong presentation and communications skills.
Posted 1 week ago
7.0 - 10.0 years
8 - 15 Lacs
chennai
Work from Office
Job Description: We are seeking a skilled Karate Automation Framework Engineer with over 7 years of experience to join our team. The ideal candidate will have a strong background in Spring Boot, object-oriented programming, and exposure to AWS cloud services. You should be proficient in using version control systems like Git and build tools such as Ant, Maven, and Gradle. As a team player, you will collaborate with cross-functional teams to ensure the delivery of high-quality software solutions. Key Responsibilities: Design, develop, and maintain automated test scripts using the Karate framework. Create test strategies and plans from business requirements. Develop and design test scripts to ensure compliance with project mandates. Clarify functional issues and escalate risks to stakeholders. Develop standard documents, systems, and procedures. Ensure repeatable and measurable testing methodologies. Manage testing workload to meet deadlines. Integrate automated tests into the CI/CD pipeline. Perform API testing using tools like Apache JMeter, REST-Assured, Postman, and Karate. Implement and maintain BDD test scenarios. Collaborate with development and QA teams for comprehensive test coverage. Troubleshoot and resolve test automation issues. Continuously improve test automation processes and tools. Provide mentorship to junior team members. Required Skills: Proficiency in Spring Boot and object-oriented programming. Exposure to AWS cloud services. Experience with version control systems like Git. Familiarity with build tools such as Ant, Maven, and Gradle. Experience with API testing tools like Apache JMeter, REST-Assured, Postman, and Karate. Hands-on experience in BDD frameworks. Strong problem-solving skills and attention to detail. Excellent communication and collaboration skills. Ability to work effectively in a team environment. Preferred Qualifications: Experience with other test automation frameworks. Knowledge of CI/CD tools and processes. Understanding of Agile methodologies.
Posted 1 week ago
5.0 - 7.0 years
9 - 14 Lacs
bengaluru
Work from Office
Educational Requirements Bachelor of Engineering,BTech,Bachelor Of Technology,BCA,BSc,MTech,MCA Service Line Application Development and Maintenance Responsibilities A day in the life of an InfoscionResponsibilities: Application migration to AWS/Azure/GCP cloud User requirements, envisioning system features and functionality. Identify bottlenecks and bugs, and recommend system solutions by comparing advantages and disadvantages of custom development Contributing to team meetings, troubleshooting development and production problems across multiple environments and operating platforms Understand Architecture Requirements and ensure effective Design, Development, Validation and Support activities Understand and analyze client requirements, refactor systems for workload migration / modernization to cloud (AWS, Azure, GCP) End-to-end feature development and resolving challenges faced in the implementation Create detailed design artifacts, work on development, and perform code reviews, implement validation and support activities Contribute to thought leadership within the area of technology specializationIf you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: Preferred Locations: Bangalore, Chennai, Pune Experience Required: 3 to 5 years of experience: Pure hands on and expertise on the skill, able to deliver without any support Experience Required: 5 - 9 years of experience: Design knowledge, estimation technique, leading and guiding the team on technical solution Experience Required: 9 - 13 years of experience: Architecture, Solutioning, (Optional) proposal Containerization, micro service development on AWS/Azure/GCP is preferred. In-depth knowledge of design issues and best practices Solid understanding of object-oriented programming Familiar with various design, architectural patterns and software development process. Implementing automated testing platforms and unit tests Strong experience in building and developing applications using technologies like .Net Core Knowledge about RESTful APIs and ability to design cloud ready applications using cloud SDKs Exposure to cloud compute services like VMs, PaaS services, containers, serverless and storage services on AWS/Azure/GCP Good understanding of application development design patterns Technical and Professional Requirements: Primary Skill: .Net CoreSecondary Skills: AWS/Azure/GCP Preferred Skills: .Net Technology->Microsoft Technologies->.NET Frameworks->.NET Core Generic Skills: Technology->Cloud Platform->AWS App Development Technology->Cloud Platform->Azure Devops Technology->Cloud Platform->GCP Devops
Posted 1 week ago
5.0 - 10.0 years
25 - 37 Lacs
bengaluru
Work from Office
hiring Senior & Staff Java Developers (512 yrs) in Bangalore (Hybrid, 3 days WFO). Skills: Java 8+, Spring Boot, Microservices, DSA, Cloud basics. Walk-in drive on 9th Aug 2025. 3 rounds same day, offers rolled out same day. Notice: 15 Days / less Provident fund
Posted 1 week ago
15.0 - 20.0 years
30 - 35 Lacs
pune, bengaluru
Hybrid
Hinjewadi,Pune/ Bangalore Lead the design and implementation of cloud-based solutions for data engineering projects.Orchestrate the integration of diverse applications and data sources across the cloud. Strong leadership and communication skills. Required Candidate profile Hybrid work Hinjewadi,Pune/ Bangalore READ CAREFULLY-jd
Posted 1 week ago
15.0 - 20.0 years
30 - 35 Lacs
pune, bengaluru
Hybrid
Hinjewadi,Pune/ Bangalore Lead the design and implementation of cloud-based solutions for data engineering projects.Orchestrate the integration of diverse applications and data sources across the cloud. Strong leadership and communication skills. Required Candidate profile Hybrid work Hinjewadi,Pune/ Bangalore READ CAREFULLY-jd
Posted 1 week ago
4.0 - 9.0 years
14 - 24 Lacs
hyderabad
Work from Office
As a Database Engineer supporting the banks Analytics platforms, you will be a part of a centralized team of database engineers who are responsible for the maintenance and support of Citizens’ most critical databases. A Database Engineer will be responsible for: Requires conceptual knowledge of database practices and procedures such as DDL, DML and DCL. Requires how to use basic SQL skills including SELECT, FROM, WHERE and ORDER BY. Ability to code SQL Joins, subqueries, aggregate functions (AVG, SUM, COUNT), and use data manipulation techniques (UPDATE, DELETE). Understanding basic data relationships and schemas. Develop Basic Entity-Relationship diagrams. Conceptual understanding of cloud computing Can solve routine problems using existing procedures and standard practices. Can look up error codes and open tickets with vendors Ability to execute, explain and identify poorly written queries Review data structures to ensure they adhere to database design best practices. Develop a comprehensive backup plan. Understanding the different cloud models (IaaS, PaaS, SaaS), service models, and deployment options (public, private, hybrid). Solves standard problems by analyzing possible solutions using experience, judgment and precedents. Troubleshoot database issues, such as integrity issues, blocking/deadlocking issues, log shipping issues, connectivity issues, security issues, memory issues, disk space, etc. Understanding cloud security concepts, including data protection, access control, and compliance. Manages risks that are associated with the use of information technology. Identifies, assesses, and treats risks that might affect the confidentiality, integrity, and availability of the organization's assets. Ability to design and implement a highly performing database using partitioning & indexing that meet or exceed the business requirements. Documents a complex software system design as an easily understood diagram, using text and symbols to represent the way data needs to flow. Ability to code complex SQL. Performs effective backup management and periodic databases restoration testing. General DB Cloud networking skills – VPCs, SGs, KMS keys, private links. Ability to develop stored procedures and at least one scripting language for reusable code and improved performance. Know how to import and export data into and out of databases using ETL tools, code, migration tools like DMS or scripts Knowledge of DevOps principles and tools, such as CI/CD. Attention to detail and demonstrate a customer centric approach. Solves complex problems by taking a new perspective on existing solutions; exercises judgment based on the analysis of multiple sources of information Ability to optimize queries for performance and resource efficiency Review database metrics to identify performance issues. Required Qualifications 5+ years of experience with database management/administration, Redshift, Snowflake or Neo4J 5+ years of experience working with incident, change and problem management processes and procedures. Experience maintaining and supporting large-scale critical database systems in the cloud. 3+ years of experience working with AWS cloud hosted databases An understanding of one programming language, including at least one front end framework (Angular/React/Vue), such as Python3, Java, JavaScript, Ruby, Golang, C, C++, etc. Experience with cloud computing, ETL and streaming technologies – OpenShift, DataStage, Kafka Experience with agile development methodology Strong SQL performance & tuning skills Excellent communication and client interfacing skills Strong team collaboration skills and capacity to prioritize tasks efficiently. Desired Qualifications Experience working in an agile development environment Experience working in the banking industry Experience working in cloud environments such as AWS, Azure or Google Experience with CI/CD pipeline (Jenkins, Liquibase or equivalent) Education and Certifications Bachelor’s degree in computer science or related discipline
Posted 1 week ago
6.0 - 11.0 years
15 - 25 Lacs
bengaluru
Hybrid
Hi all, We are hiring for the role SAP Global COUPA Technical/Functional Lead Experience: 6+ years & 10+ Years Location: Bangalore Notice Period: Immediate - 15 Days Shift Timings : 1:30- 10::30 PM Skills: Mandatory Skills : Coupa, configuration, Procurement, integration testing, sap, solution design, Ariba, Python, Java, Spark, Kafka, SQL, AWS Job Description : We are looking for a candidate with 10+ years of experience in a in Data Engineering with at least 3+ years in a Technical Lead role. Bachelors degree in computer science, information systems, computer engineering, systems analysis or a related discipline, or equivalent work experience 10+ years of experiences building enterprise, SaaS web applications using modern Javascript frameworks technologies such as ReactJS, Typescript and strong knowledge of Javascript, CSS, HTML5 Experience with programming languages such as Python and Java. Expertise in Python is a must. Hands on experience building responsive UI, Single Page Application, reusable components, with a keen eye for UI design and usability Proven track record of leading and building high impact, large scale Web Applications with focus on scalability, high performance and quality Experience with web-accessibility/WCAG standards, i18n best practices, cross browser compatibility and performance Passion for keeping up with the latest trends in the Frontend developer community and eagerness to bring in the outside-in thinking in the products we build Experience with DevOps/CICD pipelines Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases. Expertise in processing and analyzing large data workloads Experience in designing and implementing scalable Data Warehouse solutions to support analytical and reporting needs. Experience with API development and design with REST or GraphQL. Experience building and optimizing 'big data' data pipelines, architectures, and data sets. Experience with big data tools: Spark, Kafka, etc. Experience with relational SQL and NoSQL databases. Experience with data pipeline and workflow management tools. Experience with AWS cloud services. Shift Timings : 1:30- 10::30 PM If you are intereted drop your resume at mojesh.p@acesoftlabs.com Call: 9701971793
Posted 1 week ago
10.0 - 12.0 years
16 - 20 Lacs
hyderabad
Work from Office
Job Description Summary Responsible for architecting solutions using Microservices. Responsible for multiple software projects as a project leader or internal consultant. Job Description Roles and Responsibilities Engage with our utility clients to understand the impacts of distributed energy resources (DERs) on the distribution system by providing our utility clients with solutions to manage and plan for the emergence of more DERs. Engage with our utility clients to understand their business needs and how our solution can align, meet, and exceed those business needs. Speak to how effective Requirements Management, leading to a complete set of approved requirements that are traceable to both the solution design and QA test cases. Work collaboratively with a strong team of Technical Leads and Solution Specialists to provide input on solution architecture based on client needs and solution capabilities. Define success criteria and author solution diagrams for the project. Deploy GE GridOS DERMS solutions to client environments to support project use cases and DER planning scenarios. Prepare input data such as network model files, load & generation forecasts, future planning scenarios, and generation cost data. Prepare, modify, and configure client and testing input data for solutions by creating scripts and automating data processing systems. Convert and validate client CIM distribution network models using developed tools and scripts, validating power flow results. Lead the custom adapter, connector, API, microservice design and development to fully integrate the GE GridOS DERMS solution into the client environment. Fluent in full stack development, backend services, middleware, and presentation layer including UI/UX needs to deliver custom integrated solutions to our clients. Collaborate with our Service Engineering development lead during the development cycle to ensure all custom deliverables meet defined needs and standards and are on time and budget. Respond to all client inquiries, bugs, and product ideas for in-flight projects and file tickets for the Product team. Lead the simulation of project scenarios utilising combinations of solutions, configuration, and input data. Perform use case and client specific testing runs and lead factory and site acceptance testing. Debug software bugs, documenting issues for the testing teams. Automate solution processes and complete workflows using existing API documentation. Analyse and assess data and results of project activities and tasks. Hands-On experience, with certification a plus, in deploying solution to the AWS cloud (GEs and/or the clients). Report on project outcomes and present findings to project partners and stakeholders. Outline project goals, objectives, requirements, and technical functionality. Design, and document solutions to achieve the identified client requirements and use cases. Facilitate requirements gathering and analysis, workshops that capture functional and non-functional requirements from the client and relay them to the internal product and development teams. Document and design client-specific solution requirements, capturing acceptance criteria and necessary features to meet client business needs mapping them back to product capabilities. Define the user acceptance testing process and test cases, and articulate progress metrics. Experience with enterprise software full stack development, including backend development and scripting including Java, Python, and DB connector experience. Work with the Project Managers to develop, document and standardize new internal and external processes, expectations, and deliverables across multiple projects. Create business cases and new product requirements by understanding client business processes and policies, in some cases to include regulatory needs. Collaborate with other members of the Solutions team to expand our solution consulting and delivery practice, build standards of excellence, and continuously deliver innovative solution offerings for clients. Use internal software demonstrations to develop user acceptance documentations, and training modules. Train utility clients and end users on how to use GE GridOS DERMS to achieve their business cases. Identify project delivery risks, recommending potential mitigation strategies, in collaboration with Project Manager Support sales activities on occasion, providing technical demos and also identifying possible future upsell opportunities during the course of the project. Participate in pre-sales activities, including leading the SOW deliverables, and assumptions creation. Required Qualification: Masters degree or Bachelors degree in engineering or equivalent. Degree in power systems is a plus AWS Certification. DevSecOps hands on experience. Cybersecurity hands on experience. Familiarity with cloud-based solutions and deployment activities (Azure and/or GCP) Utility integration experience (E.g. SCADA, ADMS, OT, OMS etc.) Experience with power system analysis software (eg OpenDSS, CYME, Powerfactory, Synergi etc.) Desired Characteristics: 10 to 12 years of strong electricity industry experience related to software solution architecting. You will bring strong IT Enterprise Architecture skills including ITNW, Data Flow, and Complex SW Application deployments. You understand how to trace a requirement to a design specification and the test planscases. You are comfortable automating processes and utilizing or building scripting solutions to support product solutions. You bring software development experience and a strong understanding of the SDLC and integrating custom solutions into a product, in our case the GE GridOS DERMS product. Hands on Java, and Python enterprise application development You are highly familiar with emerging energy industry trends and implications on utility clients in DER management, distribution planning, IT, SCADA and asset management with a strong background in analysis. Innovation. A genuine interest in new tools and technology. You learn new software quickly without extensive documentation or hand holding. Client Focus. You enjoy being in front of clients, listening to their needs. You are deeply focused on ensuring their success. You can create powerful user stories detailing the needs of your clients. Growth Mindset. You are deeply curious and love to ask questions. Youre a lifelong learner. Communication . Strong written and verbal communication style. Can effectively share complex technical topics with various levels of audience. Problem Solving . You can quickly understand, analyze various approaches and processes and are able to configure solutions to client needs given existing product functionality. You can drill down to the details, obtaining the right level of specificity for your team. You can creatively solve complex problems. Excellence. You get things done within project deadlines, and with strong focus on quality. Teamwork . You are a natural collaborator and demonstrate a we before me attitude. Note: To comply with US immigration and other legal requirements, it is necessary to specify the minimum number of years' experience required for any role based within the USA. For roles outside of the USA, to ensure compliance with applicable legislation, the JDs should focus on the substantive level of experience required for the role and a minimum number of years should NOT be used. This Job Description is intended to provide a high level guide to the role. However, it is not intended to amend or otherwise restrict/expand the duties required from each individual employee as set out in their respective employment contract and/or as otherwise agreed between an employee and their manager. Additional Information Relocation Assistance Provided: Yes
Posted 1 week ago
6.0 - 9.0 years
20 - 27 Lacs
hyderabad, bengaluru
Hybrid
Job Description * Role & Responsibilities: Manage and administer Informatica IDMC environment including CDI, CDQ, CDGC, Marketplace, and Metadata Command Center. Perform user creation, folder creation, DB connection configuration and other day-to-day administration tasks. Monitor and troubleshoot end-to-end data integration workflows including data sources, transformations, operating systems, and app servers. Work with Informatica Support team for ticket resolution. Conduct active monitoring of platform usage and forecast scaling requirements. Mentor peers and application developers, explaining technical issues to non-technical stakeholders. Ensure effective collaboration with infrastructure teams for stable operating environment. Must-Have Skills: 46 years of hands-on Informatica IDMC administration . Modules: CDGC, CDMP, CDQ, CDI, Metadata Command Center. Proficiency with Linux commands / basic shell scripting . Knowledge of MS SQL Server . Experience in Informatica MDM administration . Understanding of AWS Cloud architecture . Nice-to-Have Skills: Prior experience with Informatica Axon, Informatica EDC . Knowledge of Informatica security model and IDMC architecture. AD authentication models, LDAP integration. SAML/SSO configuration at Org/Sub-Org level. Familiarity with IDMC IPU license model .
Posted 1 week ago
8.0 - 13.0 years
20 - 25 Lacs
bengaluru
Hybrid
Work Location - Remote Lead Time Immediate Joiner Experience - 8+yrs Skills: Linux administrator – 5 years experience AWS Cloud– 5 years’ experience in infrastructure as code environment Linux Shell Scripting (bash) – 5 years’ experience Python Experience (A plus)
Posted 1 week ago
6.0 - 9.0 years
9 - 19 Lacs
hyderabad, bengaluru
Work from Office
Informatica IDMC Administration (CDI, CDQ, CDGC, Marketplace), Informatica Axon , Informatica EDC, Basic Linux Good to have skills: MS SQL Server, Informatica MDM Administration, AWS Cloud Must have skills: Informatica IDMC Administration (CDI, CDQ, CDGC, Marketplace), Informatica Axon , Informatica EDC, Basic Linux Key Responsibilities: Work with infrastructure teams to ensure an effective operating environment. Ability to mentor others, including peers and application developers; ability to clearly explain complex technical issues to business analysts and end users. Ability to monitor and troubleshoot the end to end data integration environment comprised of data sources, data workflows and transformations, operating systems, applications, app servers, networks. Day-to-day administration activities such as user creation, folder creation, configuring DB connections. Able to raise case tickets and work with Informatica support team. Active monitoring of actual and forecasted platform usage. Must have skills: Hands-on experience with Informatica MDM administration. Hands-on experience with below modules within IDMC (Intelligent Data Management Cloud) administration (minimum 4-6 years): CDGC (Informatica Cloud Data Governance & Catalog) CDMP (Cloud Data Marketplace) CDQ (Cloud Data Quality) CDI (Cloud Data Integration) Metadata Command Center Linux Commands/ Basic shell scripting MS SQL Server Ability to create and configure scan resources of various types, metadata extraction, metadata profiling, business terms association, attribute management, data domains and rules. Understanding AWS cloud architecture. Nice-to-have skills: Prior experience with the on prem Informatica Enterprise Data Catalog (EDC) and/or Axon tools. Understanding of Informatica software security model and overall IDMC architectural. AD authentication models and LDAP integration. SAML/SSO configuration for IDMC Org/Sub-Org level. Basic understanding of IDMC IPU license model
Posted 1 week ago
4.0 - 8.0 years
0 Lacs
chennai, tamil nadu
On-site
Join us as a Data Engineer responsible for supporting the successful delivery of Location Strategy projects to plan, budget, agreed quality, and governance standards. You'll spearhead the evolution of our digital landscape, driving innovation and excellence. You will harness cutting-edge technology to revolutionize our digital offerings, ensuring unparalleled customer experiences. To be successful as a Data Engineer, you should have experience with: - Hands-on experience in PySpark and strong knowledge of Dataframes, RDD, and SparkSQL. - Hands-on experience in developing, testing, and maintaining applications on AWS Cloud. - Strong hold on AWS Data Analytics Technology Stack (Glue, S3, Lambda, Lake Formation, Athena). - Design and implement scalable and efficient data transformation/storage solutions using Snowflake. - Experience in data ingestion to Snowflake for different storage formats such as Parquet, Iceberg, JSON, CSV, etc. - Experience in using DBT (Data Build Tool) with Snowflake for ELT pipeline development. - Experience in writing advanced SQL and PL SQL programs. - Hands-On Experience for building reusable components using Snowflake and AWS Tools/Technology. - Should have worked on at least two major project implementations. - Exposure to data governance or lineage tools such as Immuta and Alation is an added advantage. - Experience in using Orchestration tools such as Apache Airflow or Snowflake Tasks is an added advantage. - Knowledge of Abinitio ETL tool is a plus. Some other highly valued skills may include: - Ability to engage with stakeholders, elicit requirements/user stories, and translate requirements into ETL components. - Ability to understand the infrastructure setup and provide solutions either individually or working with teams. - Good knowledge of Data Marts and Data Warehousing concepts. - Possess good analytical and interpersonal skills. - Implement Cloud-based Enterprise data warehouse with multiple data platforms along with Snowflake and NoSQL environment to build data movement strategy. You may be assessed on key critical skills relevant for success in the role, such as risk and controls, change and transformation, business acumen, strategic thinking, digital and technology, as well as job-specific technical skills. The role is based out of Chennai. Purpose of the role: To build and maintain the systems that collect, store, process, and analyze data, such as data pipelines, data warehouses, and data lakes to ensure that all data is accurate, accessible, and secure. Accountabilities: - Build and maintenance of data architectures pipelines that enable the transfer and processing of durable, complete, and consistent data. - Design and implementation of data warehouses and data lakes that manage the appropriate data volumes and velocity and adhere to the required security measures. - Development of processing and analysis algorithms fit for the intended data complexity and volumes. - Collaboration with data scientists to build and deploy machine learning models. Analyst Expectations: To perform prescribed activities in a timely manner and to a high standard consistently driving continuous improvement. Requires in-depth technical knowledge and experience in their assigned area of expertise. Thorough understanding of the underlying principles and concepts within the area of expertise. They lead and supervise a team, guiding and supporting professional development, allocating work requirements, and coordinating team resources. If the position has leadership responsibilities, People Leaders are expected to demonstrate a clear set of leadership behaviors to create an environment for colleagues to thrive and deliver to a consistently excellent standard. The four LEAD behaviors are: L Listen and be authentic, E Energize and inspire, A Align across the enterprise, D Develop others. OR for an individual contributor, they develop technical expertise in the work area, acting as an advisor where appropriate. Will have an impact on the work of related teams within the area. Partner with other functions and business areas. Takes responsibility for the end results of a team's operational processing and activities. Escalate breaches of policies/procedures appropriately. Take responsibility for embedding new policies/procedures adopted due to risk mitigation. Advise and influence decision-making within the own area of expertise. Take ownership of managing risk and strengthening controls in relation to the work you own or contribute to. Deliver your work and areas of responsibility following relevant rules, regulations, and codes of conduct. Maintain and continually build an understanding of how your sub-function integrates with the function, alongside knowledge of the organization's products, services, and processes within the function. Demonstrate understanding of how areas coordinate and contribute to the achievement of the objectives of the organization sub-function. Resolve problems by identifying and selecting solutions through the application of acquired technical experience and guided by precedents. Guide and persuade team members and communicate complex/sensitive information. Act as a contact point for stakeholders outside of the immediate function, while building a network of contacts outside the team and external to the organization. All colleagues will be expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence, and Stewardship our moral compass, helping us do what we believe is right. They will also be expected to demonstrate the Barclays Mindset to Empower, Challenge, and Drive the operating manual for how we behave.,
Posted 1 week ago
6.0 - 11.0 years
15 - 30 Lacs
hyderabad, bengaluru
Work from Office
Dear Candidate, Warm Greetings for the day. We are hiring AWS Cloud Engineering professionals for one of our BIG 4 Client. Role: Cloud Engineer AWS Exp: 6 - 12 yrs Location: Bangalore, Hyderabad Mode of Employment: CTH (Contract To Hire) Hybrid Working. Notice Period : We are looking for quick Joiners, max 15 days Notice. Key Responsibilities: Design, develop, and maintain continuous integration and continuous delivery (CI/CD) pipelines using AWS services and DevOps tools. Automate infrastructure provisioning and management using Infrastructure as Code (IaC) tools such as AWS CloudFormation, Terraform, or AWS CDK. Manage source control and collaboration platforms such as AWS CodeCommit, GitHub, or Bitbucket. Implement container orchestration and management using Amazon ECS, EKS, and Docker. Monitor system performance, application health, and troubleshoot infrastructure and deployment issues using AWS CloudWatch, CloudTrail, and other monitoring tools. Collaborate with cross-functional teams to integrate automated testing, security scanning, and deployment workflows. Optimize cloud infrastructure costs and performance. Implement security best practices, including identity and access management (IAM), encryption, and compliance controls within AWS. Perform configuration management using tools like Ansible, Chef, or Puppet. Produce and maintain clear documentation of architectures, workflows, and DevOps processes. Support disaster recovery plans and ensure infrastructure resilience. Mandatory Skills: Strong hands-on experience with AWS services, including but not limited to EC2, S3, IAM, RDS, Lambda, CloudFormation, CodePipeline, CodeBuild, CodeDeploy, CloudWatch, and EKS. Proficiency with CI/CD tools and pipelines, including Jenkins, GitLab CI, AWS CodePipeline, or similar. Experience with Infrastructure as Code (IaC) tools such as AWS CloudFormation, Terraform, or AWS CDK. Good scripting skills in languages such as Python, Bash, or PowerShell. Containerization knowledge and experience with Docker and container orchestration platforms like Kubernetes/EKS. Familiarity with version control systems – Git preferred. Knowledge of networking concepts, security best practices, and AWS Identity and Access Management (IAM). Experience with monitoring, logging, and alerting tools native to AWS or open source. Understanding of Agile development practices. Strong analytical and problem-solving skills. Effective communication and collaboration skills.
Posted 1 week ago
5.0 - 8.0 years
12 - 22 Lacs
pune
Work from Office
KFORCE Walk-in Hiring Drive Pune KFORCE India is hosting a 10-day Walk-in Drive at our Pune office from 10th to 21st September 2025, and we are on the lookout for brilliant minds to join our force. Venue: Knowledgeforce IT Services India Pvt. Ltd, https://lnkd.in/gsWQRKxK Dates: 10th 21st September 2025 (Full-day walk-in) Register here: https://lnkd.in/g3bQc_2Z At KFORCE, we believe in passionate tech professionals who thrive on challenges, push boundaries, and build whats next. This is your chance to explore exciting opportunities, meet our teams, and experience the KFORCE culture first-hand. Were excited to meet you. Will we see you at the drive? #KFORCE #WalkInDrive #TechJobs #AI #FullStack #SDET #GCP #DataCenter #ProductOwner #EngineeringManager #SeniorAI #AutomationTesting #HiringInPune #KnowledgeForce #TechCareers #FutureWithKFORCE Job Description: Back End Engineer Location: Pune Work Model: Hybrid/Remote with collaboration across global teams. Role Overview We are seeking an experienced Back End Engineer to lead the build-out of scalable, distributed systems on AWS . This role requires deep expertise in cloud-native architectures, 12-Factor App methodology, and event-driven design patterns . You will play a critical role in both hands-on engineering and strategic architectural decisions , working closely with product, design, and cross-functional stakeholders to deliver high-impact, resilient, and secure solutions . Key Responsibilities Technical Leadership & Architecture Lead the design and implementation of 12-Factor cloud-native applications optimized for scalability and continuous deployment. Architect and build distributed systems using C#/.NET and AWS-native services. Design and orchestrate event-driven architectures leveraging AWS EventBridge, Step Functions, Lambda, SNS, and SQS. Implement microservices architectures with well-defined service boundaries and asynchronous communication. Ensure solutions align with the AWS Well-Architected Framework pillars: operational excellence, security, reliability, performance efficiency, cost optimization, and sustainability. Collaborate with Product Managers, UX Designers, and business stakeholders for requirements gathering and timely project delivery. Observability & Performance Excellence Develop telemetry strategies using OpenTelemetry, CloudWatch, and X-Ray. Implement continuous improvement practices based on operational data and business outcomes. Required Qualifications Technical Expertise (Must-Have) 3+ years of software development experience. Hands-on expertise in 12-Factor App methodology . Strong proficiency in AWS services : EventBridge, Step Functions, Lambda, ECS/EKS, DynamoDB, RDS Aurora. Advanced programming skills in C#/.NET, TypeScript, RESTful APIs . Working knowledge of Angular (v10+) , TypeScript, HTML5, and CSS3. Experience with DevSecOps practices including container security. Proficiency with SQL & NoSQL databases . Solid understanding of SDLC, Git, CI/CD, and DevOps practices . Proven experience in microservices and event-driven architectures in production. Leadership & Collaboration (Must-Have) Strong cross-functional collaboration with Product, UX, and business teams . Preferred Qualifications Multi-cloud exposure ( Azure, Google Cloud ). Knowledge of Domain-Driven Design (DDD), CQRS, Event Sourcing . Hands-on with Kubernetes/EKS, Docker, cloud-native deployments .
Posted 1 week ago
8.0 - 13.0 years
20 - 35 Lacs
ahmedabad
Remote
AI Architect Armakuni India Role Summary: Armakuni India is seeking a visionary AI Architect with deep technical expertise in AI/ML systems, Generative AI, and cloud-native architecture. This role demands hands-on leadership to define, design, and scale next-generation AI solutionsincluding LLMs, computer vision systems, and enterprise-grade ML platforms. You will drive innovation across the AI stack, shaping the future of data-driven product development at Armakuni. Key Responsibilities: Architectural Leadership * Define and drive AI/ML architectural vision aligned with business objectives. * Lead the design of scalable, secure, and cost-optimized AI platforms on AWS or similar clouds. * Evaluate and select appropriate models, tools, and infrastructure for enterprise deployment. Team & Technical Mentorship * Guide and mentor engineers, data scientists, and MLOps professionals. * Conduct architecture reviews, enforce coding and deployment standards. * Foster a culture of experimentation, ownership, and knowledge sharing. Generative AI & LLMs * Architect and deliver GenAI solutions leveraging LLMs like GPT-4, Claude, Gemini, LLaMA, etc. * Implement agentic workflows, RAG pipelines, and domain-specific LLM fine-tuning. * Use advanced frameworks like LangChain, LangGraph, LangFuse, Crew AI, LLamaIndex. Machine Learning Engineering * Build robust ML pipelines: preprocessing, model training, evaluation, and drift monitoring. * Apply classical ML, deep learning, and time series forecasting to solve real-world problems. * Deploy and manage models using SageMaker and other MLOps tools. Computer Vision (CV) * Architect CV pipelines for image classification, object detection, segmentation, and more. * Optimize models for cloud and edge deployment using PyTorch/TensorFlow. Cloud & Infrastructure * Lead cloud-native deployments using AWS (SageMaker, Bedrock, Lambda, etc.). * Use containerization tools (Docker, Kubernetes) for scalable infrastructure. * Integrate with RESTful APIs, vector databases (Pinecone, FAISS), and caching layers. Core Skills & Technologies * Languages & Libraries: Python, Scikit-learn, XGBoost, PyTorch, TensorFlow * Generative AI Tools: LangChain, LangGraph, LLamaIndex, LangFuse, Crew AI * Databases: PostgreSQL, DynamoDB, Redis, Chroma * MLOps & Deployment: SageMaker, MLflow, FastAPI, Docker, Uvicorn * Vector Search: Pinecone, FAISS, OpenSearch * Cloud Platforms: AWS (preferred), GCP, Azure Preferred Qualifications * 8+ years in AI/ML, with 4+ years in GenAI and LLMs * Proven track record in leading enterprise-grade AI deployments * Open-source contributions or publications in AI/ML * AWS AI/ML certification or equivalent is a strong plus
Posted 1 week ago
10.0 - 17.0 years
27 - 35 Lacs
noida
Work from Office
Strong hands-on exp. with .NET Core and C#. Design & implement scalable, robust, & secure solutions using .NET Core on AWS Cloud. Collaborate with product managers, developers,& DevOps teams to align architecture with business requirements. AWS cloud Required Candidate profile Exp. AWS Cloud Services, such as EC2, Lambda, S3, RDS, API Gateway, CloudFormation. Define &document architectural patterns, technical standards, &best practices.DevOps tools & CI/CD pipelines
Posted 1 week ago
12.0 - 18.0 years
40 - 70 Lacs
pune
Hybrid
SAAS Product deliver on Cloud (AWS, GCP, Azure) Lead team of junior architects to design, code, test and deliver of e2e features Daily collaboration with US client Drive efficiency using Copilot | Cursor Oversee QA strategy, tools and techniques Required Candidate profile Recent Fullstack Experience: Python, ReactJs, Postgre, RDBMS, NoSQL and CI/CD Tools Exp in AWS or Azure or GCP, K8s, EKS..... Has lead growth of 1-3 new products to large customer and user bases
Posted 1 week ago
8.0 - 12.0 years
7 - 9 Lacs
kolkata
Work from Office
Responsibilities: Lead the design, architecture, and development of cloud-based web and mobile applications focused on education case. Work hands-on with cloud platforms (AWS, Azure), modern front-end/back-end stacks, and mobile development.
Posted 1 week ago
8.0 - 10.0 years
18 - 22 Lacs
gurugram
Work from Office
Department GPS Reports To Senior Project Manager Level Grade 5 About your team GPS Delivery supports a variety of customers, including Retail, Workplace and Institutional investors in the wholesale, direct investing and advised channels.Our team of colleagues are split across many locations worldwide, and we all strive to ensure our customers get the best possible service. About your role The role would curate the technology engineering backlog to cover for key capability gaps and give this a shape of a programme with clear prioritised agreed scope and agenda. The role will leverage/guide the existing technology experts to get this prioritised backlog delivered for build out of new technical capabilities. The key outcomes shall be (but not limited to) tech & engineering readiness for on boarding Cloud services including application containerisation and also build out of new customer experience tooling as desired by digital propositions. The selected candidate shall be nothing short of role model technology expert in the web application development space. The role shall be responsible for tech guidelines, best practices and reference implementations for the existing and new tech. The role shall ensure that other tech experts are well guided/mentored and the work gets delivered to tech strategy and guidelines. Experience and Qualifications 12-16 years BE BTech Essential Skills Expert level experience in creating well architecture application on AWS Cloud. Multiyear experience of developing & designing cloud native applications preferably targeted to Amazon Web Services (AWS) Expertise of designing and running applications which run on container (Docker) and systems which are composed of such application containers. Experience on container management products and of having run full SDLC on container hosted applications. Expertise and experience around Micro services-based architectures Candidate needs to have rich experience around engineering skills, CI CD and build/ deployment automation tools. Should have rich exposure the web, application and messaging queuing platforms. Should have rich knowledge of design and architecture patterns. Should be an expert craftsman on test first model of Test-Driven Development (TDD). Has ability to envision design for complex functional and technical problems while incrementally evolving the implementation in simple progressive steps of deliveries. You are adept at Pair programming and can comfortably pair with other developers irrespective of maturity/experience levels. You are eager to coach/guide juniors to improve the delivery capability. You are at ease with understanding/supporting the code base written by others and in learning/trying wider technology sets problem solving methods used in a project. Desired Skills Experience Contributions to industry either in the form of conference presentations, published papers blogs, to open source etc You are highly skilled and efficient with web application development technologies like java and python. You are highly skilled and efficient with RIA/ Single page application (SPA) development technologies like React or Angular.
Posted 1 week ago
7.0 - 10.0 years
10 - 16 Lacs
noida, gurugram, delhi / ncr
Work from Office
Role & responsibilities Job Summary: We are looking for a skilled AWS Cloud Engineer to join our IT Infrastructure & Cloud Services team. The ideal candidate will have hands-on experience designing, deploying, and managing scalable, secure, and reliable cloud infrastructure using AWS services. Key Responsibilities: • Design and implement cloud infrastructure solutions on AWS, including EC2, S3, RDS, Lambda, VPC, and IAM. • Automate infrastructure provisioning and management using tools like CloudFormation, Terraform, or AWS CDK. • Monitor, troubleshoot, and optimize cloud environments for performance, cost, and security. • Implement and manage CI/CD pipelines using AWS CodePipeline, CodeBuild, and third-party tools like Jenkins. • Collaborate with DevOps, Security, and Application teams to ensure smooth cloud operations and deployments. • Ensure compliance with cloud governance, security best practices, and data protection standards. • Maintain and improve disaster recovery and high availability strategies. Required Qualifications: • Bachelor's degree in Computer Science, Engineering, or related field. • 3+ years of hands-on experience with AWS cloud services. • Strong scripting and automation skills (Python, Bash, PowerShell). • Experience with containerization (Docker, ECS, or EKS). • Familiarity with networking concepts: VPC, VPN, Direct Connect, Route 53. • Understanding of identity and access management (IAM), KMS, and security best practices in AWS. • Experience with monitoring tools (CloudWatch, Datadog, etc.). Preferred Qualifications: • AWS Certified Solutions Architect Associate or equivalent certification. • Experience with hybrid cloud environments and AWS Outposts. • Knowledge of DevSecOps principles and tools (e.g., AWS Inspector, Guard Duty). • Experience in cost optimization using AWS Cost Explorer and Trusted Advisor. Preferred candidate profile
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
73564 Jobs | Dublin
Wipro
27625 Jobs | Bengaluru
Accenture in India
22690 Jobs | Dublin 2
EY
20638 Jobs | London
Uplers
15021 Jobs | Ahmedabad
Bajaj Finserv
14304 Jobs |
IBM
14148 Jobs | Armonk
Accenture services Pvt Ltd
13138 Jobs |
Capgemini
12942 Jobs | Paris,France
Amazon.com
12683 Jobs |