Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
4.0 - 8.0 years
8 - 12 Lacs
hyderabad
Work from Office
job requisition idR010531 AVEVA is creating software trusted by over 90% of leading industrial companies. Job Title Oracle Database Administrator Location Hyderabad Employment Type Full time The job As part of our global ERM development team, youll collaborate with a team of skilled software engineers in designing and implementing of both on Premise and Cloud solutions. Youll act as an expert and trusted advisor who provides guidance for transforming legacy database implementations to the cloud and migrating away for Oracle to PostgreSQL. This varied role will see you work closely with clients, partners and other internal teams to ensure consulting engagements are successful. Key responsibilities Performance TuningTroubleshoot performance problems, fine-tuning database and index analysis, In Oracle. Support Cloud Databases hosted in AWS and Azure to support our SAAS solutions in cloud, as well as our on-premise infrastructure for development Participate in Scrum meetings with other team members including sprint planning and estimating; backlog refinement; daily Scrum meetings; sprint retrospectives and sprint reviews. Support developers in performance tuning PL/SQL, functions, packages, and procedures. Take part in the journey to move the product further towards a Cloud Native solution. Ideal experience You have a solid understanding of Cloud Database and managed services, preferably in AWS and Azure. Have experience work with Postgres, and NoSQL database such as Mongo, DynamoDB, Cosmos. Understanding of data structures and algorithms. Required to have good working experience on performance tuning, troubleshooting, and debugging in Oracle. Understanding of Clean code, SOLID principles & Design patterns Good communication skills, working with a broad range of people, including Product Owners, Testers. Extensive experience in managing and tuning Oracle database. Positive approach to problem solving with a can do attitude. Ability to switch role as individual contributor as well as a team player with cross culture and distributed teams. Experience of software development methodologies and processes like Agile, SCRUM, Kanban is desirable. Experience in writing unit test cases will be an added advantage. Experience of working with Database in Containers such as Docker K8s is desirable too. Great skills to have Organization: The pace at AVEVA can be exciting and fast, so whilst you will need excellent time management and effective prioritisation, we will do all we can to support a balanced portfolio of work, and your wellbeing. Customer focus: Youll be working directly with our customers. Being able to listen to them and understand their requests, and then address them in a proactive and consultative manner, will be part of your day-to-day. Problem-solving: Youll need to enjoy getting stuck into problems. Troubleshooting and solving challenging problems is a big part of this role. R&D at AVEVA Our global team of 2000+ developers work on an incredibly diverse portfolio of over 75 industrial automation and engineering products, which cover everything from data management to 3D design. AI and cloud are at the centre of our strategy, and we have over 150 patents to our name. Our track record of innovation is no fluke its the result of a structured and deliberate focus on learning, collaboration and inclusivity. If you want to build applications that solve big problems, join us. Find out moreaveva.com/en/about/careers/r-and-d-careers/ India Benefits include: Gratuity, Medical and accidental insurance, very attractive leave entitlement, emergency leave days, childcare support, maternity, paternity and adoption leaves, education assistance program, home office set up support (for hybrid roles), well-being support Its possible were hiring for this position in multiple countries, in which case the above benefits apply to the primary location. Specific benefits vary by country, but our packages are similarly comprehensive. Find out moreaveva.com/en/about/careers/benefits/ Hybrid working By default, employees are expected to be in their local AVEVA office three days a week, but some positions are fully office-based. Roles supporting particular customers or markets are sometimes remote. Hiring process InterestedGreat! Get started by submitting your cover letter and CV through our application portal. AVEVA is committed to recruiting and retaining people with disabilities. Please let us know in advance if you need reasonable support during your application process. Find out moreaveva.com/en/about/careers/hiring-process
Posted 2 days ago
12.0 - 15.0 years
10 - 14 Lacs
hyderabad
Work from Office
Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Spring Boot, Java Standard Edition, Oracle Procedural Language Extensions to SQL (PLSQL), Amazon Web Services (AWS) Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will engage in the design, construction, and configuration of applications tailored to fulfill specific business processes and application requirements. Your typical day will involve collaborating with various teams to understand their needs, developing innovative solutions, and ensuring that applications are robust and efficient. You will also participate in code reviews and contribute to the continuous improvement of development practices, ensuring that the applications you build are not only functional but also scalable and maintainable. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Expected to provide solutions to problems that apply across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Mentor junior professionals to foster their growth and development. Professional & Technical Skills: - -Java, J2EE and related frameworks such as Spring and ORM etc.-Microservices Architecture and Rest patterns using leading industry recommended security frameworks.-Cloud and related technologies such as AWS, Google, Azure.-UI development experience using Angular/Typescript/JavaScript technologies.-Test Automation Skills using Behavioral Driven Development.- Data Integration (batch, real-time) following Enterprise Integration Patterns.- Service Oriented Architecture (event-driven SOA, Web Services, REST, ESB).- Message driven architecture (JMS, SOA, Spring).- Relational Database, No SQL Database, DynamoDB and Data Modeling,- Database development & tuning (PL/SQL/XQuery).- Performance (threading, indexing, clustering, caching).- Document-centric data architecture (XML DB/NoSQL).- Knowledge of reporting/BI tools such as Tableau would be plus.- Full-stack engineering mindset with a passion for excelling in all areas of the software development life cycle such as analysis, design, development, automated testing, and DevOps. Additional Information:- The candidate should have minimum 12 years of experience in Spring Boot.- This position is based at our Hyderabad office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 2 days ago
4.0 - 6.0 years
6 - 14 Lacs
coimbatore
Work from Office
Hiring Full Stack Engineer (React, React Native, Python, AWS) with 46 yrs exp for on-site role at Sense7AI. Work with offshore clients, flexible IST/EST hours. Strong API, AWS Serverless skills needed. Immediate joiners preferred. hr@sense7ai.com Health insurance Flexi working Cafeteria Work from home
Posted 2 days ago
8.0 - 12.0 years
17 - 18 Lacs
bengaluru
Work from Office
Must-Haves: MUST-HAVE: Overall technology experience of 8+ years MUST-HAVE: Minimum experience of 5 years in data modelling and database design MUST-HAVE: Minimum experience of 7 years in designing, implementing, and supporting medium to large scale database systems MUST-HAVE: Minimum experience of 5 years in designing, developing, and supporting solutions using S3, Redshift, DynamoDB and any of the Managed RDS MUST-HAVE: Minimum experience of 4 years designing, developing, and tuning solutions using AWS database and storage technologies Preferred: Prior experience with designing, developing, and supporting solutions using database technologies like MySQL, PostgreSQL, Cassandra is a plus Experience with designing, developing, and supporting solutions using Map Reduce, Kafka, & Streaming technologies is a plus Advanced python programming skills is a plus Roles & Responsibilities : Understand the business domain, core data objects, data entities. Model the relationships between the various entities Design the data warehouse, data mart and transactional databases including all facets of load parameters Induct aspects of high performance, security, usability, operability, maintainability, traceability, observability, evolvability into the systems design Assess performance influencing parameters like normalization, de-normalization, most executed transactions, record count, data size, I/O parameters at the database and OS level in the database and table designs Maintain a catalog of meta, master, transactional and reference data Tune the transactions and queries and determine the use of appropriate client libraries and fetch mechanism (like query vs stored procedures) Design the system for resilience, fail-over, self-healing and institute rollback plans Develop and test database code and other core and helper utilities in Python Develop and profile queries, triggers, indices, and stored procedures Monitor the health of queries and identify patterns leading to bottlenecks in the system before the customer finds it Own the DevOps and release mgmt. practices pertaining to the database solutions Estimate the cost of AWS services usage and look to continuously optimize the cost Design and develop data REST API layer on Pytho Role & responsibilities Preferred candidate profile
Posted 2 days ago
2.0 - 5.0 years
5 - 12 Lacs
chennai
Work from Office
We're looking for a hands-on AWS Engineer to join our growing engineering team. In this role, youll be working on event-driven architectures using AWS' serverless and container services to build scalable, real-time customer interaction platforms using Amazon Chime and Amazon Connect. Youll collaborate closely with product, design, and infrastructure teams to deliver end-to-end solutions with both backend and frontend responsibilities. Responsibilities Design and implement event-driven services using AWS Lambda, Step Functions, and ECS Fargate. Build features on top of Amazon Connect and Amazon Chime. Develop backend APIs and services using Node.js or Python. Build and maintain frontend applications in React or Angular, including chat or video integration components. Manage relational and NoSQL data models using Amazon RDS and DynamoDB. Collaborate with other engineers and stakeholders to design and ship features that solve real-world problems. Ensure code quality through testing, reviews, and observability best practices. Required Skills: Strong frontend development skills in React or Angular Solid experience in event-driven architecture, using AWS Lambda, Step Functions, ECS Fargate (or similar container platforms) Hands-on with both Amazon RDS (PostgreSQL or MySQL preferred), Amazon DynamoDB Proficient in at least one programming language preferably Node.js or Python Experience with Amazon Connect, Amazon Chime or similar services Comfortable working with AWS IAM roles, policies, and security best practices Nice to Have: Infrastructure as code experience with CDK, CloudFormation, or Terraform Exposure to real-time communication protocols (WebRTC) Familiarity with monitoring and tracing tools (CloudWatch, X-Ray) • Experience with CI/CD pipelines and automated testing Candidates with relevant AWS Developer or AWS Professional will be given preference.
Posted 2 days ago
3.0 - 5.0 years
12 - 16 Lacs
kochi
Work from Office
As Data Engineer, you will develop, maintain, evaluate and test big data solutions. You will be involved in the development of data solutions using Spark Framework with Python or Scala on Hadoop and AWS Cloud Data Platform Responsibilities: Experienced in building data pipelines to Ingest, process, and transform data from files, streams and databases. Process the data with Spark, Python, PySpark, Scala, and Hive, Hbase or other NoSQL databases on Cloud Data Platforms (AWS) or HDFS Experienced in develop efficient software code for multiple use cases leveraging Spark Framework / using Python or Scala and Big Data technologies for various use cases built on the platform Experience in developing streaming pipelines Experience to work with Hadoop / AWS eco system components to implement scalable solutions to meet the ever-increasing data volumes, using big data/cloud technologies Apache Spark, Kafka, any Cloud computing etc Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Total 3-5+ years of experience in Data Management (DW, DL, Data Platform, Lakehouse) and Data Engineering skills Minimum 4+ years of experience in Big Data technologies with extensive data engineering experience in Spark / Python or Scala ; Minimum 3 years of experience on Cloud Data Platforms on AWS; Experience in AWS EMR / AWS Glue / DataBricks, AWS RedShift, DynamoDB Good to excellent SQL skills Preferred technical and professional experience Certification in AWS and Data Bricks or Cloudera Spark Certified developers.
Posted 2 days ago
3.0 - 7.0 years
10 - 14 Lacs
kochi
Work from Office
As a Software Developer you'll participate in many aspects of the software development lifecycle, such as design, code implementation, testing, and support. You will create software that enables your clients' hybrid-cloud and AI journeys. Your primary responsibilities include: Comprehensive Feature Development and Issue Resolution: Working on the end to end feature development and solving challenges faced in the implementation. Stakeholder Collaboration and Issue Resolution: Collaborate with key stakeholders, internal and external, to understand the problems, issues with the product and features and solve the issues as per SLAs defined. Continuous Learning and Technology Integration: Being eager to learn new technologies and implementing the same in feature development Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Experience of technologies like Spring boot, JAVA Demonstrated technical leadership experience on impact customer facing projects. Experience in building web Applications in Java/J2EE stack, experience in UI framework such as REACT JS Working knowledge of any messaging system (KAFKA preferred) Experience designing and integrating REST APIs using Spring Boot Preferred technical and professional experience Strong experience in Concurrent design and multi-threading - General Experience, Object Oriented Programming System (OOPS), SQL Server/ Oracle/ MySQL, working knowledge of Azure or AWS cloud. Preferred Experience in building applications in a container-based environment (Docker/Kubernetes) on AWS Cloud. Basic knowledge of SQL or NoSQL databases (Postgres, MongoDB, DynamoDB preferred) design and queries
Posted 2 days ago
4.0 - 9.0 years
12 - 16 Lacs
kochi
Work from Office
As Data Engineer, you will develop, maintain, evaluate and test big data solutions. You will be involved in the development of data solutions using Spark Framework with Python or Scala on Hadoop and AWS Cloud Data Platform Responsibilities: Experienced in building data pipelines to Ingest, process, and transform data from files, streams and databases. Process the data with Spark, Python, PySpark, Scala, and Hive, Hbase or other NoSQL databases on Cloud Data Platforms (AWS) or HDFS Experienced in develop efficient software code for multiple use cases leveraging Spark Framework / using Python or Scala and Big Data technologies for various use cases built on the platform Experience in developing streaming pipelines Experience to work with Hadoop / AWS eco system components to implement scalable solutions to meet the ever-increasing data volumes, using big data/cloud technologies Apache Spark, Kafka, any Cloud computing etc. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Minimum 4+ years of experience in Big Data technologies with extensive data engineering experience in Spark / Python or Scala ; Minimum 3 years of experience on Cloud Data Platforms on AWS; Experience in AWS EMR / AWS Glue / DataBricks, AWS RedShift, DynamoDB Good to excellent SQL skills Exposure to streaming solutions and message brokers like Kafka technologies. Preferred technical and professional experience Certification in AWS and Data Bricks or Cloudera Spark Certified developers
Posted 2 days ago
5.0 - 10.0 years
14 - 17 Lacs
pune
Work from Office
As a Big Data Engineer, you will develop, maintain, evaluate, and test big data solutions. You will be involved in data engineering activities like creating pipelines/workflows for Source to Target and implementing solutions that tackle the clients needs. Your primary responsibilities include: Design, build, optimize and support new and existing data models and ETL processes based on our clients business requirements. Build, deploy and manage data infrastructure that can adequately handle the needs of a rapidly growing data driven organization. Coordinate data access and security to enable data scientists and analysts to easily access to data whenever they need too Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Must have 5+ years exp in Big Data -Hadoop Spark -Scala ,Python Hbase, Hive Good to have Aws -S3, athena ,Dynomo DB, Lambda, Jenkins GIT Developed Python and pyspark programs for data analysis.. Good working experience with python to develop Custom Framework for generating of rules (just like rules engine). Developed Python code to gather the data from HBase and designs the solution to implement using Pyspark. Apache Spark DataFrames/RDD's were used to apply business transformations and utilized Hive Context objects to perform read/write operations. Preferred technical and professional experience Understanding of Devops. Experience in building scalable end-to-end data ingestion and processing solutions Experience with object-oriented and/or functional programming languages, such as Python, Java and Scala
Posted 2 days ago
5.0 - 8.0 years
12 - 16 Lacs
kochi
Work from Office
As Data Engineer, you will develop, maintain, evaluate and test big data solutions. You will be involved in the development of data solutions using Spark Framework with Python or Scala on Hadoop and AWS Cloud Data Platform Responsibilities: Experienced in building data pipelines to Ingest, process, and transform data from files, streams and databases. Process the data with Spark, Python, PySpark, Scala, and Hive, Hbase or other NoSQL databases on Cloud Data Platforms (AWS) or HDFS Experienced in develop efficient software code for multiple use cases leveraging Spark Framework / using Python or Scala and Big Data technologies for various use cases built on the platform Experience in developing streaming pipelines Experience to work with Hadoop / AWS eco system components to implement scalable solutions to meet the ever-increasing data volumes, using big data/cloud technologies Apache Spark, Kafka, any Cloud computing etc Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Total 5-8+ years of experience in Data Management (DW, DL, Data Platform, Lakehouse) and Data Engineering skills Minimum 6+ years of experience in Big Data technologies with extensive data engineering experience in Spark / Python or Scala ; Minimum 5 years of experience on Cloud Data Platforms on AWS; Experience in AWS EMR / AWS Glue / DataBricks, AWS RedShift, DynamoDB Good to excellent SQL skills Preferred technical and professional experience Certification in AWS and Data Bricks or Cloudera Spark Certified developers
Posted 2 days ago
12.0 - 18.0 years
20 - 25 Lacs
pune
Hybrid
What’s the role all about? We are seeking a highly experienced Senior Specialist Architect with deep expertise in enterprise application architecture , AI/ML integration , and AWS cloud services . In this role, you will take complete ownership of designing scalable, secure, and high-performing applications , embedding AI capabilities, and guiding junior architects to deliver innovative, future-ready solutions. You will serve as the technical thought leader for AWS-based architecture and AI-enabled design , ensuring alignment with business objectives, compliance standards, and emerging technology trends. How will you make an impact? Own and drive end-to-end architecture for AI-powered enterprise applications hosted on AWS. Design scalable, secure, and maintainable systems using AWS services such as EC2, Lambda, S3, RDS, DynamoDB, API Gateway, SageMaker, Bedrock, Step Functions, and CloudFormation . Integrate AI/ML models (LLMs, NLP, predictive analytics) using AWS AI/ML services (SageMaker, Bedrock, Comprehend, Rekognition, Lex). Conduct architectural reviews to ensure AWS best practices, Well-Architected Framework compliance, and cost optimization. AI & Cloud Solution Expertise Architect and implement AI-driven features within cloud-native applications. Establish MLOps pipelines for model training, deployment, and monitoring on AWS. Ensure AI solutions are explainable, ethical, and compliant with enterprise and regulatory guidelines. Mentorship & Collaboration Coach and mentor junior architects in AWS services, AI integration patterns, and architectural design principles. Lead cross-functional design sessions and knowledge-sharing workshops. Provide technical guidance for resolving complex AWS and AI-related design challenges. Strategic Planning Partner with stakeholders to define technology roadmaps combining AI innovations and AWS capabilities. Conduct POCs to validate AI models, AWS services, and architectural patterns. Drive cloud migration strategies and modernization initiatives. Have you got what it takes? Experience: 13–15 years in software/application architecture, with at least 3–5 years in AI/ML and AWS cloud design. Strong expertise in AWS cloud services , architecture patterns, and Well-Architected Framework principles. Proficiency in AI frameworks (TensorFlow, PyTorch, Hugging Face, LangChain) and AWS AI/ML services (SageMaker, Bedrock, Comprehend, Rekognition, Lex). Hands-on experience with microservices, serverless computing, event-driven architecture, and API design . Skilled in MLOps, data engineering, and model deployment pipelines . Excellent mentoring skills with the ability to develop the capabilities of junior architects. Bachelor’s or Master’s degree in Computer Science, Engineering, AI, or related field. Key Attributes Visionary leader with the ability to blend AI innovation, AWS capabilities, and enterprise needs . Strong ownership mindset with accountability for solution success. Excellent communicator, capable of influencing technical and business stakeholders. Advocate for AWS best practices, AI ethics, and cost-efficient cloud adoption . What’s in it for you? Join an ever-growing, market disrupting, global company where the teams – comprised of the best of the best – work in a fast-paced, collaborative, and creative environment! As the market leader, every day at NICE is a chance to learn and grow, and there are endless internal career opportunities across multiple roles, disciplines, domains, and locations. If you are passionate, innovative, and excited to constantly raise the bar, you may just be our next NICEr! Enjoy NICE-FLEX! At NICE, we work according to the NICE-FLEX hybrid model, which enables maximum flexibility: 2 days working from the office and 3 days of remote work, each week. Naturally, office days focus on face-to-face meetings, where teamwork and collaborative thinking generate innovation, new ideas, and a vibrant, interactive atmosphere. Requisition ID : 8379 Reporting into: Director, Engineering, CX Role Type: Individual Contributor
Posted 2 days ago
2.0 - 7.0 years
10 - 20 Lacs
noida
Hybrid
Position Summary: Pentair is currently seeking Managed Services CloudOps for IoT projects in the Smart Products & IoT Strategic Innovation Centre in India team. This role is responsible for supporting managed services & application/product Operations for IoT projects. Duties & Responsibilities: Apply best practices and strategies regarding Prod application and infrastructure Maintenance (Provisioning/Alerting/Monitoring etc.) Knowledge & Purpose of various env QA, UAT/Staging, Prod. Understanding Git, AWS Code Commit. Hotfix & Sequential configuration process need to follow up. Understanding of Repositories. Understanding & use of CI/CD Pipelines. AWS CLI use & Implementation. Ensure application & AWS Infrastructure proactive monitoring- Realtime Monitoring of AWS Services. CloudWatch alert configurations. Alerts configuration in third-party tool like Newrelic. Datadog, Splunk etc. Awareness of Pre & Post Deployment changeset in AWS infrastructure Managing cloud environments in accordance with company security guidelines. Config Register Management. Daily data monitoring of deployed services. Apply Best security practices for deployed Infrastructure. Suggest regular optimization of infra by upscale & downscale. Troubleshoot incidents, identify root cause, fix and document problems, and implement preventive measures Lambda Logs Configuration. API logs Configuration. Better understanding of CloudWatch log insights. Educate teams on the implementation of new cloud-based initiatives, providing associated training as required Employ exceptional problem-solving skills, with the ability to see and solve issues before they affect business productivity. Have Experience in CloudOps Process. Participate in all aspects of the software development life cycle for AWS solutions, including planning, requirements, development, testing, and quality assurance. Various AWS accounts Billing management/analysis and alert configurations as per the defined threshold. Understanding of AWS billing console. Able to analyze daily/Monthly costing of OnDemand services. Python & Bash scripting is must to automate the regular task like Data fetch from S3/DDB, Job deployment Qualifications and Experience: Mandatory Bachelors degree in Electrical Engineering, Software Engineering, Computer Science, Computer Engineering, or related Engineering discipline. 2+ years of experience in Cloud Operations & Monitoring of AWS serverless services. 1+ years of experience in the Smart/Connected Products & IoT workflow . Hands on experience in Mobile or Web App issues troubleshooting AWS platform or certified in AWS (SysOPS or DevOPS) Server-less/headless architecture Lambda, API Gateways, Kinesis, ElasticSearch, ElasticCache, Dynamo DB, Athena, AWS IoT, Codecommit, Cloudtrail, Codebuild. Cloud formation template understanding for configuration changes. NoSQL Database (Dynamo DB preferred). Trouble ticketing tools ( Jira Software & Jira Service Desk preferred) Good hands-on experience in scripting languages: Python,Bash,Node,Gitbash,CodeCommit Experience of impact analysis for Infrastructure configuration change. Preferred Hands on experience on Newrelic/Kibana/Splunk and AWS Cloudwatch tools Prior experience in operation support for IoT projects (50,000+ live devices) will be an added advantage, Experience in AWS Cloud IoT Core platform. L2 Support experience in addition to CloudOps Skills and Abilities Required: Willingness to work in a 24X7 shifts environment Flexible to take short term travels on a short notice to facilitate the field trails & soft launch of products Excellent troubleshooting & analytical skills Highly customer-focused and always eager to find a way to enhance customer experience Able to pinpoint business needs and deliver innovative solutions Can-do positive attitude, always looking to accelerate development. Self-driven & committed to high standards of performance and demonstrate personal ownership for getting the job done. Innovative and entrepreneurial attitude; stays up to speed on all the latest technologies and industry trends; healthy curiosity to evaluate, understand and utilize new technologies. Excellent verbal & written communication skills
Posted 2 days ago
7.0 - 12.0 years
20 - 35 Lacs
gurugram
Work from Office
Qualification: B.Tech Timings: 9 am to 6 pm Mon & Fri (WFH) Tue/Wed/Thu (WFO) Job Overview: We are seeking an experienced Java Lead with over 7 years of hands-on experience in Java development, who will take ownership of designing and building scalable logging solutions. The ideal candidate should possess strong knowledge of partitioning, data sharding, and database management (both SQL and NoSQL) and should be well-versed in AWS cloud services. This is a critical role where you will lead a team to build reliable and efficient systems while ensuring high performance and scalability. Key Responsibilities: Lead Java Development: Architect, design, and implement backend services using Java, ensuring high performance, scalability, and reliability. Logging Solutions: Build and maintain robust logging solutions that can handle large-scale data while ensuring efficient retrieval and storage. Database Expertise:Implement partitioning, data sharding techniques, and optimize the use of SQL (MySQL, PostgreSQL) and NoSQL databases (MongoDB, DynamoDB). Ensure database performance tuning, query optimization, and data integrity. Cloud Deployment: Utilize AWS cloud services such as EC2, RDS, S3, Lambda, and CloudWatch to design scalable, secure, and high-availability solutions. Manage cloud-based infrastructure and deployments to ensure seamless operations. Collaboration & Leadership: Lead and mentor a team of engineers, providing technical guidance and enforcing best practices in coding, performance optimization, and design. Collaborate with cross-functional teams including product management, DevOps, and QA to ensure seamless integration and deployment of features. Performance Monitoring: Implement solutions for monitoring and ensuring the health of the system in production environments. Innovation & Optimization: Continuously improve system architecture to enhance performance, scalability, and reliability. Required Skills & Qualifications: Education: Bachelors or Masters degree in Computer Science, Information Technology, or related fields. Experience: 7+ years of hands-on experience in Java (J2EE/Spring/Hibernate) development. Database Skills: Strong experience with both SQL (MySQL, PostgreSQL) and NoSQL databases (MongoDB, Cassandra, DynamoDB). Proficiency in partitioning and data sharding. AWS Expertise: Deep understanding of AWS cloud services including EC2, S3, RDS, CloudWatch, and Lambda. Hands-on experience in deploying and managing applications on AWS. Logging and Monitoring: Experience in building and managing large-scale logging solutions (e.g., ELK stack, CloudWatch Logs). Leadership: Proven track record of leading teams, mentoring junior engineers, and handling large-scale, complex projects. Problem-Solving: Strong analytical and problem-solving skills, with the ability to debug and troubleshoot in large, complex systems. Soft Skills: Excellent communication, leadership, and teamwork skills. Ability to work in a fast-paced, dynamic environment. Preferred Qualifications: Experience with containerization technologies like Docker and orchestration tools like Kubernetes. Familiarity with microservices architecture and event-driven systems. Knowledge of CI/CD pipelines and DevOps practices.
Posted 2 days ago
3.0 - 4.0 years
8 - 10 Lacs
mumbai
Work from Office
An AWS DevOps Architect designs and manages the DevOps environment for an organization. They ensure that software development and IT operations are integrated seamlessly. Responsibilities DevOps strategy: Develop and implement the DevOps strategy and roadmap Automation: Automate the provisioning, configuration, and management of infrastructure components Cloud architecture: Design and manage the cloud and infrastructure architecture Security: Implement security measures and compliance controls Collaboration: Foster collaboration between development, operations, and other cross-functional teams Continuous improvement: Regularly review and analyze DevOps processes and practices Reporting: Provide regular reports on infrastructure performance, costs, and security to management Skills and experience Experience with AWS services like ECS, EKS, and Kubernetes Knowledge of scripting languages like Python Experience with DevOps tools and technologies like Jenkins, Terraform, and Ansible Experience with CI/CD pipelines Experience with cloud governance standards and best practices PS we need strong DevOps Tool Implementation Experts on AWS platform ( Jenkins, Terrraform and other DevOps Tool)
Posted 2 days ago
1.0 - 3.0 years
3 - 5 Lacs
dhule
Work from Office
MANDATORY SKILLS : Python, AWS PREFERRED SKILLS Good to have knowledge of ADAS Automative Should have hands-on AWS experience or Migration with AWS Knowledge and cloud-based applications and pipelines. Python, AWS, DynamoDB, Opensearch Terraform (Good to have), ReactJS, Javascript, Typescript, HTML, CSS, Agile, Git, CI CD, Docker, Linux (Good to have), Good to have knowledge of ADAS Automative Good to have AWS certification
Posted 2 days ago
5.0 - 8.0 years
10 - 20 Lacs
bengaluru
Work from Office
Backend Developer Responsibilities & Skills Position Title Backend Developer Position Type Full time permanent Location Bengaluru, India Company Description Privaini is the pioneer of privacy risk management for companies and their entire business networks. Privaini offers a unique "outside-in approach", empowering companies to gain a comprehensive understanding of both internal and external privacy risks. It provides actionable insights using a data-driven, systematic, and automated approach to proactively address reputation and legal risks related to data privacy. Privaini generates AI-powered privacy profile and privacy score for a company from externally observable privacy, corporate, regulatory, historical events, and security data. Without the need for time-consuming questionnaires or installing any software, Privaini creates standardized privacy views of companies from externally observable information. Then Privaini builds a privacy risk posture for every business partner in the company's business network and continuously monitors each one. Our platform provides actionable insights that privacy & risk teams can readily implement. Be part of an exciting team of researchers, developers, and data scientists focused on the mission of building transparency in data privacy risks for companies and their business networks. Key Responsibilities Strong Python, Flask, REST API, and NoSQL skills. Familiarity with Docker is a plus. AWS Developer Associate certification is required. AWS Professional Certification is preferred. Architect, build, and maintain secure, scalable backend services on AWS platforms. Utilize core AWS services like Lambda, DynamoDB, API Gateways, and serverless technologies. Design and deliver RESTful APIs using Python Flask framework. Leverage NoSQL databases and design efficient data models for large user bases. Integrate with web services APIs and external systems. Apply AWS Sagemaker for machine learning and analytics (optional but preferred). Collaborate effectively with diverse teams (business analysts, data scientists, etc.). Troubleshoot and resolve technical issues within distributed systems. Employ Agile methodologies (JIRA, Git) and adhere to best practices. Continuously learn and adapt to new technologies and industry standards. Qualifications A bachelors degree in computer science, information technology or any relevant disciplines is required. A masters degree is preferred. At least 6 years of development experience, with 5+ years of experience in AWS. Must have demonstrated skills in planning, designing, developing, architecting, and implementing applications. Additional Information At Privaini Software India Private Limited, we value diversity and always treat all employees and job applicants based on merit, qualifications, competence, and talent. We do not discriminate on the basis of race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status, or disability status.
Posted 2 days ago
6.0 - 11.0 years
9 - 13 Lacs
chennai
Work from Office
About the Team: We are a motivated team in central R&D at CVS helping to change the game through product digitalization and vehicle intelligence. Our focus is on building solutions for truck, bus and trailer OEMs considering both onboard and offboard (SaaS & PaaS) needs and requirements. Purpose: Connect the vehicle (Cyber) secure the vehicle Master the vehicle architecture Diagnose the vehicle Gain intelligence from the vehicle What you can look forward to as Fullstack Developer Design, develop, and deploy scalable applications using AWS Serverless (Lambda, API Gateway, DynamoDB, etc.) and Container technologies (ECS, EKS, Fargate). Build and maintain RESTful APIs and microservices architectures in .NET core (Entity Framework) Write clean, maintainable code in Node.js, JavaScript, C#, or React JS or React Native. Work with both SQL and NoSQL databases to design efficient data models. Apply Object-Oriented Analysis (OOA) and Object-Oriented Design (OOD) principles in software development. Utilize multi-threading and messaging patterns to build robust distributed systems. Collaborate using GIT and follow Agile methodologies and Lean principles. Participate in code reviews, architecture discussions, and contribute to continuous improvement. Your profile as Tech Lead: Bachelors or Masters degree in Computer Science or a related field. Minimum 6+ years of hands-on software development experience. Strong understanding of AWS cloud hosting technologies and best practices. Proficiency in at least one of the following: Node.js, JavaScript, C#, React (JS / Native). Experience with REST APIs, microservices, and cloud-native application development. Familiarity with design patterns, messaging systems, and distributed architectures. Strong problem-solving skills and a passion for optimizing business solutions.
Posted 2 days ago
3.0 - 8.0 years
5 - 12 Lacs
bengaluru
Hybrid
Job Title: AWS Developer Experience: 3+ Years Location: Bangalore (Hybrid) Job Description We are seeking an AWS Developer with a minimum of 3 years of experience in designing, developing, and maintaining cloud-native applications on AWS. The ideal candidate will have strong skills in serverless architectures, microservices, and containerized applications , along with hands-on experience in AWS core services and DevOps practices. Key Responsibilities Design, develop, and maintain cloud-native applications using AWS services such as Lambda, API Gateway, DynamoDB, S3, SQS, SNS, ECS, and EKS . Implement serverless architectures , microservices , and containerized solutions on AWS. Work with cloud-native principles and ensure adherence to best practices for scalability and security. Build and manage CI/CD pipelines for automated deployment and testing. Collaborate with product owners, designers, and cross-functional teams to gather requirements and deliver solutions. Create and maintain technical documentation for application design, deployment, and configurations. Ensure performance, security, and compliance of deployed applications. Required Skills Minimum 3 years of hands-on experience with AWS services related to computing, databases, storage, networking, and security. Strong proficiency in one or more programming languages: .Net, Node.js, Java, Go . Experience with DevOps tools and CI/CD pipeline management. Knowledge of serverless computing , microservices architecture , and cloud-native design principles . AWS Certification (AWS Certified Developer) preferred. Interested candidates or references please drop cv: abhiram.n@techno-facts.com/6303953729
Posted 2 days ago
5.0 - 10.0 years
15 - 25 Lacs
kolkata, hyderabad, bengaluru
Hybrid
Genpact (NYSE: G) is a global professional services and solutions firm delivering outcomes that shape the future. Our 125,000+ people across 30+ countries are driven by our innate curiosity, entrepreneurial agility, and desire to create lasting value for clients. Powered by our purpose the relentless pursuit of a world that works better for people – we serve and transform leading enterprises, including the Fortune Global 500, with our deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI. Inviting applications for the role of Lead Consultant-Data Engineer, AWS+Python, Spark, Kafka for ETL! Responsibilities Develop, deploy, and manage ETL pipelines using AWS services, Python, Spark, and Kafka. Integrate structured and unstructured data from various data sources into data lakes and data warehouses. Design and deploy scalable, highly available, and fault-tolerant AWS data processes using AWS data services (Glue, Lambda, Step, Redshift) Monitor and optimize the performance of cloud resources to ensure efficient utilization and cost-effectiveness. Implement and maintain security measures to protect data and systems within the AWS environment, including IAM policies, security groups, and encryption mechanisms. Migrate the application data from legacy databases to Cloud based solutions (Redshift, DynamoDB, etc) for high availability with low cost Develop application programs using Big Data technologies like Apache Hadoop, Apache Spark, etc with appropriate cloud-based services like Amazon AWS, etc. Build data pipelines by building ETL processes (Extract-Transform-Load) Implement backup, disaster recovery, and business continuity strategies for cloud-based applications and data. Responsible for analysing business and functional requirements which involves a review of existing system configurations and operating methodologies as well as understanding evolving business needs Analyse requirements/User stories at the business meetings and strategize the impact of requirements on different platforms/applications, convert the business requirements into technical requirements Participating in design reviews to provide input on functional requirements, product designs, schedules and/or potential problems Understand current application infrastructure and suggest Cloud based solutions which reduces operational cost, requires minimal maintenance but provides high availability with improved security Perform unit testing on the modified software to ensure that the new functionality is working as expected while existing functionalities continue to work in the same way Coordinate with release management, other supporting teams to deploy changes in production environment Qualifications we seek in you! Minimum Qualifications Experience in designing, implementing data pipelines, build data applications, data migration on AWS Strong experience of implementing data lake using AWS services like Glue, Lambda, Step, Redshift Experience of Databricks will be added advantage Strong experience in Python and SQL Proven expertise in AWS services such as S3, Lambda, Glue, EMR, and Redshift. Advanced programming skills in Python for data processing and automation. Hands-on experience with Apache Spark for large-scale data processing. Experience with Apache Kafka for real-time data streaming and event processing. Proficiency in SQL for data querying and transformation. Strong understanding of security principles and best practices for cloud-based environments. Experience with monitoring tools and implementing proactive measures to ensure system availability and performance. Excellent problem-solving skills and ability to troubleshoot complex issues in a distributed, cloud-based environment. Strong communication and collaboration skills to work effectively with cross-functional teams. Preferred Qualifications/ Skills Master’s Degree-Computer Science, Electronics, Electrical. AWS Data Engineering & Cloud certifications, Databricks certifications Experience with multiple data integration technologies and cloud platforms Knowledge of Change & Incident Management process Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values diversity and inclusion, respect and integrity, customer focus, and innovation. Get to know us at genpact.com and on LinkedIn, X, YouTube, and Facebook. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a 'starter kit,' paying to apply, or purchasing equipment or training.
Posted 2 days ago
4.0 - 9.0 years
6 - 11 Lacs
bengaluru
Work from Office
Why this job matters Cloud Native Java Developer - To Individually contribute and Drive transformation of our existing Java microservices deployed on Amazon Elastic Kubernetes Service (EKS) to serverless AWS Lambda functions , below are the Roles and Responsibilities What youll be doing Key Responsibilities Develop and deploy serverless applications using Quarkus/Spring Boot and AWS Lambda Build RESTful APIs and event-driven microservices using cloud-native patterns Optimize cold-start performance using GraalVM native images Integrate with AWS services such as AWS API Gateway, S3, DynamoDB, CloudWatch and Postgres Implement and manage Lambda authorizers (custom and token-based) for securing APIs Design and configure AWS API Gateway for routing, throttling, and securing endpoints Integrate OAuth 2.0 authentication flows using Azure Active Directory as the identity provider Descent Understanding of resilience patterns Write unit and integration tests using JUnit, Mockito, and Quarkus testing tools Collaborate with DevOps teams to automate deployments using AWS SAM, CDK, or Terraform Monitor and troubleshoot production issues using AWS observability tools Migration Responsibilities Analyse existing Spring Boot microservices deployed on Kubernetes to identify candidates for serverless migration Refactor services to be stateless, event-driven, and optimized for short-lived execution Replace Kubernetes ingress and service discovery with API Gateway and Lambda triggers Migrate persistent state and configuration to AWS-native services (e.g., DynamoDB, S3, Secrets Manager) Redesign CI/CD pipelines to support serverless deployment workflows Ensure performance, cost-efficiency, and scalability in the new architecture Document migration strategies, patterns, and best practices for future reference Technical Proficiency Strong industry expereince of 4+ years with command of Java 8+, with deep understanding of: Functional interfaces (Function, Predicate, Supplier, Consumer) Streams API, lambda expressions, and Optional Proficiency in Java concurrency, including: Thread management, Executor Service, Completable Future, and parallel streams Designing thread-safe components and understanding concurrency pitfalls Understanding of AWS EKS (Elastic Kubernetes Service) , Docker Containers and Kubernetes fundamentals: Experience with resource requests and limits, pod autoscaling, and K8s networking Familiarity with transitioning workloads from EKS to serverless environments.
Posted 3 days ago
5.0 - 10.0 years
5 - 9 Lacs
coimbatore
Work from Office
Project Role : Application Designer Project Role Description : Assist in defining requirements and designing applications to meet business process and application requirements. Must have skills : AWS Architecture Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : should be a graduate and AWS certified Summary :As an Application Designer, you will assist in defining requirements and designing applications to meet business process and application requirements. You will play a crucial role in ensuring that the applications are designed to meet the needs of the organization and its stakeholders. Your typical day will involve collaborating with various teams, analyzing requirements, and designing innovative solutions to address business challenges. Roles & Responsibilities:- Expected to be an SME, collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Collaborate with stakeholders to gather requirements and understand business processes.- Design and develop applications that meet the business process and application requirements.- Ensure the applications are scalable, secure, and efficient.- Conduct code reviews and provide guidance to the development team.- Stay updated with the latest industry trends and technologies.- Assist in troubleshooting and resolving application issues.- Document application designs, processes, and procedures.- Train and mentor junior team members to enhance their skills and knowledge. Professional & Technical Skills: - Must To Have Skills: Proficiency in AWS Architecture.- Good To Have Skills: Experience with cloud platforms such as Azure or Google Cloud.- Strong understanding of cloud computing concepts and architecture.- Experience in designing and implementing scalable and secure cloud solutions.- Knowledge of AWS services such as EC2, S3, Lambda, RDS, and DynamoDB.- Familiarity with infrastructure as code tools like CloudFormation or Terraform.- Experience in designing and implementing CI/CD pipelines.- Excellent problem-solving and analytical skills. Additional Information:- The candidate should have a minimum of 5 years of experience in AWS Architecture.- This position is based at our Coimbatore office.- A graduate degree is required and AWS certification is preferred. Qualification should be a graduate and AWS certified
Posted 3 days ago
2.0 - 7.0 years
13 - 18 Lacs
pune
Work from Office
The Customer, Sales & Service Practice | Cloud Job Title - Amazon Connect + Level 11 (Analyst) + Entity (S&C GN) Management Level: Level 11 - Analyst Location: Delhi, Mumbai, Bangalore, Gurgaon, Pune, Hyderabad and Chennai Must have skills: AWS contact center, Amazon Connect flows, AWS Lambda and Lex bots, Amazon Connect Contact Center Good to have skills: AWS Lambda and Lex bots, Amazon Connect Experience: Minimum 2 year(s) of experience is required Educational Qualification: Engineering Degree or MBA from a Tier 1 or Tier 2 institute Join our team of Customer Sales & Service consultants who solve customer facing challenges at clients spanning sales, service and marketing to accelerate business change. Practice: Customer Sales & Service Sales I Areas of Work: Cloud AWS Cloud Contact Center Transformation, Analysis and Implementation | Level: Analyst | Location: Delhi, Mumbai, Bangalore, Gurgaon, Pune, Hyderabad and Chennai | Years of Exp: 2-5 years Explore an Exciting Career at Accenture Are you passionate about scaling businesses using in-depth frameworks and techniques to solve customer facing challengesDo you want to design, build and implement strategies to enhance business performanceDoes working in an inclusive and collaborative environment spark your interest Then, this is the right place for you! Welcome to a host of exciting global opportunities within Accenture Strategy & Consultings Customer, Sales & Service practice. The Practice A Brief Sketch The Customer Sales & Service Consulting practice is aligned to the Capability Network Practice of Accenture and works with clients across their marketing, sales and services functions. As part of the team, you will work on transformation services driven by key offerings like Living Marketing, Connected Commerce and Next-Generation Customer Care. These services help our clients become living businesses by optimizing their marketing, sales and customer service strategy, thereby driving cost reduction, revenue enhancement, customer satisfaction and impacting front end business metrics in a positive manner. You will work closely with our clients as consulting professionals who design, build and implement initiatives that can help enhance business performance. As part of these, you will drive the following: Work on creating business cases for journey to cloud, cloud strategy, cloud contact center vendor assessment activities Work on creating Cloud transformation approach for contact center transformations Work along with Solution Architects for architecting cloud contact center technology with AWS platform Work on enabling cloud contact center technology platforms for global clients specifically on Amazon connect Work on innovative assets, proof of concept, sales demos for AWS cloud contact center Support AWS offering leads in responding to RFIs and RFPs Bring your best skills forward to excel at the role: Good understanding of contact center technology landscape. An understanding of AWS Cloud platform and services with Solution architect skills. Deep expertise on AWS contact center relevant services. Sound experience in developing Amazon Connect flows , AWS Lambda and Lex bots Deep functional and technical understanding of APIs and related integration experience Functional and technical understanding of building API-based integrations with Salesforce, Service Now and Bot platforms Ability to understand customer challenges and requirements, ability to address these challenges/requirements in a differentiated manner. Ability to help the team to implement the solution, sell, deliver cloud contact center solutions to clients. Excellent communications skills Ability to develop requirements based on leadership input Ability to work effectively in a remote, virtual, global environment Ability to take new challenges and to be a passionate learner Read about us. Blogs Whats in it for you An opportunity to work on transformative projects with key G2000 clients Potential to Co-create with leaders in strategy, industry experts, enterprise function practitioners and, business intelligence professionals to shape and recommend innovative solutions that leverage emerging technologies. Ability to embed responsible business into everythingfrom how you service your clients to how you operate as a responsible professional. Personalized training modules to develop your strategy & consulting acumen to grow your skills, industry knowledge and capabilities Opportunity to thrive in a culture that is committed to accelerate equality for all. Engage in boundaryless collaboration across the entire organization. About Accenture: Accenture is a leading global professional services company, providing a broad range of services and solutions in strategy, consulting, digital, technology and operations. Combining unmatched experience and specialized skills across more than 40 industries and all business functions underpinned by the worlds largest delivery network Accenture works at the intersection of business and technology to help clients improve their performance and create sustainable value for their stakeholders. With 569,000 people serving clients in more than 120 countries, Accenture drives innovation to improve the way the world works and lives. Visit us at www.accenture.com About Accenture Strategy & Consulting: Accenture Strategy shapes our clients future, combining deep business insight with the understanding of how technology will impact industry and business models. Our focus on issues such as digital disruption, redefining competitiveness, operating and business models as well as the workforce of the future helps our clients find future value and growth in a digital world. Today, digital is changing the way organizations engage with their employees, business partners, customers and communities. This is our unique differentiator. To bring this global perspective to our clients, Accenture Strategy's services include those provided by our Capability Network a distributed management consulting organization that provides management consulting and strategy expertise across the client lifecycle. Our Capability Network teams complement our in-country teams to deliver cutting-edge expertise and measurable value to clients all around the world. For more information visit https://www.accenture.com/us-en/Careers/capability-network Accenture Capability Network | Accenture in One Word come and be a part of our team.QualificationYour experience counts! Bachelors degree in related field or equivalent experience and Post-Graduation in Business management would be added value. Minimum 2 years of experience in delivering software as a service or platform as a service projects related to cloud CC service providers such as Amazon Connect Contact Center cloud solution Hands-on experience working on the design, development and deployment of contact center solutions at scale. Hands-on development experience with cognitive service such as Amazon connect, Amazon Lex, Lambda, Kinesis, Athena, Pinpoint, Comprehend, Transcribe Working knowledge of one of the programming/scripting languages such as Node.js, Python, Java
Posted 3 days ago
15.0 - 20.0 years
10 - 14 Lacs
bengaluru
Work from Office
Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : ASP.NET MVC Good to have skills : Amazon Web Services (AWS)Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure project milestones are met, facilitating discussions to address challenges, and guiding your team in implementing effective solutions. You will also engage in strategic planning sessions to align project goals with organizational objectives, ensuring that all stakeholders are informed and involved in the development process. Your role will be pivotal in driving innovation and efficiency within the application development lifecycle, fostering a collaborative environment that encourages creativity and problem-solving. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and implement necessary adjustments to meet deadlines. Professional & Technical Skills: - Must To Have Skills: Proficiency in ASP.NET MVC.- Good To Have Skills: Experience with Amazon Web Services (AWS).- Strong understanding of web application architecture and design patterns.- Experience with front-end technologies such as HTML, CSS, and JavaScript.- Familiarity with database management systems and SQL.- Ability to troubleshoot and optimize application performance.Must -AWS:Lambda, DynamoDB, CloudWatch, ...-GitHubMust/Nice -NodeJSNice -React-Azure DevOps-NewRelic knowledge-Fintech knowledge Additional Information:- The candidate should have minimum 5 years of experience in ASP.NET MVC.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 3 days ago
8.0 - 12.0 years
25 - 37 Lacs
chennai
Work from Office
Technical Skills Experience building data transformation pipelines using DBT and SSIS Moderate programming experience with Python Moderate experience with AWS Glue Strong experience with SQL and ability to write efficient code and manage it through GIT repositories Nice-to-have skills Experience working with SSIS Experience working in a Wealth management industry Experience in agile development methodologies
Posted 3 days ago
4.0 - 6.0 years
12 - 18 Lacs
gurugram
Work from Office
About the Role: We are looking for a skilled and experienced Backend Developer with 56 years of hands-on experience to join our growing technology team. The ideal candidate will be responsible for designing, developing, and maintaining robust backend systems, with a strong focus on scalable microservices and cloud-native applications. Key Responsibilities: Design, develop, and deploy backend services and APIs (REST & GraphQL) with a focus on performance, scalability, and reliability. Build and maintain serverless applications using AWS Chalice /Fast API / Flask and other AWS services. Strong experience with AWS services (Lambda, API Gateway, S3, DynamoDB, etc.). Collaborate with cross-functional teams including front-end developers, product managers, and DevOps engineers. Write clean, maintainable, and well-tested code in Python . Hand-on exp in Git commands . Familiarity with database systems (SQL and NoSQL). Monitor application performance and troubleshoot production issues. Participate in code reviews and ensure adherence to best practices and coding standards. Technical Skills : Good to Have: Develop microservices architecture and contribute to continuous integration and delivery (CI/CD) pipelines. Exposure to containerization tools like Docker and orchestration tools like Kubernetes. Knowledge of security best practices in backend development. Experience with monitoring and logging tools (e.g., CloudWatch, ELK stack). Nice-to-have skills Qualifications 4-6 years of work experience in relevant field B.Tech/B.E/M.Tech or MCA degree from a reputed university. Computer science background is preferred
Posted 3 days ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
54024 Jobs | Dublin
Wipro
24262 Jobs | Bengaluru
Accenture in India
18733 Jobs | Dublin 2
EY
17079 Jobs | London
Uplers
12548 Jobs | Ahmedabad
IBM
11704 Jobs | Armonk
Amazon
11059 Jobs | Seattle,WA
Bajaj Finserv
10656 Jobs |
Accenture services Pvt Ltd
10587 Jobs |
Oracle
10506 Jobs | Redwood City