Jobs
Interviews

1090 S3 Jobs - Page 23

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

6.0 - 10.0 years

0 Lacs

chennai, tamil nadu

On-site

If you are seeking a career at a dynamic company that prioritizes its people and promotes a culture of growth and autonomy, ACV is the ideal place for you. With competitive compensation packages and ample learning and development opportunities, ACV provides the resources you need to advance in your career. Every day, we strive to elevate our standards by investing in our workforce and technology to ensure the success of our customers. We seek individuals who share our passion, bring innovative ideas to the table, and thrive in a collaborative environment. ACV is a pioneering technology company that has transformed the process of buying and selling cars online for dealers. We are reshaping the automotive industry by leveraging innovation, user-designed solutions, and data-driven applications. Our goal is to establish the most reliable and efficient digital marketplace for sourcing, selling, and managing used vehicles, offering transparency and insights that were previously unimaginable. As disruptors in the industry, we invite you to join our journey of innovation. Our network of brands includes ACV Auctions, ACV Transportation, ClearCar, MAX Digital, ACV Capital, True360, and Data Services. ACV Auctions in Chennai, India is seeking talented individuals to join our team as we expand our platform. We are offering a variety of exciting opportunities across different roles in corporate, operations, and product and technology. Our global product and technology organization encompasses product management, engineering, data science, machine learning, DevOps, and program leadership. We are united by a strong focus on customer centricity, a persistent approach to solving complex problems, and a shared passion for innovation. If you are eager to grow, lead, and contribute to something greater than yourself, we welcome you to be part of this journey. Let's collaborate and create something extraordinary together as we shape the future of the automotive industry! At ACV, we prioritize the health, physical, financial, social, and emotional wellness of our team members. To support this commitment, we offer leading benefits and wellness programs. ACV Auctions is currently seeking a Senior Engineer to join our NextGen team and contribute to our SaaS Product, MAX Digital. This role will primarily involve backend development and maintenance of NodeJS features and services hosted in AWS, utilizing MongoDB and GraphQL. The ideal candidate will have a solid background and a proven track record in architecting and designing enterprise applications using NodeJS. They should demonstrate a commitment to producing high-quality code and a collaborative approach to development. Key Responsibilities: - Architect and design systems from scratch - Develop features in NodeJS for our NextGen SaaS platform - Maintain and expand the existing codebase, adhering to established patterns and understanding the code's execution flow - Lead new feature and refactoring projects - Assist the team in addressing customer and stakeholder reported issues - Maintain a strong engineering focus on the current versions of our products used by customers Requirements: - Bachelor's Degree in computer science or a related technical discipline, or equivalent - 6+ years of experience in architecting and designing systems - Experience collaborating with a geographically dispersed team across multiple time zones - Excellent communication skills - Proficiency in MongoDB, GraphQL, SQL, and integrating 3rd-party APIs - Familiarity with cloud services, particularly AWS RDS, S3, SQS, SNS, Lambdas, Docker/Containers - Experience working in an Agile environment and applying Agile best practices and ceremonies Our Values: Trust & Transparency | People First | Positive Experiences | Calm Persistence | Never Settling,

Posted 1 month ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

You are seeking a Senior Software Engineer with over 7 years of experience in Node JS and React JS development. You should have a minimum of 5 years of expertise in Node JS, JavaScript, CSS3, HTML5, and React JS. Your responsibilities will include designing and developing front-end and back-end services for different business processes. Experience in Python development, Docker, Kubernetes, Helm Charts, GitHub, AWS services (S3, EC2), and databases like MongoDB/DynamoDB, Redis, and Elasticsearch (Kibana) is preferred. You should also be proficient in unit testing and test frameworks like Nodeunit and Mocha. Your role will involve participating in platform requirements development, contributing to design reviews, and engaging in platform sprint activities. You will be responsible for developing assigned stories, creating unit test cases, participating in peer code reviews, and following Agile Scrum methodology. Excellent communication skills, both in articulating technical challenges and solutions, and collaborating with internal and external resources are essential. A Bachelor of Engineering degree in a computer-related field is required. As a Senior Software Engineer, you will play a crucial part in the design and development of various business processes. Your technical expertise in Node JS, React JS, and other related technologies will be instrumental in creating efficient front-end and back-end services. Your contribution to platform requirements, design reviews, and sprint activities will be vital in delivering high-quality software solutions. Your proficiency in unit testing, versioning controls, and AWS services will ensure the reliability and scalability of the developed applications. Your communication skills and ability to work effectively in a team will be key in successfully executing your responsibilities.,

Posted 1 month ago

Apply

5.0 - 9.0 years

0 Lacs

haryana

On-site

You will be responsible for leading as a Cloud App Developer at Wipro Limited, a leading technology services and consulting company that specializes in creating innovative solutions for complex digital transformation needs. With a global presence spanning over 65 countries and a workforce of more than 230,000 employees and partners, we are committed to helping our clients, colleagues, and communities thrive in a dynamic world. As a Lead Cloud App Developer, you will need to possess expertise in Terraform, AWS, and DevOps. Additionally, you should hold certifications such as AWS Certified Solution Architect Associate and AWS Certified DevOps Engineer Professional. Your role will involve leveraging your IT experience of more than 6 years to set up and maintain ECS solutions, design AWS solutions with various services like VPC, EC2, WAF, ECS, ALB, IAM, KMS, and others. Furthermore, you will be expected to have experience with AWS services like SNS, SQS, EventBridge, RDS, Aurora DB, Postgres DB, DynamoDB, Redis, AWS Glue jobs, AWS Lambda, CI/CD using Azure DevOps, GitHub for source code management, and building cloud-native applications. Your responsibilities will also include working with container technologies like docker, configuring logging and monitoring solutions like CloudWatch and OpenSearch, and managing system configurations using Terraform and Terragrunt. In addition to technical skills, you should possess strong communication and collaboration abilities, be a team player, have excellent analytical and problem-solving skills, and understand Agile methodologies. Your role will also involve training others in procedural and technical topics, recommending process and architecture improvements, and troubleshooting distributed systems. Join us at Wipro to be a part of our journey to reinvent our business and industry. We are looking for individuals who are inspired by reinvention and are committed to evolving themselves, their careers, and their skills. Be a part of a purpose-driven business that empowers you to shape your reinvention. Realize your ambitions at Wipro, where applications from individuals with disabilities are warmly welcomed. Experience Required: 5-8 Years To learn more about Wipro Limited, visit www.wipro.com.,

Posted 1 month ago

Apply

3.0 - 7.0 years

0 Lacs

haryana

On-site

The ideal candidate for this position in Gurugram should have at least 3+ years of experience. Your responsibilities will include designing and implementing new features and functionality, establishing and guiding the website's architecture, ensuring high-performance and availability, and managing all technical aspects of React and Node JS. Additionally, you will help formulate an effective, responsive design and turn it into a working theme and plugin. To excel in this role, you should have a good understanding of front-end technologies and backend (React and Node Js), which includes HTML5, CSS3, JavaScript, jQuery. You should also have experience in building user interfaces for websites and/or web applications, designing and developing responsive design websites, and be comfortable working with debugging tools like Firebug, Chrome inspector, etc. Moreover, you should possess the ability to understand CSS changes and their ramifications to ensure consistent style across platforms and browsers, convert comprehensive layout and wireframes into working HTML pages, and have knowledge of how to interact with RESTful APIs and formats (JSON, XML). The required tech stack for this position includes: Frontend: React JS, Fluent UI, Tailwind CSS Backend: TypeScript, Node JS Common: GraphQL, NX Workspace Services: AWS, EC2, Lambda RDS PostgreSQL, S3 If you have the necessary skills and qualifications and are looking to work in a dynamic environment where you can contribute to the development of cutting-edge websites, we encourage you to apply for this exciting opportunity.,

Posted 1 month ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

As a Data/AWS Engineer at Waters Global Research, you will be part of a dynamic team focused on researching and developing self-diagnosing, self-healing instruments to enhance the user experience of our customers. By leveraging cutting-edge technologies and innovative solutions, you will play a crucial role in advancing our analytical chemistry instruments that have a direct impact on various fields such as laboratory testing, drug discovery, and food safety. Your primary responsibility will be to develop data pipelines for specialty instrument data and Gen AI processes, train machine learning models for error diagnosis, and automate manual processes to optimize instrument procedures. You will work on projects aimed at interpreting raw data results, cleaning anomalous data, and deploying models in AWS to collect and analyze results effectively. Key Responsibilities: - Build data pipelines in AWS using services like S3, Lambda, IoT core, and EC2. - Create and maintain dashboards to monitor data health and performance. - Containerize models and deploy them in AWS for efficient data processing. - Develop Python data pipelines to handle data frames and matrices, ensuring smooth data ingestion, transformation, and storage. - Collaborate with Machine Learning engineers to evaluate data and models, and present findings to stakeholders. - Mentor and review code of team members to ensure best coding practices and adherence to standards. Qualifications: Required Qualifications: - Bachelor's degree in computer science or related field with 5-8 years of relevant work experience. - Proficiency in AWS services such as S3, EC2, Lambda, and IAM. - Experience with containerization and deployment of code in AWS. - Strong programming skills in Python for OOP and/or functional programming. - Familiarity with Git, BASH, and command prompt. - Ability to drive new capabilities, solutions, and data best practices from technical documentation. - Excellent communication skills to convey results effectively to non-data scientists. Desired Qualifications: - Experience with C#, C++, and .NET considered a plus. What We Offer: - Hybrid role with competitive compensation and great benefits. - Continuous professional development opportunities. - Inclusive environment that encourages contributions from all team members. - Reasonable adjustments to the interview process based on individual needs. Join Waters Corporation, a global leader in specialty measurement, and be part of a team that drives innovation in chromatography, mass spectrometry, and thermal analysis. With a focus on creating business advantages for various industries, including life sciences, materials, and food sciences, we aim to transform healthcare delivery, environmental management, food safety, and water quality. At Waters, we empower our employees to unlock their full potential, learn, grow, and make a tangible impact on human health and well-being. We value collaboration, problem-solving, and innovation to address the challenges of today and tomorrow. Join us to be part of a team that delivers benefits as one and provides insights for a better future.,

Posted 1 month ago

Apply

10.0 - 14.0 years

0 Lacs

karnataka

On-site

As a Senior Staff Software Engineer in Data Lake House Engineering, you will play a crucial role in designing and implementing the Data Lake house platform, supporting both Data Engineering and Data Lake house applications. Your responsibilities will include overseeing Data Engineering pipeline productionalization, end-to-end data pipelines, model development, deployment, monitoring, refresh, etc. Additionally, you will be involved in driving technology development and architecture to ensure the platforms, systems, tools, models, and services meet the technical standards for security, quality, reliability, usability, scalability, performance, efficiency, and operability to meet the evolving needs of Wex and its customers. It is essential to balance both near-term and long-term requirements in collaboration with other teams across the organization. Your technical ownership will extend to Wex's Data Lake House Data architecture and service technology implementations, emphasizing architecture, technical direction, engineering best practices, and quality/compliance. Collaboration with Platform engineering and Data Lake House Engineering teams will be a key aspect of your role. The vision behind Wex's Data Lake House revolves around creating a unified, scalable, and intelligent data infrastructure that enables the organization to leverage its data effectively. This includes goals such as data democratization, agility and scalability, and advanced insights and innovation through Data & AI technology. We are seeking a highly motivated and experienced Software Engineer to join our organization and contribute to building out the Data Lake House Platform for Wex. Reporting to the Sr. Manager of Data Lake House Engineering in Bangalore, the ideal candidate will possess deep technical expertise in building and scaling data lake house environments, coupled with strong leadership and communication skills to align efforts across the organization. Your impact will be significant as you lead and drive the development of technology and platform for the company's Data Lake house requirements, ensuring functional richness, reliability, performance, and flexibility of the Data Lake house Platform. You will be instrumental in designing the architecture, leading the implementation of the Data Lake house System and services, and challenging the status quo to drive technical solutions that effectively serve the broad risk area of Wex. Collaboration with various engineering teams, information security teams, and external partners will be essential to ensure the security, privacy, and integration of the Data Lake Platform. Moreover, you will be responsible for creating, prioritizing, managing, and executing roadmaps and project plans, as well as reporting on the status of development, quality, operations, and system performance. Your role will involve driving the technical vision and strategy of Data Lake to meet business needs, setting high standards for your team, providing technical guidance and mentorship, and fostering an environment of continuous learning and innovation. Upholding strong engineering principles and ensuring a culture of transparency and inclusion will be integral to your leadership. To be successful in this role, you should bring at least 10 years of software design and development experience at a large scale and have strong software development skills in your chosen programming language. Experience with Data Lakehouse formats, Spark programming, cloud architecture tools and services, CI/CD automation, and agile development practices will be advantageous. Additionally, you should possess excellent analytical skills, mentorship capabilities, and strong written and verbal communication skills. In terms of personal characteristics, you should demonstrate a collaborative, mission-driven style, high standards of integrity and corporate stewardship, and the ability to operate in a fast-paced entrepreneurial environment. Leading with empathy, fostering a culture of trust and transparency, and communicating effectively in various settings will be key to your success. You should also exhibit talent development and scouting abilities, intellectual curiosity, learning agility, and the capacity to drive change through influence and stakeholder management across a complex business environment.,

Posted 1 month ago

Apply

6.0 - 10.0 years

0 Lacs

kochi, kerala

On-site

As a Senior JAVA Developer, you will be required to effectively communicate technical concepts in English, both verbally and in written form. With over 6 years of commercial JAVA experience, you should demonstrate the ability to write efficient, testable, and maintainable JAVA code following best practices and patterns for implementation, build, and deployment of JAVA services. Your expertise should extend to the Java ecosystem and related technologies, including Spring Boot, Spring frameworks, Hibernate, and Maven. Proficiency in Test-Driven Development (TDD) and exposure to Behavior-Driven Development (BDD) are essential. Familiarity with version control tools like Git, project management tools such as JIRA and Confluence, and continuous integration tools like Jenkins is expected. You should have a solid background in building RESTful services within microservices architectures and working in cloud-based environments, preferably AWS. Knowledge of both NoSQL and relational databases, especially PostgreSQL, is crucial. Experience in developing services using event or stream-based systems like SQS, Kafka, or Pulsar, and knowledge of CQRS principles is desirable. A strong foundation in Computer Science fundamentals and software patterns is necessary for this role. Additionally, experience with AWS services like Lambda, SQS, S3, and Rekognition Face Liveness, as well as familiarity with Camunda BPMN, would be advantageous. This position requires a Senior level professional with over 10 years of experience, offering a competitive salary ranging from 25 to 40 LPA. If you meet these qualifications and are eager to contribute your skills to a dynamic team, we encourage you to apply for this Senior JAVA Developer role.,

Posted 1 month ago

Apply

3.0 - 7.0 years

0 Lacs

maharashtra

On-site

You are an experienced Databricks on AWS and PySpark Engineer being sought to join our team. Your role will involve designing, building, and maintaining large-scale data pipelines and architectures using Databricks on AWS and optimizing data processing workflows with PySpark. Collaboration with data scientists and analysts to develop data models and ensure data quality, security, and compliance with industry standards will also be a key responsibility. Your main tasks will include troubleshooting data pipeline issues, optimizing performance, and staying updated on industry trends and emerging data engineering technologies. You should have at least 3 years of experience in data engineering with a focus on Databricks on AWS and PySpark, possess strong expertise in PySpark and Databricks for data processing, modeling, and warehousing, and have hands-on experience with AWS services like S3, Glue, and IAM. Your proficiency in data engineering principles, data governance, and data security is essential, along with experience in managing data processing workflows and data pipelines. Strong problem-solving skills, attention to detail, effective communication, and collaboration abilities are key soft skills required for this role, as well as the capability to work in a fast-paced and dynamic environment while adapting to changing requirements and priorities.,

Posted 1 month ago

Apply

8.0 - 12.0 years

12 - 22 Lacs

Hyderabad

Remote

Tech stack- Database: Mongodb: S3 Postgres Strong experience on Data pipelines; mapping React; Node; Python Aws; Lambda About the job Summary We are seeking a detail-oriented and proactive Data Analyst to lead our file and data operations, with a primary focus on managing data intake from our clients and ensuring data integrity throughout the pipeline. This role is vital to our operational success and will work cross-functionally to support data ingestion, transformation, validation, and secure delivery. The ideal candidate must have hands-on experience with healthcare datasets, especially medical claims data, and be proficient in managing ETL processes and data operations at scale. Responsibilities File Intake & Management Serve as the primary point of contact for receiving files from clients, ensuring all incoming data is tracked, validated, and securely stored. Monitor and automate data file ingestion using tools such as AWS S3, AWS Glue, or equivalent technologies. Troubleshoot and resolve issues related to missing or malformed files and ensure timely communication with internal and external stakeholders. Data Operations & ETL Develop, manage, and optimize ETL pipelines for processing large volumes of structured and unstructured healthcare data. Perform data quality checks, validation routines, and anomaly detection across datasets. Ensure consistency and integrity of healthcare data (e.g., EHR, medical claims, ICD/CPT/LOINC codes) during transformations and downstream consumption. Data Analysis & Reporting Collaborate with data science and analytics teams to deliver operational insights and performance metrics. Build dashboards and visualizations using Power BI or Tableau to monitor data flow, error rates, and SLA compliance. Generate summary reports and audit trails to ensure HIPAA-compliant data handling practices. Process Optimization Identify opportunities for automation and efficiency in file handling and ETL processes. Document procedures, workflows, and data dictionaries to standardize operations. Required Qualifications Bachelors or Master’s degree in Health Informatics, Data Analytics, Computer Science, or related field. 5+ years of experience in a data operations or analyst role with a strong focus on healthcare data. Demonstrated expertise in working with medical claims data, EHR systems, and healthcare coding standards (e.g., ICD, CPT, LOINC, SNOMED, RxNorm). Strong programming and scripting skills in Python and SQL for data manipulation and automation. Hands-on experience with AWS, Redshift, RDS, S3, and data visualization tools such as Power BI or Tableau. Familiarity with HIPAA compliance and best practices in handling protected health information (PHI). Excellent problem-solving skills, attention to detail, and communication abilities.

Posted 1 month ago

Apply

6.0 - 8.0 years

15 - 27 Lacs

Bengaluru

Work from Office

Job Summary We are seeking a Senior Data Engineer to join our growing data team, where you will help build and scale the data infrastructure powering analytics, machine learning, and product innovation. As a Senior Data Engineer, you will be responsible for designing, building, and optimizing robust, scalable, and secure data pipelines and platforms. You will work closely with data scientists, software engineers, and product teams to deliver clean, reliable data for critical business and clinical applications. Key Responsibilities: Design, implement, and optimize complex data pipelines using advanced SQL, ETL tools, and integration technologies. Collaborate with cross-functional teams to implement optimal data solutions for advanced analytics and data science initiatives. Spearhead process improvements, including automation, data delivery optimization, and infrastructure redesign for scalability. Evaluate and recommend emerging data technologies to build comprehensive data integration strategies. Lead technical discovery processes, defining complex requirements and mapping out detailed scenarios. • Develop and maintain data governance policies and procedures. What Youll Need to Be Successful (Required Skills): 5 -7 years of experience in data engineering or related roles. Advanced proficiency in multiple programming languages (e.g., Python, Java, Scala) and expert-level SQL knowledge. Extensive experience with big data technologies (Hadoop ecosystem, Spark, Kafka) and cloudbased environments (Azure, AWS, or GCP). Proven experience in designing and implementing large-scale data warehousing solutions. Deep understanding of data modeling techniques and enterprise-grade ETL tools. • Demonstrated ability to solve complex analytical problems. Education/ Certifications: Bachelor's degree in computer science, Information Management or related field . Preferred Skills: Experience in the healthcare industry, including clinical, financial, and operational data. Knowledge of machine learning and AI technologies and their data requirements. Familiarity with data visualization tools and real-time data processing. Understanding data privacy regulations and experience implementing compliant solutions Note: We work 5days from Office - India regular shift. Netsmart, India has setup our new Global Capability Centre(GCC) at Godrej Centre, Byatarayanapura (Hebbal area) -(https://maps.app.goo.gl/RviymAeGSvKZESSo6) .

Posted 1 month ago

Apply

3.0 - 5.0 years

12 - 18 Lacs

Pune

Work from Office

Job Overview: We are seeking a skilled Java Developer with a strong DevOps mindset to join our team. This role offers a balanced blend of backend development and hands-on deployment responsibilities. The ideal candidate will be proficient in Core Java, Spring, and Hibernate, with solid experience in AWS infrastructure, CI/CD pipelines, and Linux systems. Key Responsibilities: Development 40% Design, develop, and enhance Java-based backend applications using Spring and Hibernate. Write clean, maintainable, and well-documented code. Build and manage database interactions using Oracle/SQL. Collaborate with business and QA teams to translate requirements into scalable solutions. Deployment & DevOps 60% Manage application deployment and infrastructure on AWS (EC2, RDS, S3). Develop and maintain CI/CD pipelines using GitLab/Git, Jenkins. Automate deployment tasks using tools like Ansible and Docker (good to have). Monitor system health, troubleshoot issues, and implement fixes in a timely manner. Ensure high availability, scalability, and security of applications in production environments. Mandatory Skills: Core Java, Spring Framework, Hibernate Strong experience with Oracle/SQL databases Hands-on experience with Linux environments Working knowledge of AWS services EC2, RDS, S3 Proficiency with Git/GitLab version control systems Experience in setting up and maintaining Jenkins pipelines Good to Have: Experience with Ansible and Docker Exposure to Agile/Scrum development practices Familiarity with containerization and infrastructure as code (IaC) Preferred Attributes: Ability to shift seamlessly between development and deployment responsibilities Strong analytical and troubleshooting skills Effective communicator and a proactive team player

Posted 1 month ago

Apply

3.0 - 5.0 years

12 - 18 Lacs

Pune

Work from Office

Job Overview: We are seeking a skilled Java Developer with a strong DevOps mindset to join our team. This role offers a balanced blend of backend development and hands-on deployment responsibilities. The ideal candidate will be proficient in Core Java, Spring, and Hibernate, with solid experience in AWS infrastructure, CI/CD pipelines, and Linux systems. Key Responsibilities: Development 40% Design, develop, and enhance Java-based backend applications using Spring and Hibernate. Write clean, maintainable, and well-documented code. Build and manage database interactions using Oracle/SQL. Collaborate with business and QA teams to translate requirements into scalable solutions. Deployment & DevOps 60% Manage application deployment and infrastructure on AWS (EC2, RDS, S3). Develop and maintain CI/CD pipelines using GitLab/Git, Jenkins. Automate deployment tasks using tools like Ansible and Docker (good to have). Monitor system health, troubleshoot issues, and implement fixes in a timely manner. Ensure high availability, scalability, and security of applications in production environments. Mandatory Skills: Core Java, Spring Framework, Hibernate Strong experience with Oracle/SQL databases Hands-on experience with Linux environments Working knowledge of AWS services EC2, RDS, S3 Proficiency with Git/GitLab version control systems Experience in setting up and maintaining Jenkins pipelines Good to Have: Experience with Ansible and Docker Exposure to Agile/Scrum development practices Familiarity with containerization and infrastructure as code (IaC) Preferred Attributes: Ability to shift seamlessly between development and deployment responsibilities Strong analytical and troubleshooting skills Effective communicator and a proactive team player

Posted 1 month ago

Apply

3.0 - 7.0 years

0 Lacs

kochi, kerala

On-site

You should have experience in designing and building serverless data lake solutions using a layered components architecture. This includes expertise in Ingestion, Storage, processing, Security & Governance, Data cataloguing & Search, and Consumption layer. Proficiency in AWS serverless technologies like Lake formation, Glue, Glue Python, Glue Workflows, Step Functions, S3, Redshift, Quick sight, Athena, AWS Lambda, and Kinesis is required. It is essential to have hands-on experience with Glue. You must be skilled in designing, building, orchestrating, and deploying multi-step data processing pipelines using Python and Java. Experience in managing source data access security, configuring authentication and authorization, and enforcing data policies and standards is also necessary. Additionally, familiarity with AWS Environment setup and configuration is expected. A minimum of 6 years of relevant experience with at least 3 years in building solutions using AWS is mandatory. The ability to work under pressure and a commitment to meet customer expectations are essential traits for this role. If you meet these requirements and are ready to take on this challenging opportunity, please reach out to hr@Stanratech.com.,

Posted 1 month ago

Apply

10.0 - 14.0 years

0 Lacs

indore, madhya pradesh

On-site

At InfoBeans, we believe in making other peoples lives better through our work and everyday interactions. We are currently looking for a Java Fullstack with Angular professional to join our team in Indore/Pune. With a focus on Web Application Development and over 10 years of experience, you will play a crucial role in developing Microservices using Spring/AWS technologies and deploying them on the AWS platform. Your responsibilities will include supporting Java Angular enterprise applications with multi-region setups, performing unit and system testing of application code, and executing implementation activities. You will be involved in designing, building, and testing Java EE and Angular full stack applications. In this role, you will have the opportunity to work in an open workspace with smart and pragmatic team members. You can expect ever-growing opportunities for professional and personal growth in a learning culture that encourages teamwork, collaboration, and diversity. Excellence, compassion, openness, and ownership are highly valued and rewarded in our environment. To excel in this role, we expect you to have in-depth knowledge of popular Java frameworks such as Spring boot and Spring, experience with Object-Oriented Design (OOD), and proficiency in Spring, Spring Boot, Relational Databases, MySQL, and ORM technologies (JPA2, Hibernate). Experience working in Agile (Scrum/Lean) with a DevSecOps focus is essential, along with familiarity with AWS, Kubernetes, Docker Containers, and AWS Component Usage, Configurations, and Deployment including Elasticsearch, EC2, S3, SNS, SQS, API Gateway Service, and Kinesis. An AWS certification would be advantageous, and any knowledge of Health and related technologies will be beneficial in this role. If you are looking for a challenging yet rewarding opportunity to contribute to cutting-edge projects in a supportive and dynamic environment, we encourage you to apply for this position.,

Posted 1 month ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

The role involves providing production support for trading applications and requires the candidate to be comfortable with working in a rotational shift (7 AM - 4 PM / 11 AM - 8 PM / 1 PM - 10 PM). The applications have transitioned from on-premises to AWS cloud, necessitating strong experience in AWS services such as EC2, S3, and Kubernetes. Monitoring overnight batch jobs is also a key responsibility. Key Requirements: - Proficiency in AWS services like EC2, S3, Kubernetes, CloudWatch, etc. - Familiarity with monitoring tools like Datadog, Grafana, Prometheus. Good to have: - Basic understanding of SQL. - Experience in utilizing Control-M/Autosys schedulers.,

Posted 1 month ago

Apply

8.0 - 12.0 years

0 Lacs

navi mumbai, maharashtra

On-site

Seekify Global is looking for an experienced and driven Data Catalog Engineer to join the Data Engineering team. The ideal candidate should have a strong background in designing and implementing metadata and data catalog solutions specifically in AWS-centric data lake and data warehouse environments. As a Data Catalog Engineer, you will play a crucial role in improving data discoverability, governance, and lineage across the organization's data assets. Your key responsibilities will include leading the end-to-end implementation of a data cataloging solution within AWS, establishing and managing metadata frameworks for diverse data assets, integrating the data catalog with AWS-based storage solutions, collaborating with various project teams to define metadata standards and processes, developing automation scripts for metadata management, working closely with other data professionals to ensure data accuracy, and implementing access controls to comply with data privacy standards. The ideal candidate should possess at least 7-8 years of experience in data engineering or metadata management roles, with proven expertise in implementing data catalog solutions within AWS environments. Strong knowledge of AWS services such as Glue, S3, Athena, Redshift, EMR, Data Catalog, and Lake Formation is essential. Proficiency in Python, SQL, and automation scripting for metadata pipelines is required, along with familiarity with data governance and compliance standards. Experience with BI tools and third-party catalog tools is a plus. Preferred qualifications include AWS certifications, experience with data catalog tools like Alation, Collibra, or Informatica EDC, exposure to data quality frameworks, stewardship practices, and knowledge of data migration processes. This is a full-time position that requires in-person work.,

Posted 1 month ago

Apply

5.0 - 13.0 years

0 Lacs

pune, maharashtra

On-site

We are seeking talented and experienced individuals to join our engineering team in the roles of Staff Development Engineer and Senior Software Development Engineer (SDE 3). As a member of our team, you will be responsible for taking ownership of complex projects, designing and constructing high-performance, scalable systems. In the role of SDE 3, you will play a crucial part in ensuring that the solutions we develop are not only robust but also efficient. This is a hands-on position that requires you to lead projects from concept to deployment, ensuring the delivery of top-notch, production-ready code. Given the fast-paced environment, strong problem-solving skills and a dedication to crafting exceptional software are indispensable. Your responsibilities will include: - Developing high-quality, secure, and scalable enterprise-grade backend components in alignment with technical requirements specifications and design artifacts within the expected time and budget. - Demonstrating a proficient understanding of the choice of technology and its application, supported by thorough research. - Identifying, troubleshooting, and ensuring the timely resolution of software defects. - Participating in functional specification, design, and code reviews. - Adhering to established practices for the development and upkeep of application code. - Taking an active role in diminishing the technical debt across our various codebases. We are looking for candidates with the following qualifications: - Proficiency in Python programming and frameworks such as Flask/FastAPI. - Prior experience in constructing REST API-based microservices. - Excellent knowledge and hands-on experience with RDBMS (e.g., MySQL, PostgreSQL), message brokers, caching, and queueing systems. - Preference for experience with NoSQL databases. - Ability for Research & Development to explore new topics and use cases. - Hands-on experience with AWS services like EC2, SQS, Fargate, Lambda, and S3. - Knowledge of Docker for application containerization. - Cybersecurity knowledge is considered advantageous. - Strong technical background with the ability to swiftly adapt to emerging technologies. - Desired experience: 5-13 years in Software Engineering for Staff or SDE 3 roles. Working Conditions: This role necessitates full-time office-based work; remote work arrangements are not available. Company Culture: At Fortinet, we uphold a culture of innovation, collaboration, and continuous learning. We are dedicated to fostering an inclusive environment where every employee is valued and respected. We encourage applications from individuals of all backgrounds and identities. Our competitive Total Rewards package is designed to assist you in managing your overall health and financial well-being. We also offer flexible work arrangements and a supportive work environment. If you are looking for a challenging, fulfilling, and rewarding career journey, we invite you to contemplate joining us and contributing solutions that have a meaningful and enduring impact on our 660,000+ global customers.,

Posted 1 month ago

Apply

3.0 - 7.0 years

0 Lacs

pune, maharashtra

On-site

Join us as a Cloud Data Engineer at Barclays, where you'll spearhead the evolution of the digital landscape, driving innovation and excellence. You'll harness cutting-edge technology to revolutionize digital offerings, ensuring unparalleled customer experiences. You may be assessed on key critical skills relevant for success in the role, such as risk and control, change and transformations, business acumen, strategic thinking, and digital technology, as well as job-specific skill sets. To be successful as a Cloud Data Engineer, you should have experience with: - Experience on AWS Cloud technology for data processing and a good understanding of AWS architecture. - Experience with computer services like EC2, Lambda, Auto Scaling, VPC, EC2. - Experience with Storage and container services like ECS, S3, DynamoDB, RDS. - Experience with Management & Governance KMS, IAM, CloudFormation, CloudWatch, CloudTrail. - Experience with Analytics services such as Glue, Athena, Crawler, Lake Formation, Redshift. - Experience with Solution delivery for data processing components in larger End to End projects. Desirable skill sets/good to have: - AWS Certified professional. - Experience in Data Processing on Databricks and unity catalog. - Ability to drive projects technically with right first deliveries within schedule and budget. - Ability to collaborate across teams to deliver complex systems and components and manage stakeholders" expectations well. - Understanding of different project methodologies, project lifecycles, major phases, dependencies and milestones within a project, and the required documentation needs. - Experienced with planning, estimating, organizing, and working on multiple projects. This role will be based out of Pune. Purpose of the role: To build and maintain systems that collect, store, process, and analyze data, such as data pipelines, data warehouses, and data lakes to ensure that all data is accurate, accessible, and secure. Accountabilities: - Build and maintenance of data architecture pipelines that enable the transfer and processing of durable, complete, and consistent data. - Design and implementation of data warehouses and data lakes that manage appropriate data volumes and velocity and adhere to required security measures. - Development of processing and analysis algorithms fit for the intended data complexity and volumes. - Collaboration with data scientists to build and deploy machine learning models. Analyst Expectations: - Will have an impact on the work of related teams within the area. - Partner with other functions and business areas. - Takes responsibility for end results of a team's operational processing and activities. - Escalate breaches of policies/procedure appropriately. - Take responsibility for embedding new policies/procedures adopted due to risk mitigation. - Advise and influence decision making within own area of expertise. - Take ownership for managing risk and strengthening controls in relation to the work you own or contribute to. - Deliver your work and areas of responsibility in line with relevant rules, regulations, and codes of conduct. - Maintain and continually build an understanding of how own sub-function integrates with function, alongside knowledge of the organization's products, services, and processes within the function. - Demonstrate understanding of how areas coordinate and contribute to the achievement of the objectives of the organization sub-function. - Resolve problems by identifying and selecting solutions through the application of acquired technical experience and will be guided by precedents. - Guide and persuade team members and communicate complex/sensitive information. - Act as a contact point for stakeholders outside of the immediate function, while building a network of contacts outside the team and external to the organization. All colleagues will be expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence, and Stewardship our moral compass, helping us do what we believe is right. They will also be expected to demonstrate the Barclays Mindset to Empower, Challenge, and Drive the operating manual for how we behave.,

Posted 1 month ago

Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

You have an opportunity to impact your career and embark on an adventure where you can push the limits of what's possible. As a Manager of Software Engineering - Cloud at JPMorgan Chase, you will lead a team of cloud engineers to develop and implement scalable, reliable, and secure cloud-based solutions. Your role will be pivotal in shaping the cloud strategy and architecture, ensuring alignment with business goals and technical requirements. Your leadership will drive innovation and operational excellence in cloud technologies, fostering a collaborative environment to achieve project objectives. You will be responsible for leading and mentoring a team of cloud engineers, fostering a culture of innovation and continuous improvement. Collaboration with technical teams and business stakeholders to propose and implement cloud solutions that meet current and future needs will be a key aspect of your role. You will define and drive the technical target state of cloud products, ensuring alignment with strategic goals, and participate in architecture governance bodies to ensure compliance with best practices and standards. Your expertise will be crucial in evaluating and providing feedback on new cloud technologies, recommending solutions for future state architecture. You will oversee the design, development, and deployment of cloud-based solutions on AWS, utilizing services such as EC2, S3, Lambda, and RDS. Integration of DevOps practices, including Infrastructure as Code (IaC) using tools like Terraform and AWS CloudFormation, and Configuration Management with Ansible or Chef will be part of your responsibilities. Establishing and maintaining Continuous Integration/Continuous Deployment (CI/CD) pipelines using Jenkins, GitLab CI, or AWS CodePipeline will also fall under your purview. Identifying opportunities to automate remediation of recurring issues to improve operational stability of cloud applications and systems will be essential. Leading evaluation sessions with external vendors, startups, and internal teams to assess architectural designs and technical credentials will also be part of your responsibilities. **Required Qualifications, Capabilities, and Skills:** - Formal training or certification in cloud engineering concepts with 5+ years of applied experience. - Proven experience in leading cloud engineering teams and delivering cloud solutions. - Advanced proficiency in one or more programming languages. - Expertise in automation and continuous delivery methods. - Proficient in all aspects of the Software Development Life Cycle, with a focus on cloud technologies. - Advanced understanding of agile methodologies such as CI/CD, Application Resiliency, and Security. - Demonstrated proficiency in cloud applications and technical processes within a technical discipline (e.g., cloud, artificial intelligence, machine learning, mobile, etc.). - Practical cloud-native experience, particularly with AWS services and architecture, including VPC, IAM, and CloudWatch. **Preferred Qualifications, Capabilities, and Skills:** - In-depth knowledge of the financial services industry and their IT systems. - Advanced knowledge of cloud software, applications, and architecture disciplines. - Ability to evaluate current and emerging cloud technologies to recommend the best solutions for the future state architecture.,

Posted 1 month ago

Apply

10.0 - 14.0 years

0 Lacs

pune, maharashtra

On-site

As a DataOps Engineer, you will play a crucial role within our data engineering team, operating in the realm that merges software engineering, DevOps, and data analytics. Your primary responsibility will involve creating and managing secure, scalable, and production-ready data pipelines and infrastructure that are vital in supporting advanced analytics, machine learning, and real-time decision-making capabilities for our clientele. Your key duties will encompass designing, developing, and overseeing the implementation of robust, scalable, and efficient ETL/ELT pipelines leveraging Python and contemporary DataOps methodologies. You will also be tasked with incorporating data quality checks, pipeline monitoring, and error handling mechanisms, as well as constructing data solutions utilizing cloud-native services on AWS like S3, ECS, Lambda, and CloudWatch. Furthermore, your role will entail containerizing applications using Docker and orchestrating them via Kubernetes to facilitate scalable deployments. You will collaborate with infrastructure-as-code tools and CI/CD pipelines to automate deployments effectively. Additionally, you will be involved in designing and optimizing data models using PostgreSQL, Redis, and PGVector, ensuring high-performance storage and retrieval while supporting feature stores and vector-based storage for AI/ML applications. In addition to your technical responsibilities, you will be actively engaged in driving Agile ceremonies such as daily stand-ups, sprint planning, and retrospectives to ensure successful sprint delivery. You will also be responsible for reviewing pull requests (PRs), conducting code reviews, and upholding security and performance standards. Your collaboration with product owners, analysts, and architects will be essential in refining user stories and technical requirements. To excel in this role, you are required to have at least 10 years of experience in Data Engineering, DevOps, or Software Engineering roles with a focus on data products. Proficiency in Python, Docker, Kubernetes, and AWS (specifically S3 and ECS) is essential. Strong knowledge of relational and NoSQL databases like PostgreSQL, Redis, and experience with PGVector will be advantageous. A deep understanding of CI/CD pipelines, GitHub workflows, and modern source control practices is crucial, as is experience working in Agile/Scrum environments with excellent collaboration and communication skills. Moreover, a passion for developing clean, well-documented, and scalable code in a collaborative setting, along with familiarity with DataOps principles encompassing automation, testing, monitoring, and deployment of data pipelines, will be beneficial for excelling in this role.,

Posted 1 month ago

Apply

1.0 - 8.0 years

0 Lacs

chennai, tamil nadu

On-site

As a Lead Developer, you will report to the Head of Technology based in London. Your core responsibilities will include leading the development of new features and platform improvements, managing, mentoring, and motivating team members, liaising with UK-based stakeholders to ensure team alignment with business and technical objectives, planning, coordinating, and delivering technical projects to the agreed schedule, championing and enforcing technical standards, ensuring the Software Development Life Cycle (SDLC) is followed within the team, and assisting with hiring, onboarding, and developing the team. In this role, you will work with an exciting and modern tech stack built for scale, reliability, and productivity. To succeed, you should have solid experience with tools such as Python (SQLAlchemy, Flask, Numpy, Pandas), MySQL, AWS (ECS, S3, Lambda, RDS), RabbitMQ, Docker, Linux, GitLab, Generative AI tools, and AI Tools. The required qualifications and experience for this role include a university degree in a STEM subject from a reputable institution, at least 8 years of professional software development experience, with at least 2 years in a lead/management role, proven experience liaising with remote stakeholders, familiarity with the tech stack or equivalent technologies, and a basic understanding of financial markets and derivative products. You should possess excellent teamwork skills, professional fluency in English (both written and spoken), excellent interpersonal and communication skills to collaborate effectively across global teams and time zones, a strong understanding of distributed software systems, an analytical and inquisitive mindset, and a desire to take on responsibility and make a difference. In terms of benefits, the company offers a competitive compensation package, including a competitive salary based on experience and role fit, annual/performance bonus, health insurance, life insurance, meal benefits, learning & development opportunities relevant to your role and career growth, enhanced leave policy, and a transport budget for roles requiring commute outside of business hours. This is a full-time position that requires in-person work.,

Posted 1 month ago

Apply

8.0 - 12.0 years

0 Lacs

chennai, tamil nadu

On-site

You should have experience working with BigID or Collibra, along with knowledge of data classification and data products. It is important to have an understanding of data loss and personal information security. Exposure to platforms such as Snowflake, S3, Redshift, SharePoint, and Box is required. You should also have knowledge of connecting to various source systems. A deep understanding and practical knowledge of IDEs like Eclipse, PyCharm, or any Workflow Designer is essential. Experience with one or more of the following languages - Java, JavaScript, Groovy, Python is preferred. Hands-on experience with CI/CD processes and tooling such as GitHub is necessary. Working experience in DevOps teams based on Kubernetes tools is also expected. Proficiency in database concepts and a basic understanding of data classification, lineage, and storage would be advantageous. Excellent written and spoken English, interpersonal skills, and a collaborative approach to delivery are essential. Desirable Skills And Experience: - A total of 8 to 12 years of overall IT experience - Technical Degree to support your experience - Deep technical expertise - Demonstrated understanding of the required technology and problem-solving skills - Analytical, focused, and capable of working independently with minimal supervision - Good collaborator management and team player - Exposure to platforms like Talend Data Catalog, BigID, or Snowflake is beneficial - Basic knowledge of AWS is a plus - Knowledge and experience with integration technologies such as Mulesoft and SnapLogic - Proficiency in Jira, including the ability to quickly generate JQL queries and save them for reference - Proficient in creating documentation in Confluence - Experience with Agile practices, preferably having been part of an Agile team for several years,

Posted 1 month ago

Apply

7.0 - 11.0 years

0 Lacs

noida, uttar pradesh

On-site

You are an experienced .NET Developer with at least 7 years of experience in designing, coding, deployment, and development of web-based applications using the latest .NET technologies such as .Net Core, ASP.NET, C#, MVC, LINQ, and Web API's. You are responsible for building, designing, and architecting .NET web-based applications, ensuring that the code written improves product quality. You have hands-on experience in the latest versions of .NET technologies and AWS architecture, including working with MVC Architecture, JavaScript, jQuery, and creating RESTful APIs/Web API. Your role involves writing industry-adopted code for Microservices based products and utilizing .Net Core 3.0 or above, Microservices, .Net Core Entity Framework, Gulp/Webpack, NodeJS, Python (optional), Unit/Integration testing, and Message Queuing Tools such as Kafka/RabbitMQ. You also have experience with MS SQL databases, including writing queries, store procedures, tables, views, triggers, functions, and SQL Jobs. Familiarity with SQS, SNS, S3, EC2, API Gateway, and AWS Aurora is a must. As an individual contributor, you are willing to contribute your views and opinions, questioning and probing when necessary to ensure thorough and robust solutions. Strong analytical and problem-solving skills are essential for this role, along with excellent interpersonal communication skills. Experience in working on large, complex products, very large scalable applications, or websites with huge data size is considered a plus. This position is based in Noida, Sector 68, with a 5-day work week and fixed off on Saturdays and Sundays. There are two working modes available: one opening for work from home with onsite visits and another for 100% onsite work from day one. The shift timings are in the morning shift from 4:30 am to 12:30 pm. The ideal candidate should be able to join immediately or within 15-30 days. The interview process consists of 2 rounds, the first being a core technical round and the second being a departmental HR round with a mix of technical and non-technical questions. Interviews will be conducted virtually.,

Posted 1 month ago

Apply

2.0 - 6.0 years

0 Lacs

kolkata, west bengal

On-site

As a DevOps Engineer with AWS certification, you will be responsible for implementing, maintaining, monitoring, and supporting the IT infrastructure. Your role will involve developing custom scripts to support Continuous Integration & Deployment processes and integrating various tools for automation based on target architecture. You will create packaging, deployment documentation, and scripts for production builds and assist agile development teams with builds and releases. Your key responsibilities will include implementing release automation solutions, branching & merging strategies, and providing guidance to the team on build & deployment automation issues. You will design and implement release orchestration solutions for medium or large-sized projects, ensuring efficient and effective deployment processes. To be successful in this role, you must have a minimum of 2 years of experience with AWS, a background in Linux/Unix Administration, and proficiency in using a variety of Open-Source Tools. Hands-on experience with AWS services like RDS, EC2, ELB, EBS, S3, SQS, Code Deploy, and Cloud Watch is essential. Strong skills in managing SQL and MySQL databases, as well as experience with Web Servers like Apache, Nginx, Lighttpd, and Tomcat, will be valued. You should be proficient in docker & Kubernetes deployment scripts, GIT, Jenkins, and cluster setup. Experience in environment setup, connectivity, support levels, system security compliance, and data security is required. Additionally, you should have expertise in application and infrastructure planning, testing, development, centralized configuration management, log management, and dashboards to ensure smooth operations. If you are a proactive and skilled DevOps Engineer with a passion for automation and infrastructure management, and meet the above requirements, we invite you to join our team in Kolkata for this full-time on-site position.,

Posted 1 month ago

Apply

2.0 - 6.0 years

0 Lacs

surat, gujarat

On-site

At devx, we specialize in assisting some of India's most innovative brands in unlocking growth opportunities through AI-powered and cloud-native solutions developed in collaboration with AWS. As a rapidly growing consultancy, we are dedicated to addressing real-world business challenges by leveraging cutting-edge technology. We are currently seeking a proactive and customer-focused AWS Solutions Architect to become part of our team. In this position, you will collaborate directly with clients to craft scalable, secure, and economical cloud architectures that effectively tackle significant business obstacles. Your role will involve bridging the gap between business requirements and technical implementation, establishing yourself as a reliable advisor to our clients. Key Responsibilities - Engage with clients to comprehend their business goals and transform them into cloud-based architectural solutions. - Design, deploy, and document AWS architectures, emphasizing scalability, security, and performance. - Develop solution blueprints and collaborate closely with engineering teams to ensure successful execution. - Conduct workshops, presentations, and in-depth technical discussions with client teams. - Stay informed about the latest AWS offerings and best practices, integrating them into solution designs. - Collaborate with various teams, including sales, product, and engineering, to provide comprehensive solutions. We are looking for individuals with the following qualifications: - Minimum of 2 years of experience in designing and implementing solutions on AWS. - Proficiency in fundamental AWS services like EC2, S3, Lambda, RDS, API Gateway, IAM, VPC, etc. - Proficiency in fundamental AI/ML and Data services such as Bedrock, Sagemaker, Glue, Athena, Kinesis, etc. - Proficiency in fundamental DevOps services such as ECS, EKS, CI/CD Pipeline, Fargate, Lambda, etc. - Excellent written and verbal communication skills in English. - Comfortable in client-facing capacities, with the ability to lead technical dialogues and establish credibility with stakeholders. - Capability to strike a balance between technical intricacies and business context, effectively communicating value to decision-makers. Location: Surat, Gujarat Note: This is an on-site role in Surat, Gujarat. Please apply only if you are willing to relocate.,

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies