Jobs
Interviews

11 Aws Knowledge Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 5.0 years

0 Lacs

bengaluru, karnataka, india

On-site

Who are we Smarsh empowers its customers to manage risk and unleash intelligence in their digital communications. Our growing community of over 6500 organizations in regulated industries counts on Smarsh every day to help them spot compliance, legal or reputational risks in 80+ communication channels before those risks become regulatory fines or headlines. Relentless innovation has fueled our journey to consistent leadership recognition from analysts like Gartner and Forrester, and our sustained, aggressive growth has landed Smarsh in the annual Inc. 5000 list of fastest-growing American companies since 2008. Enterprise Archive is a cloud-based platform that stores and handles (archive/ search/discovery) over peta bytes of data. It uses cutting cloud scale (like Elastic Search, Mongo DB, Storm, Kafka, Hazelcast) to solve very complex storage problems at scale. Location: Bangalore Roles And Responsibilities Take ownership of assigned features or projects, including design, implementation, testing, and delivery. Develop scalable, high-quality, and reusable code for distributed and enterprise-grade systems. Collaborate with product management and senior engineers to analyze requirements and contribute to solution design. Proactively identify technical challenges and propose improvements to enhance system performance, reliability, and scalability. Participate in technical discussions within the team and contribute to cross-team initiatives. Support delivered features by debugging, root cause analysis, and production fixes. Mentor and guide junior engineers on best practices, coding standards, and system design. Work closely with peers to reduce technical debt and ensure long-term maintainability of the system. Contribute to continuous improvement by driving automation, adopting DevOps practices, and applying CI/CD principles. Desired Skills & Experience Bachelors/Masters degree in Computer Science or related field with strong academic record. 35 years of professional experience in software development, preferably in large-scale distributed systems. Strong problem-solving ability with solid understanding of data structures, algorithms, and design patterns Mandatory Skills Distributed Systems Java 17 Spring Boot MongoDB Elasticsearch PostgreSQL (or any RDBMS) Kafka Microservices Architecture AWS knowledge Nice To Have Skills Apache Storm Angular (UI frameworks) Python or other scripting languages DevOps concepts, CI/CD (Jenkins/Concourse) Kubernetes (K8s) Exposure to GenAI/Agentic workflows is a strong plus About Our Culture Smarsh hires lifelong learners with a passion for innovating with purpose, humility and humor. Collaboration is at the heart of everything we do. We work closely with the most popular communications platforms and the worlds leading cloud infrastructure platforms. We use the latest in AI/ML technology to help our customers break new ground at scale. We are a global organization that values diversity, and we believe that providing opportunities for everyone to be their authentic self is key to our success. Smarsh leadership, culture, and commitment to developing our people have all garnered Comparably.com Best Places to Work Awards. Come join us and find out what the best work of your career looks like. Show more Show less

Posted 1 day ago

Apply

0.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Job Description: Key Responsibilities Work within Agile development teams, contributing to sprints and ceremonies. Apply data engineering best practices to ensure scalable, maintainable codebases. Develop solutions using Python and SQL. Understand and manipulate business data. Collaborate with cross-functional teams to deliver data solutions that meet business needs. Communicate effectively to gather requirements and present solutions. Follow best practices for data processing and coding standards. Skills Proficient in Python for data manipulation and automation. Must have Hands-on experience with orchestration and automation tools such as Snowflake Tasks and Streams. Strong experience with SQL development (knowledge of MS SQL is a plus). Excellent written and oral communication skills. Deep understanding of business data, especially as it relates to marketing and audience targeting. Experience with Agile methodologies and CI/CD processes Experience with MS SQL. Familiarity with SAS. Good to have B2B and AWS knowledge Location: DGS India - Bengaluru - Manyata N1 Block Brand: Merkle Time Type: Full time Contract Type: Permanent Show more Show less

Posted 3 weeks ago

Apply

5.0 - 10.0 years

0 Lacs

chennai, tamil nadu

On-site

The BI Solution Architect plays a crucial role in overseeing the technical architecture, data model, and performance of data solutions. Your responsibilities include influencing change, enforcing data best practices and governance, assessing scope, defining timelines and roadmap, analyzing data architecture, and integration solutions for improvement opportunities. Additionally, you will promote a development environment that is agile, fast-paced, and iterative. Your contribution at Logitech should embody behaviors such as being open, staying hungry and humble, collaborating, challenging, deciding, and taking action. You are expected to provide knowledge and exposure to CDP platforms, particularly Salesforce CDP, demonstrate deep domain expertise in Sales & Marketing, and related technology stack and data infrastructure. Your role involves providing expertise and leadership in making technical decisions, mentoring junior team members, and focusing on delivering an architecture that adds business value. Effective communication and collaboration with members of the business, technical, and leadership teams are essential. You will also be responsible for ensuring data quality, defining data rules, and empowering end-users for self-service analytics. Working closely with business members to understand their requirements is a key aspect of this role. Key qualifications for this position include a minimum of one full lifecycle Customer Data Platform (CDP) implementation experience, 5-7 years of experience in data integration platform architecture, configuration, and best practices. Exposure to relational databases, ETL tools, reporting platforms, hands-on SQL knowledge, data modeling experience, understanding of ERP systems, and excellent communication skills are required. Additionally, experience designing and implementing BI solutions, configuring data quality rules, managing history across dimensions, and solid understanding of data warehouse design practices are crucial. Preferable skills and behaviors include AWS knowledge, Python proficiency, and a Bachelor of Engineering in Computer Science or equivalent. Logitech offers an environment that encourages individual initiative and impact, while also providing a global platform for your actions to make a significant difference.,

Posted 3 weeks ago

Apply

4.0 - 8.0 years

0 Lacs

pune, maharashtra

On-site

As a Python Developer with expertise in AWS, you will be responsible for developing cloud-based applications, constructing data pipelines, and seamlessly integrating with various AWS services. Your role will involve close collaboration with DevOps, Data Engineering, and Product teams to conceptualize and deploy solutions that are not only scalable and resilient but also optimized for efficiency within an AWS cloud environment. Your primary responsibilities will include designing, developing, and maintaining applications and services using Python within the cloud infrastructure. You will leverage a range of AWS services such as EC2, S3, Lambda, RDS, DynamoDB, and API Gateway to build robust and scalable solutions. Additionally, you will be tasked with developing and managing data pipelines, integrating data from diverse sources into AWS-based storage solutions, and designing RESTful APIs for seamless application communication and data exchange. Monitoring and optimizing cloud resources for cost efficiency, performance, and security will be crucial aspects of your role. You will also be expected to automate workflows and deployment processes using tools like AWS Lambda, CloudFormation, and other automation technologies. Implementing security best practices, such as IAM roles and encryption, to safeguard data and ensure compliance within the cloud environment will be paramount. Collaboration will play a key role in your day-to-day activities as you work closely with DevOps, Cloud Engineers, and fellow developers to ensure the smooth deployment and integration of applications. Continuous improvement of development processes and deployment practices will be encouraged, fostering an environment of innovation and efficiency. In terms of primary skills, you should possess strong expertise in Python programming, including proficiency in libraries like Pandas, NumPy, Boto3 (AWS SDK for Python), and frameworks such as Flask or Django. Hands-on experience with AWS services like S3, EC2, Lambda, RDS, DynamoDB, CloudFormation, and API Gateway will be essential. Your experience in designing, deploying, and maintaining cloud-based applications using AWS infrastructure, as well as developing RESTful APIs and managing data exchanges, will be highly valued. Furthermore, familiarity with automation tools and scripts, version control using Git, and building CI/CD pipelines for cloud-based applications will be advantageous secondary skills that can enhance your effectiveness in this role.,

Posted 3 weeks ago

Apply

8.0 - 12.0 years

0 Lacs

hyderabad, telangana

On-site

As a highly skilled Linux Subject Matter Expert (SME), you will be responsible for managing and optimizing Linux-based systems such as Ubuntu, SUSE, and Red Hat, along with a proficient command of AWS services. Your role will involve designing, implementing, and supporting secure and scalable Linux environments, both on-premises and in the cloud. You will be expected to manage and maintain Linux systems across different distributions, including Ubuntu, SUSE, and Red Hat Enterprise Linux (RHEL). Additionally, deploying, configuring, and monitoring AWS cloud infrastructure while adhering to best practices for security, scalability, and performance will be crucial aspects of your responsibilities. Automation of routine system administration tasks using scripting languages like Bash, Python, etc., will be essential. Ensuring high availability of Linux servers through the implementation of clustering, failover, and backup solutions is another key responsibility. Monitoring system performance, identifying bottlenecks, and optimizing Linux environments will also be part of your role. Managing user authentication, permissions, and security policies on Linux servers and designing and deploying Infrastructure as Code (IaC) solutions using tools like Terraform, AWS CloudFormation, or similar will be vital tasks. Collaborating with development and DevOps teams to support CI/CD pipelines and containerized applications is also expected. Your expertise in operating systems, particularly in Linux distributions like Ubuntu, SUSE, and Red Hat Enterprise Linux (RHEL), along with hands-on experience in AWS services such as EC2, S3, RDS, VPC, CloudWatch, IAM, and Lambda will be required. Advanced scripting skills in Bash, Python, or Perl for automation and task management are essential. A strong understanding of networking protocols, load balancers, firewalls, and VPNs, along with expertise in tools like Ansible, Puppet, or Chef for configuration management, is necessary. Experience with monitoring solutions like Prometheus, Nagios, or AWS CloudWatch, and in-depth knowledge of system hardening, patch management, and encryption best practices are crucial. Practical experience with Terraform, AWS CloudFormation, or other Infrastructure as Code tools will also be beneficial. Furthermore, preparing and maintaining detailed technical documentation, runbooks, and SOPs will be part of your routine tasks in this role.,

Posted 3 weeks ago

Apply

10.0 - 14.0 years

0 Lacs

chennai, tamil nadu

On-site

The ideal candidate should have hands-on experience in Collibra Workflows, asset model creation, cataloguing, and assessment creation. Additionally, exposure to AI platforms such as Open AI, Bedrock, and integration platforms like snaplogic and mulesoft is required. A deep understanding and practical knowledge of IDEs such as Eclipse/PyCharm or any Workflow Designer is essential. Experience with one or more of the following languages: Java, JavaScript, Groovy, Python is preferred. Moreover, deep understanding and hands-on experience of CICD processes and tooling e.g., GitHub is necessary. Candidates should have experience working in Dev-Ops teams based on Kubernetes tools and converting a business workflow into an automated set of actions. Proven knowledge in scripting and a willingness to learn new languages is expected. Excellent communication skills in written & spoken English, interpersonal skills, and a collaborative approach to delivery are crucial. An enthusiasm for great documentation including high level designs, low level designs, coding standards, and Knowledge Base Articles is highly appreciated. Desirable qualifications include an Engineering Degree in IT/Computer Science with a minimum of 10 years of experience. Knowledge and experience of the Collibra Data Governance platform, exposure to AI models, AI governance, data policies, and governance are advantageous. Basic AWS knowledge is a plus. Familiarity with integration technologies like Mulesoft and Snaplogic is beneficial. Excellent Jira skills, including the ability to rapidly generate JQL on-the-fly and save JQL queries/filters/views/etc for publishing to fellow engineers & senior stakeholders, are desired. Candidates should have experience in the creation of documentation in Confluence and Agile practices, preferably having been part of an Agile team for several years. Joining Virtusa means becoming part of a team that values teamwork, quality of life, professional and personal development. With a global team of 27,000 people, Virtusa aims to provide exciting projects, opportunities, and work with state-of-the-art technologies throughout your career with us. Great minds, great potential, and a dynamic environment await you at Virtusa, where collaboration and excellence are nurtured.,

Posted 1 month ago

Apply

5.0 - 9.0 years

0 Lacs

chennai, tamil nadu

On-site

As a Splunk Developer with AWS Knowledge, you will be joining a prestigious MNC company on a contract basis in Chennai. With over 5 years of experience, you will be responsible for leveraging your expertise in Datadog, AWS, and Splunk to ensure the smooth and reliable operation of critical applications and infrastructure. Your role will involve proactively monitoring, troubleshooting, and resolving issues to maintain optimal performance and availability. Your responsibilities will include implementing, maintaining, and enhancing Datadog monitoring dashboards and alerts, analyzing monitoring data to identify performance bottlenecks, and resolving alerts to minimize service disruptions. You will also provide timely support for application-related issues, collaborate with development teams to address performance issues, and assist in deployment processes for a seamless transition to production. In addition, you will monitor and manage AWS resources to ensure optimal performance and cost efficiency, troubleshoot AWS-related issues, and participate in cloud infrastructure projects. You will configure and maintain Splunk indexes, searches, dashboards, and alerts, analyze log data to troubleshoot system and application issues, and implement log management best practices. Furthermore, you will be involved in incident response activities, document incident response procedures, and collaborate with various teams to communicate technical information effectively. Staying updated on the latest technologies and best practices in monitoring, application support, and cloud computing will be essential to excel in this role. To qualify for this position, you must have strong experience in Datadog monitoring and alerting, hands-on experience with AWS cloud services such as EC2, and familiarity with Splunk log management and analysis. Your ability to work collaboratively, troubleshoot effectively, and communicate technical details to diverse audiences will be crucial in ensuring the efficient operations of the company's applications and infrastructure.,

Posted 1 month ago

Apply

5.0 - 9.0 years

0 Lacs

faridabad, haryana

On-site

As a Frontend Developer at FantasyFi, you will play a crucial role in enhancing and optimizing our web and mobile applications to ensure cutting-edge performance and an exceptional user experience. FantasyFi is a rapidly growing fantasy sports platform based in Faridabad, Haryana, India, that combines daily fantasy games and prediction-based tournaments. We are seeking a highly skilled and passionate individual to lead the frontend development efforts, shaping the user experience of our applications. Your responsibilities will include architecting, developing, and implementing new features and modules using ReactJS, translating UI/UX designs into high-quality code, and driving the adoption of best practices for frontend architecture and design. You will also be responsible for identifying and resolving performance bottlenecks, implementing optimization techniques, and enhancing SEO best practices to improve visibility. Collaboration is key in this role, as you will work closely with the Backend Development team to integrate APIs efficiently and securely, collaborate with UI/UX designers to refine designs, and utilize AWS CloudFront and S3 for optimized asset caching and content delivery. Maintaining code quality and conducting thorough testing and debugging are essential aspects of this position, along with participating in code reviews and ensuring adherence to coding standards. To be successful in this role, you should have at least 5 years of frontend development experience, expertise in ReactJS, proficiency in JavaScript, HTML5, and CSS3, and advanced API integration skills. You should also possess exceptional performance optimization skills, a strong understanding of web security principles, basic AWS knowledge, experience with version control systems like Git, excellent problem-solving abilities, and strong communication skills. Joining FantasyFi will give you the opportunity to lead the user experience development of a rapidly growing fantasy sports platform, work on a high-performing product, be part of an innovative and user-centric team, and directly impact the platform's success and user engagement. If you are passionate about frontend development and eager to contribute to a dynamic and ambitious team, we would love to have you on board.,

Posted 1 month ago

Apply

5.0 - 9.0 years

0 Lacs

faridabad, haryana

On-site

As a Frontend Developer at FantasyFi, you will play a key role in developing, enhancing, and optimizing our web and mobile applications to ensure cutting-edge performance and an exceptional user experience. FantasyFi is a fast-growing fantasy sports platform based in Faridabad, Haryana, India, that combines daily fantasy games and prediction-based tournaments. We are seeking a highly skilled and passionate individual to lead the frontend development efforts and shape the user experience of our applications. Your responsibilities will include architecting, developing, and implementing new features and modules for the FantasyFi applications using ReactJS. You will be translating UI/UX designs into high-quality, reusable code and driving the adoption of best practices for frontend architecture and design. Additionally, you will proactively identify and resolve performance bottlenecks, optimize user experience, and ensure cross-browser compatibility. Collaboration is key in this role, as you will work closely with the Backend Development team to define and integrate APIs efficiently. You will also collaborate with UI/UX designers to refine designs and ensure functional interfaces. Maintaining code quality and conducting thorough testing and debugging are crucial aspects of this position. To excel in this role, you should have at least 5 years of frontend development experience and expertise in ReactJS, JavaScript, HTML5, and CSS3. Advanced API integration skills, experience with performance optimization techniques, and a strong understanding of web security principles are also required. Familiarity with AWS services, version control systems like Git, and excellent problem-solving abilities will be beneficial. Joining FantasyFi offers you the opportunity to work on a high-performing product, be part of an innovative and user-centric team, and directly impact the platform's success and user engagement. If you are passionate about frontend development and eager to contribute to a rapidly growing fantasy sports platform, we welcome you to apply for this exciting opportunity.,

Posted 1 month ago

Apply

5.0 - 9.0 years

0 Lacs

chennai, tamil nadu

On-site

You are a Splunk Developer with AWS knowledge who can start immediately. This is a contract position with an MNC company based in Chennai. You should have at least 5 years of experience. As a Datadog Monitoring & Application Support Engineer, you will be responsible for ensuring the smooth and reliable operation of critical applications and infrastructure. Leveraging your expertise in Datadog, AWS, and Splunk, you will proactively monitor, troubleshoot, and resolve issues to ensure optimal performance and availability. Your responsibilities will include implementing, maintaining, and enhancing Datadog monitoring dashboards and alerts, analyzing monitoring data for performance bottlenecks, providing support for application-related issues, monitoring and managing AWS resources, configuring and maintaining Splunk indexes, participating in incident response activities, and effectively communicating technical information to both technical and non-technical audiences. To qualify for this role, you must have strong experience with Datadog monitoring and alerting, experience with AWS cloud services such as EC2, and experience with Splunk log management and analysis. It is essential to stay current on the latest technologies and best practices in monitoring, application support, and cloud computing.,

Posted 1 month ago

Apply

12.0 - 18.0 years

15 - 30 Lacs

Hyderabad

Work from Office

Role & responsibilities Minimum 15+ years of experience in enterprise Java development -Proven experience as a Java Architect or similar role -Experienced in delivering advanced solutions for a multi-tier, distributed web application with AWS knowledge and experience -Strong foundation in Computer Science fundamentals such as data structures and algorithms -Strong experience with design and implement scalable ,maintainable, high-performance Java-based applications(software solutions) -Strong experience for architecture and design of e-commerce applications running on Java/Spring, -Experience in building low latency service APIs, and data aggregation pipelines -Hands-on programming experience with Java and J2EE, Java related technologies: Spring Framework, Spring Data, spring-boot, RESTful Services, JUnit/TestNG -Solid understanding of technologies such as Web Services, REST API, XML, JSON, HTTP, SSL, TCP/IP, Caching solutions, application performance tuning -Experience in developing applications that utilize relational databases (e.g. Oracle) and other less structured data stores (NoSQL) -Experience in cloud technologies (AWS/GCP) and building devops pipelines - Handle quality metrics (review QA,UAT bugs and plan preventive measures) , code reviews and mentoring team Must have: Java Enterprise Design Patterns, Java 8 and above, Spring Boot, Oracle, No-SQL, REST API, JPA, Microservices, Hibernate, AWS knowledge and experience Preferred candidate profile Stellar Technical leadership in owning end to end delivery including stakeholders management, scope management and enabling timely & quality delivery -Excellent troubleshooting, problem-solving skills and capable of thinking critically and creatively when faced with technical challenges -Good communication skills, effectively conveying complex technical concepts to both technical and non-technical stakeholders -Being self-driven & able to manage competing priorities and adjust to changing requirements in a fast-paced environment -Excellent technical project management skills in stakeholder management & scope control, ensuring timely closures and exceeding customer expectations -Strong interpersonal skills, with the ability to motivate, collaborate & inspire cross-functional teams and encourage a positive, productive work environment

Posted 3 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies