Jobs
Interviews

1090 S3 Jobs - Page 30

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 - 6.0 years

10 - 20 Lacs

Chennai

Work from Office

Role & responsibilities Role : DevOps Engineer DevOps Engineer Job Description: DevOps Engineer (4+ Years of Experience) We are looking for a DevOps Engineer with 4+ years of experience to join our dynamic team. The ideal candidate will have hands-on experience with AWS services, Docker, Kubernetes, and Jenkins, along with a strong understanding of CI/CD pipelines and infrastructure automation. Relevant course completion is mandatory, and certifications in related fields are a plus. Key Responsibilities: Design, implement, and manage scalable and reliable cloud infrastructure using AWS services. Develop and maintain CI/CD pipelines using Jenkins to support continuous integration and deployment. Containerize applications using Docker and orchestrate them with Kubernetes. Monitor, troubleshoot, and optimize system performance to ensure high availability and scalability. Collaborate with development and operations teams to improve deployment workflows and infrastructure automation. Implement security best practices for cloud and container environments. Maintain and update documentation for infrastructure, processes, and configurations. Requirements: Experience: 2+ years in DevOps or related roles. Technical Skills: Hands-on experience with AWS services (e.g., EC2, S3, RDS, CloudFormation, Lambda). Strong understanding and practical knowledge of Docker and containerization. Experience with Kubernetes for container orchestration. Proficiency in using Jenkins for CI/CD pipeline creation and management. Familiarity with Infrastructure as Code (IaC) tools like Terraform or CloudFormation. Basic scripting knowledge (e.g., Bash, Python, or PowerShell). Familiarity with version control systems like Git. Strong problem-solving skills and attention to detail. Excellent communication and collaboration abilities. Preferred Qualifications: Relevant certifications in AWS or Kubernetes. Understanding of monitoring tools like Prometheus, Grafana, or CloudWatch. Experience in setting up logging systems (e.g., ELK Stack) If any Candidate interested, please share the CV in madhumithak@sightspectrum.in Preferred candidate profile

Posted 2 months ago

Apply

5.0 - 8.0 years

7 - 10 Lacs

Bengaluru

Remote

Skillset: PostgreSQL, Amazon Redshift, MongoDB, Apache Cassandra,AWS,ETL, Shell Scripting, Automation, Microsoft Azure We are looking for futuristic, motivated go getters having following skills for an exciting role. Job Description: Monitor and maintain the performance, reliability, and availability of multiple database systems. Optimize complex SQL queries, stored procedures, and ETL scripts for better performance and scalability. Troubleshoot and resolve issues related to database performance, integrity, backups, and replication. Design, implement, and manage scalable data pipelines across structured and unstructured sources. Develop automation scripts for routine maintenance tasks using Python, Bash, or similar tools. Perform regular database health checks, set up alerting mechanisms, and respond to incidents proactively. Analyze performance bottlenecks and resolve slow query issues and deadlocks. Work in DevOps/Agile environments, integrating with CI/CD pipelines for database operations. Collaborate with engineering, analytics, and infrastructure teams to integrate database solutions with applications and BI tools. Research and implement emerging technologies and best practices in database administration. Participate in capacity planning, security audits, and software upgrades for data infrastructure. Maintain comprehensive documentation related to database schemas, metadata, standards, and procedures. Ensure compliance with data privacy regulations and implement robust disaster recovery and backup strategies. Desired skills: Database Systems: Hands-on experience with SQL-based databases (PostgreSQL, MySQL), Amazon Redshift, MongoDB, and Apache Cassandra. Scripting & Automation: Proficiency in scripting using Python, Shell, or similar to automate database operations. Cloud Platforms: Working knowledge of AWS (RDS, Redshift, EC2, S3, IAM,Lambda) and Azure SQL/Azure Cosmos DB. Big Data & Distributed Systems: Familiarity with Apache Spark for distributed data processing. Performance Tuning: Deep experience in performance analysis, indexing strategies, and query optimization. Security & Compliance: Experience with database encryption, auditing, access control, and GDPR/PII policies. Familiarity with Linux and Windows server administration is a plus. Education & Experience: BE, B.Tech, MCA, Mtech from Tier 2/3 colleges & Science Graduates 5-8 years of work experience.

Posted 2 months ago

Apply

4.0 - 6.0 years

12 - 18 Lacs

Chennai, Bengaluru

Work from Office

Key Skills : Python, SQL, PySpark, Databricks, AWS, Data Pipeline, Data Integration, Airflow, Delta Lake, Redshift, S3, Data Security, Cloud Platforms, Life Sciences. Roles & Responsibilities : Develop and maintain robust, scalable data pipelines for ingesting, transforming, and optimizing large datasets from diverse sources. Integrate multi-source data into performant, query-optimized formats such as Delta Lake, Redshift, and S3. Tune data processing jobs and storage layers to ensure cost efficiency and high throughput. Automate data workflows using orchestration tools like Airflow and Databricks APIs for ingestion, transformation, and reporting. Implement data validation and quality checks to ensure reliable and accurate data. Manage and optimize AWS and Databricks infrastructure to support scalable data operations. Lead cloud platform migrations and upgrades, transitioning legacy systems to modern, cloud-native solutions. Enforce security best practices, ensuring compliance with regulatory standards such as IAM and data encryption. Collaborate with cross-functional teams, including data scientists, analysts, and business stakeholders to deliver data solutions. Experience Requirement : 4-6 years of hands-on experience in data engineering with expertise in Python, SQL, PySpark, Databricks, and AWS. Strong background in designing and building data pipelines, and optimizing data storage and processing. Proficiency in using cloud services such as AWS (S3, Redshift, Lambda) for building scalable data solutions. Hands-on experience with containerized environments and orchestration tools like Airflow for automating data workflows. Expertise in data migration strategies and transitioning legacy data systems to modern cloud platforms. Experience with performance tuning, cost optimization, and lifecycle management of cloud data solutions. Familiarity with regulatory compliance (GDPR, HIPAA) and security practices (IAM, encryption). Experience in the Life Sciences or Pharma domain is highly preferred, with an understanding of industry-specific data requirements. Strong problem-solving abilities with a focus on delivering high-quality data solutions that meet business needs. Education : Any Graduation.

Posted 2 months ago

Apply

3.0 - 7.0 years

0 Lacs

hyderabad, telangana

On-site

You will be responsible for developing and modifying programs using Python, AWS Glue/Redshift, and PySpark technologies. Your role will involve writing effective and scalable code, as well as identifying areas for program modifications. Additionally, you must have a strong understanding of AWS cloud technologies such as CloudWatch, Lambda, Dynamo, API Gateway, and S3. Experience in creating APIs from scratch and integrating with 3rd party APIs is also required. This is a full-time position based in Hyderabad/Chennai/Bangalore, and the ideal candidate should have a maximum notice period of 15 days.,

Posted 2 months ago

Apply

3.0 - 7.0 years

0 Lacs

pune, maharashtra

On-site

The ideal candidate for this position should have advanced proficiency in Python, with a solid understanding of inheritance and classes. Additionally, the candidate should be well-versed in EMR, Athena, Redshift, AWS Glue, IAM roles, CloudFormation (CFT is optional), Apache Airflow, Git, SQL, Py-Spark, Open Metadata, and Data Lakehouse. Experience with metadata management is highly desirable, particularly with AWS Services such as S3. The candidate should possess the following key skills: - Creation of ETL Pipelines - Deploying code in EMR - Querying in Athena - Creating Airflow Dags for scheduling ETL pipelines - Knowledge of AWS Lambda and ability to create Lambda functions This role is for an individual contributor, and as such, the candidate is expected to autonomously manage client communication and proactively resolve technical issues without external assistance.,

Posted 2 months ago

Apply

3.0 - 8.0 years

0 Lacs

karnataka

On-site

We are looking for a skilled R Shiny programmer to create interactive reports that transform clinical trial data into actionable clinical insights. As an R Shiny programmer, your role will involve designing, developing, deploying, and optimizing user-friendly web applications for analyzing and visualizing clinical data. Your responsibilities will include designing, developing, testing, and deploying interactive R Shiny web applications. You will collaborate with data scientists, bioinformatics programmers, analysts, and stakeholders to understand application requirements and translate them into intuitive R Shiny applications. Additionally, you will be responsible for translating complex data analysis and visualization tasks into clear and user-friendly interfaces, writing clean and efficient R code, conducting code reviews, and validating R programming. Moreover, you will integrate R Shiny applications with AWS services like AWS Redshift, implement unit tests to ensure quality and performance, benchmark and optimize application performance, and address any inconsistencies in data, analytical, or reporting problems that may arise. Other duties may be assigned as needed. The ideal candidate should possess a Bachelor's degree in computer science, Data Science, or a related field, along with 3 to 8 years of relevant experience. Proven expertise in building R Shiny applications, strong proficiency in R programming, including data manipulation, statistical analysis, and data visualization, experience in using SQL, and an understanding of user interface (UI) and user experience (UX) principles are essential. Experience with gathering requirements, using RStudio, Version Control software, managing programming code, and working with POSIT Workbench, Connect, and/or Package Manager is preferred. Candidates should have the ability to manage multiple tasks, work independently and in a team environment, effectively communicate technical concepts in written and oral formats, and experience with R markdown, continuous integration/continuous delivery (CI/CD) pipelines, and AWS cloud computing services such as Redshift, EC2, S3, and CloudWatch. The required education for this position is a BE/MTech/MCA degree in a computer-related field. A satisfactory background check is mandatory for this role.,

Posted 2 months ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

Job Title : Azure Presales Engineer. About the Role : As a Cloud Presales Engineer specializing in Azure, you will play a critical role in our sales process by working closely with sales and technical teams to provide expert guidance and solutions for our clients. Leveraging your in-depth knowledge on Azure services, you will understand customer needs, design tailored cloud solutions, and drive the adoption of our cloud offerings. This position requires strong technical acumen, excellent communication skills, and a passion for cloud technologies. Key Responsibilities Solution Design and Architecture : Understand customer requirements and design effective cloud solutions using Azure services. Create architecture diagrams and detailed proposals tailored to customer needs. Collaborate with sales teams to define the scope of technical solutions and present them to customers. Technical Expertise And Consultation Act as a subject matter expert on AWS and Azure services, including EC2, S3, Lambda, RDS, VPC, IAM, CloudFormation, Azure Virtual Machines, Blob Storage, Functions, SQL Database, Virtual Network, Azure Active Directory, and ARM Templates. Provide technical support during the sales process, including product demonstrations, POCs (Proof of Concepts), and answering customer queries. Advise customers on best practices for cloud adoption, migration, and optimization. Customer Engagement Build and maintain strong relationships with customers, understanding their business challenges and technical needs. Conduct customer workshops, webinars, and training sessions to educate customers on Azure solutions and services. Gather customer feedback and insights to help shape product and service offerings. Sales Support Partner with sales teams to develop sales strategies and drive cloud adoption. Prepare and deliver compelling presentations, demonstrations, and product pitches to customers. Assist in the preparation of RFPs, RFQs, and other customer documentation. Continuous Learning And Development Stay up-to-date with the latest AWS and Azure services, technologies, and industry trends. Achieve and maintain relevant AWS and Azure certifications to demonstrate expertise. Share knowledge and best practices with internal teams to enhance overall capabilities. Required Qualifications Bachelor's degree in Computer Science, Information Technology, or a related field. Experience in a presales or technical consulting role, with a focus on cloud solutions. In-depth knowledge of AWS and Azure services, with hands-on experience in designing and implementing cloud-based architectures. Azure certifications (i.e. Microsoft Certified : Azure Solutions Architect Expert) are highly preferred. Strong understanding of cloud computing concepts, including IaaS, PaaS, SaaS, and hybrid cloud models. Excellent presentation, communication, and interpersonal skills. Ability to work independently and collaboratively in a fast-paced, dynamic environment. Preferred Qualifications Experience with other cloud platforms (i.e., Google Cloud) is a plus. Familiarity with DevOps practices, CI/CD pipelines, and infrastructure as code (IaC) using Terraform, CloudFormation, and ARM Templates. Experience with cloud security, compliance, and governance best practices. Background in software development, scripting, or system administration. Join us to be part of an innovative team, shaping cloud solutions and driving digital transformation for our clients!. (ref:hirist.tech),

Posted 2 months ago

Apply

3.0 - 7.0 years

0 Lacs

haryana

On-site

As an Engineering Leader at Crop.photo, you will play a crucial role in shaping the future of brand consistency through AI technology. With a focus on building a high-performing engineering team, you will not only lead by example through hands-on coding but also provide guidance to ensure the success of our projects. Your responsibilities will encompass a wide range of tasks, from architecting and developing our AWS-based microservices infrastructure to collaborating with product management on technical decision-making. You will be at the forefront of backend development using Java, Node.js, and Python within the AWS ecosystem, while also contributing to frontend development using React and TypeScript when necessary. Your expertise will be essential in designing and implementing scalable AI/ML pipeline architectures, establishing engineering best practices, and mentoring junior engineers to foster a culture of engineering excellence. Additionally, you will be responsible for system reliability, performance optimization, and cost management, ensuring that our platform delivers high-quality solutions for our marketing professionals. To excel in this role, you must have a minimum of 8+ years of software engineering experience, including at least 3 years of experience leading engineering teams. Your technical skills should cover a wide range of AWS services, backend development, frontend development, system design & architecture, as well as leadership & communication. Your ability to drive architectural decisions, identify technical debt, and lead initiatives to address it will be key to the success of our projects. Working at Crop.photo will provide you with the opportunity to take true technical ownership of a rapidly growing AI platform, shape architecture from an early stage, work with cutting-edge AI/ML technologies, and have a direct impact on product direction and engineering culture. Your success in this role will be measured by the implementation of scalable, maintainable architecture, reduction in system latency and processing costs, successful delivery of key technical initiatives, team growth, and engineering velocity improvements, as well as system reliability and uptime metrics. If you are passionate about building scalable systems, have a proven track record of technical leadership, and thrive in an early-stage environment where you can make a significant impact on both technology and team culture, we encourage you to apply for this exciting opportunity at Crop.photo.,

Posted 2 months ago

Apply

7.0 - 11.0 years

0 Lacs

karnataka

On-site

You will be responsible for analyzing and debugging problems at the network, storage, and virtualization layers of scale-out distributed storage solutions in various cloud environments. Your role will involve developing a knowledge base to expedite the troubleshooting of customer issues and providing feedback on existing tools while identifying and creating new tools required for customer problem triaging. Additionally, you will research, diagnose, troubleshoot, and resolve customer issues, collaborating with engineering teams as necessary. Working with technical writing resources to document issue resolutions accurately and assisting in defining the support process from issue identification to resolution will also be part of your responsibilities. Ideally, you should hold a BS/MS in Computer Science or equivalent and possess a minimum of 7+ years of experience in storage solutions. Your background should include testing and debugging storage systems, particularly distributed systems, along with a solid understanding of NFS protocols (v3, v4.1, pNFS). Familiarity with other storage protocols like SMB and S3, as well as virtualization technologies such as VMware, Hyper-V, and containers, will be beneficial. Knowledge of network solutions for clouds, including network virtualization technologies, is also desirable. Leadership skills, the ability to take ownership of customer issues, commitment, focus, and exceptional customer service and communication skills are essential. You should be proficient in researching, diagnosing, troubleshooting, and providing solutions to customer problems. Experience in developing scripts using Python or other scripting languages would be advantageous. In terms of personal characteristics, you should have a keen eye for distinguishing between perfection and adequacy, be prepared to tackle challenging tasks independently, be a team player, demonstrate good judgment, and be willing to question assumptions. You should be comfortable working in a fast-paced environment alongside a global team.,

Posted 2 months ago

Apply

3.0 - 7.0 years

0 Lacs

kochi, kerala

On-site

You will be responsible for planning, implementing, and growing the AWS cloud infrastructure. Your role will involve building, releasing, and managing the configuration of all production systems. It will be essential to manage a continuous integration and deployment methodology for server-based technologies. Collaboration with architecture and engineering teams to design and implement scalable software services will also be part of your responsibilities. Ensuring system security through the utilization of best-in-class cloud security solutions will be crucial. Staying up to date with new technology options and vendor products is important, and you will be expected to evaluate which ones would be suitable for the company. Implementing continuous integration/continuous delivery (CI/CD) pipelines when needed will also fall under your purview. You will have the opportunity to recommend process and architecture improvements, troubleshoot the system, and resolve problems across all platform and application domains. Overseeing pre-production acceptance testing to maintain the high quality of the company's services and products will be part of your duties. Experience with Terraform, Ansible, GIT, and Cloud Formation will be beneficial for this role. Additionally, a solid background in Linux/Unix and Windows server system administration is required. Configuring the AWS CloudWatch and monitoring, creating and modifying scripts, and hands-on experience with MySQL are also essential skills. You should have experience in designing and building web environments on AWS, including working with services like EC2, ELB, RDS, and S3. This is a full-time position with benefits such as Provident Fund and a yearly bonus. The work schedule is during the day shift, and the preferred experience level for AWS is 3 years. The work location is in person.,

Posted 2 months ago

Apply

4.0 - 8.0 years

0 Lacs

karnataka

On-site

We are looking for individuals who are risk-takers, collaborators, inspired, and inspirational. We seek those who are courageous enough to work on the cutting edge and develop solutions that will enhance and enrich the lives of people globally. If you aspire to make a difference that wows the world, we are eager to have a conversation with you. If you believe this role aligns with your ambitions and skill set, we invite you to begin the application process. Explore our other available positions as well, as our numerous opportunities can pave the way for endless possibilities. With 4 to 8 years of experience, the ideal candidate should possess the following primary skills: - Proficiency in Server Side (Java) & AWS serverless framework. - Hands-on experience with serverless framework is a must. - Design knowledge and experience in cloud-based web applications. Familiarity with software design representation tools like astah, visio, etc. - Strong experience in AWS, including but not limited to EC2 Volume, EC2 Security Group, EC2 AMI, Lambda, S3, AWSbackup, CloudWatch, CloudFormation, CloudTrail, IAM, SecretsManager, StepFunction, CostExplorer, KMS, VPC/Subnet. - Ability to understand business requirements concerning UI/UX. - Work experience on development/staging/production servers. - Proficient in testing and verification, knowledge of SSL certificates, and encryption. - Familiarity with Docker containerization. In addition to technical skills, soft skills are also crucial, including: - Excellent interpersonal, oral, and written communication skills. - Strong analytical and problem-solving abilities. - Capability to comprehend and analyze customer requirements and expectations. - Experience in interacting with customers. - Previous work with international cross-culture teams is a plus. Secondary Skills include: - Scripting using Python. - Knowledge of identity management is advantageous. - Understanding of UI/UX, ReactJS/typescript/bootstrap. - Proficiency in business use cases concerning UI/UX. - Troubleshooting issues related to integration on the cloud (front end/back end/system/services APIs).,

Posted 2 months ago

Apply

4.0 - 8.0 years

0 Lacs

chennai, tamil nadu

On-site

This is a full-time on-site role for a PHP Laravel Developer based in Chennai. In this position, you will play a key role in developing and maintaining web applications utilizing the Laravel framework. Your responsibilities will include coding, debugging, testing, and deploying new features. Additionally, you will collaborate with cross-functional teams to create efficient and scalable solutions. To excel in this role, you must possess a strong proficiency in PHP and have hands-on experience with the Laravel framework. Familiarity with frontend technologies like HTML, CSS, and JavaScript is essential. Moreover, knowledge of database management systems, particularly MySQL, is required. Understanding RESTful APIs, integrating third-party services, and using version control systems like Git are also important aspects of this position. Candidates should have practical experience in schema design, query optimization, REST API, and AWS services such as EC2, S3, RDS, Lambda, and Redis. Proficiency in designing scalable and secure web applications, expertise in automated testing frameworks, and a solid grasp of web security practices are crucial for success in this role. The ideal candidate will be able to prioritize tasks effectively and work both independently and collaboratively as part of a team. Strong problem-solving and troubleshooting skills are essential, as is clear communication and the ability to work with others. A Bachelor's degree in computer science or a related field, or equivalent experience, is required. Requirements: - Strong proficiency in PHP with Laravel framework - Experience in HTML, CSS, and JavaScript - Knowledge of MySQL and RESTful APIs - Familiarity with Git and version control systems - Hands-on experience with schema design, query optimization, and REST API - Profound knowledge of AWS services - Demonstrated experience in designing scalable and secure web applications - Expertise in automated testing frameworks - Strong understanding of web security practices - Ability to prioritize tasks and work independently or as part of a team - Excellent problem-solving and troubleshooting skills - Good communication and collaboration skills - Bachelor's degree or equivalent experience in computer science or related field Experience: 4+ Years Location: Chennai/Madurai Interested candidates can share CV at anushya.a@extendotech.com / 6374472538 Job Type: Full-time Benefits: Health insurance, Provident Fund Location Type: In-person Schedule: Morning shift Work Location: In person,

Posted 2 months ago

Apply

3.0 - 7.0 years

0 Lacs

hyderabad, telangana

On-site

Tezo is a new generation Digital & AI solutions provider, with a history of creating remarkable outcomes for our customers. We bring exceptional experiences using cutting-edge analytics, data proficiency, technology, and digital excellence. Job Overview The AWS Architect with Data Engineering Skills will be responsible for designing, implementing, and managing scalable, robust, and secure cloud infrastructure and data solutions on AWS. This role requires a deep understanding of AWS services, data engineering best practices, and the ability to translate business requirements into effective technical solutions. Key Responsibilities Architecture Design: Design and architect scalable, reliable, and secure AWS cloud infrastructure. Develop and maintain architecture diagrams, documentation, and standards. Data Engineering Design and implement ETL pipelines using AWS services such as Glue, Lambda, and Step Functions. Build and manage data lakes and data warehouses using AWS services like S3, Redshift, and Athena. Ensure data quality, data governance, and data security across all data platforms. AWS Services Management Utilize a wide range of AWS services (EC2, S3, RDS, Lambda, DynamoDB, etc.) to support various workloads and applications. Implement and manage CI/CD pipelines using AWS CodePipeline, CodeBuild, and CodeDeploy. Monitor and optimize the performance, cost, and security of AWS resources. Collaboration And Communication Work closely with cross-functional teams including software developers, data scientists, and business stakeholders. Provide technical guidance and mentorship to team members on best practices in AWS and data engineering. Security And Compliance Ensure that all cloud solutions follow security best practices and comply with industry standards and regulations. Implement and manage IAM policies, roles, and access controls. Innovation And Improvement Stay up to date with the latest AWS services, features, and best practices. Continuously evaluate and improve existing systems, processes, and architectures. (ref:hirist.tech),

Posted 2 months ago

Apply

4.0 - 8.0 years

0 Lacs

ahmedabad, gujarat

On-site

This is a remote position. Job Details Job Title - Sr. QA Analyst ( Automation Engineer ) Office Location - Office No: 403-405, Time Square, CG Road,Ellisbridge, Ahmedabad, Gujarat-380006. Duration & Type of Employment - Full Time Permanent , 4+yrs Experience req. Work Style - Hybrid In Office days - 3 days a week Requirements Tech Stack Selenium WebDriver (browser automation) Playwright (modern web automation) TestNG / JUnit (WebDriver) APPIUM Testing Mocha / Jest API Testing : Postman, Playwright Mocking: Playwright Network mocks, wiremock Programming language: JS, TS, Python Mentoring: Guide junior QAs on automation best practices. Automation Framework Design: Design & maintain WebDriver/Playwright automation frameworks. CI/CD Integration: Ensure automated tests are part of the CI/CD pipeline. (Github Actions) Test Reporting & Continuous Improvement: Oversee test reporting, monitor results, and improve automation efficiency. Testing to be done on the following Tech Stack: AWS Serverless Lambda with Node.js API Gateway (REST/JSON) DynamoDB S3 API Integration React.js (website) / React Native (app) Requirements: Bachelor's or Master's degree in Computer Science, Engineering, or a related field, or equivalent experience. Proven experience as a QA Analyst, Software Tester, Developer, or in a similar role. Strong understanding of software testing concepts, methodologies, and best practices. Proficiency in test case design, test execution, and defect tracking. Experience with manual testing of web and mobile applications across different platforms and devices. Experience with JavaScript testing frameworks like Jest and Vitest. Knowledge of automated testing tools and setting up frameworks. Solid knowledge of defect tracking systems and experience working with bug tracking tools. Strong analytical and problem-solving skills, with the ability to think critically and troubleshoot issues. Excellent attention to detail and ability to meticulously follow test plans and procedures. Effective communication and collaboration skills to work with cross-functional teams and stakeholders. Knowledge of Agile methodologies and experience working in Agile development environments. Familiarity with continuous integration/continuous deployment (CI/CD) pipelines and tools. Ability to adapt to changing priorities and work under tight deadlines. Knowledge of software development lifecycle (SDLC) and software engineering principles. Responsibilities: Collaborate with cross-functional teams to understand project requirements and define test strategies and plans. Develop, document, and maintain detailed test cases and test scripts based on project requirements and functional specifications. Execute manual and automated tests to verify software functionality, performance, usability, and security. Identify, document, and track software defects using a bug tracking system and work closely with the development team to ensure timely resolution. Participate in the review of product requirements, design documents, and specifications to provide input on testability and quality aspects. Perform exploratory testing and provide feedback on user experience and potential usability issues. Conduct regression testing to ensure that software changes and updates do not introduce new defects. Collaborate with software developers to reproduce and debug reported issues, and provide clear and concise steps to reproduce. Continuously improve the QA process by identifying inefficiencies, proposing solutions, and implementing best practices. Stay up-to-date with industry trends and advancements in software testing methodologies and tools. Communicate test progress, test results, and other relevant information to project stakeholders. Bonus skills: Developer experience. ISTQB or similar certification in software testing. Experience with performance testing and load testing tools (e.g., JMeter, LoadRunner). Knowledge of test automation frameworks and scripting languages (e.g., Java, Python, JavaScript). Familiarity with API testing and tools like Postman or SOAPUI. Experience with database testing and SQL query language. Understanding of security testing concepts and tools (e.g., OWASP ZAP, Burp Suite). Experience with test management tools (e.g., TestRail, Zephyr). Knowledge of usability testing and user experience (UX) principles. Start-up experience. Benefits Hybrid Working Culture Amazing Perks & Medical Benefits 5 Days Working Mentorship programs & Certification Courses Flexible work arrangements Free drinks fridge and snacks Competitive Salary & recognitions,

Posted 2 months ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

As a Senior / Lead Full Stack or Frontend Engineer, you will be responsible for delivering performant, scalable, and high-quality cloud-based software, encompassing both frontend and backend components. Your key duties will include conducting code reviews, mentoring team members to align with product requirements, and collaborating with the Senior Architect to make design and technology decisions for the product development roadmap. We are seeking candidates with a strong educational background from prestigious institutions like IIT, NIT, BITs, or other Tier 1 colleges, or individuals from non-premium institutes with experience in product companies. The ideal candidate should possess a comprehensive understanding of developing cloud-based software, including backend APIs and frontend frameworks like React and Angular. Proficiency in scalable design patterns and message-based systems such as Kafka, RabbitMQ, Redis, MongoDB, ORM, SQL, along with experience in AWS services like S3, IAM, and Lambda is essential. You should have expert-level coding skills in NodeJs, TypeScript, Scala, ReactJs, and Angular, with a focus on user-centric mobile-first designs on the frontend. Experience with hybrid frontends such as React Native and ElectronJs will be considered a plus. Join us in Bangalore for a Full-Time, Permanent position (currently remote, relocation required post-pandemic) and contribute to building cutting-edge cloud software solutions that drive our product forward.,

Posted 2 months ago

Apply

5.0 - 9.0 years

0 Lacs

haryana

On-site

Genpact is a global professional services and solutions firm focused on delivering outcomes that shape the future. With over 125,000 employees in more than 30 countries, we are driven by curiosity, agility, and the desire to create lasting value for our clients. Our purpose is the relentless pursuit of a world that works better for people, serving and transforming leading enterprises, including Fortune Global 500 companies, through deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI. We are currently seeking applications for the position of Lead Consultant-Databricks Developer - AWS. As a Databricks Developer in this role, you will be responsible for solving cutting-edge real-world problems to meet both functional and non-functional requirements. Responsibilities: - Stay updated on new and emerging technologies and explore their potential applications for service offerings and products. - Collaborate with architects and lead engineers to design solutions that meet functional and non-functional requirements. - Demonstrate knowledge of relevant industry trends and standards. - Showcase strong analytical and technical problem-solving skills. - Possess excellent coding skills, particularly in Python or Scala, with a preference for Python. Qualifications: Minimum qualifications: - Bachelor's Degree in CS, CE, CIS, IS, MIS, or an engineering discipline, or equivalent work experience. - Stay informed about new technologies and their potential applications. - Collaborate with architects and lead engineers to develop solutions. - Demonstrate knowledge of industry trends and standards. - Exhibit strong analytical and technical problem-solving skills. - Proficient in Python or Scala coding. - Experience in the Data Engineering domain. - Completed at least 2 end-to-end projects in Databricks. Additional qualifications: - Familiarity with Delta Lake, dbConnect, db API 2.0, and Databricks workflows orchestration. - Understanding of Databricks Lakehouse concept and its implementation in enterprise environments. - Ability to create complex data pipelines. - Strong knowledge of Data structures & algorithms. - Proficiency in SQL and Spark-SQL. - Experience in performance optimization to enhance efficiency and reduce costs. - Worked on both Batch and streaming data pipelines. - Extensive knowledge of Spark and Hive data processing framework. - Experience with cloud platforms (Azure, AWS, GCP) and common services like ADLS/S3, ADF/Lambda, CosmosDB/DynamoDB, ASB/SQS, Cloud databases. - Skilled in writing unit and integration test cases. - Excellent communication skills and experience working in teams of 5 or more. - Positive attitude towards learning new skills and upskilling. - Knowledge of Unity catalog and basic governance. - Understanding of Databricks SQL Endpoint. - Experience in CI/CD to build pipelines for Databricks jobs. - Exposure to migration projects for building Unified data platforms. - Familiarity with DBT, Docker, and Kubernetes. This is a full-time position based in India-Gurugram. The job posting was on August 5, 2024, and the unposting date is set for October 4, 2024.,

Posted 2 months ago

Apply

10.0 - 17.0 years

0 Lacs

hyderabad, telangana

On-site

We have an exciting opportunity for an ETL Data Architect position with an AI-ML driven SaaS Solution Product Company in Hyderabad. As an ETL Data Architect, you will play a crucial role in designing and implementing a robust Data Access Layer to provide consistent data access needs to the underlying heterogeneous storage layer. You will also be responsible for developing and enforcing data governance policies to ensure data security, quality, and compliance across all systems. In this role, you will lead the architecture and design of data solutions that leverage the latest tech stack and AWS cloud services. Collaboration with product managers, tech leads, and cross-functional teams will be essential to align data strategy with business objectives. Additionally, you will oversee data performance optimization, scalability, and reliability of data systems while guiding and mentoring team members on data architecture, design, and problem-solving. The ideal candidate should have at least 10 years of experience in data-related roles, with a minimum of 5 years in a senior leadership position overseeing data architecture and infrastructure. A deep background in designing and implementing enterprise-level data infrastructure, preferably in a SaaS environment, is required. Extensive knowledge of data architecture principles, data governance frameworks, security protocols, and performance optimization techniques is essential. Hands-on experience with AWS services such as RDS, Redshift, S3, Glue, Document DB, as well as other services like MongoDB, Snowflake, etc., is highly desirable. Familiarity with big data technologies (e.g., Hadoop, Spark) and modern data warehousing solutions is a plus. Proficiency in at least one programming language (e.g., Node.js, Java, Golang, Python) is a must. Excellent communication skills are crucial in this role, with the ability to translate complex technical concepts to non-technical stakeholders. Proven leadership experience, including team management and cross-functional collaboration, is also required. A Bachelor's degree in computer science, Information Systems, or related field is necessary, with a Master's degree being preferred. Preferred qualifications include experience with Generative AI and Large Language Models (LLMs) and their applications in data solutions, as well as familiarity with financial back-office operations and the FinTech domain. Stay updated on emerging trends in data technology, particularly in AI/ML applications for finance. Industry: IT Services and IT Consulting,

Posted 2 months ago

Apply

7.0 - 12.0 years

25 - 35 Lacs

Hyderabad, Chennai

Hybrid

Hiring for AWS Devops Enginer immediate joiners preferred. Preferred candidate profile AWS - EKS Cluster setup, scaling, node groups, IAM roles, ingress AWS - EC2 AMI usage, instance lifecycle, auto-scaling AWS - EBS Volume usage, snapshots, reattachment AWS - S3 Lifecycle policies, usage with backups/CI-CD AWS - RDS Provisioning, backups, monitoring, failover AWS - SNS Notifications, topic-subscription configuration CI/CD - Jenkins Job configs, pipelines, shared libraries CI/CD - GitHub Repo structure, branch policy, PR reviews CI/CD - Terraform State management, modules, remote backend CI/CD - Argo Rollouts Canary, blue-green strategies, Istio integration K8s - Helm Charts Custom charts, chart repo usage, secrets templating K8s - Istio Gateway, mTLS, observability setup Monitoring - Datadog Dashboards, log ingestion, alerts, APM Automation - Scripts Provisioning, scaling, log rotation, backups Automation - CRON jobs Scheduled tasks for automation Linux Admin - User Access Access controls, patching Linux Admin - Performance Monitoring System metrics, troubleshooting

Posted 2 months ago

Apply

7.0 - 12.0 years

10 - 15 Lacs

Bengaluru

Hybrid

Hiring an AWS Data Engineer for a 6-month hybrid contractual role based in Bellandur, Bengaluru. The ideal candidate will have 7+ years of experience in data engineering, with strong expertise in AWS services (S3, EC2, RDS, Lambda, EKS), PostgreSQL, Redis, Apache Iceberg, and Graph/Vector Databases. Proficiency in Python or Golang is essential. Responsibilities include designing and optimizing data pipelines on AWS, managing structured and in-memory data, implementing advanced analytics with vector/graph databases, and collaborating with cross-functional teams. Prior experience with CI/CD and containerization (Docker/Kubernetes) is a plus.

Posted 2 months ago

Apply

5.0 - 8.0 years

8 - 10 Lacs

Hyderabad, Bengaluru, Delhi / NCR

Work from Office

We are seeking a skilled Application Support Engineer to manage and support multiple invoice processing and development projects. The ideal candidate should have experience in production support, issue resolution, process automation, and working with tools like Jiffy and Python scripting. Responsibilities: Provide end-to-end support including: Running scheduled tasks based on received files Issue resolution and root cause analysis Handling new change requests and enhancements Support and development in DEV environments for document and invoice processing (VOLVO & DXC projects) Ongoing support and development for Gatwick application Collaborate with cross-functional teams to ensure smooth delivery of features and fixes Utilize Jiffy automation tool for process automation and monitoring Write and maintain scripts in Python to automate repetitive support tasks Document processes and provide clear communication on resolutions and updates Keyskills: Experience in unit testing, distributed systems, and API development is essential. Requirements * Minimum 8 years of hands-on development experience in a role * Bachelor degree in Information Technology, Computer Science, Software Engineering, or a related field * Very good level of English communication (spoken and written) * Ability to work Indian Standard Time (IST) hours, with flexibility to adjust start times during onboarding to overlap with Australian business hours for the first week * Ability to navigate ambiguity and deliver in a complex environment * Strong collaborative and leadership skills Location: Remote- Bengaluru,Hyderabad,Delhi / NCR,Chennai,Pune,Kolkata,Ahmedabad,Mumbai

Posted 2 months ago

Apply

7.0 - 12.0 years

20 - 35 Lacs

Pune, Gurugram

Hybrid

Python,PySpark,SQL / Queries ,AWS Elastic MapReduce (EMR),Amazon Managed Workflow for Apache Airflow (MWAA),AWS CDK, Cloud-formation, Lambda, Step-funtion,Athena,Redshift,Glue Catalog,S3 ,CI/CD: Github Actions

Posted 2 months ago

Apply

1.0 - 2.0 years

3 - 7 Lacs

Mohali

Work from Office

Manage AWS (EC2, S3, RDS), CI/CD pipelines, Bunny.net CDN. Ensure performance, uptime, and cost optimization. Set up alerts, logs, and backups for reliability. Analyze AWS usage for efficient cloud cost control.

Posted 2 months ago

Apply

5.0 - 10.0 years

15 - 30 Lacs

Hyderabad

Work from Office

Job Title: .NET Cloud (AWS) Developer Job Description: We are looking for a skilled and passionate .NET Developer to join our growing engineering team at our Hyderabad office. The ideal candidate will have strong experience in C#, .NET Core, and AWS services, with a focus on cloud-native architecture and scalable applications. Key Responsibilities: Design, develop, test, and deploy scalable web applications using C# and .NET Core. Build and manage microservices and APIs hosted on AWS using: Lambda Functions Elastic Kubernetes Services (EKS) CloudFront S3 Buckets Aurora Database & RDS (Good to have) Collaborate with DevOps and cloud teams to ensure cloud best practices are followed. Work with version control and CI/CD pipelines using GitHub. Utilize GitHub Copilot to boost productivity and coding efficiency. Write clean, maintainable, and testable code, following coding standards. Participate in code reviews and agile ceremonies (daily stand-ups, retrospectives). Engage in Prompt Engineering for enhancing AI-assisted development (Good to have). Optionally contribute to front-end development using Angular (Good to have). Required Skills: Proficiency in C# and .NET Core. Strong understanding and practical experience with AWS Cloud Services. Hands-on with serverless architecture (Lambda) and container orchestration (EKS). Familiarity with CloudFront, S3, and Aurora/RDS. Good knowledge of GitHub workflows, CI/CD, and GitHub Copilot. Strong debugging, troubleshooting, and analytical skills. Familiarity with Agile methodologies. Preferred/Good-to-Have Skills: Prompt Engineering for AI-assisted coding. Angular experience for full-stack exposure. Exposure to cloud cost optimization and performance tuning. Educational Qualification: Bachelors or Masters degree in Computer Science, Engineering, or a related field. Why Join Us? Be part of a technology-driven organization with global clientele. Opportunity to work with modern cloud-native technologies. Collaborative culture focused on learning, innovation, and career growth. State-of-the-art infrastructure and vibrant work environment in Hyderabad. Note: This is only by invite drive and not a walk-in drive. If interested, please apply.

Posted 2 months ago

Apply

5.0 - 10.0 years

16 - 30 Lacs

Hyderabad

Work from Office

Dear Candidate, Greetings of the day..!! We have an urgent requirement for a .NET Full Stack Developer with one of our esteemed client and please find the requirement below: Position Details: Skillset: .NET Core, AWS Services (S3, EC2, Lambda, SQS and Cloud watch) Experience : 5 to 10 Years Location (Job): Hyderabad Work from Office: 5 days a week Interview Location: Hyderabad Interview Dates: 18th & 19th July 2025 Interview Mode: Face to Face Interview Notice Period : Immediate to 30 Days Walk-in Interview Details: Interview Dates: 18th & 19th July 2025 Interview Location: Hyderabad Job Location: Hyderabad Required Skills & Qualifications: Full Stack Development: Proven experience using .NET technologies (e.g., .NET 4.6.1, .NET Core 3, ASP.NET Web API 2). Cloud Technologies: Experience with AWS services like S3, Lambda, CloudWatch, and EC2. If you are interested and available for the walk-in interview, please send your updated resume to Sneha.k@precisiontechcorp.com. We look forward to hearing from you soon! Thanks & Regards Sneha K Sneha.k@precisiontechcorp.com

Posted 2 months ago

Apply

8.0 - 13.0 years

7 - 17 Lacs

Bengaluru

Hybrid

MuleSoft development. RESTful APIdevelopment and integration design patterns.JSON, XML,web service technologiesDataWeave API-led connectivity, MuleSoft security policies CI/CD pipelines(SQL/NoSQL)AWS services like S3, SQS Salesforce

Posted 2 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies