Jobs
Interviews

81 Aws Kinesis Jobs - Page 4

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

6.0 - 11.0 years

4 - 8 Lacs

Kolkata

Work from Office

Must have knowledge in Azure Datalake, Azure function, Azure Databricks, Azure data factory, PostgreSQL Working knowledge in Azure devops, Git flow would be an added advantage. (OR) SET 2 Must have working knowledge in AWS Kinesis, AWS EMR, AWS Glue, AWS RDS, AWS Athena, AWS RedShift. Should have demonstrable knowledge and expertise in working with timeseries data. Working knowledge in delivering data engineering / data science projects in Industry 4.0 is an added advantage. Should have knowledge on Palantir. Strong problem-solving skills with an emphasis on sustainable and reusable development. Experience using statistical computer languages to manipulate data and draw insights from large data sets Python/PySpark, Pandas, Numpy seaborn / matplotlib, Knowledge in Streamlit.io is a plus Familiarity with Scala, GoLang, Java would be added advantage. Experience with big data toolsHadoop, Spark, Kafka, etc. Experience with relational databases such as Microsoft SQL Server, MySQL, PostGreSQL, Oracle and NoSQL databases such as Hadoop, Cassandra, Mongo dB Experience with data pipeline and workflow management toolsAzkaban, Luigi, Airflow, etc Experience building and optimizing big data data pipelines, architectures and data sets. Strong analytic skills related to working with unstructured datasets. Primary Skills Provide innovative solutions to the data engineering problems that are faced in the project and solve them with technically superior code & skills. Where possible, should document the process of choosing technology or usage of integration patterns and help in creating a knowledge management artefact that can be used for other similar areas. Create & apply best practices in delivering the project with clean code. Should work innovatively and have a sense of proactiveness in fulfilling the project needs. Additional Information: Reporting to Director- Intelligent Insights and Data Strategy Travel Must be willing to be deployed at client locations anywhere in the world for long and short term as well as should be flexible to travel on shorter duration within India and abroad

Posted 3 months ago

Apply

4 - 6 years

7 - 9 Lacs

Hyderabad

Work from Office

What you will do In this vital role you will be responsible for designing, developing, and maintaining software solutions for Research scientists. Additionally, it involves automating operations, monitoring system health, and responding to incidents to minimize downtime. You will join a multi-functional team of scientists and software professionals that enables technology and data capabilities to evaluate drug candidates and assess their abilities to affect the biology of drug targets. This team implements scientific software platforms that enable the capture, analysis, storage, and report of in vitro assays and in vivo / pre-clinical studies as well as those that manage compound inventories / biological sample banks. The ideal candidate possesses experience in the pharmaceutical or biotech industry, strong technical skills, and full stack software engineering experience (spanning SQL, back-end, front-end web technologies, automated testing). Roles & Responsibilities: Take ownership of complex software projects from conception to deployment Manage software delivery scope, risk, and timeline Possesses strong rapid prototyping skills and can quickly translate concepts into working code Contribute to both front-end and back-end development using cloud technology Develop innovative solution using generative AI technologies Conduct code reviews to ensure code quality and consistency to standard methodologies Create and maintain documentation on software architecture, design, deployment, disaster recovery, and operations Identify and resolve technical challenges effectively Stay updated with the latest trends and advancements Work closely with product team, business team including scientists, and other collaborators Design, develop, and implement applications and modules, including custom reports, interfaces, and enhancements Analyze and understand the functional and technical requirements of applications, solutions and systems and translate them into software architecture and design specifications Develop and implement unit tests, integration tests, and other testing strategies to ensure the quality of the software Identify and resolve software bugs and performance issues Work closely with multi-functional teams, including product management, design, and QA, to deliver high-quality software on time Maintain detailed documentation of software designs, code, and development processes Customize modules to meet specific business requirements Work on integrating with other systems and platforms to ensure seamless data flow and functionality Provide ongoing support and maintenance for applications, ensuring that they operate smoothly and efficiently What we expect of you We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Masters degree with 4 - 6 years of experience in Computer Science, IT, Computational Chemistry, Computational Biology/Bioinformatics or related field OR Bachelors degree with 6 - 8 years of experience in Computer Science, IT, Computational Chemistry, Computational Biology/Bioinformatics or related field OR Diploma with 10 - 12 years of experience in Computer Science, IT, Computational Chemistry, Computational Biology/Bioinformatics or related field Must-Have Skills: Proficient in a General-Purpose High-Level Language (e.g. Python, Java, C#.NET) Proficient in a Javascript UI Framework (e.g. React, ExtJs) Proficient in SQL (e.g. Oracle, PostGres, Databricks) Experience with event-based architecture (e.g. Mulesoft, AWS EventBridge, AWS Kinesis, Kafka) Preferred Qualifications: 3+ years of experience in implementing and supporting biopharma scientific software platforms Strong understanding of software development methodologies, mainly Agile and Scrum Hands-on experience with Full Stack software development Strong understanding of cloud platforms (e.g AWS) and containerization technologies (e.g., Docker, Kubernetes) Working experience with DevOps practices and CI/CD pipelines Experience with big data technologies (e.g., Spark, Databricks) Experience with API integration, serverless, microservices architecture (e.g. Mulesoft, AWS Kafka) Experience with monitoring and logging tools (e.g., Prometheus, Grafana, Splunk) Experience of infrastructure as code (IaC) tools (Terraform, CloudFormation) Experience with version control systems like Git Experience with automated testing tools and frameworks Experience with Benchling Professional Certifications AWS Certified Cloud Practitioner preferred Soft Skills: Excellent problem solving, analytical, and troubleshooting skills Strong communication and interpersonal skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation Ability to learn quickly & work independently Team-oriented, with a focus on achieving team goals Ability to manage multiple priorities successfully Strong presentation and public speaking skills

Posted 4 months ago

Apply

- 3 years

3 - 6 Lacs

Hyderabad

Work from Office

Lets do this. Lets change the world. In this vital role you will join a multi-functional team of scientists and software professionals that enables technology and data capabilities to evaluate drug candidates and assess their abilities to affect the biology of drug targets. This team implements scientific software platforms that enable the capture, analysis, storage, and report of in vitro assays and in vivo / pre-clinical studies as well as those that manage compound inventories / biological sample banks. The ideal candidate possesses experience in the pharmaceutical or biotech industry, strong technical skills, and full stack software engineering experience (spanning SQL, back-end, front-end web technologies, automated testing). Roles Responsibilities: Design, develop, and implement applications and modules, including custom reports, interfaces, and enhancements Analyze and understand the functional and technical requirements of applications, solutions and systems and translate them into software architecture and design specifications Develop and execute unit tests, integration tests, and other testing strategies to ensure the quality of the software Identify and resolve software bugs and performance issues Work closely with cross-functional teams, including product management, design, and QA, to deliver high-quality software on time Maintain documentation of software designs, code, and development processes Customize modules to meet specific business requirements Work on integrating with other systems and platforms to ensure seamless data flow and functionality Provide ongoing support and maintenance for applications, ensuring that they operate smoothly and efficiently Contribute to both front-end and back-end development using cloud technology Develop innovative solution using generative AI technologies Identify and resolve technical challenges effectively Work closely with product team, business team including scientists, and other stakeholders What we expect of you We are all different, yet we all use our unique contributions to serve patients. The [vital attribute] professional we seek is a [type of person] with these qualifications. Basic Qualifications: Bachelors degree and 0 to 3 years of experience in Computer Science, IT, Computational Chemistry, Computational Biology/Bioinformatics or related field OR Diploma and 4 to 7 years of experience in Computer Science, IT, Computational Chemistry, Computational Biology/Bioinformatics or related field Preferred Qualifications: Experience in implementing and supporting biopharma scientific software platforms Functional Skills: Must-Have Skills: Proficient in a General Purpose High Level Language (e.g. Python, Java, C#.NET) Proficient in a Javascript UI Framework (e.g. React, ExtJs) Proficient in a SQL (e.g. Oracle, PostGres, Databricks) Experience with event-based architecture (e.g. Mulesoft, AWS EventBridge, AWS Kinesis, Kafka) Good-to-Have Skills: Strong understanding of software development methodologies, mainly Agile and Scrum Hands-on experience with Full Stack software development Strong understanding of cloud platforms (e.g AWS) and containerization technologies (e.g., Docker, Kubernetes) Working experience with DevOps practices and CI/CD pipelines Experience with big data technologies (e.g., Spark, Databricks) Experience with API integration, serverless, microservices architecture (e.g. Mulesoft, AWS Kafka) Experience with monitoring and logging tools (e.g., Prometheus, Grafana, Splunk) Experience of infrastructure as code (IaC) tools (Terraform, CloudFormation) Experience with version control systems like Git Experience with automated testing tools and frameworks Experience with Benchling Professional Certifications (please mention if the certification is preferred or mandatory for the role): AWS Certified Cloud Practitioner preferred Soft Skills: Excellent problem solving, analytical, and troubleshooting skills Strong communication and interpersonal skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation Ability to learn quickly work independently Team-oriented, with a focus on achieving team goals Ability to manage multiple priorities successfully Strong presentation and public speaking skills

Posted 4 months ago

Apply

15.0 - 20.0 years

13 - 18 Lacs

coimbatore

Work from Office

Project Role : Data Architect Project Role Description : Define the data requirements and structure for the application. Model and design the application data structure, storage and integration. Must have skills : AWS Analytics Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Architect, you will define the data requirements and structure for the application. Your typical day will involve modeling and designing the application data structure, storage, and integration, ensuring that the architecture aligns with business needs and technical specifications. You will collaborate with various teams to ensure that data flows seamlessly across the organization, contributing to the overall efficiency and effectiveness of data management practices. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Develop and maintain documentation related to data architecture and design. Professional & Technical Skills: - Must To Have Skills: Proficiency in AWS Analytics.- Strong understanding of data modeling techniques and best practices.- Experience with data integration tools and methodologies.- Familiarity with cloud data storage solutions and architectures.- Ability to analyze and optimize data workflows for performance. Additional Information:- The candidate should have minimum 5 years of experience in AWS Analytics.- This position is based in Pune.- A 15 years full time education is required. Qualification 15 years full time education

Posted Date not available

Apply

5.0 - 9.0 years

16 - 31 Lacs

noida, pune, gurugram

Work from Office

Job Description Design, implement, and maintain data pipelines for processing large datasets, ensuring data availability, quality, and efficiency for machine learning model training and inference. Collaborate with data scientists to streamline the deployment of machine learning models, ensuring scalability, performance, and reliability in production environments. Develop and optimize ETL (Extract, Transform, Load) processes, ensuring data flow from various sources into structured data storage systems. Ensure effective model monitoring, versioning, and logging to track performance and metrics in a production setting. Ensure data security, integrity, and compliance with data governance policies. Perform troubleshooting and root cause analysis on production-level machine learning systems. Skills: Glue, Pyspark, AWS Services, Strong in SQL; Nice to have : Redshift, Knowledge of SAS Dataset

Posted Date not available

Apply

5.0 - 6.0 years

2 - 6 Lacs

hyderabad

Work from Office

Assurance & Quality (AQ) Engineer DepartmentQuality Assurance / Engineering LocationRemote Role Purpose: To ensure the highest quality standards are met for live streaming and video surveillance solutions, especially in CCTV systems. The AQ Engineer will be responsible for validating compliance with STQC (Standardization Testing and Quality Certification) norms and will perform thorough testing and verification of live video streaming functionalities, encoding/decoding quality, latency, resilience, and compatibility. Key Responsibilities: Conduct functional, performance, and security testing on live streaming and CCTV video systems. Ensure systems comply with STQC guidelines, including for government-grade video surveillance. Validate end-to-end video streaming workflows capture, encoding, transmission, and playback. Test latency, frame rate consistency, resolution standards, and night vision capabilities. Ensure compatibility across various video codecs (H.264, H.265), streaming protocols (RTSP, RTMP, ONVIF). Develop and maintain automated test scripts for regression and stress testing of video systems. Collaborate with development, network, and security teams to resolve quality issues. Conduct compliance and certification testing to ensure product readiness for deployment. Generate quality reports, issue logs, and root cause analysis documentation. Required Skills & Experience: Bachelors Degree in Electronics, Computer Science, or related discipline. STQC Certification or experience working with STQC standards is mandatory . 36 years of experience in QA for video surveillance, CCTV, or live streaming environments. Strong understanding of video streaming protocols and CCTV camera technologies. Familiarity with tools like Wireshark, VLC, ONVIF Device Manager, and IP video testers. Experience with CCTV integration platforms, video analytics, and storage systems (NVR/DVR). Good scripting skills (Python, Shell) for test automation preferred. Excellent problem-solving skills and attention to detail. Preferred Qualifications: Knowledge of cybersecurity practices for video surveillance. Experience with cloud-based video management systems (AWS Kinesis Video, Azure Media Services). Exposure to AI-based video analytics and real-time video alert systems. Key Performance Indicators (KPIs): Number of defects identified pre-production Compliance rate with STQC guidelines Stream quality consistency (uptime, resolution, frame drop rate) Test automation coverage and execution speed Turnaround time for issue resolution

Posted Date not available

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies