Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
4.0 - 8.0 years
0 Lacs
hyderabad, telangana
On-site
You will be responsible for building and integrating systems using Node.JS and AWS. You should have fluency in working with REST APIs & SOAP webservices and be able to orchestrate workflow in NodeJS. It is important to have good knowledge in Server Setups & Security Setups (AWS EC2/ S3) as well as DNS Setups (AWS, Route52). Your understanding and experience on Enterprise Integration Patterns, B2B, EAI integration Roles and Responsibilities will be crucial for this role. You should have at least 4 years of experience with integration methods and technologies including web services, SOAP, JSON, REST, APIs, XML, and orchestration tools. Knowledge in Mulesoft, Tibco, or any other integrations tools will be advantageous. A sound understanding of ERP concepts is required. Good written and oral communication skills are essential. You should also have the ability to handle integrations independently. If you find yourself suitable and interested in this position, please send your updated profiles to hr@altostratussolutions.com.,
Posted 1 month ago
8.0 - 12.0 years
0 Lacs
hyderabad, telangana
On-site
The position of Architect Cloud & API Platform in the Insurance Domain is open for applications in Bangalore. The selected candidate will be responsible for leading the architecture and development of a scalable, cloud-native platform for insurance agents, specifically focusing on modernizing service delivery and enhancing business performance, particularly through the design of PruForce, an agent productivity solution. Key Responsibilities: - Design and implement a secure, scalable, cloud-native platform using Kubernetes, Docker, and Spring Boot (Kotlin/Java). - Take the lead in API architecture and schema design, utilizing OpenAPI and JSON Schema. - Develop event-driven integrations with technologies like Kafka, RabbitMQ, PubSub, or Artemis. - Collaborate on Salesforce FSC and core system integrations. - Document and update solution architectures, roadmaps, and best practices. - Work closely with cross-functional teams to ensure high availability and performance in cloud deployments. Required Qualifications: - Minimum 8 years of experience as a technical architect or senior engineer. - Proficiency in Spring Boot (Java/Kotlin), API-first design, and OpenAPI specifications. - Strong background in cloud-native architecture, microservices, and container orchestration. - Expertise in relational and NoSQL databases, data modeling, and optimization. - Hands-on experience with event-driven architecture and pub/sub messaging technologies like Kafka, RabbitMQ, and PubSub. - Excellent communication skills and a strong sense of solution ownership. Preferred Qualifications: - Familiarity with DevOps pipelines, CI/CD practices, and cloud storage solutions such as AWS S3, GCP, and Azure. - Experience with Agile methodologies (Scrum/Kanban) and the C4 model for architectural documentation. - Knowledge of the Insurance and Financial Services domain. If you are ready to shape the future of insurance technology, please submit your resume to hr@vikhyatainfotech.net.,
Posted 1 month ago
3.0 - 7.0 years
0 Lacs
hyderabad, telangana
On-site
You should have experience with Data Transfer tools and methods, including the ability to create and process change requests for new and existing clients. This includes expertise in GlobalScape, NDM (including Secure+), AWS S3/AWS CLI, GCP, Azure Blob, WinSCP/Putty. Additionally, you should have proficiency in certificate management and application license management. Experience with server patching, maintenance, vulnerability remediation, and server monitoring is essential. Familiarity with system diagnostic tools and maintenance reports such as Rapid7 and Brinqa is required. You should also possess expertise in file server management, Active Directory, and DNS management. Extensive knowledge of AWS services is a must, including VPC, EC2 EMR, S3 Fargate, Load balancers, EFX, EBS, AWS Workspaces. You should be able to install and configure software according to organizational guidelines and plans, including system configuration and default user settings. Managing server access requests, system accounts, password management, Instance/EBS snapshots, server decommissions, and change control processes will be part of your responsibilities. You should also have experience in setting up and configuring user tools like Dbeaver, Excel Macro functionality, Vedit, and troubleshooting any related issues. This position was posted by Hymavati Sarojini from Softility.,
Posted 1 month ago
10.0 - 14.0 years
0 Lacs
pune, maharashtra
On-site
You should have over 10 years of experience for a lead position, with a strong background in software design, development, and architecture using the Microsoft Stack. You should have proven leadership skills in managing cross-functional technical teams and delivering enterprise-grade solutions. Your expertise should include hands-on experience with C# .NET/Core, SQL Server, and Entity Framework. Additionally, you should possess good design skills in Data Structures, Algorithms, Design Patterns, and OOPS Concepts. As a lead, you will be responsible for conducting code reviews and providing quality feedback to team members. Excellent communication, strategic thinking, and problem-solving abilities are essential for this role. Experience with Cloud Native AWS services such as Lambda, SQS, RDS/Aurora, S3, Lex, and Polly is a plus. In this role, you will define and implement technical strategies to achieve the India Development Center's goals. You will align technological initiatives with business objectives to deliver scalable and innovative solutions. Additionally, you will build, lead, and mentor a high-performing technical team while fostering a collaborative environment that promotes knowledge sharing and continuous learning. You will oversee the design, architecture, and implementation of complex software solutions across multiple practices. Timely delivery of high-quality products and services with a focus on performance, scalability, and security will be a key responsibility. Proficiency in frameworks like ASP.NET, .NET Core, and Blazor for front-end development, as well as a deep understanding of C#, Entity Framework, Web API, and other Microsoft technologies for back-end development, is required. Experience with SQL Server for database management, optimization, and integration is necessary. Expertise in major AWS services, as well as hands-on experience in automating builds, testing, and deployments for seamless software delivery pipelines, will be beneficial in this role. In return, we offer you the opportunity to lead transformative technical initiatives in a global market. You will work in a supportive environment that encourages innovation, growth, and career development. We provide a competitive compensation and benefits package to our team members.,
Posted 1 month ago
5.0 - 9.0 years
0 Lacs
punjab
On-site
As a NodeJs Team Lead at Apptunix, you will be responsible for utilizing your deep experience in working with Node.js to develop cutting-edge software solutions. Your role will involve understanding SQL and NoSQL database systems, particularly MongoDB, and implementing them effectively in projects. You will be expected to have a solid grasp of MVC architecture and stateless APIs, along with building RESTful APIs to ensure seamless communication between front-end and back-end systems. Your expertise in scaling and security considerations will be crucial in ensuring the robustness of the applications you work on. Integration of user-facing elements with server-side logic, using technologies such as ExpressJs, MongoDB, AWS S3, and ES6, will be a key part of your day-to-day tasks. You will be responsible for writing reusable, testable, and efficient code, as well as designing and implementing low-latency, high-availability, and performance applications. In addition, you will play a pivotal role in implementing security measures and data protection protocols to safeguard the integrity of the applications. Your experience in integrating data storage solutions and structuring databases will be essential in delivering high-quality solutions that meet the needs of our clients. If you are passionate about leading a team of experts and driving innovation in the field of Node.js development, we encourage you to apply for this exciting opportunity at Apptunix. Join us in our mission to empower startups and enterprise businesses with technology solutions that drive incremental growth.,
Posted 1 month ago
3.0 - 7.0 years
0 Lacs
punjab
On-site
We are seeking an experienced Node.js Developer with a strong background in leading and guiding team members while also taking ownership of the technical direction. Although the position does not entail a formal Team Lead role, we highly value candidates who exhibit initiative and possess prior experience in mentoring and supervising team members. As a Node.js Developer, your responsibilities will include providing technical leadership by mentoring junior developers, offering technical assistance, and conducting code reviews. You will be responsible for overseeing development workflows and project timelines to ensure the team consistently delivers high-quality work within the set schedule. Your expertise in Node.js is crucial, as you will be expected to leverage your deep understanding of the technology to develop scalable applications. It will be your responsibility to uphold code quality and implement best practices across the team. Additionally, a solid grasp of both SQL and NoSQL databases, such as MongoDB, is essential for optimizing database performance, integrating data storage solutions, and designing efficient database structures. Experience in microservices architecture, including tools like Redis, RabbitMQ, and Kafka, is highly beneficial. You will lead initiatives focused on building scalable and resilient systems, requiring a strong command of API development, particularly RESTful APIs and stateless APIs. Your role will involve designing and implementing APIs that seamlessly integrate with front-end applications. Ensuring application security, scalability, and performance are paramount. You will guide the team in adhering to best practices for low-latency and high-availability designs. Collaborating with front-end developers to merge user-facing elements with server-side logic is essential, as is engaging with cross-functional teams to facilitate smooth collaboration among all stakeholders. Promoting coding best practices, such as writing reusable, testable, and efficient code, is key. You will advocate for clean code principles and contribute to the ongoing enhancement of development processes and practices. Additionally, your experience in managing team dynamics, fostering productive collaboration, and cultivating a supportive work environment will be invaluable. Requirements for this role include extensive hands-on experience with Node.js and frameworks like ExpressJS, a strong understanding of MongoDB and expertise in both NoSQL and SQL databases, knowledge of microservices technologies such as Redis, RabbitMQ, and Kafka, familiarity with cloud platforms like AWS S3, and proficiency in designing secure, high-availability applications with low-latency performance. If you possess a solid track record in building RESTful APIs and integrating front-end technologies with back-end services, along with the ability to design, implement, and maintain secure applications, we encourage you to apply for this role.,
Posted 1 month ago
8.0 - 12.0 years
0 Lacs
hyderabad, telangana
On-site
As a Full Stack Data Engineer Lead Analyst at Evernorth, you will be a key player in the Data & Analytics Engineering organization of Cigna, a leading Health Services company. Your role will involve delivering business needs by understanding requirements and deploying software into production. To excel in this position, you should be well-versed in critical technologies, eager to learn, and committed to adding value to the business. Ownership, a thirst for knowledge, and an open mindset are essential attributes for a successful Full Stack Engineer like yourself. In addition to delivery responsibilities, you will be expected to embrace an automation-first and continuous improvement mindset. You will drive the adoption of CI/CD tools and support the enhancement of toolsets and processes. Your ability to articulate clear business objectives aligned with technical specifications and work in an iterative, agile manner will be crucial. Taking ownership and being accountable, writing referenceable and modular code, and ensuring data quality are key behaviors expected from you. Key Characteristics: - Independently design and architect solutions - Demonstrate ownership and accountability - Write referenceable and modular code - Possess fluency in specific areas and proficiency in multiple areas - Exhibit a passion for continuous learning - Maintain a quality mindset to ensure data quality and business impact assessment Required Skills: - Experience in developing data integration and ingestion strategies, including Snowflake cloud data warehouse, AWS S3 buckets, and loading nested JSON formatted data - Strong understanding of snowflake cloud database architecture - Proficiency in big data technologies like Databricks, Hadoop, HiveQL, Spark (Scala/Python) and cloud technologies such as AWS (S3, Glue, Terraform, Lambda, Aurora, Redshift, EMR) - Experience in working on Analytical Models and enabling their deployment and production via data and analytical pipelines - Expertise in Query Tuning and Performance improvements - Previous exposure to onsite/offshore setup or model Required Experience & Education: - 8+ years of professional industry experience - Bachelor's degree (or equivalent) - 5+ years of Python scripting experience - 5+ years of Data Management and SQL expertise in Teradata & Snowflake - 3+ years of Agile team experience, preferably with Scrum Desired Experience: - Familiarity with version management tools, with Git being preferred - Exposure to BDD and TDD development methodologies - Experience in an agile CI/CD environment; Jenkins experience is preferred - Knowledge of Health care information domains is advantageous Location & Hours of Work: - (Specify whether the position is remote, hybrid, in-office, and where the role is located as well as the required hours of work) Evernorth is committed to being an Equal Opportunity Employer, actively promoting and supporting diversity, equity, and inclusion efforts throughout the organization. Staff are encouraged to participate in these initiatives to enhance internal practices and external collaborations with diverse client populations.,
Posted 1 month ago
2.0 - 6.0 years
0 Lacs
chennai, tamil nadu
On-site
Applied Materials, Inc. is the global leader in materials engineering solutions used to produce virtually every new chip and advanced display in the world. Our expertise in modifying materials at atomic levels and on an industrial scale enables customers to transform possibilities into reality. At Applied Materials, our innovations make possible the technology shaping the future. Applied's AIx Products group is searching for front-end developers to join our team. AIx (Actionable Insight Accelerator) is an ML/AI Data Analytics platform that enables development and deployment of new chip technologies. AIx allows engineers to innovate and optimize semiconductor processes in real-time, and control thousands of variables to improve semiconductor performance, power, area-cost and time to market (PPACt). The ideal candidate should possess the following essential skillset: - Experience in AWS DevOps. - Experience deploying applications in K8S with YAML using kubectl and helm. - Experience setting up network ingress and load balancing in AWS and K8S. - Experience configuring AWS S3 object storage. - Experience of 2 to 6 years. - Location: Chennai Qualifications: Education: High School Diploma/GED Skills: Certifications: Languages: Years of Experience: 4 - 7 Years Work Experience: Additional Information: Time Type: Full time Employee Type: Assignee / Regular Travel: Relocation Eligible: No,
Posted 1 month ago
8.0 - 12.0 years
0 Lacs
haryana
On-site
You should have 8-10 years of operational knowledge in Microservices and .Net Fullstack, with experience in C# or Python development, as well as Docker. Additionally, experience with PostgreSQL or Oracle is required. Knowledge of AWS services such as S3 is necessary, and familiarity with AWS Kinesis and AWS Redshift is preferred. A strong desire to learn new technologies and skills is highly valued. Experience with unit testing and Test-Driven Development (TDD) methodology is considered an asset. You should possess strong team spirit, analytical skills, and the ability to synthesize information. Having a passion for Software Craftsmanship, a culture of excellence, and writing Clean Code is important. Fluency in English is required due to the multicultural and international nature of the team. In this role, you will have the opportunity to develop your technical skills in C# .NET and/or Python, Oracle, PostgreSQL, AWS, ELK (Elasticsearch, Logstash, Kibana), GIT, GitHub, TeamCity, Docker, and Ansible.,
Posted 1 month ago
3.0 - 7.0 years
0 Lacs
pune, maharashtra
On-site
Agivant is seeking a talented and passionate Senior Data Engineer to join our growing data team. In this role, you will play a key part in building and scaling our data infrastructure, enabling data-driven decision-making across the organization. You will be responsible for designing, developing, and maintaining efficient and reliable data pipelines for both ELT (Extract, Load, Transform) and ETL (Extract, Transform, Load) processes. Responsibilities: Design, develop, and maintain robust and scalable data pipelines for ELT and ETL processes, ensuring data accuracy, completeness, and timeliness. Work with stakeholders to understand data requirements and translate them into efficient data models and pipelines. Build and optimize data pipelines using a variety of technologies, including Elastic Search, AWS S3, Snowflake, and NFS. Develop and maintain data warehouse schemas and ETL/ELT processes to support business intelligence and analytics needs. Implement data quality checks and monitoring to ensure data integrity and identify potential issues. Collaborate with data scientists and analysts to ensure data accessibility and usability for various analytical purposes. Stay current with industry best practices, CI/CD/DevSecFinOps, Scrum, and emerging technologies in data engineering. Contribute to the development and enhancement of our data warehouse architecture. Requirements: - Bachelor's degree in Computer Science, Engineering, or a related field. - 5+ years of experience as a Data Engineer with a strong focus on ELT/ETL processes. - At least 3+ years of experience in Snowflake data warehousing technologies. - At least 3+ years of experience in creating and maintaining Airflow ETL pipelines. - Minimum 3+ years of professional level experience with Python languages for data manipulation and automation. - Working experience with Elastic Search and its application in data pipelines. - Proficiency in SQL and experience with data modeling techniques. - Strong understanding of cloud-based data storage solutions such as AWS S3. - Experience working with NFS and other file storage systems. - Excellent problem-solving and analytical skills. - Strong communication and collaboration skills.,
Posted 1 month ago
8.0 - 12.0 years
0 Lacs
haryana
On-site
You should have 8-10 years of operational knowledge in Microservices and .Net Fullstack, C# or Python development, along with experience in Docker. Additionally, experience with PostgreSQL or Oracle is required. Knowledge of AWS services such as S3 is a must, and familiarity with AWS Kinesis and AWS Redshift is desirable. A genuine interest in mastering new technologies is essential for this role. Experience with unit testing and Test-Driven Development (TDD) methodology will be considered as assets. Strong team spirit, analytical skills, and the ability to synthesize information are key qualities we are looking for. Having a passion for Software Craftsmanship, a culture of excellence, and writing Clean Code is highly valued. Being fluent in English is important as you will be working in a multicultural and international team. In this role, you will have the opportunity to develop your technical skills in the following areas: C# .NET and/or Python programming, Oracle and PostgreSQL databases, AWS services, ELK (Elasticsearch, Logstash, Kibana) stack, as well as version control tools like GIT and GitHub, continuous integration with TeamCity, containerization with Docker, and automation using Ansible.,
Posted 1 month ago
7.0 - 11.0 years
0 Lacs
delhi
On-site
As a CBRE Software Senior Engineer, you will work under broad direction to supervise, develop, maintain, and enhance client systems. This role is part of the Software Engineering job function and requires successfully executing and monitoring system improvements to increase efficiency. Responsibilities: - Develop, maintain, enhance, and test client systems of moderate to high complexity. - Execute the full software development life cycle (SDLC) to build high-quality, innovative, and performing software. - Conduct thorough code reviews to ensure high-quality code. - Estimate technical efforts of agile sprint stories. - Implement performance-optimized solutions and improve the performance of existing systems. - Serve as the primary technical point of contact on client engagements. - Investigate and resolve complex data system and software issues in the production environment. - Design and implement strategic partner integrations. - Participate in the specification and design of new features at client or business request. - Evaluate new platforms, tools, and technologies. - Coach others to develop in-depth knowledge and expertise in most or all areas within the function. - Provide informal assistance such as technical guidance, code review, and training to coworkers. - Apply advanced knowledge to seek and develop new, better methods for accomplishing individual and department objectives. - Showcase expertise in your job discipline and in-depth knowledge of other job disciplines within the organization function. - Lead by example and model behaviors consistent with CBRE RISE values. - Anticipate potential objections and persuade others, often at senior levels and of divergent interest, to adopt a different point of view. - Impact the achievement of customer operational project or service objectives across multidiscipline teams. - Contribute to new products, processes, standards, and/or operational plans in support of achieving functional goals. - Communicate difficult and complex ideas with the ability to influence. Qualifications: - Bachelor's Degree preferred with 7-9 years of relevant experience. In lieu of a degree, a combination of experience and education will be considered. - Knowledge of Java, Spring Boot, VueJS, Unit Testing, AWS services (ECS, Fargate, Lambda, RDS, S3, Step Functions), Bootstrap/CSS/CSS3, Docker, Dynamo DB, JavaScript/jQuery, Microservices, SNS, SpringBoot, and SQS. - Optional knowledge of .NET, Python, Angular, SQL Server, AppD, New Relic. - Innovative mentality to develop methods that go beyond existing solutions. - Ability to solve unique problems using standard and innovative solutions with a broad impact on the business. - Expert organizational skills with an advanced inquisitive mindset. Required Skills: - Angular - AWS API Gateway - AWS CloudFormation - AWS Lambda - AWS RDS - AWS S3 - AWS Step Functions - Bootstrap/CSS/CSS3 - Docker - Dynamo DB - Java - JavaScript/jQuery - Microservices - SNS - SpringBoot - SQS,
Posted 1 month ago
3.0 - 7.0 years
0 Lacs
hyderabad, telangana
On-site
The ideal candidate for this role should have strong skills in AWS EMR, EC2, AWS S3, Cloud Formation Template, Batch data, and AWS Code Pipeline services. It would be an added advantage to have experience with EKS. As this is a hands-on role, the candidate will be expected to have good administrative knowledge of AWS EMR, EC2, AWS S3, Cloud Formation Template, and Batch data. Responsibilities include managing and deploying EMR Clusters, with a solid understanding of AWS account and IAM. The candidate should also have experience in administrative tasks related to EMR Persistent Cluster and Transient Cluster. It is essential for the candidate to possess a good understanding of AWS Cloud Formation, cluster setup, and AWS network. Hands-on experience with Infrastructure as Code for Deployment tools like Terraform is highly desirable. Additionally, experience in AWS health monitoring and optimization is required. Knowledge of Hadoop and Big Data will be considered as an added advantage for this position.,
Posted 1 month ago
8.0 - 12.0 years
0 Lacs
hyderabad, telangana
On-site
You are a Senior Cloud Application Developer (AWS to Azure Migration) with 8+ years of experience. Your role involves hands-on experience in developing applications for both AWS and Azure platforms. You should have a strong understanding of Azure services for application development and deployment, including Azure IaaS and PaaS services. Your responsibilities include proficiency in AWS to Azure cloud migration, which involves service mapping and SDK/API conversion. You will also be required to perform code refactoring and application remediation for cloud compatibility. You should have a minimum of 5 years of experience in application development using Java, Python, Node.js, or .NET. Additionally, you must possess a solid understanding of CI/CD pipelines, deployment automation, and Azure DevOps. Experience with containerized applications, AKS, Kubernetes, and Helm charts is also necessary. Your role will involve application troubleshooting, support, and testing in cloud environments. Experience with the following tech stack is highly preferred: - Spring Boot REST API, NodeJS REST API - Apigee config, Spring Server Config - Confluent Kafka, AWS S3 Sync Connector - Azure Blob Storage, Azure Files, Azure Functions - Aurora PostgreSQL to Azure DB migration - EKS to AKS migration, S3 to Azure Blob Storage - AWS to Azure SDK Conversion Location options for this role include Hyderabad, Bangalore, or Pune. You should have a notice period of 10-15 days.,
Posted 1 month ago
5.0 - 9.0 years
0 Lacs
thiruvananthapuram, kerala
On-site
As a Senior Machine Learning Engineer Contractor specializing in AWS ML Pipelines, your primary responsibility will be to design, develop, and deploy advanced ML pipelines within an AWS environment. You will work on cutting-edge solutions that automate entity matching for master data management, implement fraud detection systems, handle transaction matching, and integrate GenAI capabilities. The ideal candidate for this role should possess extensive hands-on experience in AWS services like SageMaker, Bedrock, Lambda, Step Functions, and S3. Moreover, you should have a strong command over CI/CD practices to ensure a robust and scalable solution. Your key responsibilities will include designing and developing end-to-end ML pipelines focusing on entity matching, fraud detection, and transaction matching. You will be integrating generative AI solutions using AWS Bedrock to enhance data processing and decision-making. Collaboration with cross-functional teams to refine business requirements and develop data-driven solutions tailored to master data management needs will also be a crucial aspect of your role. In terms of AWS ecosystem expertise, you will be required to utilize SageMaker for model training, deployment, and continuous improvement. Additionally, leveraging Lambda and Step Functions to orchestrate serverless workflows for data ingestion, preprocessing, and real-time processing will be part of your daily tasks. Managing data storage, retrieval, and scalability concerns using AWS S3 will also be within your purview. Furthermore, you will need to develop and integrate automated CI/CD pipelines to streamline model testing, deployment, and version control. Ensuring rapid iteration and robust deployment practices to maintain high availability and performance of ML solutions will be essential. Data security and compliance will be a critical aspect of your role. You will need to implement security best practices to safeguard sensitive data, ensuring compliance with organizational and regulatory requirements. Incorporating monitoring and alerting mechanisms to maintain the integrity and performance of deployed ML models will be part of your responsibilities. Collaboration and documentation will also play a significant role in your day-to-day activities. Working closely with business stakeholders, data engineers, and data scientists to ensure solutions align with evolving business needs will be crucial. You will also need to document all technical designs, workflows, and deployment processes to support ongoing maintenance and future enhancements. Providing regular progress updates and adapting to changing priorities or business requirements in a dynamic environment are expected. To qualify for this role, you should have at least 5+ years of professional experience in developing and deploying ML models and pipelines. Proven expertise in AWS services including SageMaker, Bedrock, Lambda, Step Functions, and S3 is necessary. Strong proficiency in Python and/or PySpark, demonstrated experience with CI/CD tools and methodologies, and practical experience in building solutions for entity matching, fraud detection, and transaction matching within a master data management context are also required. Familiarity with generative AI models and their application within data processing workflows will be an added advantage. Strong analytical and problem-solving skills are essential for this role. You should be able to transform complex business requirements into scalable technical solutions and possess strong data analysis capabilities with a track record of developing models that provide actionable insights. Excellent verbal and written communication skills, the ability to work independently as a contractor while effectively collaborating with remote teams, and a proven record of quickly adapting to new technologies and agile work environments are also preferred qualities for this position. A Bachelor's or Master's degree in Computer Science, Data Science, Engineering, or a related field is a plus. Experience with additional AWS services such as Kinesis, Firehose, and SQS, prior experience in a consulting or contracting role demonstrating the ability to manage deliverables under tight deadlines, and experience within industries where data security and compliance are critical will be advantageous.,
Posted 1 month ago
6.0 - 8.0 years
11 - 13 Lacs
Hyderabad, Gurugram, Bengaluru
Work from Office
iSource Services is hiring for one of their client for the position of Java Developer About the role: We are looking for a skilled Java Developer with strong expertise in Spring Boot, Microservices, and AWS to join our growing team. The ideal candidate must have a proven track record of delivering scalable backend solutions and a minimum of 4 years of hands-on experience with AWS services. Key Responsibilities: Develop and maintain high-performance Java applications using Spring Boot and Microservices architecture Integrate with AWS services including Lambda, DynamoDB, SQS, SNS, S3, ECS, and EC2 Work with event-driven architecture using Kafka Collaborate with cross-functional teams to define, design, and ship new features Ensure the performance, quality, and responsiveness of applications Required Skills: Strong proficiency in Java (8+), Spring Boot, and Microservices Minimum 4 years of hands-on experience with AWS (Lambda, DynamoDB, SQS, SNS, S3, ECS, EC2) Experience with Kafka for real-time data streaming Solid understanding of system design, data structures, and algorithms Excellent problem-solving and communication skills.
Posted 1 month ago
3.0 - 8.0 years
0 Lacs
kochi, kerala
On-site
As a Tech Lead Full Stack at Qubryx, a US based Product Consulting and Development company, you will play a crucial role in leading the development, implementation, and maintenance of software solutions and applications for both client and company web-based products. This is a full-time remote role with the opportunity to work on designing and developing user interfaces, testing, and debugging code. While the role is primarily located in Kochi, there is flexibility for remote work as well. To be considered for this position, you should have at least 8 years of experience in full cycle software development projects and a minimum of 3 years of experience as a Tech Lead. You should have a proven track record of designing and developing software applications from scratch. Proficiency in JavaScript, Typescript, Node.js, and strong skills in No-SQL MongoDB designing and querying are essential for this role. Additionally, you should possess strong SQL skills and experience working with SQL Server and Postgres. Experience with AWS Lambda, S3, RDS, API Gateway, as well as familiarity with front-end UI frameworks such as React and React Native are highly desirable. Knowledge of Scrum methodologies, sprint planning, project planning, estimation, and product feature management is crucial for success in this role. You should have experience managing teams of developers, providing technical guidance, and fostering a collaborative team environment. Preferred qualifications include AWS Certifications, experience with Docker, containers, Kubernetes, and microservices, as well as proficiency in Python with past Java or .NET experience. Experience with serverless coding on AWS Lambda or Azure Functions, Azure Devops, and working in Scrum Teams are advantageous. A deep understanding of DevOps and SRE principles, along with experience implementing DevOps best practices, is also preferred. As a Tech Lead Full Stack, you should be a self-starter with excellent problem-solving skills and strong verbal and written communication abilities. You should be comfortable working independently as well as collaborating closely with other team members, both offshore and onsite. You should have the ability to code new features, troubleshoot problems, and identify areas for improvement. If you are a highly motivated individual with a passion for software development and a willingness to learn and grow with the team, we encourage you to apply for this exciting opportunity. Join us at Qubryx and be part of a dynamic team that values innovation, collaboration, and continuous improvement. Benefits include competitive compensation and the opportunity to work on cutting-edge projects with a talented team. To qualify for this role, you should have a bachelor's degree and a minimum of 8 years of experience in relevant technologies.,
Posted 1 month ago
5.0 - 9.0 years
0 Lacs
hyderabad, telangana
On-site
As a Full-Stack Developer with 5+ years of experience in the MERN stack, you will be responsible for proficiently handling backend development tasks using Node.js, Express.js, and AWS Lambda. Your strong hands-on experience with MongoDB, AWS Neptune, Redis, and other databases will be essential for the successful execution of projects. Additionally, your expertise in front-end development using React.js, HTML, CSS, and JavaScript (ES6+) will play a crucial role in delivering high-quality user interfaces. Your familiarity with AWS services such as Lambda, API Gateway, S3, CloudFront, IAM, and DynamoDB will be advantageous for integrating and deploying applications effectively. Experience with DevOps tools like GitHub Actions, Jenkins, and AWS CodePipeline will be required to streamline the development process. Proficiency in Git-based workflows and hands-on experience with Agile methodologies and tools like JIRA will be necessary for collaborative and efficient project management. In terms of technical skills development, you should possess expertise in React.js with Redux, Context API, or Recoil, along with HTML5, CSS3, JavaScript (ES6+), and TypeScript. Knowledge of Material UI, Tailwind CSS, Bootstrap, and performance optimization techniques will be crucial for creating responsive and visually appealing web applications. Your proficiency in Node.js & Express.js, AWS Lambda, RESTful APIs & GraphQL, and authentication & authorization mechanisms like JWT, OAuth, and AWS Cognito will be key for building robust server-side applications. Moreover, your familiarity with Microservices, event-driven architecture, MongoDB & Mongoose, AWS Neptune, Redis, and AWS S3 for object storage will be essential for developing scalable and efficient applications. Understanding Cloud & DevOps concepts such as AWS services, Infrastructure as Code (IaC), CI/CD Pipelines, and Monitoring & Logging tools will be necessary for deploying and maintaining applications in a cloud environment. Your soft skills, including strong problem-solving abilities, excellent communication skills, attention to detail, and the ability to mentor junior developers will be crucial for collaborating with cross-functional teams and providing technical guidance. Your adaptability to learn and work with new technologies in a fast-paced environment will be essential for staying updated and delivering innovative solutions effectively.,
Posted 1 month ago
3.0 - 7.0 years
0 Lacs
chennai, tamil nadu
On-site
As a Data Engineer, you will be responsible for developing and maintaining a metadata-driven generic ETL framework to automate ETL code. Your primary tasks will include designing, building, and optimizing ETL/ELT pipelines using Databricks (PySpark/SQL) on AWS. You will be required to ingest data from a variety of structured and unstructured sources such as APIs, RDBMS, flat files, and streaming services. In this role, you will also develop and maintain robust data pipelines for both batch and streaming data utilizing Delta Lake and Spark Structured Streaming. Implementing data quality checks, validations, and logging mechanisms will be essential to ensure data accuracy and reliability. You will work on optimizing pipeline performance, cost, and reliability and collaborate closely with data analysts, BI teams, and business stakeholders to deliver high-quality datasets. Additionally, you will support data modeling efforts, including star and snowflake schemas, de-normalization tables approach, and assist in data warehousing initiatives. Your responsibilities will also involve working with orchestration tools like Databricks Workflows to schedule and monitor pipelines effectively. To excel in this role, you should have hands-on experience in ETL/Data Engineering roles and possess strong expertise in Databricks (PySpark, SQL, Delta Lake). Experience with Spark optimization, partitioning, caching, and handling large-scale datasets is crucial. Proficiency in SQL and scripting in Python or Scala is required, along with a solid understanding of data lakehouse/medallion architectures and modern data platforms. Knowledge of cloud storage systems like AWS S3, familiarity with DevOps practices (Git, CI/CD, Terraform, etc.), and strong debugging, troubleshooting, and performance-tuning skills are also essential for this position. Following best practices for version control, CI/CD, and collaborative development will be a key part of your responsibilities. If you are passionate about data engineering, enjoy working with cutting-edge technologies, and thrive in a collaborative environment, this role offers an exciting opportunity to contribute to the success of data-driven initiatives within the organization.,
Posted 1 month ago
6.0 - 8.0 years
17 - 18 Lacs
Hyderabad
Work from Office
Position - Software Developer (Angular,React,Vue.js,Python,Django,Ruby,Lambda,SQS,S3) Qualifications: 5+ year of software development and 3+ years of Python development experience 1+ Ruby experience preferred 3+ years of experience with web frameworks (preferred: Rails or Rack, Django) 1+ years of Angular, React, or Vue.js Demonstrated experience with AWS Services (services preferred: Lambda, SQS, S3) Experience working in a software product driven environment Demonstrable knowledge of front-end technologies such as JavaScript, HTML5, CSS3 Workable knowledge of relational databases (ex: MySQL, Postgres) BS/MS degree in Computer Science or equivalent experience Knowledge of version control, such as Git Familiarity with Docker (containerized environment) Knowledge of testing libraries (ideally: rspec, pytest, jest) Experience with Linters (ex: RuboCop, Flake8, ESLint)
Posted 1 month ago
3.0 - 8.0 years
0 Lacs
pune, maharashtra
On-site
You should have strong experience in PySpark, Python, Unix scripting, SparkSQL, and Hive. You must be proficient in writing SQL queries, creating views, and possess excellent oral and written communication skills. Prior experience in the Insurance domain would be beneficial. A good understanding of the Hadoop Ecosystem including HDFS, Map Reduce, Pig, Hive, Oozie, and Yarn is required. Knowledge of AWS services such as Glue, AWS S3, Lambda function, Step Function, and EC2 is essential. Experience in data migration from platforms like Hive/S3 to Data Bricks is a plus. You should be able to prioritize, plan, organize, and manage multiple tasks efficiently while delivering high-quality work. As a candidate, you should have 6-8 years of technical experience in PySpark, AWS (Glue, EMR, Lambda, Steps functions, S3), with at least 3 years of experience in Big Data/ETL using Python, Spark, and Hive, along with 3+ years of experience in AWS. Your primary key skills should include PySpark, AWS (Glue, EMR, Lambda, Steps functions, S3), and Big Data with Python, Spark, and Hive experience. Exposure to Big Data migration is also important. Secondary key skills that would be beneficial for this role include Informatica BDM/Power center, Data Bricks, and MongoDB.,
Posted 1 month ago
8.0 - 12.0 years
0 Lacs
karnataka
On-site
You are a solid hands-on engineer in the video algorithm domain with expertise in developing video compression algorithms for cloud and mobile applications. Your role involves developing video software algorithms using various codecs like H.264 for applications such as mobile video sharing, cloud-based video encoding, and optimizing video delivery in broadcast and surveillance domains. As a developer in this role, you will be part of a core video team dedicated to enhancing user experience and reducing video delivery costs. You should have a solid understanding of video compression fundamentals and practical experience with codecs like H.264, H.265, AV1, and VVC. Knowledge of Media Codec frameworks on Android and iOS platforms is essential, along with strong programming skills in C/C++ on Linux. Experience in the video streaming domain and familiarity with protocols such as HTTP, RTP, RTSP, and WebRTC are necessary. Additionally, you should have a thorough understanding of HLS, MPEG-DASH, MP4, fMP4, and MOV file formats. Desirable experience includes working with operating systems like Linux, iOS, and Android, media frameworks such as Android MediaCodec Framework and iOS Video Toolbox, and source control tools like Git. Proficiency in open-source media frameworks like FFmpeg and GStreamer, video filters, scaling, denoiser, blending algorithms, and machine learning techniques for video compression algorithms is highly valued. An understanding of OS internals like I/O, networking, and multi-threading is also important. Your specific responsibilities will include developing Video Compression SDKs for mobile devices, addressing challenges related to video processing, developing new video algorithms using the latest codecs, and improving video content quality and efficiency. You will collaborate with cross-functional teams locally and globally, maintain and extend software components for customer deployments, and work in a fast-paced development environment following SDLC. To excel in this role, you must be well-organized, willing to take on development challenges, and eager to learn new video technologies. You should have at least 8 years of experience in video compression, knowledge of media frameworks for iOS and Android, and familiarity with tools like GStreamer and FFMPEG. Experience with codecs like H.265, VP9, building SDKs, AWS S3, Agile methodologies, and video stream analysis tools is beneficial. A Master's degree in Computer Science or Engineering is preferred. If you meet these requirements and are ready to contribute to a dynamic engineering environment focused on advancing video technology, please send your CV to careers@crunchmediaworks.com.,
Posted 1 month ago
8.0 - 12.0 years
12 - 22 Lacs
Hyderabad
Remote
Tech stack- Database: Mongodb: S3 Postgres Strong experience on Data pipelines; mapping React; Node; Python Aws; Lambda About the job Summary We are seeking a detail-oriented and proactive Data Analyst to lead our file and data operations, with a primary focus on managing data intake from our clients and ensuring data integrity throughout the pipeline. This role is vital to our operational success and will work cross-functionally to support data ingestion, transformation, validation, and secure delivery. The ideal candidate must have hands-on experience with healthcare datasets, especially medical claims data, and be proficient in managing ETL processes and data operations at scale. Responsibilities File Intake & Management Serve as the primary point of contact for receiving files from clients, ensuring all incoming data is tracked, validated, and securely stored. Monitor and automate data file ingestion using tools such as AWS S3, AWS Glue, or equivalent technologies. Troubleshoot and resolve issues related to missing or malformed files and ensure timely communication with internal and external stakeholders. Data Operations & ETL Develop, manage, and optimize ETL pipelines for processing large volumes of structured and unstructured healthcare data. Perform data quality checks, validation routines, and anomaly detection across datasets. Ensure consistency and integrity of healthcare data (e.g., EHR, medical claims, ICD/CPT/LOINC codes) during transformations and downstream consumption. Data Analysis & Reporting Collaborate with data science and analytics teams to deliver operational insights and performance metrics. Build dashboards and visualizations using Power BI or Tableau to monitor data flow, error rates, and SLA compliance. Generate summary reports and audit trails to ensure HIPAA-compliant data handling practices. Process Optimization Identify opportunities for automation and efficiency in file handling and ETL processes. Document procedures, workflows, and data dictionaries to standardize operations. Required Qualifications Bachelors or Master’s degree in Health Informatics, Data Analytics, Computer Science, or related field. 5+ years of experience in a data operations or analyst role with a strong focus on healthcare data. Demonstrated expertise in working with medical claims data, EHR systems, and healthcare coding standards (e.g., ICD, CPT, LOINC, SNOMED, RxNorm). Strong programming and scripting skills in Python and SQL for data manipulation and automation. Hands-on experience with AWS, Redshift, RDS, S3, and data visualization tools such as Power BI or Tableau. Familiarity with HIPAA compliance and best practices in handling protected health information (PHI). Excellent problem-solving skills, attention to detail, and communication abilities.
Posted 1 month ago
9.0 - 12.0 years
14 - 24 Lacs
Gurugram
Remote
We are looking for an experienced Senior Data Engineer to lead the development of scalable AWS-native data lake pipelines with a strong focus on time series forecasting and upsert-ready architectures. This role requires end-to-end ownership of the data lifecycle, from ingestion to partitioning, versioning, and BI delivery. The ideal candidate must be highly proficient in AWS data services, PySpark, versioned storage formats like Apache Hudi/Iceberg, and must understand the nuances of data quality and observability in large-scale analytics systems. Role & responsibilities Design and implement data lake zoning (Raw Clean Modeled) using Amazon S3, AWS Glue, and Athena. Ingest structured and unstructured datasets including POS, USDA, Circana, and internal sales data. Build versioned and upsert-friendly ETL pipelines using Apache Hudi or Iceberg. Create forecast-ready datasets with lagged, rolling, and trend features for revenue and occupancy modelling. Optimize Athena datasets with partitioning, CTAS queries, and metadata tagging. Implement S3 lifecycle policies, intelligent file partitioning, and audit logging. Build reusable transformation logic using dbt-core or PySpark to support KPIs and time series outputs. Integrate robust data quality checks using custom logs, AWS CloudWatch, or other DQ tooling. Design and manage a forecast feature registry with metrics versioning and traceability. Collaborate with BI and business teams to finalize schema design and deliverables for dashboard consumption. Preferred candidate profile 9-12 years of experience in data engineering. Deep hands-on experience with AWS Glue, Athena, S3, Step Functions, and Glue Data Catalog. Strong command over PySpark, dbt-core, CTAS query optimization, and partition strategies. Working knowledge of Apache Hudi, Iceberg, or Delta Lake for versioned ingestion. Experience in S3 metadata tagging and scalable data lake design patterns. Expertise in feature engineering and forecasting dataset preparation (lags, trends, windows). Proficiency in Git-based workflows (Bitbucket), CI/CD, and deployment automation. Strong understanding of time series KPIs, such as revenue forecasts, occupancy trends, or demand volatility. Data observability best practices including field-level logging, anomaly alerts, and classification tagging. Experience with statistical forecasting frameworks such as Prophet, GluonTS, or related libraries. Familiarity with Superset or Streamlit for QA visualization and UAT reporting. Understanding of macroeconomic datasets (USDA, Circana) and third-party data ingestion. Independent, critical thinker with the ability to design for scale and evolving business logic. Strong communication and collaboration with BI, QA, and business stakeholders. High attention to detail in ensuring data accuracy, quality, and documentation. Comfortable interpreting business-level KPIs and transforming them into technical pipelines.
Posted 1 month ago
3.0 - 7.0 years
0 Lacs
chennai, tamil nadu
On-site
The role of Data Scientist - Clinical Data Extraction & AI Integration in our healthcare technology team requires an experienced individual with 3-6 years of experience. As a Data Scientist in this role, you will be primarily focused on medical document processing and data extraction systems. You will have the opportunity to work with advanced AI technologies to create solutions that enhance the extraction of crucial information from clinical documents, thereby improving healthcare data workflows and patient care outcomes. Your key responsibilities will include designing and implementing statistical models for medical data quality assessment, developing predictive algorithms for encounter classification, and validation. You will also be responsible for building machine learning pipelines for document pattern recognition, creating data-driven insights from clinical document structures, and implementing feature engineering for medical terminology extraction. Furthermore, you will apply natural language processing (NLP) techniques to clinical text, develop statistical validation frameworks for extracted medical data, and build anomaly detection systems for medical document processing. Additionally, you will create predictive models for discharge date estimation, encounter duration, and implement clustering algorithms for provider and encounter classification. In terms of AI & LLM Integration, you will be expected to integrate and optimize Large Language Models via AWS Bedrock and API services, design and refine AI prompts for clinical content extraction with high accuracy, and implement fallback logic and error handling for AI-powered extraction systems. You will also develop pattern matching algorithms for medical terminology and create validation layers for AI-extracted medical information. Having expertise in the healthcare domain is crucial for this role. You will work closely with medical document structures, implement healthcare-specific validation rules, handle medical terminology extraction, and conduct clinical context analysis. Ensuring HIPAA compliance and adhering to data security best practices will also be part of your responsibilities. Proficiency in programming languages such as Python 3.8+, R, SQL, and JSON, along with familiarity with data science tools like pandas, numpy, scipy, scikit-learn, spaCy, and NLTK is required. Experience with ML Frameworks including TensorFlow, PyTorch, transformers, huggingface, and visualization tools like matplotlib, seaborn, plotly, Tableau, and PowerBI is desirable. Knowledge of AI Platforms such as AWS Bedrock, Anthropic Claude, OpenAI APIs, and experience with cloud services like AWS (SageMaker, S3, Lambda, Bedrock) will be advantageous. Familiarity with research tools like Jupyter notebooks, Git, Docker, and MLflow is also beneficial for this role.,
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
73564 Jobs | Dublin
Wipro
27625 Jobs | Bengaluru
Accenture in India
22690 Jobs | Dublin 2
EY
20638 Jobs | London
Uplers
15021 Jobs | Ahmedabad
Bajaj Finserv
14304 Jobs |
IBM
14148 Jobs | Armonk
Accenture services Pvt Ltd
13138 Jobs |
Capgemini
12942 Jobs | Paris,France
Amazon.com
12683 Jobs |