At Unify Technologies , we’re hiring Scala Developers who are passionate about functional programming and ready to make a real impact. 🔹 Role: Scala Developer 🔹 Experience: 3+ Years 🔹 Location: Hyderabad 🔹 Tech Stack: Scala, Play Framework, Akka, Lagom, Slick Should be very strong Scala development(Coding) With Any combination of Java/Python/Spark/Bigdata 3+ years experience in Core Java/Scala with good understanding of multithreading The candidate must be good with Computer Science fundamentals Exposure to python/perl and Unix / K-Shell scripting Code management tools such as Git/Perforce. Experience with large batch-oriented systems DB2/Sybase or any RDBMS Prior experience with financial products, particularly OTC Derivatives Exposure to counterparty risk, margining, collateral or confirmation systems Show more Show less
JOB DESCRIPTION: Job Title: Automation Test Engineer (Java, Selenium, BDD/Cucumber, API Rest Assured) Location: Chennai- WFO (5days) Job Type: Full-time Experience: 3+ Years Job Overview: We are seeking a highly skilled Automation Test Engineer with expertise in Java, Selenium, BDD/Cucumber, API Rest Assured, Postman, SDLC, STLC, and Agile methodologies . The ideal candidate will have a strong foundation in API testing, an innovative problem-solving mindset, and the ability to work efficiently in a fast-paced environment. Key Responsibilities: Develop and maintain automated test scripts using Java, Selenium, BDD/Cucumber, and API Rest Assured . Design and execute automated test scenarios, test cases, and test data preparation. Work with stakeholders to establish testable acceptance criteria for software applications. Analyze testing goals and align them with software development life cycle (SDLC) and software testing life cycle (STLC). Identify defects, track them, and ensure timely resolution. Collaborate with developers and business teams to customize automation frameworks as per client business needs. Ensure maximum test automation across domains and technologies while being adaptable to learning new tools. Provide client-facing support and effectively communicate findings and solutions . Required Skills: Strong expertise in API Rest Assured testing. Experience working with BDD frameworks like Cucumber . Hands-on experience with automation tools such as Selenium and API validation using Postman . Knowledge of SDLC, STLC , and Agile methodologies. Excellent problem-solving skills and a proactive approach to challenges. Strong verbal and written communication skills , with the ability to articulate testing strategies clearly. Ability to work in banking domain environments is a plus. Preferred Qualifications: Experience in banking domain functions and processes . Prior client-facing experience in delivering automation testing solutions. Ability to identify testing goals, create test scenarios, design test cases, and execute tests efficiently .
Employment Type: Full-Time Experience: 4+ Years NP: Immediate to 45 day Work Location: Bangalore- India(WFO) Job Description: Linux Kernel expertise ( Preferable to have Infotainment domain Understanding.) Practical Knowledge & Hands on experience in Linux Kernel , Drivers, and Linux Embedded Systems Excellent understanding of Linux internals, Realtime Linux, RT porting and Android LINUX Kernel. Hands on knowledge on Linux Driver Porting.(UFS, Filesystem, I2C,SPI, Ethernet, UART, Display) Experience in analyzing and fixing Kernel Crashes and Open-source development. Knowledge on Hypervisor systems (Good to have) Show more Show less
Role: Sr SDET - Software Development Engineer and Testing Position: Senior QA - Manual Tetsing+Automation Testing Project: Maps/Adds Platforms - Product Experience: 5 -8 Years and 8-12 Yrs Key Skills: trong experience in Manual Testing and Postman along with Automation experience with Python/Java Language Participate in troubleshooting and analysis of Production/Data issues Incorporates the automated tests into continuous intergation process. Design sanity, smoke and regression suites through manual and automation tests Develop automated tests using Java/Python langugues Collaborates with Product managers and developers to understand requirements, use cases and transform them into tests.Skills4 years of GIS experience1 year of Hands-on geo-editing experience Strong analytical skills Experience with Arc / QGIS Detailed understanding of at least one Geo-Editing tool Good English Communication skills, specifically - reading comprehension Others: BE/BTech or equivalent technical degree. Proven automation experience in client-facing roles. Location: Hyderabad (Work from Client Office) Kindly reply to this mail with the following details if you are interested in applying for this position. Total Years Exp: Postman Exp: Manual Testing Exp: GIS Experience: Automation Testing Experience: Java/Python/Javascript Experience: Notice Period(If serving LWD mention): Current Location: Are you ready to relocate Hyderabad Location: Current CTC: Expected CTC: Current Company(last company if not working): Role & responsibilities
Role & responsibilities Hi All, I hope you are doing well I am reaching out Unify Technologies - Hyderabad; Kindly Share your updated CV We have followed the Job Opportunity at our company, please go through the job details and let us know if you are interested. Our Company: Unify Technologies Our Website: http://unifytech.com/ Linked In: https://www.linkedin.com/company/9206998 Offices in: Hyderabad, Bangalore, Pune, Gurgaon ? India, and Seattle-USA Industry/Domain: Product Engineering ? Big Data, Healthcare, Cyber Security, Mobile Few words about Unify Technologies: Unify a Digital Engineering company. We help our clients by providing Digital Engineering Services to develop high-quality products. We have extensive experience in software product engineering and a successful track record of delivering on aggressive delivery plans without compromising on the quality in Cloud, Mobile, and Data practices. Employment Type: Full-Time Position: SDET Engineer Experience, Key Skills and Job location details SDET: 5-9 Yrs Automation Testing +Python+PyTest Work Location: Hyderabad(Hybrid) Mandatory Skillset: Strong automation testing using Selenium and Python with PyTest. Kindly reply to this mail with the following details if you are interested in applying for this position. Total Experience: Key Skills: Automation Testing experience: Python Experience: Current Company: Ready to relocate Hyderabad: Current CTC: Expected CTC: Notice Period(If Serving mention LWD): Reason for Job change: Confirm that you have gone through all company details and job details and that you are interested in pursuing this role and job, Company Y/N: Preferred candidate profile Automation Testing+Python+PyTest Perks and benefits
Employment Type: Full time Experience:9-12 Yrs Joining time: Immediate to 20 days Work Mode : Hybrid Work location : Kharadi, Pune Primary & Mandatory Skills:- Repetitive Manufacturing in SAP PP, demand planning, scheduling- lead time & basic date scheduling, MRP planning- MTS (strategy 10, 11 & 40) & CBP planning, Master data- BOM, Routing , work center, production version, shop floor reporting for Repetitive manufacturing, RICEF objects development Good to have skills:- Production planning, purchase planning, 1 greenfield project, 1 roll-out project Shift time :10 am to 7pm shift & onsite travel basis on project requirements. JD : Qualifications for SAP PP- Consultant(Must-to- Have): 9- 12+ years experience in development of IT solutions configuring SAP PP to support Business process requirements. Bachelors degree in engineering, Science, or related field plus 10 years of relevant experience in SAP (Domain experience is add-on advantage) Certified in SAP Production Planning & amp; shop floor control module. Experience in implementation of PP-REM/Discrete module in at least 2 Greenfield implementation projects & 2 Roll-out projects (FULL CYCLE, from initiation to closure) Expertise in Master Data Maintenance (Material Master- MRP views, MRP group, BOMs, Routing, Work center & Production versions) Knowledge & Hand-on experience on Planning strategies- MTS/MTO, MRP- Regenerative planning, Scheduling of orders- forward/backward, Concept & basics for Material Requirement planning. Expertise in configuration of order types & entire Plan to Produce cycle for discrete manufacturing. Expertise in managing Run schedule quantity, Batch Management & customize scheduling. Experience in writing Business Blueprint documents Above average knowledge of integration of PP with other SAP modules like MM, QM, PM, SD. Highly-motivated, Individual Contributor, self-directed, and able to effectively lead efforts including delivery of larger complex projects Experience performing work in a virtual environment Must be able to travel to team co-locations for collaboration on processes and approaches
Role & responsibilities Hi All, I hope you are doing well I am reaching out Unify Technologies - Hyderabad; Kindly Share your updated CV We have followed the Job Opportunity at our company, please go through the job details and let us know if you are interested. Our Company: Unify Technologies Our Website: http://unifytech.com/ Linked In: https://www.linkedin.com/company/9206998 Offices in: Hyderabad, Bangalore, Pune, Gurgaon ? India, and Seattle-USA Industry/Domain: Product Engineering ? Big Data, Healthcare, Cyber Security, Mobile Few words about Unify Technologies: Unify a Digital Engineering company. We help our clients by providing Digital Engineering Services to develop high-quality products. We have extensive experience in software product engineering and a successful track record of delivering on aggressive delivery plans without compromising on the quality in Cloud, Mobile, and Data practices. Employment Type: Full-Time Position: SDET Engineer Experience, Key Skills and Job location details SDET: 5-9 Yrs Automation Testing +Python+PyTest+API Automation Work Location: Hyderabad(Hybrid) Mandatory Skillset: Strong automation testing using Selenium and Python with PyTest. Kindly reply to this mail with the following details if you are interested in applying for this position. Total Experience: Pytest Experience: Automation Testing experience: Python Experience: API Automation(Rest API, Rest Assured, Request Library) Experience; Current Company: Ready to relocate Hyderabad: Current CTC: Expected CTC: Notice Period(If Serving mention LWD): Reason for Job change: Confirm that you have gone through all company details and job details and that you are interested in pursuing this role and job, Company Y/N: Preferred candidate profile Automation Testing+Python+PyTest Perks and benefits
We are Hiring | Technical Lead Backend Engineering Location: Hyderabad | Domain: Healthcare | Experience: 8+ Years F2F Interview | Core Java Backend Expertise Required At UNIFY, we’re transforming healthcare through innovative tech solutions — and we’re looking for a Technical Lead to drive backend excellence. If you’ve grown from a hands-on Java developer into a reliable tech lead who can guide teams and architect scalable systems, we’d love to talk. What we’re looking for: Expertise in Core Java, Spring Boot, Microservices Strong problem-solving and design skills Experience in leading technical teams and mentoring developers Healthcare domain knowledge is a big plus This is a great opportunity to step up, take ownership, and build meaningful solutions in the healthcare space.
Employment Type: Full-Time Role: Senior Software Development Engineer - Senior SDE - AI Engineer Project: Health Care Domain Experience: 6-9 Years Key Skills: Machine Learning, Artificial Intelligence, Python, LLM & RAG Architecture, Agent Development Tools Number of Openings: 01 Joining time: Immediate to 30 days Job Location: Hyderabad/Bangalore/Remote Education: Masters/Bachelors degree in Computer Science, Statistics, Engineering or a related technical discipline will be preferred Detailed Job Description: We are hiring a Senior Data Engineer - AI Systems for our team developing an advanced AI solution for a US Based project, This role needs someone who has strong experience of engineering and building AI based systems. The work will involve deploying AI Models, Integration with various databases, and Implementing autonomous agent features powered by LLM's and Integrations with external tools. This is a high Impact engineering role, ideal for someone who has who has previously built AI-enabled systems in production and thrives in a collaborative, fast-paced environment. You will be part of a small team of developers in India and working closely with a Senior AI Architect in the US, as well as with Client technical personnel. This role requires working with some US time-zone overlap for communication and collaboration. Key Qualifications: About 6+ years in software design and development, with at least 3 years working on AI based systems & solutions Very strong in Python development with hands-on experience in both backend systems and ML tooling, using standard industry frameworks and libraries like Flask, FastAPI, Pytorch, Scikit-Learn, Pandas etc. Strong understanding of LLM architectures, with hands-on experience developing scalable LLM based solutions integrating foundation models (such as GPT, Claude, LLaMA, etc.) into workflows. Strong understanding of prompt engineering, tool calling, and constraints of working with memory-based architectures. Experience implementing RAG architectures, and familiarity with agent integration protocols such as Model Context protocol or similar. Proven experience designing and deploying LLM-based agent systems using frameworks like LangChain, LangGraph or similar. Experience with LangGraph in developing multi-step, stateful workflows is a plus. Experience developing & deploying to the cloud (AWS/GCP/Azure) using infrastructure components like Docker, Kubernetes etc. Strong understanding of the software engineering lifecycle, with best practices for code quality, repeatability, performance and reliability. Familiarity with AI-assisted development tools (e.g., Cursor, GitHub Copilot, Claude Code) and experience defining human-in-the-loop review processes for their safe use is required. Prior experience developing AI powered search or knowledge graph solutions is a plus Ability to work in a rapidly iterative, experiment-driven environment with a tightly knit, cross-functional team. Roles & Responsibilities: Work collaboratively with senior members in the team to design solution workflows as per the requirements and deploy, fine-tune, and integrate foundation models into these workflows. Develop integrations into various source or output systems using APIs, protocols like MCP and other available interfaces. Collaborate with data engineers in the team to integrate the LLM solution and workflows with the data platform and infrastructure. Drive the usage of AI coding assistants or agentic coding tools in the team, while establishing appropriate development & coding processes and standards for tool usage. Review the code developed for the AI applications, including services built on LLMs and intelligent agent frameworks to ensure robustness and reliability. Drive technical decisions, mentor team members, and contribute to establishing engineering best practices and standards for performance, code quality, and reliability
Engineering Delivery Manager (India) Location : India(Hybrid - Unify Locations, preferably Hyderabad) Exp : 10+ Years Role Overview : We are looking for a hands-on Engineering Delivery Manager to lead technology engagements with a dual focus on delivery excellence and strategic account growth. This role demands strong technical capabilities, leadership in scaling teams, ownership of project execution, participation in presales, and an ability to farm accounts by identifying and expanding opportunities within existing client relationships. Key Responsibilities Technical Delivery (Hands-On) : Own end-to-end project deliveryact as the technical and operational manager. Participate in architecture and design reviews, guide code quality, and resolve blockers. Lead Agile teams, manage sprints, and enforce best practices in DevOps, testing, and deployment. Ensure adherence to SLAs, delivery timelines, and quality benchmarks. Team Building & Scaling : Build, onboard, and mentor high-performing engineering teams across locations. Define hiring plans aligned with pipeline growth and project load. Cultivate a culture of ownership, collaboration, and continuous improvement. Oversee training, development, and performance management for team members. Presales & Solutioning : Collaborate with sales and solutioning teams to contribute to RFPs, RFIs, estimations, and proposals. Participate in client calls, technical workshops, and proposal presentations. Define delivery models, timelines, team structure, and risk mitigation plans during presales cycles. Account Farming / Growth : Build strong, trust-based relationships with client stakeholders. Understand evolving client needs and proactively identify new opportunities (e.g., additional modules, support, new teams). Coordinate with sales/account managers to upsell new services, capabilities, or teams. Drive organic growth within existing accounts by consistently delivering value and staying aligned with the clients roadmap. Governance & Communication : Conduct regular delivery governance meetings, QBRs, and status reviews with clients. Report project health, risks, and escalations to internal leadership and stakeholders. Ensure transparency, accountability, and communication flow between all parties. Required Qualifications Bachelors or Masters degree in Engineering, Computer Science, or related field. 10 - 12 years of experience in engineering/project delivery, including 5-7 years in client-facing leadership roles. Strong hands-on technical background (preferred in modern tech stacks) Proven experience in team scaling, performance management, and engineering mentorship. Prior experience in presales and account growth within services, product, or consulting organizations. Excellent communication, stakeholder management, and negotiation skills. Preferred Skills Experience in scaling teams across geographies or in offshore/nearshore models. Familiarity with delivery tools : JIRA, Git, Jenkins, Confluence, Azure DevOps. (ref:hirist.tech)
You should have a Master's/Bachelor's degree or equivalent in Computer Science or related fields. Your expertise should include React-Native, JavaScript, and Java.,
About The Role We are looking for a Senior Engineer - AI & Backend Systems for our team developing an advanced AI solution for a US based client. This role needs someone with strong experience designing and building scalable, high-performance backend systems along with exposure to developing AI based solutions. You will work on building robust API integrations, and integrating large language model (LLM) capabilities into business workflows, collaborating with a global team including a Senior AI Architect in the US. This is a high-impact engineering role, ideal for someone who has previously built AI-enabled systems in production and thrives in a collaborative, fast-paced environment. You will be part of a small team of developers in India and working closely with a Senior AI Architect in the US, as well as with Client technical personnel. This role requires working with some US time-zone overlap for communication and collaboration. Required Experience & Skills 6+ years in backend software design and development, with at least 2 years working on AI based systems & solutions Very strong in Python development with proven experience designing and building scalable, high-performance backend systems using Python, and deep expertise in FastAPI, Flask and other similar Python frameworks. Strong hands-on experience in developing and maintaining connectors to third-party APIs, ensuring secure, reliable, and efficient data exchange. Experience developing & deploying to the cloud (AWS/GCP/Azure) using infrastructure components like Docker, Kubernetes etc. Expertise working with AI-assisted development tools (e.g., Cursor, GitHub Copilot, Claude Code etc. ) and using human-in-the-loop review processes to ensure code safety and reliability. Strong grounding in the software engineering lifecycle, with knowledge of clean architecture, testing strategies, version control, CI/CD, and cloud-native backend engineering best practices. Experience working with AI frameworks or libraries (Transformers, LangChain, OpenAI APIs etc.) and hands-on experience developing AI-powered systems is a strong plus. Exposure to integrating Retrieval-Augmented Generation (RAG) systems and experience implementing agent workflows/protocols (e.g., MCP) is a plus. Ability to work in a rapidly iterative, experiment-driven environment with a tightly knit, cross-functional team Key Responsibilities Work collaboratively with senior members in the team to design integrations as per the requirements and deploy these into the solution workflows. Develop integrations into various source and output systems using APIs, protocols like MCP and other available interfaces. Collaborate with data engineers & AI engineers in the team to integrate the core LLM solution and the data platform with the API infrastructure. Drive the usage of AI coding assistants or agentic coding tools in the team, while establishing appropriate development & coding processes and standards for tool usage. Participate in reviewing the code developed for the applications, including APIs and services built on LLMs to ensure robustness and reliability. Contribute to establishing engineering best practices and standards for performance, code quality, and reliability (ref:hirist.tech)
About The Role We are looking for a Senior Engineer - Knowledge Graphs & NLP for our team developing an advanced AI solution for a US based client. This role needs an engineer with expertise in developing systems using Knowledge Graph and NLP techniques like Named Entity Recognition (NER), and Entity Linking. The work will involve architecting and building of knowledge-driven solutions leveraging cutting-edge graph technology and databases and entity-centric data enrichment.This is a high-impact engineering role, ideal for someone who has previously built AI-enabled systems in production and thrives in a collaborative, fast-paced environment. Skills You will be part of a small team of developers in India and working closely with a Senior AI Architect in the US, as well as with Client technical personnel. This role requires working with some US time-zone overlap for communication and Experience & Skills : 6+ years of experience in software development, with 3+ years focused on NLP, knowledge graphs, and semantic technologies. Strong experience in NLP and hands-on experience building and deploying NER and Entity Linking models to production. Experience working with graph databases and graph query languages like Cypher or SPARQL. Proficient in Python development with hands-on experience using relevant frameworks and libraries such as spaCy, PyKEEN, PyTorch etc. Experience developing & deploying to the cloud (AWS/GCP/Azure) using infrastructure components like Docker, Kubernetes etc. Strong understanding of the software engineering lifecycle, with best practices for code quality, repeatability, performance and reliability. Familiarity with AI-assisted development tools (e.g., Cursor, GitHub Copilot, Claude Code) and experience with human-in-the-loop review processes. Ability to work in a rapidly iterative, experiment-driven environment with a tightly knit, cross-functional team Key Responsibilities Work collaboratively with team and client personnel to architect and build scalable knowledge graphs integrating diverse, large-scale data sources. Develop ontologies and semantic data models for structured domain representation. Build language pipelines for semantic parsing, relation extraction, context modeling etc. Collaborate with Data and AI engineers in the team to integrate the graph elements into the overall solution and workflows (ref:hirist.tech)
Location : Gurugram Experience Level : 3 - 7 years Employment Type : Full-time Key Responsibilities Design and develop integration solutions using Apache Camel, Java, and Spring Boot Implement Enterprise Integration Patterns (EIP) for seamless system communication Develop and maintain RESTful APIs, SOAP services, and message routing Collaborate with cross-functional teams to gather requirements and deliver scalable solutions Optimize performance and troubleshoot integration issues across distributed systems Ensure code quality through unit testing, code reviews, and CI/CD practices Required Skills Strong proficiency in Java 8+ Hands-on experience with Apache Camel and its components (e.g., JMS, FTP, HTTP, File) Familiarity with Spring Framework, Spring Boot, and Maven/Gradle Experience with message brokers like ActiveMQ, Kafka, or RabbitMQ Understanding of microservices architecture and containerization (Docker/Kubernetes) Knowledge of logging frameworks (SLF4J, Log4j) and monitoring tools Preferred Qualifications Exposure to cloud platforms (AWS, Azure, GCP) Experience with Camel K or Camel Quarkus Familiarity with DevOps tools like Jenkins, Git, and Ansible Certification in Java or Integration Technologies is a plus Educational Background Bachelors or Masters degree in Computer Science, Engineering, or related field (ref:hirist.tech)
You should have a minimum of 5 years of experience in AI/ML with at least 2+ years in NLP, LLMs, and Generative AI. Your expertise should include ML architecture design, end-to-end model development, and deployment in production systems. Proficiency in Python is essential, along with deep experience in ML libraries and frameworks like TensorFlow, PyTorch, Hugging Face, and LangChain. A sound knowledge of transformer models, embeddings, tokenization, and vector databases such as FAISS and Pinecone is required. Experience with cloud-native AI solutions on AWS, Azure, or GCP is preferred. Familiarity with MLOps, model versioning, containerization using Docker, and orchestration tools like Kubeflow and MLflow is a plus. Your responsibilities will include architecting and implementing end-to-end machine learning and Generative AI solutions for real-world applications. You will design, fine-tune, and deploy models using transformers, embeddings, tokenization, and LLMs for tasks like summarization, classification, question answering, and content generation. Developing and maintaining high-quality, production-grade ML code in Python using libraries such as TensorFlow, PyTorch, Hugging Face, and LangChain is crucial. Furthermore, you will be responsible for building and optimizing retrieval-augmented generation (RAG) pipelines by integrating LLMs with structured and unstructured data. Working with vector databases like FAISS and Pinecone to manage semantic search and context retrieval efficiently will be part of your role. Utilizing cloud-native AI services for model training, deployment, and scaling on platforms like AWS, GCP, and Azure is expected. Implementing MLOps best practices, including model versioning, containerization using Docker, orchestration with tools like Kubeflow and MLflow, and following CI/CD procedures are also key responsibilities. Strong problem-solving skills, architectural thinking, and the ability to lead complex AI initiatives, along with excellent communication, stakeholder management, and technical leadership capabilities, are essential for this role.,
You are a candidate with strong experience in Spark Core/Streaming, Spark RDD API, and Spark Dataframes API using programming languages such as Scala, Java, or Python. You have a background in Machine Learning or AI and possess a deep understanding of Data Structures, Algorithms, Data Transformation, Data Ingestion, and Optimization mechanisms. Additionally, you are well-versed in Big Data technologies including Hadoop, MapReduce, Kafka, and Cassandra. Your primary responsibility will be to design and develop scalable and high-performance data processing applications using Apache Spark (Core, Streaming) with Scala. You will work with Spark RDD and DataFrame APIs to create robust data pipelines and apply machine learning or AI techniques to solve real-world data challenges. Data ingestion from various structured and unstructured sources, optimizing Spark jobs for performance and resource utilization, and collaborating with cross-functional teams are integral parts of your role. You should have at least 4 years of experience in Big Data technologies, particularly in Apache Spark (Core/Streaming). Proficiency in Scala is preferred, along with knowledge of Java or Python. A strong understanding of Spark RDD API, DataFrame API, and practical experience with Machine Learning or AI frameworks and use cases is essential. Your expertise should also include data structures, algorithms, and performance optimization techniques, as well as hands-on experience with data ingestion, transformation, and processing pipelines. Given the scale of data handled in this role, you must enjoy tackling complex problems, conducting independent research, and collaborating with various teams to enhance the overall product experience. Your ability to ensure high code quality through unit testing and code reviews, along with designing and implementing data transformation, data ingestion, and ETL pipelines, will be key to your success in this position.,
You will be joining as a Software Development Engineer in a Full-Time capacity within the Maps and Advertising Platforms Product project team based in Hyderabad, India (Hybrid Mode). As a part of this team, you will be designated as a Developer, Senior Developer, or Lead Developer, depending on your experience level ranging from 3 to 9 years in SDE role. Your primary responsibility will involve architecting, designing, and implementing scalable and efficient software solutions using Java throughout the software development lifecycle. You will be expected to leverage your expertise in data structures and algorithms to optimize existing systems and tackle complex problems effectively. Ensuring code quality by writing clean, maintainable, and testable code will be crucial, along with active participation in code reviews and adherence to software development best practices. Functional Programming principles, particularly in Scala or similar paradigms, will be incorporated as part of your role. Collaborating closely with cross-functional teams comprising product managers, designers, and other engineers is essential to deliver high-quality software products. Furthermore, you will play a key role in providing technical leadership and mentorship to junior developers, aiding in their career growth while upholding best practices in software development. To excel in this role, you must possess proven expertise in data structures and algorithms, with a strong background in software development spanning 3 to 9 years, preferably in backend or systems engineering roles. Proficiency in Scala or Java programming languages, coupled with experience in at least one Scala framework like Play or Akka, and API development expertise with SQL/NoSQL databases are essential requirements. Your educational background should ideally reflect a Masters or Bachelor's degree in Computer Science, Statistics, Engineering, or related technical disciplines, with a focus on Scala+Play/Akka. The role necessitates immediate availability or joining within 30 days of offer acceptance. If you are someone with a passion for software development, possess strong logical and problem-solving skills, and are eager to contribute to a dynamic team environment, we welcome your application to be a part of our innovative project team.,
Were Hiring Java Developer & Apache Camel| Immediate Joiners | At Unify Technologies, we believe in empowering tech talent to build high-impact solutions. Were looking for passionate developers with 3+ years of experience in Java Developer Python & SQL Developer, ready to take the next step in their career. Location : Gurugram Experience : 3+ Years Skills : Java Developer & Apache Camel Job Description Key Responsibilities : Design and develop robust Java applications and microservices. Implement and maintain integration flows using Apache Camel. Build and manage real-time data pipelines using Apache Kafka. Collaborate with cross-functional teams to deliver scalable and reliable backend systems. Ensure code quality through unit testing, integration testing, and code reviews. Optimize application performance and scalability. Required Skills Strong programming skills in Java (8 or above). Hands-on experience with Apache Camel for routing and mediation. Solid knowledge of Apache Kafka for real-time messaging and streaming. Experience with REST APIs, Spring Boot, and microservices architecture. Familiarity with CI/CD tools, Git, and Agile methodologies. Excellent problem-solving and communication skills. Qualifications Bachelors degree in Computer Science or related field. Confluent Certified Developer for Apache Kafka (preferred). Experience with cloud platforms (AWS, Azure, or GCP) is a plus. (ref:hirist.tech)
As a skilled individual in the field of machine learning, you will be responsible for developing and optimizing ML pipelines. Your expertise in deep learning, computer vision, PyTorch, and Python will be essential in designing and implementing efficient algorithms and models. Your role will involve working on various projects that require advanced machine learning techniques to solve complex problems. You will collaborate with a team of data scientists and engineers to build and deploy cutting-edge solutions. Your contribution will have a significant impact on the development of innovative products and services.,
Company Description Unify Technologies is a leading digital engineering company committed to helping global clients build high-quality software products quickly and affordably. With extensive experience in software product engineering practices, Unify Technologies delivers custom software solutions. We collaborate with industry leaders across multiple geographies to drive innovation and efficiency. Role Description This is a full-time, on-site role for a Senior Data Engineer, located in Hyderabad. The Senior Data Engineer will be responsible for designing, developing, and maintaining data pipelines and databases. The role includes data modeling, ETL processes, and ensuring data architecture meets industry standards. The Senior Data Engineer will collaborate with cross-functional teams to support data analytics and business intelligence initiatives. Key Qualifications: 5 Yrs of experience in software in a Data Engineering role Experience in Snowflake (3+ Yrs) Including performance tuning and data modeling Experience with relational database and managing structured data at scale Strong SQL skills with the ability to write, Optimize, analyze and debug complex queries Experience with Data Orchestration and ETL tools, specifically Airflow and DBT in a production environment Proficiency in Python for Airflow DAGs and general Data manipulation and automation tasks. Ability to diagnose and resolve data quality Issues and performance bottlenecks Knowledge of Data warehouse, data lake and data pipeline performance tuning and troubleshooting. Roles & Responsibilities: Design, implement, and maintain robust data pipelines for data ingress and orchestration workflows Develop and enforce standards for data warehousing, data ingress/egress, and production data set quality Collaborate with engineering, product, and analytics teams to define and deliver on data requirements Monitor, troubleshoot, and resolve performance issues and bottlenecks across data pipelines, data lakes, and warehouses. Other data related tasks & duties, as assigned.
FIND ON MAP