Zymr is experiencing rapid growth and seeks an accomplished Director of Sales, based in India, to spearhead North American sales initiatives for our software development, cloud, and AI/ML services. This is a quota-carrying, strategic position focused on the acquisition and expansion of mid-market and enterprise accounts within the United States and Canada. The successful candidate will collaborate closely with our marketing, pre-sales, and delivery teams to cultivate a robust pipeline and finalize multi-year, high-value service engagements. Job Title: Director of Sales Required Experience: 10-15 years Job Location: Pune/Ahmedabad Responsibilities Achieve and surpass personal sales quotas for new revenue generated from North America. Lead comprehensive sales cycles, encompassing prospecting, discovery, solutioning, proposal development, and closing. Drive outbound outreach efforts utilizing email, LinkedIn, and warm leads. Engage with CXO-level stakeholders, including CTOs, Heads of Engineering, and Product Leaders. Develop tailored proposals with the support of pre-sales and delivery teams. Maintain accurate and current pipeline data, sales forecasts, and CRM hygiene (Zoho CRM). Participate in virtual client meetings, accommodating US time zones (EST/PST flexibility is required). Collaborate with the internal marketing team to execute campaigns and events. Qualifications 10-15 years of sales experience within the IT/software services sector. Demonstrated history of successfully closing deals with US-based companies (valued at $50K–$ 500 K+). This is a quota-carrying individual contributor role, and responsibilities will include lead generation, prospecting, identifying, qualifying, and creating opportunities and selling the full range of Zymr services, including Product and Platform Engineering services, DevOps, Data Analytics, AIML, Agentic AI, and Cloud native Development Services across industries. Good to have industry-specific experience like ISVs, Digital platforms, Enterprises cutting across various domains like Retail, Banking and Financial Services, Fintech, Security, Healthcare, RPA, Health & Fitness Experience in outbound prospecting and consultative enterprise selling. Exceptional verbal and written communication skills, suitable for interactions with US clients. Proficiency in utilizing CRM, LinkedIn Sales Navigator, and email outreach tools (e.g., Zoho, Apollo). Self-motivated, accountable, and driven to secure high-value accounts. Achieve and surpass personal sales quotas for new revenue generated from North America. Bring in large-scale sales skills and drive long-term sticky deal closures, particularly in outsourcing, rebadging, and transformation Why Join Us? Opportunity to join a high-growth software services firm distinguished by a strong delivery ethos. Competitive base salary coupled with an incentive structure Lead comprehensive sales cycles, encompassing prospecting, discovery, solutioning, proposal development, and closing. Drive outbound outreach efforts utilizing email, LinkedIn, and warm leads. Engage with CXO-level stakeholders, including CTOs, Heads of Engineering, and Product Leaders. Develop tailored proposals with the support of pre-sales and delivery teams. Maintain accurate and current pipeline data, sales forecasts, and CRM hygiene (Zoho CRM). Participate in virtual client meetings, accommodating US time zones (EST/PST flexibility is required). Collaborate with the internal marketing team to execute campaigns and events. Positive culture and flexible work hours.
Join our vibrant team at Zymr as a Senior DevOps CI/CD Engineer and become a driving force behind the exciting world of continuous integration and deployment automation. We're a dynamic group dedicated to building a high-quality product while maintaining exceptional speed and efficiency. This is a fantastic opportunity to be part of our rapidly growing team. Job Title : Sr. DevOps Engineer Location: Ahmedabad/Pune Experience: 5 + Years Educational Qualification: UG: BS/MS in Computer Science, or other engineering/technical degree Responsibilities: Deployments to Development, Staging, and Production Take charge of managing deployments to each environment with ease: Skillfully utilize Github protocols to identify and resolve root causes of merge conflicts and version mismatches. Deploy hotfixes promptly by leveraging deployment automation and scripts. Provide guidance and approval for Ruby on Rails (Ruby) scripting performed by junior engineers, ensuring smooth code deployment across various development environments. Review and approve CI/CD scripting pull requests from engineers, offering valuable feedback to enhance code quality. Ensure the smooth operation of each environment on a daily basis, promptly addressing any issues that arise: Leverage Datadog monitoring to maintain a remarkable uptime of 99.999% for each development environment. Develop strategic plans for Bash and Ruby scripting to automate health checks and enable auto-healing mechanisms in the event of errors. Implement effective auto-scaling strategies to handle higher-than-usual traffic on these development environments. Evaluate historical loads and implement autoscaling mechanisms to provide additional resources and computing power, optimizing workload performance. Collaborate with DevOps to plan capacity and monitoring using Datadog. Analyze developer workflows in close collaboration with team leads and attend squad standup meetings, providing valuable suggestions for improvement. Harness the power of Ruby and Bash to create tools that enhance engineers' development workflow. Script infrastructure using Terraform to facilitate the creation infrastructure Leverage CI/CD to add security scanning to code pipelines Develop Bash and Ruby scripts to automate code deployment while incorporating robust security checks for vulnerabilities. Enhance our CI/CD pipeline by building Canary Stages with Circle CI, Github Actions, YAML and Bash scripting. Integrate stress testing mechanisms using Ruby on Rails, Python, and Bash scripting into the pipeline's stages. Look for ways to reduce engineering toil and replace manual processes with automation! Nice to have: Terraform is required Github, AWS tooling, however pipeline outside of AWS, Rails (other scripting languages okay)
We are looking for Sr QA automation engineer with experience in Robot Framework, Python, RESTful Api’s, and Selenium . Working experience in Cloud, Network Security, and WiFi is a plus Job Title: Sr QA automation Engineer Educational Qualification: Computer Science - Engineering or B.Sc/M.Sc degree in Computer Science or MCA. Must have a strong academic background. Required Experience : 4-7 years J ob Location: Ahmedabad Requirements: Strong system administration experience in Linux Experience in testing System products such as Networking, Network Security, Wifi is preferable Experience in SaaS based applications, Elastic Search is a must Experience in public and private clouds: AWS, Azure, OpenStack, KVM, HyperV, VMWare Understanding of network security, remote access products a plus. Must be hands-on in developing automation frameworks like Robot Framework, Selenium, Restful Api’s Python Programming is a must Creating Test plans and test cases for feature testing. Hands-on testing in Scripting: basic Shell scripts, PowerShell Good Communication Skills
We are looking for a data engineering professional with strong experience in designing and implementing end-to-end ETL solutions using Azure Data Factory (ADF), Azure Data Lake Storage (ADLS), Azure Databricks, and SSIS. The candidate should be proficient in SQL, REST API integration, and automation through CI/CD pipelines using Azure DevOps and Git. They should also have a solid understanding of maintaining and optimizing data pipelines, warehouses, and reporting within the Microsoft SQL stack. Job Title: Sr. Data Engineer Location: Ahmedabad/Pune Experience: 5 + Years Educational Qualification: UG: BS/MS in Computer Science or other engineering/technical degree Roles and responsibilities: Design and implement end-to-end data solutions using Microsoft Azure Data Factory (ADF) and Azure Data Lake Storage (ADLS), Azure Databricks, and SSIS. Develop complex transformation logic using SQL Server, SSIS, and ADF, and develop ETL Jobs/Pipelines to execute those mappings concurrently. Maintain and enhance existing ETL Pipelines, Warehouses, and Reporting leveraging traditional MS SQL Stack Understanding of REST API principles and creating ADF pipelines to handle HTTP requests for APIs. Well-versed with best practices for development, deployment of SSIS packages, SQL jobs, and ADF pipelines. Implement and manage source control practices using GIT within Azure DevOps to ensure code integrity and facilitate collaboration Participate in the development and maintenance of CI/CD pipelines for automated testing and deployment of BI solutions Preferred skills, but not required: Understanding of the Azure environment and developing Azure Logic Apps and Azure Function Apps. Understanding of Code deployment, GIT, CI/CD, and deployment of developed ETL code (SSIS, ADF).
As a Python Engineer with 2-4 years of experience, you will be responsible for building, deploying, and scaling Python applications along with AI/ML solutions. Your strong programming skills will be put to use in developing intelligent solutions and collaborating closely with clients and software engineers to implement machine learning models. You should be an expert in Python, with advanced knowledge of Flask/FastAPI and server programming to implement complex business logic. Understanding fundamental design principles behind scalable applications is crucial. Independently designing, developing, and deploying machine learning models and AI algorithms tailored to business requirements will be a key aspect of your role. Your responsibilities will include solving complex technical challenges through innovative AI/ML solutions, building and maintaining integrations (e.g., APIs) for machine learning models, conducting data preprocessing and feature engineering, and optimizing datasets for model training and inference. Monitoring and continuously improving model performance in production environments, focusing on scalability and efficiency, will also be part of your tasks. Managing model deployment, monitoring, and scaling using tools like Docker, Kubernetes, and cloud services will be essential. You will need to develop integration strategies for smooth communication between APIs and troubleshoot integration issues. Creating and maintaining comprehensive documentation for AI/ML projects will be necessary, along with staying updated on emerging trends and technologies in AI/ML. Key Skills Required: - Proficiency in Python, R, or similar languages commonly used in ML/AI development - Hands-on experience with TensorFlow, PyTorch, scikit-learn, or similar ML libraries - Strong knowledge of data preprocessing, data cleaning, and feature engineering - Familiarity with model deployment using Docker, Kubernetes, or cloud platforms - Understanding of statistical methods, probability, and data-driven decision-making processes - Proficient in querying databases for ML projects - Experience with ML lifecycle management tools like MLflow, Kubeflow - Familiarity with NLP frameworks for language-based AI solutions - Exposure to computer vision techniques - Experience with managed ML services like AWS SageMaker, Azure Machine Learning, or Google Cloud AI Platform - Familiarity with agile workflows and DevOps or CI/CD pipelines Good to Have Skills: - Exposure to big data processing tools like Spark, Hadoop - Experience with agile development methodologies The job location for this role is in Ahmedabad/Pune, and the required educational qualifications include a UG degree in BE/BTech or PG degree in ME/M-Tech/MCA/MSC-IT/Data Science, AI, Machine Learning, or a related field.,
We are seeking a skilled data engineering professional with over 5 years of experience in designing and implementing end-to-end ETL solutions using Azure Data Factory (ADF), Azure Data Lake Storage (ADLS), Azure Databricks, and SSIS. The ideal candidate should have a strong proficiency in SQL, REST API integration, and automation through CI/CD pipelines using Azure DevOps and Git. Additionally, a solid understanding of maintaining and optimizing data pipelines, warehouses, and reporting within the Microsoft SQL stack is required. As a Senior Data Engineer, you will be responsible for designing and implementing comprehensive data solutions using Microsoft Azure Data Factory (ADF) and Azure Data Lake Storage (ADLS), Azure Databricks, and SSIS. This role involves developing complex transformation logic using SQL Server, SSIS, and ADF, as well as creating ETL Jobs/Pipelines to execute those mappings concurrently. You will also play a key role in maintaining and enhancing existing ETL Pipelines, Warehouses, and Reporting leveraging traditional MS SQL Stack. The role requires a deep understanding of REST API principles and the ability to create ADF pipelines to handle HTTP requests for APIs. Proficiency in best practices for development and deployment of SSIS packages, SQL jobs, and ADF pipelines is essential. You will be responsible for implementing and managing source control practices using GIT within Azure DevOps to ensure code integrity and facilitate collaboration. Additionally, you will participate in the development and maintenance of CI/CD pipelines for automated testing and deployment of BI solutions. Preferred skills include an understanding of the Azure environment and experience in developing Azure Logic Apps and Azure Function Apps. Knowledge of code deployment, GIT, CI/CD, and deployment of developed ETL code (SSIS, ADF) would be advantageous, but not mandatory. This position is based in Ahmedabad/Pune and requires a candidate with a BS/MS in Computer Science or other engineering/technical degree.,
We are seeking a talented Senior Developer to join our team. The ideal candidate should have a solid understanding of backend development principles and be able to contribute to development in Java and Node.js. Job Title : Sr Software Engineer (Java +NodeJS) Required Experience: 7-10 years Job Location: Ahmedabad/Pune Required Educational Qualification: Bachelor’s degree in computer science, Software Engineering, or a related field. Requirements & Responsibilities: Candidates should have at least 7 years of experience as a software engineer, specializing in backend development. Strong knowledge of Java & NodeJS with TypeScript. Experience with RESTful API and JSON data structure. Good understanding of any one of the industry-leading SQL databases. Ability to write unit tests in any one of the industry standard frameworks. Basic knowledge of AWS/GCP services. Experience with security principles and practices in backend development. The ability to learn quickly and be open to taking up new challenges. A reliable nature with a helpful "can-do" attitude. Good written English skills and communication capabilities. Eagerness to deliver high-quality service to clients. Good to have Skills: Knowledge of CI/CD processes. Familiarity with deploying applications on cloud technologies. Exposure to React, Vue.js, Gatsby, and GraphQ L is a plus.
We are looking for a skilled Senior Developer to join our team. The ideal candidate must possess a strong grasp of backend development principles and be capable of contributing to Java and Node.js development. The Senior Software Engineer (Java + NodeJS) should have at least 7-10 years of experience in the field. The position is based in Ahmedabad/Pune and requires a Bachelor's degree in Computer Science, Software Engineering, or a related field. Key Requirements & Responsibilities: - Minimum 7 years of experience as a software engineer specializing in backend development. - Proficiency in Java & NodeJS with TypeScript. - Familiarity with RESTful API and JSON data structure. - Solid knowledge of an industry-leading SQL database. - Ability to write unit tests using industry-standard frameworks. - Basic understanding of AWS/GCP services. - Experience in backend development security principles and practices. - Quick learner with a willingness to tackle new challenges. - Reliable and helpful attitude with good written English and communication skills. - Commitment to delivering high-quality service to clients. Desired Skills: - Understanding of CI/CD processes. - Experience in deploying applications on cloud technologies. - Exposure to React, Vue.js, Gatsby, and GraphQL would be advantageous.,
Join our vibrant team at Zymr as a Senior DevOps CI/CD Engineer and become a driving force behind the exciting world of continuous integration and deployment automation. We're a dynamic group dedicated to building a high-quality product while maintaining exceptional speed and efficiency. This is a fantastic opportunity to be part of our rapidly growing team. As a Senior DevOps Engineer at Zymr, you will be responsible for deployments to Development, Staging, and Production environments. You will skillfully utilize Github protocols to identify and resolve root causes of merge conflicts and version mismatches. Additionally, you will deploy hotfixes promptly by leveraging deployment automation and scripts. A key aspect of your role will be providing guidance and approval for Ruby on Rails scripting performed by junior engineers, ensuring smooth code deployment across various development environments. You will also review and approve CI/CD scripting pull requests from engineers, offering valuable feedback to enhance code quality. Ensuring the smooth operation of each environment on a daily basis will be a critical part of your responsibilities. You will leverage Datadog monitoring to maintain a remarkable uptime of 99.999% for each development environment. Developing strategic plans for Bash and Ruby scripting to automate health checks and enable auto-healing mechanisms in the event of errors will be essential. You will also implement effective auto-scaling strategies to handle higher-than-usual traffic on these development environments, optimizing workload performance by analyzing historical loads and implementing autoscaling mechanisms. Collaborating with DevOps to plan capacity and monitoring using Datadog, analyzing developer workflows, attending squad standup meetings, and providing valuable suggestions for improvement will be part of your daily routine. You will harness the power of Ruby and Bash to create tools that enhance engineers" development workflow and script infrastructure using Terraform to facilitate the creation of infrastructure. Additionally, you will leverage CI/CD to add security scanning to code pipelines and develop Bash and Ruby scripts to automate code deployment while incorporating robust security checks for vulnerabilities. Enhancing the CI/CD pipeline by building Canary Stages with Circle CI, Github Actions, YAML, and Bash scripting, as well as integrating stress testing mechanisms using Ruby on Rails, Python, and Bash scripting into the pipeline's stages will be crucial. You will continuously look for ways to reduce engineering toil and replace manual processes with automation. Experience: 8+ Years Educational Qualification: UG: BS/MS in Computer Science, or other engineering/technical degree Location: Ahmedabad/Pune Nice to have: Terraform, Github, AWS tooling, Rails (other scripting languages okay),
We are seeking a highly motivated SaaS Procurement Specialist to join our research-focused team. The primary responsibility for this role is to manage the end-to-end procurement of advanced licenses for predominantly US-based SaaS applications, enabling our technical and development teams to access robust APIs for extracting and processing research data. This role will require excellent communication skills, experience in software/vendor negotiations, and the ability to coordinate with sales teams across US time zones. Job Title: SaaS Procurement Specialist (Research & Development Focus) Location: Pune, Ahmedabad Experience: 2-5 Years Educational Qualification: Bachelor’s degree in business, IT, or a related field. Key Responsibilities: SaaS Vendor Identification & Evaluation: Research and shortlist SaaS applications aligned with our research objectives. Evaluate vendor products based on API capabilities, licensing models, compliance, and data security. Stakeholder Coordination: Work closely with internal research and development teams to gather requirements and technical needs. Facilitate product demos and coordinate technical evaluations with vendors. Negotiations & Procurement: Initiate and lead communications with sales representatives of US-based SaaS vendors, primarily via email and Zoom calls. Negotiate pricing, contract terms, and procurement of advanced licenses to ensure API access and optimal value. Contract & Compliance Management: Draft, review, and process purchase agreements, ensuring compliance with organizational and regulatory standards. Liaise with legal and finance teams for approvals and documentation. Relationship Management: Develop and maintain long-term relationships with SaaS vendors and internal stakeholders. Act as the primary point of contact for vendor-related queries and escalations. Requirements : Bachelor’s degree in business, IT, or a related field. 2+ years of experience in software/SaaS procurement, technical sales, or vendor management. Excellent verbal and written communication skills in English. Prior experience dealing with US-based vendors and comfort working with overlapping US time zones (typically evenings India time). Proven experience negotiating contracts and managing procurement processes for technical or research organizations. Understanding of SaaS licensing models, API access, and data privacy/compliance considerations. Preferred Skills: Familiarity with technical API integrations and developer needs. Prior experience in a research-driven or data science organization. Strong organizational and multitasking abilities. Proficiency with procurement or contract management tools. Work Hours: Core working hours aligned with Indian Standard Time (IST) with expected overlap with US business hours (typically 6pm–11pm IST for Zoom meetings with US sales teams). If you are proactive, tech-savvy, and comfortable negotiating with international vendors, we encourage you to apply for this dynamic role!
The Platform Core Development team provides the foundational software stack for Personal Cloud, our flagship white-label cloud content platform. This system supports backup, sync, and content sharing for millions of global mobile operator subscribers. The team architected and continues to evolve our platform to efficiently manage petabytes of data, handle billions of API calls, and support terabytes of daily content uploads. Job Title : Sr. Java Developer Location: Ahmedabad/Pune Experience: 8 + Years Educational Qualification: UG: BS/MS in Computer Science, or other engineering/technical degree Key Responsibilities Take technical ownership across the full software development lifecycle: architecture, design, documentation, coding, testing, release, and ongoing support. Act as a technical mentor, guiding less experienced engineers and fostering a high-performance, collaborative environment. Collaborate closely within and across scrum teams (daily standups, team refinements, sprint/PI planning), proactively driving process improvements. Lead efforts to maintain and enhance existing product components, as well as architect and deliver highly scalable new features. Champion code quality, robust security standards, and service reliability at scale. Lead technical innovation for improving platform components, CI/CD, deployment, and team workflows. What We’re Looking for 8–10 years of experience designing, building, and maintaining enterprise Java solutions (Java 8+; Spring Boot, JEE, advanced functional patterns). Proven experience in architecting and delivering high-availability, mission-critical systems that are scalable, performant, secure, and maintainable. Deep expertise with core Java frameworks: Spring/Spring Boot, Hibernate/JPA, Dependency Injection, and JDBC. Strong command of build and dev tools: Gradle/Maven, IntelliJ/Eclipse, git, Bamboo/Jenkins. Expertise in comprehensive Java testing & profiling (JUnit, Cucumber, JMeter) and driving automation initiatives. RDBMS mastery (MySQL, Postgres) and hands-on REST API design/implementation. Proficient in deploying, monitoring, and troubleshooting production workloads on LINUX and major cloud providers. Substantial experience in agile software development and leading scrum teams. Excellent communication, documentation, technical negotiation, and stakeholder presentation skills. Owns the delivery of robust CI pipelines, automated test frameworks, and efficient release processes. Committed to simplicity and pragmatic engineering—avoids overengineering and leads by example. Demonstrated impact on improving team performance, productivity, and technical skill sets. Superior troubleshooting, incident response, and critical-problem-solving experience. Preferred Experience Expert-level distributed systems development: Microservices architecture, containerization (Docker, Kubernetes, Helm), and CI/CD at scale. Advanced API security (OAuth, Open ID) and experience with secure design/reviews. Deep knowledge of NoSQL databases (MongoDB, Cassandra, Redis), caching, and eventual consistency patterns. Large-scale messaging systems: JMS, RabbitMQ, and Kafka. Exposure to data analytics platforms: Hive, Hadoop, Spark, etc. Experience mentoring teams on cloud-native best practices and modern DevOps.
About Us We are a fast-growing IT company seeking smart, driven, and passionate young IT professionals to join our team. If you're eager to learn, solve real-world problems, and build innovative tech solutions, this is your launchpad into the IT industry. Job Title: Software Engineer Locations: Ahmedabad & Pune Open Positions: 10 Employment Type: Full-Time Eligibility Criteria Education: BE/BTech (IT/CS), ME/MTech (IT/CS), MSc (IT) Experience: 0–2 years Key Skills & Requirements Strong problem-solving abilities and a learner’s mindset Sound understanding of Object-Oriented Programming (OOPS) Good grasp of Software Engineering principles Familiarity with Data Structures and Algorithms Proficiency in SQL and NoSQL databases Enthusiasm to explore AI fundamentals, including basic knowledge of concepts like supervised learning, classification, and model training Ability to work collaboratively and think creatively Technical Skills: Programming Languages (at least one): C# / Python / Java / JavaScript Databases: SQL: MySQL, PostgreSQL, SQL Server NoSQL: MongoDB, CosmosDB DevOps & CI/CD: Basic concept of CI/CD pipelines Good to have: Exposure to Docker, Kubernetes Web Development: Knowledge of RESTful APIs, microservices architecture Familiarity with frontend frameworks (React.js, Angular, or Vue.js) Software Testing: Basic knowledge of manual/automation testing Exposure to API testing tools like Postman Good to have skills Cloud Platforms: Basic familiarity with Azure, AWS, or GCP AI/ML & Data Engineering: Exposure to ML tools/frameworks like TensorFlow or PyTorch Understanding of AI workflows (data collection, training, inference) Hands-on with ETL pipelines, basic data preprocessing, or Big Data technologies Experience with simple projects using OpenAI, Firebase Studio, Cursor, or any tools
As an AWS DevOps Engineer at our company, you will play a crucial role in the design, development, and deployment of robust applications utilizing AWS cloud technologies. Your responsibilities will include hands-on experience with various AWS services, proficiency in Serverless & Micro Service Architecture, and expertise in build management and continuous integration tools and processes. Additionally, you will be expected to have knowledge of Cloudformation and Terraform with module-based coding, and be proficient in AWS services such as IAM, VPC, EC2, RDS Databases, cloud front, ECR, ECS, EKS, lambda, Code Build, Code Commit, Cloud trail, and Cloud Watch Services. You will be required to create all AWS Resources/Services/Filters via code, monitor platform capacity and health indicators, configure build and CI/CD management tools like Jenkins and Github Action, and build artifacts using Maven, Gradle, npm, etc. Moreover, you should have hands-on experience in Cloud-based Infrastructure (AWS) management, proficiency in Docker, and experience with container management tools like AWS ECS, EKS, Kubernetes, etc. Working experience with Git and GitHub is also essential for this role. To be successful in this position, you must have proven experience as a DevOps Engineer with a focus on AWS, strong expertise in designing and implementing CI/CD pipelines using DevOps, and knowledge of security best practices in the AWS environment. Strong problem-solving skills, excellent communication, and teamwork skills, as well as scripting experience in Bash or Python, are also required. Additionally, having AWS certifications, understanding of AI/ML concepts, knowledge of logging and monitoring tools (ELK and Grafana), and familiarity with sonarqube will be considered advantageous. If you are someone with a learning mindset, strong analytical and problem-solving skills, and hands-on experience with logging and troubleshooting deployment pipelines, we encourage you to apply for this position and be a valuable part of our growing team.,
We are looking for a Lead Platform Engineer to build functional and efficient server-side applications. The responsibilities include participating in all phases of the Agile software development lifecycle and coaching junior developers. Your ultimate goal is to create high-quality, cloud-native products that meet customer needs. Required Experience: 8 + Years Required Qualification: Bachelor's or Master's degree in Computer Science, Engineering, or a related field. Job Location : Pune/Ahmedabad Responsibilities: Cloud Experience: Minimum 5 years in building platforms using Azure. System Architecture: Excellent track record in system architecture and design. Programming Skills: Strong expertise in Python programming. Serverless Technologies: Experience in developing applications using serverless technologies. Database Skills: Strong expertise in both relational and NoSQL databases. Must have experience in Azure Services. Leadership: At least 3 years of experience in leading teams. Additional Skills: Experience in leading SRE or DevOps teams for a long-term platform project. Exposure to performance optimization, improving platform efficiency/ scalability.
We are looking for a Full-stack developer to develop full-stack web applications using JavaScript technologies. Required Experience: 3-5 Years Job title: Senior Full Stack Developer Required Qualification: BE/BTech (IT/CS/Electronics), ME/MTech (IT/CS), or equivalent technical degree Job Location: Pune/Ahmedabad Key Skills and Requirements: Strong analytical and problem-solving skills with the ability to architect scalable solutions Proven experience in full-stack JavaScript development lifecycle Solid understanding of software design patterns and architectural principles Experience with Agile/Scrum development methodologies Excellent debugging and performance optimisation skills Ability to mentor junior developers and conduct code reviews Strong communication skills and a collaborative team player Technical Skills: Frontend Development: JavaScript/TypeScript: Advanced proficiency with ES10+ features Frontend Frameworks: experience with React.js, Next.js, or Vue.js State Management: Redux, Zustand, or Context API CSS/Styling: Tailwind CSS, Material-UI, or Styled Components Build Tools: Webpack, Vite, or similar bundlers Real-time Applications: Socket.io, WebRTC, or Server-Sent Events Backend Development: Ability to design from scratch Node.js: experience with Express.js, Koa.js, or Fastify API Development: RESTful APIs and GraphQL implementation Serverless experience, Low-Code / No-Code experience Authentication: JWT, OAuth, session management Microservices: Understanding of microservices architecture and implementation Any SQL or NoSQL Databases experience: PostgreSQL, MySQL, MongoDB, Redis - complex query optimisation, caching, etc. Any ORM/ODM experience: Sequelize, Prisma, or Mongoose General: Version Control: Advanced Git workflows, branching strategies Basic Cloud Knowledge: AWS, Azure, or GCP common service experience Unit & Integration Testing: Jest, Mocha, Vitest, Supertest, Cypress, or Playwright Responsibility: Develop full-stack web applications using JavaScript technologies Design and implement scalable backend APIs and database schemas Create responsive and interactive frontend interfaces Collaborate with product managers, designers, and other developers Participate in code reviews and maintain coding standards Troubleshoot and debug complex technical issues Mentor junior developers and contribute to team knowledge sharing Stay updated with the latest JavaScript ecosystem trends and best practices Good to have skills: Cloud Platforms: Advanced AWS/Azure Certifications Container Orchestration: Kubernetes, Docker Swarm Message Queues: RabbitMQ, Apache Kafka, or Redis Pub/Sub Monitoring: New Relic, DataDog, or Elastic Stack Performance: CDN configuration, caching strategies AI/ML Integration: Experience with OpenAI API, LangChain, or similar Mobile Development: React Native or cross-platform experience Progressive Web Apps (PWAs): Service workers, offline functionality
Seeking a creative and user-focused UX Designer to design seamless and engaging user experiences for web and mobile platforms. The ideal candidate will have 2–4 years of hands-on experience, a strong portfolio, and comfort collaborating across product, design, and development teams. Required Experience: 2-5 Years Job title: UX Designer Required Qualification : Bachelor’s degree in design, HCI, psychology, or a related field preferred. Job Location: Pune/Ahmedabad Responsibilities Conduct user research, including interviews, surveys, and usability testing. Analyze user behavior and feedback to identify pain points and improvement opportunities. Design wireframes, user flows, prototypes, and high-fidelity mockups for digital products. Collaborate closely with UI designers, developers, and product managers to implement user-friendly solutions. Apply UX best practices to improve accessibility, performance, and engagement. Iterate on designs based on data insights and user feedback. Create and maintain documentation, personas, and design guidelines. Ensure consistency with established design systems and brand standards. Qualifications & Skills 2–4 years of experience in UX Design or similar roles. Bachelor’s degree in Design, HCI, Psychology, or related field preferred. Demonstrated proficiency with modern design tools (Figma, Sketch, Adobe XD, InVision). Strong understanding of user-centered design principles and usability testing. Ability to create responsive, mobile-first designs. Familiarity with HTML, CSS, or front-end development is an advantage. Excellent communication and collaboration skills; ability to present and defend design decisions. Strong attention to detail and passion for improving user experience.
Role Overview: As a Lead Platform Engineer, your primary responsibility will be to develop functional and efficient server-side applications. You will be involved in all phases of the Agile software development lifecycle and will also have the opportunity to coach and mentor junior developers. Your main objective will be to create high-quality, cloud-native products that effectively meet customer requirements. Key Responsibilities: - Utilize a minimum of 5 years of cloud experience to build platforms using Azure. - Demonstrate an excellent track record in system architecture and design. - Showcase strong expertise in Python programming. - Develop applications using serverless technologies. - Exhibit strong expertise in both relational and NoSQL databases. - Apply experience in Azure Services to the development process. - Lead teams effectively with at least 3 years of leadership experience. Qualifications Required: - Bachelor's or Master's degree in Computer Science, Engineering, or a related field. Additional Details of the Company: The company is seeking a Lead Platform Engineer with proven experience in system architecture, cloud development, and team leadership. The ideal candidate will have a background in Computer Science or a related field and possess a minimum of 8 years of relevant experience. The role is based in Pune/Ahmedabad and offers the opportunity to work on cutting-edge cloud-native products. Experience in leading SRE or DevOps teams and exposure to performance optimization are highly valued.,
As a skilled data engineering professional with over 5 years of experience, you will be responsible for designing and implementing end-to-end ETL solutions using Azure Data Factory (ADF), Azure Data Lake Storage (ADLS), Azure Databricks, and SSIS. Your strong proficiency in SQL, REST API integration, and automation through CI/CD pipelines using Azure DevOps and Git will be crucial for this role. You must have a solid understanding of maintaining and optimizing data pipelines, warehouses, and reporting within the Microsoft SQL stack. Key Responsibilities: - Design and implement comprehensive data solutions using Microsoft Azure Data Factory (ADF) and Azure Data Lake Storage (ADLS), Azure Databricks, and SSIS. - Develop complex transformation logic using SQL Server, SSIS, and ADF, and create ETL Jobs/Pipelines to execute those mappings concurrently. - Maintain and enhance existing ETL Pipelines, Warehouses, and Reporting leveraging traditional MS SQL Stack. - Create ADF pipelines to handle HTTP requests for APIs and implement best practices for development and deployment of SSIS packages, SQL jobs, and ADF pipelines. - Implement and manage source control practices using GIT within Azure DevOps to ensure code integrity and facilitate collaboration. - Participate in the development and maintenance of CI/CD pipelines for automated testing and deployment of BI solutions. Qualifications Required: - BS/MS in Computer Science or other engineering/technical degree. Preferred Skills: - Understanding of the Azure environment and experience in developing Azure Logic Apps and Azure Function Apps. - Knowledge of code deployment, GIT, CI/CD, and deployment of developed ETL code (SSIS, ADF) would be advantageous. (Location: Ahmedabad/Pune),
We are looking for a seasoned and technically proficient Sr. QA Engineer with deep expertise in API, cloud-based, and ETL testing processes. The ideal candidate will bring hands-on experience with Azure services and demonstrate advanced skills in managing comprehensive test cases within Azure DevOps to uphold high-quality standards in software delivery. Required Experience: 4-8 years of experience Job Location: Ahmedabad, Pune Required Educational Qualification: Bachelor's degree in Computer Science, Software Engineering, or a related field. Requirements: API Testing: Perform in-depth API testing using Postman, including the configuration of environments and verification of authentication and authorization protocols (e.g., OAuth, JWT). Strong understanding of REST, SOAP, and GraphQL APIs, with knowledge of best practices for pagination and real-world API testing challenges. Performance Testing: Conduct robust performance and load testing with JMeter to evaluate system responsiveness under varying workloads. Test Cases: Develop, manage, and execute detailed test cases within Azure DevOps for efficient project tracking and defect management. Capable of prioritizing test cases, assessing testing risks, and maintaining comprehensive traceability. ETL Testing: Lead ETL testing efforts to ensure data accuracy and transformation integrity within data pipelines, applying best practices for data-driven applications. Cloud Expertise: Demonstrated hands-on experience with core Azure Cloud services, including API Management (APIM), Application Insights, Service Bus, Azure Functions, Logic Apps, and Function Apps, Azure Dashboard. Cross-functional Collaboration: Work closely with DevOps and development teams to meet quality standards and achieve comprehensive test coverage. Database Proficiency: Write SQL, and NoSQL queries to support backend data validation and testing requirements. Good to Have: ERP System Experience: Familiarity with ERP systems and their testing requirements. Security Testing: Knowledge of OWASP principles for implementing secure testing practices.
We are seeking an experienced Sr. Java Developer with expertise in Java Spring and Spring Boot frameworks, Rest API, and Cloud. The ideal candidate will have 6+ years of hands-on experience in developing scalable and robust applications. Experience with any cloud services (AWS/Azure/GCP). Job Title: Sr. Java Developer Location: Ahmedabad/Pune Experience: 6 + Years Educational Qualification: UG: BS/MS in Computer Science, or other engineering/technical degree Key Responsibilities: Responsible for the complete software development life cycle, including requirement analysis, design, development, deployment, and support. Responsible for developing software products for Agentic AI Security. Write clean, testable, readable, scalable and maintainable Java code. Design, develop and implement highly scalable software features and infrastructure on our security platform ready for cloud native deployment from inception to completion. Participate actively and contribute to design and development discussions. Develop solid understanding and be able to explain advanced cloud computing and cloud security concepts to others. Work cross-functionally with Product Management, SRE, Software, and Quality Engineering teams to deliver new security-as-a-service offerings to the market in a timely fashion with excellent quality. Be able to clearly communicate goals and desired outcomes to internal project teams. Work closely with customer support teams to improve end-customer outcomes. Required Skill: Strong programming skills in Java, with experience in building distributed systems. 6+ years of experience in software engineering, with a focus on cloud-native application development, at large organizations or innovative startups. 3+ Experience and deep understanding for building connectors for Low Code/noCode and Agentic AI platforms like Microsoft Copilot Studio, Microsoft Power Platform, Salesforce Agentforce, Zappier, Crew AI, Marketo etc. 5+ Experience building connectors for SaaS Applications like Microsoft O365, Power Apps, Salesforce, ServiceNow etc. Preferred experience with security products-—data and DLP, CASB security, SASE, and integration with third party APIs and services. 5+ years of experience with running workloads on cloud-based architectures. (AWS/GCP experience preferred) 5+ years of experience in cloud technologies like ElasticSearch, Redis, Kafka, Mongo DB, Spring Boot . Experience with Docker and Kubernetes or other container orchestration platforms. Excellent troubleshooting abilities. Isolate issues found during testing and verify bug fixes once they are resolved. Experience with backend development (REST APIs, databases, and serverless computing) of distributed cloud applications. Experience with building and delivering services and workflows at scale, leveraging microservices architectures. Experience with the agile process and working with software development teams involved with building out full-stack products deployed on the cloud at scale. Good understanding of public cloud design considerations and limitations in areas of microservice architectures, security, global network infrastructure, distributed systems, and load balancing. Strong understanding of principles of DevOps and continuous delivery. Can-do attitude and ability to make trade-off judgements with data-driven decision-making. High energy and the ability to work in a fast-paced environment. Enjoys working with many different teams with strong collaboration and communication skills.