Jobs
Interviews
11 Job openings at Procallisto Solutions
Travel Consultant

Greater Kolkata Area

3 years

Not disclosed

On-site

Full Time

Job Title: Sales Manager / Sales Executive Locations: Kolkata – Sales Manager Hyderabad – Sales Manager Pune – Sales Executive Raipur – Sales Executive Indore – Sales Executive About the Role: We are looking for dynamic and driven individuals to join our sales team across multiple locations. Ideal candidates will have a strong background in sales, preferably within the travel and tourism industry , and a passion for driving business growth. Key Responsibilities: Sales Manager (SM): Lead and manage sales operations in the assigned region. Develop and implement strategic sales plans to achieve business objectives. Build and maintain strong relationships with travel agencies, corporate clients, and other key stakeholders. Monitor market trends and competitor activities to identify new business opportunities. Mentor and support sales executives in achieving their targets. Report on sales performance, forecasts, and budgets to senior management. Sales Executive (SE): Identify and generate leads through field visits, cold calls, and networking. Promote travel products and services to potential customers and clients. Maintain a strong client relationship for repeat business and referrals. Support the Sales Manager in implementing local sales strategies. Achieve individual sales targets and contribute to regional goals. Prepare daily reports and maintain CRM systems for sales tracking. Preferred Candidate Profile: Experience: Minimum 1–3 years in a similar sales role. Experience in the travel and tourism industry is highly preferred. Skills: Strong communication and negotiation skills. Ability to work independently and as part of a team. Goal-oriented mindset with a passion for sales. Proficiency in local language and English. Compensation: Sales Manager: ₹30,000 – ₹40,000/month (based on experience) Sales Executive: ₹20,000/month Note: Salary is negotiable for eligible candidates with relevant experience.  How to Apply: Interested candidates may share their updated resume mentioning the position and location in the subject line. Email ID- Careers@procallistosolutions.co.in Reference- Anjali Jain Thank You Show more Show less

Collibra Data Engineer - Data Governance

Noida, Uttar Pradesh, India

0 years

None Not disclosed

On-site

Full Time

Job Description Experience in designing and implementing business process workflows using Collibra Workflow Designer. Understanding of Collibra Data Governance Center (DGC) and its modules, including Data Catalog, Business Glossary, and Policy Manager. Experience in metadata harvesting, lineage tracking, and governance to improve data visibility. Proficiency in using Collibra REST APIs for workflow automation, data exchange, and custom integrations with other tools. Familiarity with Collibra Data Quality & Observability, setting up data quality rules and configuring DQ workflows. Familiarity with Groovy & Java for developing custom workflows and scripts within Collibra. Ability to write Python & SQL for data validation, integration scripts, and automation. Understanding of ETL processes and integrating Collibra with cloud/on-prem databases. Familiarity with data governance frameworks (e.g., GDPR, CCPA, HIPAA) and best practices. Experience in managing technical and business metadata effectively. Ability to track data lineage and assess downstream/upstream data impacts. (ref:hirist.tech)

React Native Developer - Android/iOS Platforms

Hyderabad, Telangana, India

7 years

None Not disclosed

On-site

Full Time

Role : React Native Lead Experience : 7+ Years Must Have Skills Here are some of the key technologies that make up our environment. While we do not expect you to have a detailed understanding of each, the more of these you are familiar with, the better. 8+ years of experience in Software development. Have 5+ years of relevant experience with ReactNative and Redux Hands-on experience working on ReactNative on the front end to create Android/iOS Apps. Thorough understanding of React Native and its core principles Strong proficiency in JavaScript, including DOM manipulation and the JavaScript object model Firm grasp of the JavaScript {​{and TypeScript or ClojureScript}} language and its nuances, including ES6+ synta x Rock solid at working with third-party dependencies and debugging dependency conflict s Familiarity with native build tools, like XCode, Gradle {​{Android Studio, IntelliJ }} Understanding of REST APIs, the document request model, and offline stora ge Experience with popular React workflows (such as Flux or Redu x) Experience creating, optimising, and integrating Application Programming Interface (API) calls; background in API development preferre d. Experience with Agile/Scrum methodologie s. Familiarity with Git and version control system s. Experience with mobile CI/CD tools (Fastlane, Bitrise, or GitHub Actions ). Knowledge of mobile security practices and data protectio n. Experience with automated testing frameworks (Jest, Detox, Appium ). Published apps on the App Store or Google Play Stor e.(ref:hirist.tech)

Solution Architect - MS Dynamics F&O

Greater Kolkata Area

0 years

None Not disclosed

On-site

Full Time

Job Description Looking for a Solution Architect with expertise in D365 FO. Must have experience across multiple modules (Finance, Inventory, SCM, etc.), full-cycle implementations, and managing stakeholder expectations. Knowledge of process manufacturing, planning optimization, costing, and logistics integration is Responsibilities : Lead the design and architecture of D365 FO solutions to meet client business needs. Work closely with business stakeholders, project managers, and technical teams to define system requirements and deliver high-quality solutions. Translate business requirements into technical specifications and solution designs. Provide technical leadership and oversight to development teams during project delivery. Ensure solution integrity and consistency throughout the project lifecycle. Conduct solution reviews, identify gaps, and recommend improvements. Collaborate with other architects and integration teams to ensure seamless system integration. Stay up to date with the latest features and capabilities of D365 FO and related Microsoft Skills and Experience : Proven experience as a Solution Architect or Technical Lead on Microsoft D365 FO projects. Strong understanding of D365 FO modules (Finance, Supply Chain, Manufacturing, etc.). Experience with D365 customizations, extensions, integrations (Power Platform, Azure services, Data Management Framework). In-depth knowledge of software development lifecycle (SDLC) and Agile methodologies. Strong problem-solving and communication skills. Experience with data migration and performance tuning. Microsoft certifications in D365 FO (e.g., MB-700 : Microsoft Dynamics 365 : Finance and Operations Apps Solution Architect) Qualifications : Experience in global implementations or multi-entity rollouts. Familiarity with other D365 modules or Microsoft ecosystems like Power BI, Power Apps, and Azure DevOps. Bachelor's or Master's degree in Computer Science, Information Technology, Engineering, or a related field. (ref:hirist.tech)

Senior Guidewire Cloud Engineer - ClaimCenter

Greater Kolkata Area

5 years

None Not disclosed

Remote

Contractual

Project Description The project involves migrating a Claims Payout Accelerator from on-premise infrastructure to the Guidewire Cloud Platform, with an emphasis on ClaimCenter. The engagement focuses on designing scalable, cloud-native solutions, integrating modern APIs, and transforming legacy architecture for improved operational agility and customer Overview : We are seeking a Senior Guidewire Cloud Engineer who will lead cloud migration and platform modernization efforts. This individual must be experienced in Guidewire Cloud implementations, ClaimCenter, and agile-based delivery models. The role is ideal for engineers who excel in greenfield MVP environments and have a strong foundation in cloud-native microservices and Guidewire SDKs. Key Responsibilities Lead the end-to-end migration of a Claims Payout Accelerator from on-premise to Guidewire Cloud Platform (CAP). Design, configure, and extend Guidewire ClaimCenter using Guidewire-provided SDKs and DevConnect APIs. Develop scalable and maintainable components using Gosu and Java/Spring Boot. Implement cloud-native architecture best practices, including microservices integration, CI/CD pipelines, and infrastructure automation. Participate actively in Agile ceremonies, including sprint planning, retrospectives, and daily stand-ups. Collaborate with product managers, solution architects, QA teams, and business stakeholders to ensure timely and high-quality delivery. Document technical specifications, deployment pipelines, and system architecture diagrams. Assist with UAT defect triage, resolution, and knowledge transfer to internal client engineering Skills & Qualifications : 5+ years of hands-on development experience with Guidewire products, with a strong focus on ClaimCenter. Proven experience with Guidewire Cloud implementations and Guidewire Cloud Application Platform (CAP). Strong command of Gosu programming language, Java, and Spring Boot frameworks. Should have good understanding of Data Modelling and Analysis Experience with Guidewire DevConnect and cloud-native APIs for integrations. Familiarity with CI/CD pipelines and infrastructure tools (e.g., Terraform, Jenkins). Experience working in Agile delivery teams in fast-paced environments. Excellent communication skills with the ability to interact with technical and non-technical Qualifications : Guidewire Certifications : ClaimCenter Cloud Developer, Cloud Integration Developer, or equivalent. Prior experience migrating legacy Guidewire Accelerators to the cloud-native format. Exposure to multi-region cloud deployments across Canada and the US. Familiarity with additional Guidewire modules such as PolicyCenter or Join This Opportunity : Opportunity to lead a greenfield Guidewire Cloud project for a major insurance platform. Work within a culture of innovation, autonomy, and agile delivery. Flexibility to work remotely and collaborate with a global engineering team. Attractive and competitive contract rates for top-tier talent. (ref:hirist.tech)

Python & Databricks Engineer - Data Pipeline

Pune, Maharashtra, India

5 years

None Not disclosed

On-site

Full Time

We are seeking an experienced Python + Databricks Developer to join our data engineering team. The ideal candidate will have a strong background in Python programming, data processing, and hands-on experience with Databricks for building and optimizing data pipelines. Key Responsibilities Design, develop, and maintain scalable data pipelines using Databricks and Apache Spark. Write efficient Python code for data transformation, cleansing, and analytics. Collaborate with data scientists, analysts, and engineers to understand data needs and deliver high-performance solutions. Optimize and tune data pipelines for performance and cost efficiency. Implement data validation, quality checks, and monitoring. Work with cloud platforms (preferably Azure or AWS) to manage data workflows. Ensure best practices in code quality, version control, and documentation. Required Skills & Experience 5+ years of professional experience in Python development. 3+ years of hands-on experience with Databricks (including notebooks, clusters, Delta Lake, and job orchestration). Strong experience with Spark (PySpark preferred). Proficient in working with large-scale data processing and ETL/ELT pipelines. Solid understanding of data warehousing concepts and SQL. Experience with Azure Data Factory, AWS Glue, or other data orchestration tools is a plus. Familiarity with version control tools like Git. Excellent problem-solving and communication skills. (ref:hirist.tech)

Python & Databricks Engineer

pune, maharashtra

3 - 7 years

INR Not disclosed

On-site

Full Time

You are an experienced Python + Databricks Developer who will be a valuable addition to our data engineering team. Your expertise in Python programming, data processing, and hands-on experience with Databricks will be instrumental in building and optimizing data pipelines. Your key responsibilities will include designing, developing, and maintaining scalable data pipelines using Databricks and Apache Spark. You will be expected to write efficient Python code for data transformation, cleansing, and analytics. Collaboration with data scientists, analysts, and engineers is essential to understand data needs and deliver high-performance solutions. Optimizing and tuning data pipelines for performance and cost efficiency, implementing data validation, quality checks, and monitoring, as well as working with cloud platforms (preferably Azure or AWS) to manage data workflows are crucial aspects of the role. Ensuring best practices in code quality, version control, and documentation will also be part of your responsibilities. To be successful in this role, you should have 5+ years of professional experience in Python development and at least 3 years of hands-on experience with Databricks, including notebooks, clusters, Delta Lake, and job orchestration. Strong experience with Spark, especially PySpark, is required. Proficiency in working with large-scale data processing and ETL/ELT pipelines, solid understanding of data warehousing concepts and SQL, as well as experience with Azure Data Factory, AWS Glue, or other data orchestration tools will be beneficial. Familiarity with version control tools like Git and excellent problem-solving and communication skills are also essential. If you are looking to leverage your Python and Databricks expertise to contribute to building robust data pipelines and optimizing data workflows, this role is a great fit for you.,

Python & Databricks Engineer - Data Pipeline

pune, maharashtra

3 - 7 years

INR Not disclosed

On-site

Full Time

You will be joining our data engineering team as an experienced Python + Databricks Developer. Your role will involve designing, developing, and maintaining scalable data pipelines using Databricks and Apache Spark. You will write efficient Python code for data transformation, cleansing, and analytics. Collaboration with data scientists, analysts, and engineers to understand data needs and deliver high-performance solutions is a key part of this role. Additionally, you will optimize and tune data pipelines for performance and cost efficiency and implement data validation, quality checks, and monitoring. Working with cloud platforms, preferably Azure or AWS, to manage data workflows will also be part of your responsibilities. Ensuring best practices in code quality, version control, and documentation is essential for this role. To be successful in this position, you should have at least 5 years of professional experience in Python development and 3 years of hands-on experience with Databricks, including notebooks, clusters, Delta Lake, and job orchestration. Strong experience with Spark, particularly PySpark, is required. Proficiency in working with large-scale data processing and ETL/ELT pipelines is necessary, along with a solid understanding of data warehousing concepts and SQL. Experience with Azure Data Factory, AWS Glue, or other data orchestration tools would be advantageous. Familiarity with version control tools like Git is also desired. Excellent problem-solving and communication skills are important for this role.,

Python+databrick

India

5 years

None Not disclosed

Remote

Full Time

Design, develop, and maintain scalable data pipelines using Databricks and Apache Spark. Write efficient Python code for data transformation, cleansing, and analytics. Collaborate with data scientists, analysts, and engineers to understand data needs and deliver high-performance solutions. Optimize and tune data pipelines for performance and cost efficiency. Implement data validation, quality checks, and monitoring. Work with cloud platforms (preferably Azure or AWS) to manage data workflows. Ensure best practices in code quality, version control, and documentation. Required Skills & Experience: 5+ years of professional experience in Python development. 3+ years of hands-on experience with Databricks (including notebooks, clusters, Delta Lake, and job orchestration). Strong experience with Spark (PySpark preferred). Proficient in working with large-scale data processing and ETL/ELT pipelines. Solid understanding of data warehousing concepts and SQL. Experience with Azure Data Factory, AWS Glue, or other data orchestration tools is a plus. Familiarity with version control tools like Git. Excellent problem-solving and communication skills. Job Type: Full-time Pay: ₹2,000,000.00 - ₹3,000,000.00 per year Schedule: Monday to Friday Experience: Spark: 6 years (Required) Azure Data: 5 years (Required) Git: 5 years (Required) CI/CD: 5 years (Required) DevOps: 5 years (Required) Databricks: 5 years (Required) Python : 5 years (Required) Work Location: Remote

Collibra Data Engineer - Data Governance

noida, uttar pradesh

3 - 7 years

INR Not disclosed

On-site

Full Time

The ideal candidate should have experience in designing and implementing business process workflows using Collibra Workflow Designer. You should have a good understanding of Collibra Data Governance Center (DGC) and its modules, including Data Catalog, Business Glossary, and Policy Manager. Your expertise should include metadata harvesting, lineage tracking, and governance to enhance data visibility. Proficiency in using Collibra REST APIs for workflow automation, data exchange, and custom integrations with other tools is essential. It is important to be familiar with Collibra Data Quality & Observability, including setting up data quality rules and configuring DQ workflows. Knowledge of Groovy & Java for developing custom workflows and scripts within Collibra is required. You should be able to write Python & SQL for data validation, integration scripts, and automation. Understanding of ETL processes and integrating Collibra with cloud/on-prem databases is a plus. Familiarity with data governance frameworks such as GDPR, CCPA, HIPAA, and best practices is highly desirable. Experience in effectively managing technical and business metadata is important. You should have the ability to track data lineage and assess downstream/upstream data impacts.,

SAP TRM & CASH MANAGEMENT

karnataka

5 - 9 years

INR Not disclosed

On-site

Full Time

You will be responsible for implementing and configuring SAP TRM modules to meet the specific business requirements. This includes setting up Transaction Manager and Risk Analyzer (optional) modules according to the needs of the organization. You should have a strong understanding of various financial instruments such as bonds, loans, investments, and derivatives, along with their associated risks. Your expertise in financial instruments, treasury operations, and risk management principles will be crucial for this role. In this position, you will be required to have a comprehensive knowledge of SAP TRM modules, including Transaction Manager, Risk Analyzer, and Cash Management. Experience with SAP S/4HANA and the ability to integrate with other SAP modules like FI, CO, and Cash Management will be beneficial. Your excellent communication, collaboration, and problem-solving skills will be essential for effectively translating business requirements into SAP TRM solutions. Additionally, you should have a minimum of 5 years of experience in Cash Management within SAP. Your responsibilities will include working on cash forecasting, managing cash inflows and outflows as per operational needs, and potentially overseeing liquidity management. Experience in Bank Relationship management, Cash operations, Electronic Banking, bank operations, and bank reconciliation will be advantageous for this role. As a candidate, you should also be prepared to lead projects, mentor team members, and be willing to travel to project sites for extended periods. Your ability to lead projects and guide team members will be crucial for the successful delivery of SAP TRM solutions.,

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Job Titles Overview