Pune, Maharashtra, India
Not disclosed
On-site
Full Time
We are hiring ETL & Data Warehousing Developer (SQL + Qlik / BI Tools)Location: Pune / HyderabadExperience: 5+ YearsPriority Hire – Immediate Joining Preferred Key Responsibilities:Design, build, and optimize robust ETL workflows to integrate data from diverse sources into our data warehouse.Implement data quality and validation checks to ensure accuracy and consistency.Translate business requirements into effective data models and transformation logic.Write complex SQL queries to extract, transform, and load data efficiently.Develop star and snowflake schema data models for analytical reporting.Partner with BI teams to integrate ETL pipelines with Qlik Sense or other BI tools.Maintain and enhance the current data warehouse infrastructure for scalability and performance.Troubleshoot data anomalies and recommend long-term solutions. Required Skills:Minimum 5 years of hands-on experience in ETL development and Data Warehousing.Strong command over SQL and RDBMS (Oracle, SQL Server, PostgreSQL, etc.).Experience with leading ETL tools like Informatica, Talend, DataStage, or similar.Solid knowledge of dimensional modeling, star/snowflake schemas, and data integration best practices.Strong performance tuning experience for both SQL and ETL jobs. Good to Have:Experience with Qlik Sense, QlikView, or Qlik Replicate.Exposure to Power BI, Tableau, Looker, or other visualization tools.Familiarity with cloud platforms (AWS, Azure, GCP).Knowledge of data governance, data cataloging, and metadata management. Educational Qualifications:Bachelor’s degree in Computer Science, Information Technology, or a related field.
Gurugram, Haryana
INR Not disclosed
On-site
Full Time
Job Title: Data Engineer / Analyst – ELT & Feature Store Type: Full-Time Industry: IT / Software Location: Gurgaon, Haryana, India (Required) Experience: 2 Years (Required) Job Summary: We are seeking a hands-on Data Engineer / Analyst to build and optimize ELT pipelines and feature stores that power forecasting and machine learning workflows . You'll work on transforming multi-source datasets into trusted, auditable data products while ensuring data quality, performance, and documentation standards. Key Responsibilities: Build scalable ELT pipelines and star-schema models using TypeScript + Drizzle ORM. Implement data quality checks (nulls, schema drift, outliers) and post-COVID re-weighting logic. Develop and manage feature stores (e.g., holiday events, promo tracking, customer segments). Optimize ingestion and transformation performance using chunked uploads and materialized views. Maintain up-to-date data dictionaries, ERDs, and lineage documentation. Required Skills: Advanced SQL & PostgreSQL tuning TypeScript and/or Python dbt-style modular transformations Git, Docker, and CI/CD basics Good to Have: Experience with Airflow or Airbyte Familiarity with Parquet or columnar formats BI-level validation and support Job Type: Full-time Pay: ₹700,000.00 - ₹1,100,000.00 per year Application Question(s): Do you have at least 2 years of professional experience in a data engineering or analytics role? Are you proficient in writing and optimizing Advanced SQL queries? Have you worked with PostgreSQL and performed query tuning? Are you experienced in TypeScript or Python for data scripting and transformation? Have you built ELT pipelines or transformations using dbt-style (modular, version-controlled) approaches? Have you implemented any data quality checks such as null checks, schema drift detection, or outlier handling? Are you familiar with data orchestration tools like Airflow or Airbyte? Work Location: In person Application Deadline: 07/06/2025
Gurugram, Haryana
INR Not disclosed
On-site
Full Time
Business Analyst – Insurance Domain Location: Gurgaon, Haryana, India (Required) Type: Full-Time Industry: Banking & Finance (Required) Experience: 3–5 years (Required) Job Overview: We are hiring a skilled Business Analyst with strong experience in the insurance domain (life, health, P&C). You will act as a bridge between business and technology teams, driving requirement gathering, process improvements, and digital solution delivery. Key Responsibilities: Gather and analyze business requirements; prepare BRD & FSD Coordinate with stakeholders, IT teams, and vendors Drive UAT , change requests, and SDLC activities Identify process gaps and suggest improvements using Agile/Waterfall Perform data analysis, customer journey mapping, and reporting Required Skills: Strong knowledge of insurance products & regulations Proficient in MS Office, JIRA, Visio; understanding of HTML/CSS/JS is a plus Excellent communication and stakeholder management skills Familiarity with platforms like Life Asia, Premia, InsureMO, TCS BaNCS, Symbiosys, etc. Qualifications: Bachelor’s degree in Business, IT, or Finance Certification in CBAP, CCBA, or PMP preferred Fluent in English and Swahili Job Type: Full-time Pay: ₹800,000.00 - ₹1,300,000.00 per year Application Question(s): Do you have 3–5 years of experience as a Business Analyst? Have you worked in the insurance domain (life, health, P&C) Are you familiar with BRD and FSD documentation? Have you worked in an Agile or Waterfall SDLC environment? Have you used or worked on any core insurance platforms (e.g., Life Asia, Premia, TCS BaNCS)? Do you have working knowledge of HTML, CSS, or JavaScript? License/Certification: CBAP (Preferred) CCBA (Preferred) PMP (Preferred) Work Location: In person Application Deadline: 07/06/2025 Expected Start Date: 07/06/2025
Gurugram, Haryana
INR Not disclosed
On-site
Full Time
Position : AI / ML Engineer Job Type : Full-Time Location : Gurgaon, Haryana, India Experience : 2 Years Industry : Information Technology Domain : Demand Forecasting in Retail/Manufacturing Job Summary We are seeking a skilled Time Series Forecasting Engineer to enhance existing Python microservices into a modular, scalable forecasting engine. The ideal candidate will have a strong statistical background, expertise in handling multi-seasonal and intermittent data, and a passion for model interpretability and real-time insights. Key Responsibilities Develop and integrate advanced time-series models: MSTL, Croston, TSB, Box-Cox. Implement rolling-origin cross-validation and hyperparameter tuning. Blend models such as ARIMA, Prophet, and XGBoost for improved accuracy. Generate SHAP-based driver insights and deliver them to a React dashboard via GraphQL. Monitor forecast performance with Prometheus and Grafana; trigger alerts based on degradation. Core Technical Skills Languages : Python (pandas, statsmodels, scikit-learn) Time Series : ARIMA, MSTL, Croston, Prophet, TSB Tools : Docker, REST API, GraphQL, Git-flow, Unit Testing Database : PostgreSQL Monitoring : Prometheus, Grafana Nice-to-Have : MLflow, ONNX, TensorFlow Probability Soft Skills Strong communication and collaboration skills Ability to explain statistical models in layman terms Proactive problem-solving attitude Comfort working cross-functionally in iterative development environments Job Type: Full-time Pay: ₹400,000.00 - ₹800,000.00 per year Application Question(s): Do you have at least 2 years of hands-on experience in Python-based time series forecasting? Have you worked in retail or manufacturing domains where demand forecasting was a core responsibility? Are you currently authorized to work in India without sponsorship? Have you implemented or used ARIMA, Prophet, or MSTL in any of your projects? Have you used Croston or TSB models for forecasting intermittent demand? Are you familiar with SHAP for model interpretability? Have you containerized a forecasting pipeline using Docker and exposed it through a REST or GraphQL API? Have you used Prometheus and Grafana to monitor model performance in production? Work Location: In person Application Deadline: 05/06/2025 Expected Start Date: 05/06/2025
Pune, Maharashtra
INR Not disclosed
On-site
Full Time
DevOps Engineer (8–10 Yrs Exp) Location: Pune, Maharashtra, India Industry: Banking & Finance (Required) Job Type: Full-Time Key Responsibilities: -Lead and manage CI/CD pipelines using Jenkins, GitHub, Shell Scripting . -Automate routine tasks with Control-M , Connect:Direct , and custom scripts. -Administer and optimize Oracle Databases for performance and reliability. -Perform system administration and troubleshooting on Unix/Linux servers . -Support middleware systems like WebSphere (WAS) and IBM MQ . -Collaborate with development, QA, and operations teams to streamline DevOps processes. -Champion automation and system reliability across environments. Must-Have Skills: -8–10 years of experience in DevOps/System Administration . -Strong expertise in Unix/Linux , Shell Scripting , and Oracle DB administration. -Hands-on experience with CI/CD tools – Jenkins , GitHub . -Knowledge of Control-M , Connect:Direct , WebSphere , and IBM MQ . -Solid understanding of networking , system monitoring , and infrastructure troubleshooting. Preferred Qualifications: -Certifications in DevOps , Oracle DBA , or Linux Administration . -Exposure to AWS , Azure , or GCP cloud platforms. -Experience with Infrastructure as Code (IaC) tools like Terraform or Ansible . Job Type: Full-time Pay: ₹2,000,000.00 - ₹3,000,000.00 per year Application Question(s): Do you have 8 to 10 years of experience in DevOps or System Administration? Are you proficient in working with CI/CD tools like Jenkins and GitHub? How many years of hands-on experience do you have with Oracle Database administration? Do you have experience with Unix/Linux system administration and shell scripting? Are you familiar with Control-M, Connect:Direct (C:D), WebSphere (WAS), or IBM MQ? License/Certification: Oracal DBA (Preferred) Linux system administration. (Preferred) Work Location: In person Speak with the employer +91 9909030155 Application Deadline: 07/06/2025 Expected Start Date: 07/06/2025
Hyderābād
INR 22.0 - 28.0 Lacs P.A.
On-site
Full Time
Job Title: TM1 Developer (7–8 Years) Domain : Banking & Finance (Required) Location: Hyderabad, Telangana Experience: 7–8 Years (Minimum 5 Years in TM1 Development) Notice Period: Immediate to 15 Days Job Summary: We are hiring a skilled TM1 Developer to support a high-impact project in the Banking and Finance domain. This role involves full-cycle TM1 development, DevOps integration, and secure, scalable architecture implementation. If you’re passionate about performance optimization, automation, and cloud-native tools—this opportunity is for you. Key Responsibilities: -Design and develop IBM Cognos TM1 Cubes, Dimensions, Business Rules, and TI Processes -Optimize TM1 models using feeders, skip checks, and advanced configuration -Integrate TM1 with CI/CD pipelines using tools like Jenkins, Git, and Ansible -Automate deployments using Python and Shell scripting -Collaborate on data platform integration (Spark, DeltaLake, PostgreSQL) -Ensure system security, compliance, and performance Required Skills: -IBM Cognos TM1: 5+ years hands-on experience -Scripting & Automation: Python, Shell, SQL, MDX, Excel, Visual Basic -CI/CD Tools: Jenkins, Git, Ansible -Containerization: Docker, Kubernetes, Zookeeper -Databases: PostgreSQL, Oracle, SQL Server, DB2 -DevSecOps: Familiarity with security best practices Preferred Qualifications: -Experience in large-scale financial or risk data platforms -Exposure to Spark ecosystem and container orchestration - Certification in Agile, DevOps, or TM1 (preferred but not mandatory) Job Type: Full-time Pay: ₹2,200,000.00 - ₹2,800,000.00 per year Application Question(s): Do you have a minimum of 5 years of hands-on experience in IBM Cognos TM1 (Cubes, TI Processes, Business Rules, Security)? Which of the following DevOps tools have you worked with? How many years of experience do you have in Python and Shell scripting for automation? Are you currently located in Hyderabad or willing to relocate within 15 days? What is your current notice period? Work Location: In person Application Deadline: 13/06/2025 Expected Start Date: 16/06/2025
Pune, Maharashtra
INR Not disclosed
On-site
Full Time
Job Title: Cloud Data Engineer – GCP + Python Job Type: Full Time Industry: Banking & Finance Location: Pune, Maharashtra, India (Hybrid/On-site) Job Summary: We are hiring a skilled Cloud Data Engineer with expertise in Google Cloud Platform (GCP), Python, and advanced SQL . You'll work on building scalable, cloud-native data pipelines and automating data workflows for enterprise-scale analytics and banking projects. Key Responsibilities: Build and maintain robust ETL pipelines using Python and PySpark Develop data workflows using BigQuery, Cloud Composer, Dataflow , and Cloud Storage Write and optimize complex SQL queries for transformation and reporting Automate workflows with Airflow/Cloud Composer Collaborate with analysts, architects, and business teams Ensure code quality, reliability, and secure data practices Contribute to scalable, high-performance cloud data architecture Requirements: 5–8 years in Data Engineering or Cloud Data roles Strong hands-on experience with GCP services (BigQuery, Cloud Storage, Composer) Proficiency in Python and PySpark Advanced SQL skills and experience with CI/CD tools Working knowledge of workflow orchestration (Airflow preferred) Job Type: Full-time Pay: ₹1,200,000.00 - ₹2,000,000.00 per year Application Question(s): How many years of hands-on experience do you have with Google Cloud Platform (GCP) services such as BigQuery, Cloud Storage, or Dataflow? Are you proficient in Python for building ETL pipelines and automation workflows? Do you have prior experience working in the banking or financial services domain? Which orchestration tool(s) have you used in a production environment? Rate your expertise in writing and optimizing SQL for data transformation. Work Location: In person Application Deadline: 13/06/2025 Expected Start Date: 16/06/2025
Pune, Maharashtra
INR Not disclosed
On-site
Full Time
DevOps Engineer (8–10 Yrs Exp) Location: Pune, Maharashtra, India Industry: Banking & Finance (Required) Job Type: Full-Time Key Responsibilities: -Lead and manage CI/CD pipelines using Jenkins, GitHub, Shell Scripting . -Automate routine tasks with Control-M , Connect:Direct , and custom scripts. -Administer and optimize Oracle Databases for performance and reliability. -Perform system administration and troubleshooting on Unix/Linux servers . -Support middleware systems like WebSphere (WAS) and IBM MQ . -Collaborate with development, QA, and operations teams to streamline DevOps processes. -Champion automation and system reliability across environments. Must-Have Skills: -8–10 years of experience in DevOps/System Administration . -Strong expertise in Unix/Linux , Shell Scripting , and Oracle DB administration. -Hands-on experience with CI/CD tools – Jenkins , GitHub . -Knowledge of Control-M , Connect:Direct , WebSphere , and IBM MQ . -Solid understanding of networking , system monitoring , and infrastructure troubleshooting. Preferred Qualifications: -Certifications in DevOps , Oracle DBA , or Linux Administration . -Exposure to AWS , Azure , or GCP cloud platforms. -Experience with Infrastructure as Code (IaC) tools like Terraform or Ansible . License/Certification: Oracal DBA (Preferred) Linux system administration. (Preferred) Job Type: Full-time Pay: ₹2,000,000.00 - ₹3,000,000.00 per year Application Question(s): Do you have 8 to 10 years of experience in DevOps or System Administration? Which of the following CI/CD tools have you worked with professionally? How many years of hands-on experience do you have with Oracle Database administration? Do you have working experience with any of the following? Are you currently located in Pune or willing to relocate there immediately? Work Location: In person Application Deadline: 13/06/2025 Expected Start Date: 16/06/2025
Pune District, Maharashtra
INR Not disclosed
On-site
Full Time
Job Title: Spark & Delta Lake Developer Job Type: Full-Time Location: Pune, Maharashtra, India Experience: 5–8 years Industry: Banking & Finance Job Summary: We’re hiring an experienced Spark & Delta Lake Developer to build high-performance data pipelines and cloud-native solutions for a global banking project. If you have strong hands-on experience with Apache Spark , Delta Lake , and cloud-based lakehouse architecture , this role is for you. Key Responsibilities: Develop and optimize Apache Spark pipelines for batch/streaming data Work with Delta Lake to enable scalable and reliable data workflows Design and maintain cloud-based data lakehouse architectures Collaborate with data architects and DevOps to deploy enterprise-grade data solutions Implement robust data ingestion, transformation, and governance practices Participate in code reviews and CI/CD processes Required Skills: 5–8 years in big data / distributed systems Strong knowledge of Apache Spark (RDD, DataFrame, SQL, Streaming) Hands-on experience with Delta Lake architecture Programming with PySpark or Scala Experience with cloud platforms (AWS, Azure, or GCP) Familiarity with data security, governance, and performance tuning Job Type: Full-time Pay: ₹1,800,000.00 - ₹2,200,000.00 per year Application Question(s): How many years of hands-on experience do you have with Apache Spark (RDD, DataFrame, SQL, Streaming)? Have you worked on Delta Lake architecture in a production environment? Which programming language have you used with Spark? Which cloud platform(s) have you used for big data or data lakehouse projects? Do you have experience with implementing data governance or security practices in large-scale data pipelines? Work Location: On the road Application Deadline: 13/06/2025 Expected Start Date: 16/06/2025
Hyderabad, Telangana
INR Not disclosed
On-site
Full Time
Job Title: TM1 Developer (7–8 Years) Domain : Banking & Finance (Required) Location: Hyderabad, Telangana Experience: 7–8 Years (Minimum 5 Years in TM1 Development) Notice Period: Immediate to 15 Days Job Summary: We are hiring a skilled TM1 Developer to support a high-impact project in the Banking and Finance domain. This role involves full-cycle TM1 development, DevOps integration, and secure, scalable architecture implementation. If you’re passionate about performance optimization, automation, and cloud-native tools—this opportunity is for you. Key Responsibilities: -Design and develop IBM Cognos TM1 Cubes, Dimensions, Business Rules, and TI Processes -Optimize TM1 models using feeders, skip checks, and advanced configuration -Integrate TM1 with CI/CD pipelines using tools like Jenkins, Git, and Ansible -Automate deployments using Python and Shell scripting -Collaborate on data platform integration (Spark, DeltaLake, PostgreSQL) -Ensure system security, compliance, and performance Required Skills: -IBM Cognos TM1: 5+ years hands-on experience -Scripting & Automation: Python, Shell, SQL, MDX, Excel, Visual Basic -CI/CD Tools: Jenkins, Git, Ansible -Containerization: Docker, Kubernetes, Zookeeper -Databases: PostgreSQL, Oracle, SQL Server, DB2 -DevSecOps: Familiarity with security best practices Preferred Qualifications: -Experience in large-scale financial or risk data platforms -Exposure to Spark ecosystem and container orchestration - Certification in Agile, DevOps, or TM1 (preferred but not mandatory) Job Type: Full-time Pay: ₹2,200,000.00 - ₹2,800,000.00 per year Application Question(s): Do you have a minimum of 5 years of hands-on experience in IBM Cognos TM1 (Cubes, TI Processes, Business Rules, Security)? Which of the following DevOps tools have you worked with? How many years of experience do you have in Python and Shell scripting for automation? Are you currently located in Hyderabad or willing to relocate within 15 days? What is your current notice period? Work Location: In person Application Deadline: 13/06/2025 Expected Start Date: 16/06/2025
Hyderābād
INR 22.0 - 28.0 Lacs P.A.
On-site
Full Time
Hiring: Senior Cognos TM1 Developer with DevSecOps Expertise Location: Hyderabad - Full-Time Domain: Banking & Finance Experience: 7–8 Years (Required) Notice Period: Immediate to 15 Days We’re looking for a seasoned TM1 Developer with strong DevSecOps experience to join a high-performing tech team in a BFSI environment. You'll drive TM1 development, automate CI/CD pipelines, and lead secure data platform integrations. Key Skills: IBM Cognos TM1 (Cubes, Rules, Processes, Security, REST API) DevOps Tools: Jenkins, Ansible, Git Scripting: Python, Shell Databases: PostgreSQL, Oracle, SQL Server, DB2 Containers: Docker, Kubernetes Agile, Security Compliance What You'll Do: Build & optimize TM1 applications Automate deployments & container orchestration Lead platform integration with Spark, DeltaLake Ensure secure, scalable data architecture Job Type: Full-time Pay: ₹2,200,000.00 - ₹2,800,000.00 per year Application Question(s): How many years of hands-on experience do you have in developing Cognos TM1 Cubes, Rules, and TI Processes (without using wizards)? Have you implemented CI/CD pipelines using tools like Jenkins, Ansible, or Git in a production environment? What scripting languages have you used for automation and integration tasks in TM1 projects (e.g., Python, Shell)? Describe your experience with containerization and orchestration using Docker and Kubernetes. How do you ensure security compliance and control in TM1 deployments, especially in regulated industries like BFSI? Work Location: In person Application Deadline: 19/06/2025
Gurugram, Haryana
INR Not disclosed
On-site
Full Time
Job Title:Solution Architect – CRM & Contact Center (Banking/Financial Services) Job Type: Full-Time Industry: Banking & Financial Services Location: Gurgaon, Haryana, India Experience Required: 7 to 12 years(Minimum 4 years in Banking/Financial Services domain) Key Responsibilities: Design end-to-end CRM and Contact Center solutions aligned to retail and corporate banking requirements. Translate business requirements into scalable architecture designs , system integrations, and data flows. Lead discussions with internal teams and vendors to ensure technical alignment with business strategy. Ensure compliance with banking regulations (e.g., GDPR, KYC, AML) and security/data privacy standards (e.g., PCI DSS). Integrate and optimize platforms including Salesforce, Microsoft Dynamics, Siebel , and Contact Center solutions like Genesys Cloud, NICE, Amazon Connect, Cisco, Avaya . Architect intelligent IVR systems , CTI integrations , SIP/VoIP , and omnichannel routing (chat, email, voice, social). Drive implementation of AI/ML-powered chatbots, voice bots , and predictive engagement tools . Define and optimize data models , campaign automation , and customer analytics within CRM platforms. Lead design for high availability , fault-tolerant systems , and cloud integration strategies (AWS, Azure). Provide hands-on technical leadership across implementation, QA, and deployment phases. Required Skills & Qualifications: Bachelor’s/Master’s in Computer Science , Information Systems , or related field. 8+ years of overall experience, with 4+ years in the banking/financial domain . Strong domain knowledge in customer service, digital transformation, and banking products . Expertise in CRM technologies: Salesforce, Microsoft Dynamics, Siebel . Proven experience in Contact Center platforms: Genesys (Cloud & Engage), Cisco UCCE, NICE, Avaya, Amazon Connect . Solid understanding of CTI, SIP, IVR flows, APIs, middleware, ESB, and microservices architecture. Hands-on experience with AI-based automation for customer service (chatbots, routing engines). Knowledge of GDPR, PCI DSS, KYC/AML , and related compliance standards. Familiar with TOGAF or similar enterprise architecture frameworks. Preferred Certifications: TOGAF Certified Salesforce Architect Certification AWS / Azure Cloud Architect Genesys / Cisco Contact Center Certifications Job Type: Full-time Pay: Up to ₹3,000,000.00 per year Application Question(s): How many years of experience do you have in Solution Architecture specifically within the banking or financial services domain? Do you have hands-on implementation experience with at least two of the following CRM platforms: Salesforce, Microsoft Dynamics, or Siebel? Have you worked with any Contact Center solutions such as Genesys (Cloud or Engage), Cisco UCCE, NICE, Avaya, or Amazon Connect in a technical or architectural capacity? How many years of experience do you have designing cloud-native or hybrid solutions using AWS and/or Azure? Are you familiar with data privacy, security, and compliance standards such as GDPR, PCI DSS, and KYC/AML in enterprise implementations? License/Certification: TOGAF (Preferred) Salesforce Architect Certification (Preferred) AWS / Azure Cloud Architect (Preferred) Genesys / Cisco Contact Center Certifications (Preferred) Work Location: In person Speak with the employer +91 9909030155 Application Deadline: 04/07/2025 Expected Start Date: 07/07/2025
Hyderabad, Telangana
INR 1.5 - 2.4 Lacs P.A.
On-site
Full Time
Position: Credit Counselor / Sr. Credit Counselor / Subject Matter Expert (SME) Location: Hyderabad Working Hours: 9:00 AM to 6:00 PM | Working Days: 6 days/week Open Positions Key Requirements Language Proficiency: Fluency in Hindi, Telugu, Tamil, Kannada, or Malayalam Qualification: Minimum 12th pass , Diploma holders , Graduates , or those currently pursuing graduation Certification: Must be DRA Certified (Debt Recovery Agent) Technical Skills: Basic to intermediate proficiency in MS Excel Job Responsibilities Proactively engage with assigned customers to remind and recover overdue/outstanding payments . Educate customers about their upcoming dues and repayment options. Track and escalate suspected fraud or skip cases to the reporting manager. Monitor and ensure promise-to-pay (PTP) commitments are fulfilled on time. Handle and resolve customer queries with a focus on providing a positive experience. Lead settlement discussions for seriously delinquent or specific accounts. Prepare and deliver performance and recovery reports as required. Compensation Structure Statutory Benefits: ESI & PF coverage Growth Opportunity: Appraisal cycle every 6 months based on performance Why Join Us? Stable opportunity with long-term growth potential Incentivized performance environment Exposure to pan-India customer portfolios Structured career progression pathway with SME-level mentoring Position: Credit Counselor / Sr. Credit Counselor / Subject Matter Expert (SME) Location: Noida Working Hours: 9:00 AM to 6:00 PM Working Days: 6 days/week Key Requirements Language Proficiency: Fluency in Hindi, Telugu, Tamil, Kannada, or Malayalam Qualification: Minimum 12th pass , Diploma holders , Graduates , or those currently pursuing graduation Certification: Must be DRA Certified (Debt Recovery Agent) Technical Skills: Basic to intermediate proficiency in MS Excel Job Responsibilities Proactively engage with assigned customers to remind and recover overdue/outstanding payments . Educate customers about their upcoming dues and repayment options. Track and escalate suspected fraud or skip cases to the reporting manager. Monitor and ensure promise-to-pay (PTP) commitments are fulfilled on time. Handle and resolve customer queries with a focus on providing a positive experience. Lead settlement discussions for seriously delinquent or specific accounts. Prepare and deliver performance and recovery reports as required. Compensation Structure Statutory Benefits: ESI & PF coverage Growth Opportunity: Appraisal cycle every 6 months based on performance Why Join Us? Stable opportunity with long-term growth potential Incentivized performance environment Exposure to pan-India customer portfolios Structured career progression pathway with SME-level mentoring Job Type: Full-time Pay: ₹150,000.00 - ₹240,000.00 per year Schedule: Day shift Supplemental Pay: Performance bonus License/Certification: Debt Recovery Agent (Required) Work Location: In person Application Deadline: 01/07/2025 Expected Start Date: 01/07/2025
Gurgaon
INR 30.0 - 30.0 Lacs P.A.
On-site
Full Time
Job Title:Solution Architect – CRM & Contact Center (Banking/Financial Services) Job Type: Full-Time Industry: Banking & Financial Services Location: Gurgaon, Haryana, India Experience Required: 7 to 12 years(Minimum 4 years in Banking/Financial Services domain) Key Responsibilities: Design end-to-end CRM and Contact Center solutions aligned to retail and corporate banking requirements. Translate business requirements into scalable architecture designs , system integrations, and data flows. Lead discussions with internal teams and vendors to ensure technical alignment with business strategy. Ensure compliance with banking regulations (e.g., GDPR, KYC, AML) and security/data privacy standards (e.g., PCI DSS). Integrate and optimize platforms including Salesforce, Microsoft Dynamics, Siebel , and Contact Center solutions like Genesys Cloud, NICE, Amazon Connect, Cisco, Avaya . Architect intelligent IVR systems , CTI integrations , SIP/VoIP , and omnichannel routing (chat, email, voice, social). Drive implementation of AI/ML-powered chatbots, voice bots , and predictive engagement tools . Define and optimize data models , campaign automation , and customer analytics within CRM platforms. Lead design for high availability , fault-tolerant systems , and cloud integration strategies (AWS, Azure). Provide hands-on technical leadership across implementation, QA, and deployment phases. Required Skills & Qualifications: Bachelor’s/Master’s in Computer Science , Information Systems , or related field. 8+ years of overall experience, with 4+ years in the banking/financial domain . Strong domain knowledge in customer service, digital transformation, and banking products . Expertise in CRM technologies: Salesforce, Microsoft Dynamics, Siebel . Proven experience in Contact Center platforms: Genesys (Cloud & Engage), Cisco UCCE, NICE, Avaya, Amazon Connect . Solid understanding of CTI, SIP, IVR flows, APIs, middleware, ESB, and microservices architecture. Hands-on experience with AI-based automation for customer service (chatbots, routing engines). Knowledge of GDPR, PCI DSS, KYC/AML , and related compliance standards. Familiar with TOGAF or similar enterprise architecture frameworks. Preferred Certifications: TOGAF Certified Salesforce Architect Certification AWS / Azure Cloud Architect Genesys / Cisco Contact Center Certifications Job Type: Full-time Pay: Up to ₹3,000,000.00 per year Application Question(s): How many years of experience do you have in Solution Architecture specifically within the banking or financial services domain? Do you have hands-on implementation experience with at least two of the following CRM platforms: Salesforce, Microsoft Dynamics, or Siebel? Have you worked with any Contact Center solutions such as Genesys (Cloud or Engage), Cisco UCCE, NICE, Avaya, or Amazon Connect in a technical or architectural capacity? How many years of experience do you have designing cloud-native or hybrid solutions using AWS and/or Azure? Are you familiar with data privacy, security, and compliance standards such as GDPR, PCI DSS, and KYC/AML in enterprise implementations? License/Certification: TOGAF (Preferred) Salesforce Architect Certification (Preferred) AWS / Azure Cloud Architect (Preferred) Genesys / Cisco Contact Center Certifications (Preferred) Work Location: In person Speak with the employer +91 9909030155 Application Deadline: 04/07/2025 Expected Start Date: 07/07/2025
Hyderābād
INR 1.5 - 2.4 Lacs P.A.
On-site
Full Time
Position: Credit Counselor / Sr. Credit Counselor / Subject Matter Expert (SME) Location: Hyderabad Working Hours: 9:00 AM to 6:00 PM | Working Days: 6 days/week Open Positions Key Requirements Language Proficiency: Fluency in Hindi, Telugu, Tamil, Kannada, or Malayalam Qualification: Minimum 12th pass , Diploma holders , Graduates , or those currently pursuing graduation Certification: Must be DRA Certified (Debt Recovery Agent) Technical Skills: Basic to intermediate proficiency in MS Excel Job Responsibilities Proactively engage with assigned customers to remind and recover overdue/outstanding payments . Educate customers about their upcoming dues and repayment options. Track and escalate suspected fraud or skip cases to the reporting manager. Monitor and ensure promise-to-pay (PTP) commitments are fulfilled on time. Handle and resolve customer queries with a focus on providing a positive experience. Lead settlement discussions for seriously delinquent or specific accounts. Prepare and deliver performance and recovery reports as required. Compensation Structure Statutory Benefits: ESI & PF coverage Growth Opportunity: Appraisal cycle every 6 months based on performance Why Join Us? Stable opportunity with long-term growth potential Incentivized performance environment Exposure to pan-India customer portfolios Structured career progression pathway with SME-level mentoring Position: Credit Counselor / Sr. Credit Counselor / Subject Matter Expert (SME) Location: Noida Working Hours: 9:00 AM to 6:00 PM Working Days: 6 days/week Key Requirements Language Proficiency: Fluency in Hindi, Telugu, Tamil, Kannada, or Malayalam Qualification: Minimum 12th pass , Diploma holders , Graduates , or those currently pursuing graduation Certification: Must be DRA Certified (Debt Recovery Agent) Technical Skills: Basic to intermediate proficiency in MS Excel Job Responsibilities Proactively engage with assigned customers to remind and recover overdue/outstanding payments . Educate customers about their upcoming dues and repayment options. Track and escalate suspected fraud or skip cases to the reporting manager. Monitor and ensure promise-to-pay (PTP) commitments are fulfilled on time. Handle and resolve customer queries with a focus on providing a positive experience. Lead settlement discussions for seriously delinquent or specific accounts. Prepare and deliver performance and recovery reports as required. Compensation Structure Statutory Benefits: ESI & PF coverage Growth Opportunity: Appraisal cycle every 6 months based on performance Why Join Us? Stable opportunity with long-term growth potential Incentivized performance environment Exposure to pan-India customer portfolios Structured career progression pathway with SME-level mentoring Job Type: Full-time Pay: ₹150,000.00 - ₹240,000.00 per year Schedule: Day shift Supplemental Pay: Performance bonus License/Certification: Debt Recovery Agent (Required) Work Location: In person Application Deadline: 01/07/2025 Expected Start Date: 01/07/2025
Hyderabad
INR 1.44 - 2.4 Lacs P.A.
Work from Office
Full Time
We are Hiring Credit Counselors in Hyderabad Must be DRA certified , 12th pass & fluent in Hindi/South languages. Handle collections, resolve queries, ensure recoveries. Basic Excel skills required. Growth & incentives offered. Provident fund
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.