Drive end-to-end business analysis by gathering requirements, managing stakeholders, improving processes, and supporting full SDLC execution using Agile, Waterfall, and Lean methodologies. Fluent in English and Swahili (both written and spoken)
Roles and Responsibilities Design, develop, test, deploy, and maintain complex Cognos TM1 solutions using various tools such as MDX, SQL, Excel, Visual Basic, DevOps, Ansible, Jenkins, Git, Configuration Management. Collaborate with cross-functional teams to gather requirements and deliver high-quality solutions on time. Troubleshoot issues related to TM1 cubes, dimensions, business rules, security settings using TM1 Development environment. Ensure seamless integration of TM1 applications with other systems through REST APIs. Desired Candidate Profile 7-10 years of experience in Cognos TM1 development with expertise in TM1 Development (Cubes), TM1 Dimensions & Business Rules. Strong understanding of SQL programming language for querying databases like PostgreSQL or Oracle/SQL Server/DB2. Experience with scripting languages like Python/Shell Scripting/Puppet/Chef/Python for automation tasks.
Roles and Responsibilities Design, develop, test, deploy, and maintain automation solutions using Uipath. Desired Candidate Profile Strong understanding of Robotic Process Automation (RPA), Re Framework, and Uipath Orchestrator concepts. 2-5 years of experience in RPA development using Uipath or similar tools like Automation Anywhere.
8–10 years of hands-on experience in DevOps, System Administration, or related roles. Strong expertise in Unix/Linux systems and Shell Scripting. Solid experience with CI/CD tools – particularly Jenkins and GitHub.
Handle end-to-end recruitment cycle including sourcing, screening, scheduling, interviewing, and onboarding. Source candidates through various channels such as job portals (Naukri, LinkedIn, Indeed), employee referrals, and recruitment agencies.
Seeking a skilled ServiceNow Developer with expertise in Service Catalog and Incident Management.Join our ITSM team to design, develop and optimize applications, automate workflows, improve IT service delivery in collaboration with key stakeholders.
Develop, optimize, and maintain scalable data pipelines using Python and PySpark. Design and implement data processing workflows leveraging GCP services such as: BigQuery Dataflow Cloud Functions Cloud Storage
Seeking a full-stack developer skilled in Java (Spring Boot) and Angular to build scalable web apps. Must have experience with end-to-end development, relational databases, Git, Kubernetes, and CI/CD pipelines.
We are seeking a skilled and motivated DevOps Engineer to join our dynamic team. The ideal candidate will have a strong background in CI/CD pipelines, cloud infrastructure, containerization, and automation, along with basic programming knowledge.
Job Summary: We are looking for a highly skilled and experienced Platform & DevOps Engineer to join our team. The ideal candidate will be responsible for managing and supporting DevOps tools, ensuring smooth CI/CD pipeline implementation, and maintaining infrastructure on Google Cloud Platform (GCP) . This role requires expertise in Jenkins, Terraform, Docker, Kubernetes (GKE), and security best practices . Experience in the banking industry is a plus. The candidate should be able to work independently, troubleshoot production issues efficiently, and be flexible with work shifts. Key Responsibilities: Design, implement, and maintain CI/CD pipelines using Jenkins and other DevOps tools. Manage and support Terraform-based infrastructure as code (IaC) for scalable deployments. Work with GCP products such as GCE, GKE, BigQuery, Pub/Sub, Monitoring, and Alerting . Collaborate with development and operations teams to enhance integration and deployment processes. Build and manage container images using Packer and Docker , ensuring efficient image rotation strategies. Monitor systems, respond to alerts, and troubleshoot production issues promptly. Ensure infrastructure security, compliance, and best practices are maintained. Provide technical guidance to development teams on DevOps tools and processes. Implement and support GitOps best practices, including repository configurations like code owners and webhooks . Document processes, configurations, and best practices for operational efficiency. Stay updated with the latest DevOps technologies and trends , continuously improving existing practices. Required Skills & Qualifications: Proficiency in scripting and automation using Bash, Python, or Groovy . Hands-on experience with Jenkins, Terraform, and GCP infrastructure management. Strong knowledge of containerization (Docker) and orchestration tools like Kubernetes (GKE) and Helm . Familiarity with disaster recovery, backups, and troubleshooting production issues . Solid understanding of infrastructure security, compliance, and monitoring best practices . Experience with image creation and management using Packer and Docker . Prior exposure to banking industry processes and regulations is an advantage. Excellent problem-solving, communication, and teamwork skills. Ability to work independently and handle multiple priorities in a fast-paced environment.
Coordinate software releases across Dev, QA, DevOps & Infrastructure. Ensure on-time, risk-free deployments with ITIL & DevOps best practices. Strong governance & collaboration skills required.
Job Summary: We are seeking a highly experienced and motivated Snowflake Data Architect & ETL Specialist to join our growing Data & Analytics team. The ideal candidate will be responsible for designing scalable Snowflake-based data architectures, developing robust ETL/ELT pipelines, and ensuring data quality, performance, and security across multiple data environments. You will work closely with business stakeholders, data engineers, and analysts to drive actionable insights and ensure data-driven decision-making. Key Responsibilities: Design, develop, and implement scalable Snowflake-based data architectures . Build and maintain ETL/ELT pipelines using tools such as Informatica, Talend, Apache NiFi, Matillion , or custom Python/SQL scripts. Optimize Snowflake performance through clustering, partitioning, and caching strategies. Collaborate with cross-functional teams to gather data requirements and deliver business-ready solutions. Ensure data quality, governance, integrity, and security across all platforms. Migrate legacy data warehouses (e.g., Teradata, Oracle, SQL Server) to Snowflake . Automate data workflows and support CI/CD deployment practices. Implement data modeling techniques including dimensional modeling, star/snowflake schema , normalization/denormalization. Support and promote metadata management and data governance best practices. Technical Skills (Hard Skills): Expertise in Snowflake : Architecture design, performance tuning, cost optimization. Strong proficiency in SQL , Python , and scripting for data engineering tasks. Hands-on experience with ETL tools: Informatica, Talend, Apache NiFi, Matillion , or similar. Proficient in data modeling (dimensional, relational, star/snowflake schema). Good knowledge of Cloud Platforms : AWS, Azure, or GCP. Familiar with orchestration and workflow tools such as Apache Airflow, dbt, or DataOps frameworks . Experience with CI/CD tools and version control systems (e.g., Git). Knowledge of BI tools such as Tableau, Power BI , or Looker . Certifications (Preferred/Required): Snowflake SnowPro Core Certification Required or Highly Preferred SnowPro Advanced Architect Certification – Preferred Cloud Certifications (e.g., AWS Certified Data Analytics – Specialty, Azure Data Engineer Associate) – Preferred ETL Tool Certifications (e.g., Talend, Matillion) – Optional but a plus Soft Skills: Strong analytical and problem-solving capabilities. Excellent communication and collaboration skills. Ability to translate technical concepts into business-friendly language. Proactive, detail-oriented, and highly organized. Capable of multitasking in a fast-paced, dynamic environment. Passionate about continuous learning and adopting new technologies. Why Join Us? Work on cutting-edge data platforms and cloud technologies Collaborate with industry leaders in analytics and digital transformation Be part of a data-first organization focused on innovation and impact Enjoy a flexible, inclusive, and collaborative work culture
SharePoint and Power Platform development (Canvas & Model-Driven), Power Automate, and Power BI Azure services, including Azure DevOps (CI/CD pipelines, repositories, etc.) Knowledge of governance, security, and compliance in Microsoft environments.
skilled Java + GCP Developer Shell scripting and Python, Java, Spring Boot, BigQuery. The ideal candidate should have hands-on experience in Java, Spring Boot, and Google Cloud Platform (GCP)
Payments Domain expertise Proficiency in JIRA, Confluence, MS Word, Excel, PowerPoint,MS Visio Payment Engine Management MT and MX ISO messages notifications, statements Payment Schemes SWIFT, RTGS, ACH, RTP solution design, QA test plans, test cases
manual testing within the payments domain testing payment gateways, acquiring/issuing systems, or full transaction lifecycle ISO 8583, SWIFT, or card scheme protocols Visa, MasterCard testing REST/SOAP APIs using Postman, SoapUI, or similar tools sq
Senior Project/Program Manager with expertise in Payments/Liquidity, project financial management, and portfolio governance. Skilled in leading Agile PODs/Scrum teams, driving large-scale program delivery, and managing end-to-end global project exec
FIND ON MAP