BPI dev is mandatory (at least a couple projects with BPI and progressive experience in Telecom domain) Preferably full stack JAVA developers including Angular JS and REST API Database: Neo4J and Postgres Workflow: Camunda
Description: We are looking for a skilled Power BI Developer with Snowflake expertise to join our Analytics and Insights Team. As a Power BI developer, you will be responsible for designing, developing and maintaining business intelligence solutions using Power BI and Snowflake, enabling data driven decisioning within Enterprise Network Services organization. Skills Required: Hands on Power BI Developer with 3+ years of experience in report creation, data modelling and data visualization leveraging Power BI and Snowflake. Strong understanding of DAX, Power BI's data modelling features. 1+ year of experience working with Snowflake database, with strong understanding of database design and development concepts and Snowflake's SQL dialect. Hands on experience in analyzing complex data sets and building critical business insights. Strong understanding of SDLC methodology, DevOps concepts. Experience in working as Agile Team member with good understanding of Agile principles. Excellent Problem Solving and Communication skills.
Description: We are looking for a talented, agile and self-driven Data Engineer to join Enterprise Network Services(ENS) Analytics & Insights team. As a Data Engineer, you will be developing analytics solutions for critical Network Infrastructure Governance initiatives including Inventorying assets in DMZ network, automation of critical processes and workflows associated with IT Governance leveraging powerful technologies and tools including Python, Splunk, MongoDB and other modern technologies. This role require strong collaboration with various stakeholders across ENS teams, prioritizing workload requirements, and executing on critical deliverables. This role also requires collaborating with several different technology teams across Enterprise Technology Services, meeting on a regular cadence with stakeholders to provide updates on existing and gather new requirements. Skills Required: Hands on Splunk Developer minimum 3 years of experience in Splunk solutions development including dashboards, reports and alerts. Hands on experience in analysing complex data sets and building critical business insights Experience in working with SQL & No-SQL databases Strong understanding of SDLC methodology, DevOps concepts Knowledge in basics of Networking (Infrastructure) Experience in working as Agile Team member with good understanding of Agile principles. Excellent Problem Solving and Communication skills
Description: We are looking for a talented, agile and self-driven Data Engineer to join Enterprise Network Services(ENS) Analytics & Insights team. As a Data Engineer, you will be developing analytics solutions for critical IT Asset Management functions including Hardware & Software Inventory Management, End to End Hardware Lifecycle management etc., leveraging powerful technologies and tools including Python, Splunk, MongoDB and other modern technologies. This role require strong collaboration with various stakeholders across ENS teams, prioritizing workload requirements, and executing on critical deliverables. This role also requires collaborating with several different technology teams across Enterprise Technology Services, meeting on a regular cadence with stakeholders to provide updates on existing and gather new requirements. Skills Required: Hands on programmer with minimum 5 years of experience in Python programming. Hands on experience in analysing complex data sets and building critical business insights Hands on experience in developing complex dashboards / reports / visualizations using any data visualization tool. Experience in working with SQL & No-SQL databases Strong understanding of SDLC methodology, DevOps concepts Knowledge in basics of Networking (Infrastructure) Experience in working as Agile Team member with good understanding of Agile principles. Excellent Problem Solving and Communication skills Skills Desired: Experience in developing high performance dashboards using Splunk or PowerBI tool Ability to analyse volumes of data and develop data insights to support critical business decisions. Ability to build scalable, extensible and reusable software solutions
Roles and Responsibilities Design, develop, test, deploy, and maintain large-scale Java applications using Spring Boot microservices architecture. Collaborate with cross-functional teams to identify requirements and implement solutions that meet business needs. Ensure high-quality code by following best practices in software engineering, testing, and debugging techniques. Participate in code reviews to improve overall quality of the application. Troubleshoot issues related to application performance, scalability, and reliability.
A bachelor's degree in Computer Science or a related field. 5-7 years of experience working as a hands-on developer in Sybase, DB2, ETL technologies. Worked extensively on data integration, designing, and developing reusable interfaces. Advanced experience in Python, DB2, Sybase, shell scripting, Unix, Perl scripting, DB platforms, database design and modeling. Expert level understanding of data warehouse, core database concepts and relational database design Experience in writing stored procedures, optimization, and performance tuning Strong Technology acumen and a deep strategic mindset Proven track record of delivering results Proven analytical skills and experience making decisions based on hard and soft data A desire and openness to learning and continuous improvement, both of yourself and your team members Hands-on experience on development of APIs is a plus Good to have experience with Business Intelligence tools, Source to Pay applications such as SAP Ariba, and Accounts Payable system Skills Required Familiarity with Postgres and Python is a plus
Key Responsibilities: Design, develop, and execute advanced automated test scripts using Tricentis TOSCA. Work extensively with SAP modules, particularly in Warehousing and Logistics (Supply Chain), to ensure robust test coverage and business process validation. Collaborate with functional and technical teams to understand requirements, identify automation opportunities and ensure high-quality test deliverables. Troubleshoot and resolve issues in automation scripts, frameworks, and integrations. Act as a subject matter expert for TOSCA automation within the team, mentoring and providing technical support to other automation engineers. Maintain and optimise automation frameworks for scalability, reusability and efficiency. Participate in test planning, estimation and reporting activities. Ensure alignment with best practices, quality standards and project timelines. Requirements: Proven, advanced-level experience with Tricentis TOSCA automation tools. Solid working experience with SAP, specifically in Warehousing and Logistics (Supply Chain) modules. Strong understanding of end-to-end SAP business processes. Ability to start contributing immediately with minimal onboarding. Experience in mentoring, coaching, or supporting other automation engineers. Strong problem-solving skills and attention to detail. Excellent communication and stakeholder engagement skills. Preferred Skills: Knowledge of SAP S/4HANA. Experience with API testing and test data management. Familiarity with continuous integration tools and DevOps practices. Personal Attributes: Self-motivated and proactive. Collaborative team player with a knowledge-sharing mindset. Ability to work under pressure and manage multiple priorities effectively.
Primary Responsibilities: Develop, test, and deploy Public Cloud Controls across multi-cloud environments, with a primary focus on Google Cloud Platform (GCP). Provide security recommendations and solutions for cloud-native and migrated applications across major cloud providers (GCP, AWS, Azure). Act as a subject matter expert for cloud security tools, DevOps practices, and secure cloud architecture across all major cloud platforms. Implement and manage security controls specific to GCP, including organizational policies, VPC Service Controls, IAM policies, and API security. Collaborate with cross-functional teams and vendors to design, deploy, and validate cloud security services. Monitor and respond to security posture changes and configuration drifts in the cloud environment; coordinate remediation efforts with relevant stakeholders. Integrate, configure, and document compliant cloud infrastructure and supporting services. Troubleshoot and analyze root causes of security issues or bugs introduced by security solutions; drive fixes or mitigations. Work closely with internal risk management, security architecture, and incident response teams to ensure cloud services are compliant with required security controls. Participate in a globally distributed team environment to design and implement scalable, secure cloud solutions. Required Skills: 5+ years of experience in software engineering and/or cloud platform engineering, with a strong focus on Google Cloud Platform. Deep understanding of the Shared Responsibility Model and associated cloud security risks. Hands-on experience developing security controls across the cloud security lifecycle (prevent, detect, respond, remediate). Experience configuring native security features on GCP, such as Org Policies, VPC SC, IAM, and API controls. Proficiency with event-driven and serverless security solutions across cloud providers (e.g., Google Cloud Functions, AWS Lambda, Azure Functions, Automation Runbooks). Strong background in DevOps processes and secure software development lifecycle (Secure SDLC). Proficiency with Infrastructure as Code (IaC) tools, particularly Terraform. Familiarity with logging, monitoring, and data pipeline architectures in cloud environments. Strong scripting skills (Python, PowerShell, Bash, or Go). Ability to produce high-quality technical documentation, including architectural designs and runbooks. Experience building and securing CI/CD pipelines. Hands-on experience with GitHub Actions, Jenkins, or similar CI/CD tools. Familiarity with IT Service Management (ITSM) practices and tools. Strong interpersonal and communication skills, with the ability to explain complex technical issues to non-technical stakeholders. Experience with risk control frameworks and interfacing with risk or regulatory teams. Experience in regulated industries such as finance is a plus. Knowledge of Policy-as-Code concepts and platforms (e.g., Rego/OPA) is a plus. Cloud security certifications (e.g., GCP Professional Cloud Security Engineer, AWS Security Specialty, Azure Security Engineer) are a plus.
5+ years of experience of Java development with Spring Boot and JPA, substantial experience of RESTful API development, good experience with Kubernetes (in terms of building for services that will be executed within Kubernetes and good working knowledge of K8s manifests), and reasonable experience with relational databases.
We are currently in search of an experienced dev ops engineer to join Perimeter Email (security engineering) dev ops team. The successful candidate will supplement existing devors team in design, delivery and maintenance of Perimeter Email infrastructure of the firm. Core Responsibilities As an experience automation engineer, you will work with a global team of colleagues to analyze, design, plan and deploy automation for 'toil' reduction/elimination. You will contribute to transform existing systems into infrastructure as code, config as code and monitoring as code build on common approaches, enterprise products/solutions and best practices. Introduce anomaly and fault detection to improve service stability, security and improve service levels. Reimagine and build the next generation secure and mission critical email infrastructure for a global financial powerhouse. Work with a diverse and passionate group of individuals from across with globe. Provide architecture assurance on messaging initiatives. Provide a secure environment, by implementing controls to manage and mitigate risks. Provide consultancy services to IT Security teams. Develop automated metrics reporting capabilities. Create, review, maintain and update documentation including Documenting & Publishing fixes in central knowledge base. Qualifications and Skills Unix Networking Familiarity with SDLC concepts and toolsets (git, Jenkins, ALM) Familiarity with Agile principles and DevOps practices Intermediate or above knowledge in Automation (Ansible) GNU Linux operating system and GNU tools Infrastructure protocols such as DNS, LDAP, NTP, SNMP, SMTP SMTP email related protocols and standards SMTP, DMARC, DKIM, SPF Programmer mindset, and at a minimum, have intermediate level skill on Python, shell scripting Familiarity with email security techniques (anti malware, anti-spam) Soft skills Self-learner Communicator Team player Preferred Professional Experience Bachelors degree in management information systems, Information Technology, or equivalent work experience 8+ years of experience with RHEL/Linux system 5+ years' experience using Python and/or Bash scripting to solve and automate common system tasks. 3+ years of experience with automation using Ansible Experience developing, testing, deploying, and maintaining software using SDLC management systems such as Git, Jira, ALM, Jenkins Experience with Linux system administration Familiarity with email security products from Proofpoint® and/or Fortinet® Fundamental working knowledge of Networking, DNS, LDAP, SSL/TLS Experience with large scale SMTP email gateway deployment, email hygiene and email authentication frameworks DMARC, DKIM, SPF.
5+ years of experience of Java development with Spring Boot and JPA, substantial experience of RESTful API development, good experience with Kubernetes (in terms of building for services that will be executed within Kubernetes and good working knowledge of K8s manifests), and reasonable experience with relational databases.
About the Role: We're looking for a passionate Python Developer with a strong foundation in backend development using FastAPI (latest version) and modern Python tools. Youll be building scalable APIs and working closely with a team that values clean, efficient code. Tech Stack: Languages: Python 3.x Framework: FastAPI Database: MongoDB ORM/ODM: Beanie Validation: Pydantic Key Responsibilities: Develop and maintain RESTful APIs using FastAPI Integrate MongoDB with Beanie ODM for seamless data operations Design and validate models using Pydantic Write clean, testable, and well-documented code Collaborate with cross-functional teams for feature development and deployment Requirements: 1–4 years of hands-on experience with Python development Solid understanding of FastAPI and asynchronous programming Experience working with MongoDB and Beanie Familiarity with Pydantic for data validation Strong debugging and problem-solving skills
Key Responsibilities: -Design, develop, and implement solutions using Microsoft Dynamics 365 CE.-Customize and configure Dynamics 365 CE applications to meet business requirements.-Design and implement C# plugins, workflows, and custom business logic in Dynamics 365 CE.-Work with Dataverse to create, update, and manage data models and relationships.-Integrate Dynamics 365 CE with other systems and applications using Power Automate, Azure Functions, Web APIs, and connectors.-Perform data migration and ensure data integrity.-Perform unit testing and provide technical support and troubleshooting to ensure reliability and performance.-Create and maintain technical documentation.-Stay updated with the latest features and updates in Dynamics 365 CE.-Maintain best practices in ALM, source control (Azure Dev OPS/ GitHub Actions Skills Required: -Bachelors degree in computer science, Information Technology, or a related field. Proficiency in customizing and configuring Dynamics 365 CE applications.-Strong knowledge of C#, .NET, JavaScript, and SQL.-Experience with Dynamics 365 CE integration using REST APIs and web services. Experience working with Dataverse, power Automate and Azure Services (Azure Functions, Logic Apps)-Familiarity with security mode in Dynamics 365 CE including authentication vis Azure AD and OAuth. Experience with DevOps for packaging and deploying solutions using GitHub pipelines, CI/CD pipelines and automated deployments is a plus.-Strong analytical and problem-solving skills.
Key Responsibilities Develop and configure Power Pages for user engagement. Design and implement Power Pages web templates, entity lists, and forms. Work with Dataverse to manage data models and relationships. Develop custom JavaScript, TypeScript, and Liquid templates. Ensure security compliance with Azure AD, OAuth, and RBAC authentication. Optimize Power Pages performance and troubleshoot issues. Maintain best practices in ALM and source control. Skills Required Bachelors degree in Computer Science, IT, or related field. 3-5 years of experience in Power Pages & Power Apps Portal development. Proficiency in JavaScript, TypeScript, Liquid, and CSS. Experience with REST APIs, Web Services, Power Automate, Azure Functions. Familiarity with OAuth and Azure AD authentication. DevOps experience (CI/CD pipelines, GitHub Actions) is a plus. Strong analytical and problem-solving skills. Excellent communication skills to collaborate with teams globally.
Required Skills 5+ years of experience in software and/or cloud platform engineering, with a strong emphasis on Service Mesh technologies (e.g., Istio, Linkerd, Consul, AWS App Mesh). Deep understanding of the Shared Responsibility Model and cloud-native security risks. Experience across the security assurance lifecycle: prevent, detect, respond, and remediate. Familiarity with Cloud Security Posture Management (CSPM) tools such as Wiz, Prisma, Check Point CloudGuard. Experience with Policy-as-Code using platforms like OPA and Rego. Hands-on experience with event-driven serverless security controls (e.g., AWS Lambda, Azure Functions). Strong grasp of DevOps workflows , Secure SDLC, and Infrastructure as Code (IaC) tools like Terraform. Working knowledge of logging and data pipeline architectures in cloud environments. Proficiency in scripting languages such as PowerShell, Python, Bash, or Go. Familiarity with Agile methodologies and CI/CD pipelines (GitHub Actions, Jenkins). Experience with ITSM processes and risk control frameworks. Ability to communicate complex technical concepts to non-technical stakeholders. Experience in the financial industry and cloud certifications are a plus.
Roles and Responsibilities Design, develop, test, and deploy chatbots using Python and Linux. Stay up-to-date with industry trends in artificial intelligence (AI), machine learning (ML), and deep learning (DL). Collaborate with cross-functional teams to integrate machine learning models into chatbot applications. Develop scalable and efficient algorithms for natural language processing (NLP) tasks such as text classification, sentiment analysis, and entity resolution. Troubleshoot issues related to chatbot performance, data quality, and integration with other systems.