BPI dev is mandatory (at least a couple projects with BPI and progressive experience in Telecom domain) Preferably full stack JAVA developers including Angular JS and REST API Database: Neo4J and Postgres Workflow: Camunda
Description: We are looking for a skilled Power BI Developer with Snowflake expertise to join our Analytics and Insights Team. As a Power BI developer, you will be responsible for designing, developing and maintaining business intelligence solutions using Power BI and Snowflake, enabling data driven decisioning within Enterprise Network Services organization. Skills Required: Hands on Power BI Developer with 3+ years of experience in report creation, data modelling and data visualization leveraging Power BI and Snowflake. Strong understanding of DAX, Power BI's data modelling features. 1+ year of experience working with Snowflake database, with strong understanding of database design and development concepts and Snowflake's SQL dialect. Hands on experience in analyzing complex data sets and building critical business insights. Strong understanding of SDLC methodology, DevOps concepts. Experience in working as Agile Team member with good understanding of Agile principles. Excellent Problem Solving and Communication skills.
Description: We are looking for a talented, agile and self-driven Data Engineer to join Enterprise Network Services(ENS) Analytics & Insights team. As a Data Engineer, you will be developing analytics solutions for critical Network Infrastructure Governance initiatives including Inventorying assets in DMZ network, automation of critical processes and workflows associated with IT Governance leveraging powerful technologies and tools including Python, Splunk, MongoDB and other modern technologies. This role require strong collaboration with various stakeholders across ENS teams, prioritizing workload requirements, and executing on critical deliverables. This role also requires collaborating with several different technology teams across Enterprise Technology Services, meeting on a regular cadence with stakeholders to provide updates on existing and gather new requirements. Skills Required: Hands on Splunk Developer minimum 3 years of experience in Splunk solutions development including dashboards, reports and alerts. Hands on experience in analysing complex data sets and building critical business insights Experience in working with SQL & No-SQL databases Strong understanding of SDLC methodology, DevOps concepts Knowledge in basics of Networking (Infrastructure) Experience in working as Agile Team member with good understanding of Agile principles. Excellent Problem Solving and Communication skills
Description: We are looking for a talented, agile and self-driven Data Engineer to join Enterprise Network Services(ENS) Analytics & Insights team. As a Data Engineer, you will be developing analytics solutions for critical IT Asset Management functions including Hardware & Software Inventory Management, End to End Hardware Lifecycle management etc., leveraging powerful technologies and tools including Python, Splunk, MongoDB and other modern technologies. This role require strong collaboration with various stakeholders across ENS teams, prioritizing workload requirements, and executing on critical deliverables. This role also requires collaborating with several different technology teams across Enterprise Technology Services, meeting on a regular cadence with stakeholders to provide updates on existing and gather new requirements. Skills Required: Hands on programmer with minimum 5 years of experience in Python programming. Hands on experience in analysing complex data sets and building critical business insights Hands on experience in developing complex dashboards / reports / visualizations using any data visualization tool. Experience in working with SQL & No-SQL databases Strong understanding of SDLC methodology, DevOps concepts Knowledge in basics of Networking (Infrastructure) Experience in working as Agile Team member with good understanding of Agile principles. Excellent Problem Solving and Communication skills Skills Desired: Experience in developing high performance dashboards using Splunk or PowerBI tool Ability to analyse volumes of data and develop data insights to support critical business decisions. Ability to build scalable, extensible and reusable software solutions
Roles and Responsibilities Design, develop, test, deploy, and maintain large-scale Java applications using Spring Boot microservices architecture. Collaborate with cross-functional teams to identify requirements and implement solutions that meet business needs. Ensure high-quality code by following best practices in software engineering, testing, and debugging techniques. Participate in code reviews to improve overall quality of the application. Troubleshoot issues related to application performance, scalability, and reliability.
A bachelor's degree in Computer Science or a related field. 5-7 years of experience working as a hands-on developer in Sybase, DB2, ETL technologies. Worked extensively on data integration, designing, and developing reusable interfaces. Advanced experience in Python, DB2, Sybase, shell scripting, Unix, Perl scripting, DB platforms, database design and modeling. Expert level understanding of data warehouse, core database concepts and relational database design Experience in writing stored procedures, optimization, and performance tuning Strong Technology acumen and a deep strategic mindset Proven track record of delivering results Proven analytical skills and experience making decisions based on hard and soft data A desire and openness to learning and continuous improvement, both of yourself and your team members Hands-on experience on development of APIs is a plus Good to have experience with Business Intelligence tools, Source to Pay applications such as SAP Ariba, and Accounts Payable system Skills Required Familiarity with Postgres and Python is a plus
Key Responsibilities: Design, develop, and execute advanced automated test scripts using Tricentis TOSCA. Work extensively with SAP modules, particularly in Warehousing and Logistics (Supply Chain), to ensure robust test coverage and business process validation. Collaborate with functional and technical teams to understand requirements, identify automation opportunities and ensure high-quality test deliverables. Troubleshoot and resolve issues in automation scripts, frameworks, and integrations. Act as a subject matter expert for TOSCA automation within the team, mentoring and providing technical support to other automation engineers. Maintain and optimise automation frameworks for scalability, reusability and efficiency. Participate in test planning, estimation and reporting activities. Ensure alignment with best practices, quality standards and project timelines. Requirements: Proven, advanced-level experience with Tricentis TOSCA automation tools. Solid working experience with SAP, specifically in Warehousing and Logistics (Supply Chain) modules. Strong understanding of end-to-end SAP business processes. Ability to start contributing immediately with minimal onboarding. Experience in mentoring, coaching, or supporting other automation engineers. Strong problem-solving skills and attention to detail. Excellent communication and stakeholder engagement skills. Preferred Skills: Knowledge of SAP S/4HANA. Experience with API testing and test data management. Familiarity with continuous integration tools and DevOps practices. Personal Attributes: Self-motivated and proactive. Collaborative team player with a knowledge-sharing mindset. Ability to work under pressure and manage multiple priorities effectively.
Primary Responsibilities: Develop, test, and deploy Public Cloud Controls across multi-cloud environments, with a primary focus on Google Cloud Platform (GCP). Provide security recommendations and solutions for cloud-native and migrated applications across major cloud providers (GCP, AWS, Azure). Act as a subject matter expert for cloud security tools, DevOps practices, and secure cloud architecture across all major cloud platforms. Implement and manage security controls specific to GCP, including organizational policies, VPC Service Controls, IAM policies, and API security. Collaborate with cross-functional teams and vendors to design, deploy, and validate cloud security services. Monitor and respond to security posture changes and configuration drifts in the cloud environment; coordinate remediation efforts with relevant stakeholders. Integrate, configure, and document compliant cloud infrastructure and supporting services. Troubleshoot and analyze root causes of security issues or bugs introduced by security solutions; drive fixes or mitigations. Work closely with internal risk management, security architecture, and incident response teams to ensure cloud services are compliant with required security controls. Participate in a globally distributed team environment to design and implement scalable, secure cloud solutions. Required Skills: 5+ years of experience in software engineering and/or cloud platform engineering, with a strong focus on Google Cloud Platform. Deep understanding of the Shared Responsibility Model and associated cloud security risks. Hands-on experience developing security controls across the cloud security lifecycle (prevent, detect, respond, remediate). Experience configuring native security features on GCP, such as Org Policies, VPC SC, IAM, and API controls. Proficiency with event-driven and serverless security solutions across cloud providers (e.g., Google Cloud Functions, AWS Lambda, Azure Functions, Automation Runbooks). Strong background in DevOps processes and secure software development lifecycle (Secure SDLC). Proficiency with Infrastructure as Code (IaC) tools, particularly Terraform. Familiarity with logging, monitoring, and data pipeline architectures in cloud environments. Strong scripting skills (Python, PowerShell, Bash, or Go). Ability to produce high-quality technical documentation, including architectural designs and runbooks. Experience building and securing CI/CD pipelines. Hands-on experience with GitHub Actions, Jenkins, or similar CI/CD tools. Familiarity with IT Service Management (ITSM) practices and tools. Strong interpersonal and communication skills, with the ability to explain complex technical issues to non-technical stakeholders. Experience with risk control frameworks and interfacing with risk or regulatory teams. Experience in regulated industries such as finance is a plus. Knowledge of Policy-as-Code concepts and platforms (e.g., Rego/OPA) is a plus. Cloud security certifications (e.g., GCP Professional Cloud Security Engineer, AWS Security Specialty, Azure Security Engineer) are a plus.
5+ years of experience of Java development with Spring Boot and JPA, substantial experience of RESTful API development, good experience with Kubernetes (in terms of building for services that will be executed within Kubernetes and good working knowledge of K8s manifests), and reasonable experience with relational databases.