Home
Jobs

2209 Shell Scripting Jobs - Page 47

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

7.0 - 10.0 years

11 - 16 Lacs

Bengaluru

Work from Office

Naukri logo

Job Description Are you a seasoned DevOps professional with a passion for scalable cloud infrastructure and automationJoin our team and help shape resilient, high-performing systems that power the future of our tech stack! Were looking for a Senior DevOps Engineer with 7-10 years of hands-on experience in designing, building, and managing cloud environments particularly on AWS, and driving automation through Terraform, Ansible, and Azure DevOps CI/CD pipelines. TITLE: Senior DevOps Engineer REPORTING TO: IT Operations Manager WORKING LOCATION: Bangalore, India JOB RESPONSIBILITIES: Cloud Infrastructure Design Management: Design scalable and highly available cloud infrastructure, networks, and security. Provision and maintain AWS infrastructure, including EC2, RDS, and other services. Understand and manage AWS infrastructure components such as IAM, EC2, VPCs, Subnets, Route Tables, ALBs, NLBs, DNS amongst other cloud objects. Optimize cloud resources for cost-efficiency and performance. Prepare and analyse infrastructure-related architecture documentation. Automation and CI/CD: Automate infrastructure management using Infrastructure as Code (IaC) tools such as Terraform and Ansible. Implement and manage CI/CD pipelines using Azure DevOps. Automate tasks including deployments, monitoring, and backups using PowerShell, Bash, and Python. Security and Monitoring: Integrate security best practices into DevOps pipelines to ensure secure application delivery. Implement infrastructure monitoring using tools like ICINGA. System Administration: Administer Windows and Linux systems, including group policies, patch management, log management, amongst others. Collaboration and Training: Work closely with development teams to ensure smooth application deployment. Provide training and mentoring to junior engineers on cloud deployment and management. Disaster Recovery and Kubernetes Management: Design and test disaster recovery plans for business continuity. Manage Kubernetes clusters for container orchestration and scalability. REQUIRED EXPERIENCE AND SKILLS: Experience: 7-10 years of experience in DevOps or related roles. Proven expertise in designing and managing scalable, highly available cloud infrastructure. Hands-on experience with IaC tools such as Terraform and Ansible. Extensive experience with Azure DevOps for CI/CD pipeline automation. Strong scripting skills in PowerShell, Bash, Shell Scripting and Python. Experience integrating security best practices into DevOps pipelines. Proficiency in monitoring tools, preferably ICINGA. Expertise in AWS AWS services including EC2, RDS, IAM, VPCs, Subnets, Route Tables, ALBs, NLBs, and DNS. Solid background in Windows and Linux administration including patch management. Ability to prepare and analyze infrastructure architecture documentation. Experience in cloud cost optimization and infrastructure performance tuning. Solid understanding of disaster recovery planning and testing. Experience in Kubernetes cluster management. Strong knowledge of GitHub and experience in managing repositories Technology skills: Cloud Infrastructure Administration : AWS, Azure, and other cloud platforms. CI/CD Pipelines : Azure DevOps, Octopus Deploy.

Posted 3 weeks ago

Apply

4.0 - 5.0 years

22 - 27 Lacs

Hyderabad

Work from Office

Naukri logo

About Sanofi: We are an innovative global healthcare company, driven by one purpose: we chase the miracles of science to improve people s lives. Our team, across some 100 countries, is dedicated to transforming the practice of medicine by working to turn the impossible into the possible. We provide potentially life-changing treatment options and life-saving vaccine protection to millions of people globally, while putting sustainability and social responsibility at the center of our ambitions. Sanofi has recently embarked into a vast and ambitious digital transformation program. A cornerstone of this roadmap is the acceleration of its data transformation and of the adoption of artificial intelligence (AI) and machine learning (ML) solutions that will accelerate Manufacturing Supply performance and help bring drugs and vaccines to patients faster, to improve health and save lives. Who You Are: You are a dynamic Data Engineer interested in challenging the status quo to design and develop globally scalable solutions that are needed by Sanofi s advanced analytic, AI and ML initiatives for the betterment of our global patients and customers. You are a valued influencer and leader who has contributed to making key datasets available to data scientists, analysts, and consumers throughout the enterprise to meet vital business use needs. You have a keen eye for improvement opportunities while continuing to fully comply with all data quality, security, and governance standards. Our vision for digital, data analytics and AI Join us on our journey in enabling Sanofi s Digital Transformation through becoming an AI first organization. This means: AI Factory - Versatile Teams Operating in Cross Functional Pods: Utilizing digital and data resources to develop AI products, bringing data management, AI and product development skills to products, programs and projects to create an agile, fulfilling and meaningful work environment. Leading Edge Tech Stack: Experience building products that will be deployed globally on a leading-edge tech stack. World Class Mentorship and Training: Working with renowned leaders and academics in machine learning to further develop your skillsets There are multiple vacancies across our Digital profiles and NA region. Further assessments will be completed to determine specific function and level of hired candidates. Job Highlights: Propose and establish technical designs to meet business and technical requirements Develop and maintain data engineering solutions based on requirements and design specifications using appropriate tools and technologies Create data pipelines / ETL pipelines and optimize performance Test and validate developed solution to ensure it meets requirements Create design and development documentation based on standards for knowledge transfer, training, and maintenance Work with business and products teams to understand requirements, and translate them into technical needs Adhere to and promote to best practices and standards for code management, automated testing, and deployments Leverage existing or create new standard data pipelines within Sanofi to bring value through business use cases Develop automated tests for CI/CD pipelines Gather/organize large complex data assets, and perform relevant analysis Conduct peer reviews for quality, consistency, and rigor for production level solution Actively contribute to Data Engineering community and define leading practices and frameworks Communicate results and findings in a clear, structured manner to stakeholders Remains up to date on the company s standards, industry practices and emerging technologies Key Functional Requirements Qualifications: Experience working cross-functional teams to solve complex data architecture and engineering problems Demonstrated ability to learn new data and software engineering technologies in short amount of time Good understanding of agile/scrum development processes and concepts Able to work in a fast-paced, constantly evolving environment and manage multiple priorities Strong technical analysis and problem-solving skills related to data and technology solutions Excellent written, verbal, and interpersonal skills with ability to communicate ideas, concepts and solutions to peers and leaders Pragmatic and capable of solving complex issues, with technical intuition and attention to detail Service-oriented, flexible, and approachable team player Fluent in English (Other languages a plus) Key Technical Requirements Qualifications: Bachelor s Degree or equivalent in Computer Science, Engineering, or relevant field 4 to 5+ years of experience in data engineering, integration, data warehousing, business intelligence, business analytics, or comparable role with relevant technologies and tools, such as Spark/Scala, Informatica/IICS/Dbt Understanding of data structures and algorithms Working knowledge of scripting languages (Python, Shell scripting) Experience in cloud-based data platforms (Snowflake is a plus) Experience with job scheduling and orchestration (Airflow is a plus) Good knowledge of SQL and relational databases technologies/concepts Experience working with data models and query tuning Nice to haves: Experience working in life sciences/pharmaceutical industry is a plus Familiarity with data ingestion through batch, near real-time, and streaming environments Familiarity with data warehouse concepts and architectures (data mesh a plus) Familiarity with Source Code Management Tools (GitHub a plus) Pursue Progress Discover Extraordinary Better is out there. Better medications, better outcomes, better science. But progress doesn t happen without people - people from different backgrounds, in different locations, doing different roles, all united by one thing: a desire to make miracles happen. So, let s be those people. Watch our ALL IN video and check out our Diversity, Equity and Inclusion actions at sanofi.com !

Posted 3 weeks ago

Apply

7.0 - 9.0 years

27 - 42 Lacs

Coimbatore

Work from Office

Naukri logo

Skill - Mulesoft platform admin with RTF Level: SA/M Client Round (Yes/ No): yes Location Constraint if any : PAN India Shift timing: 2PM to 11 PM JD: Expertise on MuleSoft Platform and operational work – including setting up the environment, upgrading, cert renewal, overall maintenance of the platform Expertise on Runtime fabric(RTF) on BYOK is a must. Knowledge on Azure(AKS) is preferred. Good understanding on alerting. Monitoring, logging mechanism of RTF platform

Posted 3 weeks ago

Apply

6.0 - 10.0 years

27 - 42 Lacs

Noida

Work from Office

Naukri logo

Terraform, Jenkins, Artifactory, ELK stack for monitoring, GCP cloud aware, Kubernetes, Anthos, GCP Administration Job Summary Join our dynamic team as an Infra Dev Specialist where you will leverage your expertise in Artifactory Anthos ELK Stack Kubernetes Jenkins GCP Ansible Terraform and DevOps to drive innovation in the Consumer Lending domain. With a hybrid work model and rotational shifts you will play a crucial role in optimizing infrastructure and enhancing system performance contributing to our mission of delivering exceptional financial solutions. Responsibilities Implement and manage infrastructure solutions using Artifactory Anthos and ELK Stack to ensure seamless integration and efficient operations. Collaborate with cross-functional teams to design and deploy Kubernetes clusters enhancing scalability and reliability of applications. Utilize Jenkins for continuous integration and continuous deployment processes streamlining development workflows and reducing time-to-market. Optimize cloud infrastructure on GCP ensuring cost-effective and secure solutions that align with business objectives. Develop and maintain automation scripts using Ansible and Terraform improving system provisioning and configuration management. Drive DevOps practices to enhance collaboration between development and operations teams fostering a culture of continuous improvement. Analyze and troubleshoot system issues providing timely resolutions to minimize downtime and ensure business continuity. Monitor system performance and implement enhancements to improve efficiency and user experience. Collaborate with stakeholders in the Consumer Lending domain to understand requirements and deliver tailored solutions that meet business needs. Participate in rotational shifts to provide 24/7 support ensuring high availability and reliability of infrastructure services. Contribute to the development of best practices and standards for infrastructure management promoting consistency and quality across projects. Engage in ongoing learning and development to stay updated with the latest technologies and industry trends. Support hybrid work model initiatives balancing remote and on-site work to maximize productivity and team collaboration. Qualifications Possess strong technical skills in Artifactory Anthos ELK Stack Kubernetes Jenkins GCP Ansible Terraform and DevOps essential for effective infrastructure management. Demonstrate expertise in the Consumer Lending domain understanding industry-specific requirements and challenges. Exhibit proficiency in cloud solutions particularly GCP to design and implement scalable and secure infrastructure. Showcase experience in automation tools like Ansible and Terraform crucial for efficient system provisioning and configuration. Display knowledge of DevOps practices fostering collaboration and continuous improvement within teams. Hold a minimum of 6 years and a maximum of 10 years of relevant experience ensuring a solid foundation in infrastructure development. Adapt to rotational shifts providing consistent support and maintaining high availability of services.

Posted 3 weeks ago

Apply

6.0 - 10.0 years

27 - 42 Lacs

Kolkata

Work from Office

Naukri logo

Primark Skill & JD IAC – Ansible, Terraform, Jenkins with AWS/GCP, DevOps and Scripting

Posted 3 weeks ago

Apply

6.0 - 10.0 years

27 - 42 Lacs

Noida

Work from Office

Naukri logo

Terraform, Jenkins, Artifactory ,Kubernetes, GCP Job Summary Join our dynamic team as an Infra Dev Specialist where you will leverage your expertise in Artifactory Anthos ELK Stack Kubernetes Jenkins GCP Ansible Terraform and DevOps to drive innovation in the Consumer Lending domain. With a hybrid work model and rotational shifts you will play a crucial role in optimizing infrastructure and enhancing system performance contributing to our mission of delivering exceptional financial solutions. Responsibilities Implement and manage infrastructure solutions using Artifactory Anthos and ELK Stack to ensure seamless integration and efficient operations. Collaborate with cross-functional teams to design and deploy Kubernetes clusters enhancing scalability and reliability of applications. Utilize Jenkins for continuous integration and continuous deployment processes streamlining development workflows and reducing time-to-market. Optimize cloud infrastructure on GCP ensuring cost-effective and secure solutions that align with business objectives. Develop and maintain automation scripts using Ansible and Terraform improving system provisioning and configuration management. Drive DevOps practices to enhance collaboration between development and operations teams fostering a culture of continuous improvement. Analyze and troubleshoot system issues providing timely resolutions to minimize downtime and ensure business continuity. Monitor system performance and implement enhancements to improve efficiency and user experience. Collaborate with stakeholders in the Consumer Lending domain to understand requirements and deliver tailored solutions that meet business needs. Participate in rotational shifts to provide 24/7 support ensuring high availability and reliability of infrastructure services. Contribute to the development of best practices and standards for infrastructure management promoting consistency and quality across projects. Engage in ongoing learning and development to stay updated with the latest technologies and industry trends. Support hybrid work model initiatives balancing remote and on-site work to maximize productivity and team collaboration. Qualifications Possess strong technical skills in Artifactory Anthos ELK Stack Kubernetes Jenkins GCP Ansible Terraform and DevOps essential for effective infrastructure management. Demonstrate expertise in the Consumer Lending domain understanding industry-specific requirements and challenges. Exhibit proficiency in cloud solutions particularly GCP to design and implement scalable and secure infrastructure. Showcase experience in automation tools like Ansible and Terraform crucial for efficient system provisioning and configuration. Display knowledge of DevOps practices fostering collaboration and continuous improvement within teams. Hold a minimum of 6 years and a maximum of 10 years of relevant experience ensuring a solid foundation in infrastructure development. Adapt to rotational shifts providing consistent support and maintaining high availability of services.

Posted 3 weeks ago

Apply

3.0 - 10.0 years

25 - 30 Lacs

Bengaluru

Work from Office

Naukri logo

Oracle Analytics Cloud team is hiring! You will Design, develop, troubleshoot ,debug and deliver innovative analytic features that involve Java, Python, Terraform, JavaScript, Docker, Kubernetes and other Cloud technologies. You will work closely with your peer developers located across the world, including Mexico, UK, India and the USA.. Key responsibilities: Design, develop, test and deliver new features on a world-class analytics platform suitable for deployment to both the Oracle Cloud and on-premise environments Lead the creation of formal design specifications and coding of complex systems Lead devops activities to maintain our fleet in Oracle Cloud Contribute to Product infrastructure deployment in new regions across the globe Contribute to automation and CI/CD framework using the Oracle Cloud Infrastructure Tools Work with support team to address customer issues Build software applications following established secure coding standards Communicate continually with the project teams, explain progress on the development effort Contribute to continuous improvement by suggesting improvements to software and deployment architecture or recommending new technologies Ensure quality of work through development standards and QA procedures Perform maintenance and enhancements on existing software Key Qualifications: BS/MS in Computer Science or related major Strong written and verbal English communication skills Self-motivated and passionate in developing high quality software Basic understanding of Agile/Scrum development methodologies Devops mindset to address real-time customer issues in production Solid skills utilizing one or more of object oriented languages - Java, Python. Solid Cloud technologies, IaC (Infrastructure as Code) Terraform background Strong CI/CD skills Strong fundamentals and hands on experience working with containerization technologies such as Docker and Kubernetes Knowledge of Linux and Shell scripting Hands-on experience using source control tools such as GIT and build technologies such as Maven/Gradle Knowledge of Configuration Management tools (Ansible, Chef) is a plus

Posted 3 weeks ago

Apply

3.0 - 7.0 years

14 - 16 Lacs

Bengaluru

Work from Office

Naukri logo

Technology Deployment and Management Service (TDMS) organization is a critical arm of Oracle FLEXCUBE consulting group. TDMS delivers Oracle Technology services for FSGBU product customer, while the applications team focusses on the application customization and setup. We are looking for a highly capable, self-motivated and independent Cloud Operations Engineers based in India. If you are passionate about Oracle technology as well as cloud computing, this is the ideal role you ve been waiting for. Our team supports technology which are available both Cloud and on-premise. Extensive Experience with multiple Public Cloud Provider (OCI, Azure, AWS, GCP) Extensive experience supporting Cloud or PaaS / SaaS production environment Experience with Cloud Services and Cloud Automation solution Manage and administer cloud platforms of OCI / Azure / AWS hosting enterprise applications and databases of Oracle /MySQL on Linux/ Windows environments and hosting infrastructure in accordance with company security guidelines. Experience in providing Level 2/3 support on Public Cloud (OCI, AWS, AZUE, etc.) Strong analysis and troubleshooting skills and experience Experience in carrying out Cost Analysis Automation - experience in the likes of Ansible or Cloud Formation Scripting experience in Python, Powershell or Ansible Platform experience with the likes of RedHat, Linux or Windows advantageous Experience in Containers/VMWare Knowledge of ITIL best practices Responsible for developing processes for enforcing cloud governance, architecture, operating procedures, monitoring, and system standards. Respond to incidents, own them and drive to completion, participate in root cause analysis. Orchestrating and automating cloud-based platforms with primary focus on OCI, AWS and Azure. Deploying and debugging cloud initiatives as needed in accordance with best practices throughout the development lifecycle. Employing exceptional problem-solving skills, with the ability to see and solve issues before they snowball into problems. Educating teams on the implementation of new cloud-based initiatives and writing SOP (Standard Operating Procedures) to accomplish repetitive tasks. Requirements Graduate in Computer Science or Engineering. Certification in OCI / AWS / Azure as Solutions Architect given a high priority. Any Cloud Security certification a plus. Experience in infrastructure setup, services operation, monitoring and governance in public cloud environments (OCI, AWS, Azure). Strong experience working with enterprise application architectures and Databases (Oracle) clustering, High Availability. Extensive knowledge of Linux / Windows based systems including Hardware, software, networking, Cloud storage and fault tolerant designs. Very strong in writing puppet modules for deployment automation, Terraforms and scripting languages like Perl, Python, Power shell scripting. Experience in DevOps setup procedures and process, workflow automation, CI/CD pipeline development. Excellent communication and written skills and ability to generate and evangelize architectural documentation / diagrams across many teams. Skilled at working in tandem with a team of engineers, or alone as required Career Level - IC2 Career Level - IC2 Technology Deployment and Management Service (TDMS) organization is a critical arm of Oracle FLEXCUBE consulting group. TDMS delivers Oracle Technology services for FSGBU product customer, while the applications team focusses on the application customization and setup. We are looking for a highly capable, self-motivated and independent Cloud Operations Engineers based in India. If you are passionate about Oracle technology as well as cloud computing, this is the ideal role you ve been waiting for. Our team supports technology which are available both Cloud and on-premise. Extensive Experience with multiple Public Cloud Provider (OCI, Azure, AWS, GCP) Extensive experience supporting Cloud or PaaS / SaaS production environment Experience with Cloud Services and Cloud Automation solution Manage and administer cloud platforms of OCI / Azure / AWS hosting enterprise applications and databases of Oracle /MySQL on Linux/ Windows environments and hosting infrastructure in accordance with company security guidelines. Experience in providing Level 2/3 support on Public Cloud (OCI, AWS, AZUE, etc.) Strong analysis and troubleshooting skills and experience Experience in carrying out Cost Analysis Automation - experience in the likes of Ansible or Cloud Formation Scripting experience in Python, Powershell or Ansible Platform experience with the likes of RedHat, Linux or Windows advantageous Experience in Containers/VMWare Knowledge of ITIL best practices Responsible for developing processes for enforcing cloud governance, architecture, operating procedures, monitoring, and system standards. Respond to incidents, own them and drive to completion, participate in root cause analysis. Orchestrating and automating cloud-based platforms with primary focus on OCI, AWS and Azure. Deploying and debugging cloud initiatives as needed in accordance with best practices throughout the development lifecycle. Employing exceptional problem-solving skills, with the ability to see and solve issues before they snowball into problems. Educating teams on the implementation of new cloud-based initiatives and writing SOP (Standard Operating Procedures) to accomplish repetitive tasks. Requirements Graduate in Computer Science or Engineering. Certification in OCI / AWS / Azure as Solutions Architect given a high priority. Any Cloud Security certification a plus. Experience in infrastructure setup, services operation, monitoring and governance in public cloud environments (OCI, AWS, Azure). Strong experience working with enterprise application architectures and Databases (Oracle) clustering, High Availability. Extensive knowledge of Linux / Windows based systems including Hardware, software, networking, Cloud storage and fault tolerant designs. Very strong in writing puppet modules for deployment automation, Terraforms and scripting languages like Perl, Python, Power shell scripting. Experience in DevOps setup procedures and process, workflow automation, CI/CD pipeline development. Excellent communication and written skills and ability to generate and evangelize architectural documentation / diagrams across many teams. Skilled at working in tandem with a team of engineers, or alone as required

Posted 3 weeks ago

Apply

5.0 - 7.0 years

5 - 8 Lacs

Chennai

Work from Office

Naukri logo

DESCRIPTION Python Developer Location: San Ramon, CA, USA - On-site Hiring Company Name: ACHNET Demo Hiring Company Location: San Ramon, CA, USA Job Type: Full-time Experience Level: Mid-Senior Level - 5-7 years Job Overview: ACHNET Demo is seeking a skilled and experienced Python Developer to join our dynamic team. The ideal candidate will be responsible for designing, developing, and implementing Python-based applications, with a focus on solutions for the banking and insurance industries. This role requires a strong understanding of software development principles, excellent problem-solving skills, and the ability to work collaboratively in a fast-paced environment. Key Responsibilities: Design, develop, and maintain Python applications and services. Write clean, efficient, and well-documented code. Develop and maintain shell scripts for automation and system administration tasks. Collaborate with cross-functional teams to define, design, and ship new features. Participate in code reviews and contribute to improving code quality. Troubleshoot, debug, and resolve software defects and issues. Stay up-to-date with the latest industry trends and technologies. Educational Qualifications: Bachelors degree in Computer Science, Information Technology, or a related field. Must-Have Skills: 5-7 years of experience in Python development. Proficiency in shell scripting (e.g., Bash, etc.). Experience with software development methodologies (e.g., Agile, Scrum). Strong understanding of object-oriented programming principles. Experience with version control systems (e.g., Git). Good-to-Have Skills: Experience with banking or insurance industry applications. Knowledge of database technologies (e.g., SQL, NoSQL). Experience with cloud platforms (e.g., AWS, Azure, GCP). Familiarity with DevOps practices. Pay Rate/Salary: Commensurate with experience. Number of Openings: 1 QUALIFICATIONS Must Have Skills Python shell scripting Bash Agile Scrum object-oriented programming Git Good To Have Skills SQL NoSQL AWS Azure GCP DevOps Minimum Education Level Bachelors or Equivalent Years of Experience 5-7 years ADDITIONAL INFORMATION Pay Range: Commensurate with experience Work Type: FullTime Location: Chennai Tamil Nadu, India Job ID: Achnet-Pyt-FA1679 or to apply! 3130 Crow Canyon Pl, Ste 205, San Ramon, CA 94583 Copyright 2025 ACHNET Inc, All rights reserved. Product Services Company

Posted 3 weeks ago

Apply

4.0 - 6.0 years

27 - 42 Lacs

Chennai

Work from Office

Naukri logo

Skill – Aks , Istio service mesh Shift timing - Afternoon Shift Location - Chennai, Kolkata, Bangalore Excellent AKS, GKE or Kubernetes admin experience. Good troubleshooting experience on istio service mesh, connectivity issues. Experience with Github Actions or similar ci/cd tool to build pipelines.Working experience on any cloud, preferably Azure, Google with good networking knowledge. Experience on python or shell scripting. Experience on building dashboards, configure alerts using prometheus and Grafana.

Posted 3 weeks ago

Apply

5.0 - 8.0 years

12 - 18 Lacs

Pune, Delhi / NCR

Hybrid

Naukri logo

5+ yrs of exp in deploying, enhancing, & troubleshooting AWS Services (EC2, S3, RDS, VPC, CloudTrail, CloudFront, Lambda, EKS, ECS) 3+ yrs of exp with serverless technologies, services,Docker, Kubernetes exp in JavaScript, Bash, Python, Typescript

Posted 3 weeks ago

Apply

8.0 - 12.0 years

20 - 27 Lacs

Bengaluru

Work from Office

Naukri logo

We help the world run better . What youll do As a Linux Cloud Platform DevOps Engineer, you will be in charge of providing Platform support for SAPs key cloud products which are running on Linux OS in various cloud platforms and on containerized Kubernetes clusters In your day-to-day you will Build and maintain the life cycle of Linux servers running in Native and Cloud environment (Google, Azure, AWS, Ali Converged Cloud). Also various infra components in containerized Kubernetes clusters Subject Matter Expert (SME) for various Linux and Cloud topics. Diagnosis and troubleshooting of Linux OS issues and performance tuning. Using DevOps tools to develop and maintain Automations. Team player in an international set-up as bridge interfaces to many Application, Infrastructure, and Tools teams. Closely collaborate with team members. Engage in various Projects/Proof of Concepts and Continuous Improvement initiatives. Contributes to the implementation and improvement of various ITIL processes. What you bring Do you have a Cloud mindset and innovative thinkingAre you analytical, self-motivated, and enjoy problem solvingAre you a Service and Customer oriented team playerDo you enjoy continuous learning and working efficiently in a fast-paced environment If this sounds like you, do you also bring 8-12 yrs of total experience Bachelor, Master, or equivalent graduate with hands-on Linux administration experience. Cloud Technology experience in Open Stack / Google / Azure / AWS/ Alibaba Cloud Good understanding of Server virtualization, Networking Storage Experience with DNS, LDAP, mail setup, user access management, and security groups Automation experience in a DevOps environment, preferably using Ansible. Knowledge and working experience with GitHub Actions for CI/CD . Proficient in programming with Shell scripting and Python. Knowledge and hands-on experience in containerizing applications and orchestrating them on Kubernetes. AI knowledge would be preferable Strong communication skills both written and spoken in English Meet your Team As part of the Platform Engineering Delivery(PED) team in Bangalore which is part of Public Cloud ERP Delivery, you will manage the infrastructure of SAPs various cloud products like S/4HANA Public cloud, IBP, ByD, C4C etc. As a part of this team, you will work with the latest Cloud Technologies, gaining experience in DevOps and Site Reliability Engineering. The team spread across India, Germany Hungary, operates on a hybrid model, with weekly 3 days in the office and the rest of the days worked remotely from the base location Bring out your best SAP s culture of inclusion, focus on health and well-being, and flexible working models help ensure that everyone - regardless of background - feels included and can run at their best. At SAP, we believe we are made stronger by the unique capabilities and qualities that each person brings to our company, and we invest in our employees to inspire confidence and help everyone realize their full potential. We ultimately believe in unleashing all talent and creating a better and more equitable world. SAP is proud to be an equal opportunity workplace and is an affirmative action employer. We are committed to the values of Equal Employment Opportunity and provide accessibility accommodations to applicants with physical and/or mental disabilities. If you are interested in applying for . Successful candidates might be required to undergo a background verification with an external vendor. Requisition ID: 427384 | Work Area: Software-Development Operations | Expected Travel: 0 - 10% | Career Status: Professional | Employment Type: Regular Full Time | Additional Locations: #LI-Hybrid. Requisition ID: 427384 Posted Date: May 30, 2025 Work Area: Software-Development Operations Career Status: Professional Employment Type: Regular Full Time Expected Travel: 0 - 10% Location:

Posted 3 weeks ago

Apply

2.0 - 9.0 years

25 - 27 Lacs

Pune

Work from Office

Naukri logo

Nice to meet you! We re a leader in data and AI. Through our software and services, we inspire customers around the world to transform data into intelligence - and questions into answers. We re also a debt-free multi-billion-dollar organization on our path to IPO-readiness. If youre looking for a dynamic, fulfilling career coupled with flexibility and world-class employee experience, youll find it here. About the role: The role is based in the Pune RD Center, at SAS RD Pune facility. For this role, you will join the SAS9 ARD team. We are looking for a C developer to help develop and maintain the SAS 9 platform based solutions. You will help with plan and design work and take responsibility for large parts of the application. Furthermore, you will be joining a friendly team with a broad range of experience, to develop and maintain SAS 9 Solutions. Primary Responsibilities Your primary responsibility will be to develop and maintain the SAS Platform products code base, facilitate the selection of correct technical choices via collaboration and implement for scale, performance and usability configuration and Download tools. You will be mentored by senior developers to ensure the seamless development of installation and configuration tools for deploying SAS software. You would be expected to contribute to the enhancement of deployment automation tools and scripts. You will be involved in troubleshooting issues, bug fixing and providing support to ensure a smooth user experience. Requirement Good understanding of OOPs, SOLID principles. Program in C and Java in a Linux and/or Windows environment. Experience working on variety of languages like C, C++, TK, Java, Python. Experience programming in a threaded kernel environment (TK/MVA Multi Vendor Architecture) Experience in SAS programming would be a huge advantage. Programming skills with Core Java, AWT and Swing. Understanding XML, JSON, and REST. Experience with Agile software development methodologies. Knowledge of software development processes and Quality Standards. Familiar with Agile methodology and with tools such as Jira, Confluence. Working knowledge of tools like Git, Gerrit Comfortable working on Windows and Linux operating systems Good debugging skills, logical problem-solving capability, quick and self-learner to adopt and use new technologies. Ability to use Test Driven and CI/CD Development methodologies (Unit test and Integration Testing) Demonstrated experience with disciplined unit, regression, and integration testing Experience of scripting language (python, Perl, shell-scripting) or Golang will be helpful. Mandatory Technical Skills Good exposure to design and development of C applications, REST APIs, Java. Good design and programming skills in Java or Golang using IDE. Sound OOPS/OOAD concepts, knowledge of SQL concepts and exposure to implementing design patterns. Good understanding of OOPs, SOLID principles. Program in C and Java in a Linux and/or Windows environment. Experience working on variety of languages such as C, C++, TK, Go, Java, Python. Experience programming in a threaded kernel environment (TK/MVA Multi Vendor Architecture) Experience in SAS programming would be a huge advantage. Programming skills with Core Java, AWT and Swing. Understanding XML, JSON, and REST. Good exposure to development of C and Java applications using IDE. Sound OOPS/OOAD concepts, knowledge of SQL concepts and exposure to implementing design patterns. Proficiency in C, JAVA, Spring, Design patterns, Gradle, Jenkins, Java Script, Unix/LAX Total Years of Relevant experience 2-9 years of relevant experience (There are multiple positions) Education Preference Bachelor s in computer science or relevant Diverse and Inclusive Additional Information: Please insert appropriate compliance verbiage for your country. SAS only sends emails from verified sas.com email addresses and never asks for sensitive, personal information or money.

Posted 3 weeks ago

Apply

4.0 - 9.0 years

12 - 16 Lacs

Bengaluru

Work from Office

Naukri logo

At PowerSchool, we are a dedicated team of innovators guided by our shared purpose of powering personalized education for students around the world. From the central office to the classroom to the home, PowerSchool supports the entire educational ecosystem as the global leader of cloud-based software for K-12 education. Our employees make it all possible, and a career with us means you re joining a successful team committed to engaging, empowering, and improving the K-12 education experience everywhere. Team Overview Our Hosting team manages and supports the infrastructure of all our platforms, from hardware to software, to operating systems to PowerSchool products. This collaborative team helps meet the needs of an evolving technology and business model with a specialized focus on protecting customer data and keeping information secure. Responsibilities A Cloud Operations engineer is responsible for designing, deploying, managing, and optimizing cloud infrastructure and services. They ensure the availability, performance, and security of cloud-based applications and resources while monitoring and responding to incidents and issues. This role requires expertise in cloud platforms, automation, and best practices. Your day-to-day job will consist of: Responsible for execution of day-to-day tasks related to monitoring and managing moderately complex database systems and infrastructure to ensure service availability and data security Execute ticket triage, investigation and resolution of reported incidents ; Participate in 24x7 on-call rotations to resolve incidents in support of production systems Monitor and configure systems resources (memory, disks space CPU utilization) Improve customer experience with performance tuning and database optimization Manage and perform database server software patches and updates Performs root cause analysis on trended database incidents and major outages up through the application stack Develops automation that can trigger off a variety of industry standard monitoring tools to resolve common issues in the environment or maintain operating levels Qualifications Minimum Qualifications Minimum of 4 years experience as an Oracle DBA Bachelor s degree in Computer Science, Information Systems, or equivalent degree Strong working knowledge of RDBMS Strong working knowledge with Windows, Linux or Ubuntu Experience with Oracle Enterprise Manager, OEM Preferred Qualifications Experience with database replication and high availability Experience with MS SQL Experience with cloud compute infrastructure such as Azure and AWS Experience with Cloud providers and their associated database services, such as AWS, AWS RDS or Azure Experience with data warehouses and Extract, Transform, Load (ETL) Experience with Site Reliability Engineering Experience with Shell Scripting, Batch Scripting, and Perl Scripting EEO Commitment EEO Commitment PowerSchool is committed to a diverse and inclusive workplace. PowerSchool is an equal opportunity employer and does not discriminate on the basis of race, national origin, gender, gender identity, sexual orientation, protected veteran status, disability, age, or other legally protected status. Our inclusive culture empowers PowerSchoolers to deliver the best results for our customers. We not only celebrate the diversity of our workforce, we celebrate the diverse ways we work.

Posted 3 weeks ago

Apply

2.0 - 9.0 years

25 - 27 Lacs

Pune

Work from Office

Naukri logo

Nice to meet you! We re a leader in data and AI. Through our software and services, we inspire customers around the world to transform data into intelligence - and questions into answers. We re also a debt-free multi-billion-dollar organization on our path to IPO-readiness. If youre looking for a dynamic, fulfilling career coupled with flexibility and world-class employee experience, youll find it here. About the role: The role is based in the Pune RD Center, at SAS RD Pune facility. For this role, you will join the SAS9 ARD team. We are looking for a C developer to help develop and maintain the SAS 9 platform based solutions. You will help with plan and design work and take responsibility for large parts of the application. Furthermore, you will be joining a friendly team with a broad range of experience, to develop and maintain SAS 9 Solutions. Primary Responsibilities Your primary responsibility will be to develop and maintain the SAS Platform products code base, facilitate the selection of correct technical choices via collaboration and implement for scale, performance and usability configuration and Download tools. You will be mentored by senior developers to ensure the seamless development of installation and configuration tools for deploying SAS software. You would be expected to contribute to the enhancement of deployment automation tools and scripts. You will be involved in troubleshooting issues, bug fixing and providing support to ensure a smooth user experience. Requirement Good understanding of OOPs, SOLID principles. Program in C and Java in a Linux and/or Windows environment. Experience working on variety of languages like C, C++, TK, Java, Python. Experience programming in a threaded kernel environment (TK/MVA Multi Vendor Architecture) Experience in SAS programming would be a huge advantage. Programming skills with Core Java, AWT and Swing. Understanding XML, JSON, and REST. Experience with Agile software development methodologies. Knowledge of software development processes and Quality Standards. Familiar with Agile methodology and with tools such as Jira, Confluence. Working knowledge of tools like Git, Gerrit Comfortable working on Windows and Linux operating systems Good debugging skills, logical problem-solving capability, quick and self-learner to adopt and use new technologies. Ability to use Test Driven and CI/CD Development methodologies (Unit test and Integration Testing) Demonstrated experience with disciplined unit, regression, and integration testing Experience of scripting language (python, Perl, shell-scripting) or Golang will be helpful. Mandatory Technical Skills Good exposure to design and development of C applications, REST APIs, Java. Good design and programming skills in Java or Golang using IDE. Sound OOPS/OOAD concepts, knowledge of SQL concepts and exposure to implementing design patterns. Good understanding of OOPs, SOLID principles. Program in C and Java in a Linux and/or Windows environment. Experience working on variety of languages such as C, C++, TK, Go, Java, Python. Experience programming in a threaded kernel environment (TK/MVA Multi Vendor Architecture) Experience in SAS programming would be a huge advantage. Programming skills with Core Java, AWT and Swing. Understanding XML, JSON, and REST. Good exposure to development of C and Java applications using IDE. Sound OOPS/OOAD concepts, knowledge of SQL concepts and exposure to implementing design patterns. Proficiency in C, JAVA, Spring, Design patterns, Gradle, Jenkins, Java Script, Unix/LAX Total Years of Relevant experience 2-9 years of relevant experience (There are multiple positions) Education Preference Bachelor s in computer science or relevant Diverse and Inclusive At SAS, it s not about fitting into our culture it s about adding to it. We believe our people make the difference. Our diverse workforce brings together unique talents and inspires teams to create amazing software that reflects the diversity of our users and customers. Our commitment to diversity is a priority to our leadership, all the way up to the top; and it s essential to who we are. To put it plainly: you are welcome here. Additional Information: Please insert appropriate compliance verbiage for your country. SAS only sends emails from verified sas.com email addresses and never asks for sensitive, personal information or money.

Posted 3 weeks ago

Apply

3.0 - 6.0 years

4 - 9 Lacs

Noida, Greater Noida, Delhi / NCR

Work from Office

Naukri logo

Role & responsibilities Build and manage robust CI/CD pipelines using tools like AWS CodePipeline, Jenkins, or GitHub Actions. Deploy and maintain infrastructure using Infrastructure as Code (IaC) tools like Terraform or AWS CloudFormation. Design, implement, and manage scalable cloud infrastructure on AWS (EC2, ECS, EKS, RDS, Lambda, S3, etc.). Containerize applications using Docker and orchestrate with Amazon ECS or EKS (Kubernetes). • Monitor infrastructure using AWS CloudWatch, CloudTrail, and integrate with tools like Prometheus and Grafana. Ensure security best practices across AWS resources, including IAM, VPC, encryption, and backups. Collaborate with development and QA teams to streamline deployments and ensure system reliability. Automate repetitive tasks and deployments to improve efficiency and reduce human error. Must-Have Skills Hands-on experience with AWS services (EC2, S3, IAM, Lambda, RDS, VPC, CloudWatch, etc.). Proficiency with CI/CD tools like Jenkins, AWS CodeBuild/CodeDeploy, GitLab CI/CD, or GitHub Actions. Strong scripting skills in Bash, Python, or Shell. Experience with Docker and container orchestration using ECS or EKS. Working knowledge of Infrastructure as Code with Terraform or CloudFormation. Git and version control best practices.

Posted 3 weeks ago

Apply

1.0 - 6.0 years

4 - 6 Lacs

Hyderabad

Work from Office

Naukri logo

SUMMARY Mason Job Description We are seeking a skilled Mason to join our team in constructing, repairing, and maintaining various structures using masonry materials. The ideal candidate should possess precision, strength, and a deep understanding of building techniques and materials. Key Responsibilities: Interpret blueprints, drawings, and specifications. Lay bricks, concrete blocks, and other building blocks in mortar. Shape bricks and stones to fit specific spaces. Mix mortar or grout and apply it to surfaces. Construct and repair walls, partitions, arches, fireplaces, chimneys, and other structures. Utilize hand and power tools to cut and shape materials. Ensure structures are level, plumb, and square. Clean surfaces and remove excess mortar. Collaborate with other construction professionals to complete projects. Adhere to safety standards and regulations at all times. Requirements Requirements: Education Background: Open to all categories Experience: Minimum 0.6 months experience as a Mason Age: No restrictions Visa Type: Work Visa Language: Basic English proficiency Benefits Relocation Support: Free Visa . Free furnished shared accommodation will be provided. Daily travel to work will be covered. International Work Experience: Boost your resume with Dubai industry expertise. Limited openings! Apply now to meet an employer for interview and migrating to Dubai!

Posted 3 weeks ago

Apply

4.0 - 8.0 years

9 - 11 Lacs

Hyderabad

Remote

Naukri logo

Role: Data Engineer (ETL Processes, SSIS, AWS) Duration: Fulltime Location: Remote Working hours: 4:30am to 10:30am IST shift timings. Note: We need a ETL engineer for MS SQL Server Integration Service working in 4:30am to 10:30am IST shift timings. Roles & Responsibilities: Design, develop, and maintain ETL processes using SQL Server Integration Services (SSIS). Create and optimize complex SQL queries, stored procedures, and data transformation logic on Oracle and SQL Server databases. Build scalable and reliable data pipelines using AWS services (e.g., S3, Glue, Lambda, RDS, Redshift). Develop and maintain Linux shell scripts to automate data workflows and perform system-level tasks. Schedule, monitor, and troubleshoot batch jobs using tools like Control-M, AutoSys, or cron. Collaborate with stakeholders to understand data requirements and deliver high-quality integration solutions. Ensure data quality, consistency, and security across systems. Maintain detailed documentation of ETL processes, job flows, and technical specifications. Experience with job scheduling tools such as Control-M and/or AutoSys. Exposure to version control tools (e.g., Git) and CI/CD pipelines.

Posted 3 weeks ago

Apply

12.0 - 20.0 years

30 - 35 Lacs

Navi Mumbai

Work from Office

Naukri logo

Job Title: Big Data Developer nd Project Support & Mentorship Location: Mumbai Employment Type: Full-Time/Contract Department: Engineering & Delivery Position Overview: We are seeking a skilled Big Data Developer to join our growing delivery team, with a dual focus on hands-on project support and mentoring junior engineers. This role is ideal for a developer who not only thrives in a technical, fast-paced environment but is also passionate about coaching and developing the next generation of talent. You will work on live client projects, provide technical support, contribute to solution delivery, and serve as a go-to technical mentor for less experienced team members. Key Responsibilities: Perform hands-on Big Data development work, including coding, testing, troubleshooting, and deploying solutions. Support ongoing client projects, addressing technical challenges and ensuring smooth delivery. Collaborate with junior engineers to guide them on coding standards, best practices, debugging, and project execution. Review code and provide feedback to junior engineers to maintain high quality and scalable solutions. Assist in designing and implementing solutions using Hadoop, Spark, Hive, HDFS, and Kafka. Lead by example in object-oriented development, particularly using Scala and Java. Translate complex requirements into clear, actionable technical tasks for the team. Contribute to the development of ETL processes for integrating data from various sources. Document technical approaches, best practices, and workflows for knowledge sharing within the team. Required Skills and Qualifications: 8+ years of professional experience in Big Data development and engineering. Strong hands-on expertise with Hadoop, Hive, HDFS, Apache Spark, and Kafka. Solid object-oriented development experience with Scala and Java. Strong SQL skills with experience working with large data sets. Practical experience designing, installing, configuring, and supporting Big Data clusters. Deep understanding of ETL processes and data integration strategies. Proven experience mentoring or supporting junior engineers in a team setting. Strong problem-solving, troubleshooting, and analytical skills. Excellent communication and interpersonal skills. Preferred Qualifications: Professional certifications in Big Data technologies (Cloudera, Databricks, AWS Big Data Specialty, etc.). Experience with cloud Big Data platforms (AWS EMR, Azure HDInsight, or GCP Dataproc). Exposure to Agile or DevOps practices in Big Data project environments. What We Offer: Opportunity to work on challenging, high-impact Big Data projects. Leadership role in shaping and mentoring the next generation of engineers. Supportive and collaborative team culture. Flexible working environment Competitive compensation and professional growth opportunities.

Posted 3 weeks ago

Apply

5.0 - 7.0 years

10 - 20 Lacs

Bengaluru

Hybrid

Naukri logo

Greetings from BCT Consulting We are hiring a Linux Admin JD: Shell Scripting, SQL, Ansible, Jenkins location: Bangalore Experience: 5+ CTC: Best in the industry Notice period: Less than 30 days Interested candidates can reach out to syed.s@bct-consulting.com

Posted 3 weeks ago

Apply

5.0 - 7.0 years

12 - 17 Lacs

Chennai, Guindy, Chenai

Work from Office

Naukri logo

Test Automation Architect Chennai - Guindy, India Information Technology 16551 Overview We are seeking a highly skilled Automation Architect with 5 to 7 years of experience to join our dynamic team. The ideal candidate will have a strong background in designing and implementing robust automation frameworks in Katalon, possess exceptional problem-solving skills, and be proficient in various programming languages and automation tools. The Automation Architect will play a critical role in optimizing business processes through automation and leading automation projects to successful completion. Responsibilities Automation Framework Design Design and implement scalable and maintainable automation frameworks tailored to our business needs. System Integration Integrate various systems and automate end-to-end processes to ensure seamless operations. Process Optimization Analyze and optimize business processes for automation, enhancing efficiency and productivity. Project Management Manage automation projects, including planning, execution, monitoring, and delivering results on time and within budget. Technical Leadership Provide technical guidance and mentorship to junior team members and collaborate with cross-functional teams. Documentation Create and maintain comprehensive documentation for automation frameworks, processes, and best practices. Requirements Katalon Proficiency in using Katalon Studio for test automation. Selenium Expertise in Selenium WebDriver for web application testing. Jira Management Experience in using Jira for test management, including creating, tracking, and managing test cases and defect reports. GitHub and Bitbucket Familiarity with version control systems like GitHub and Bitbucket for code repository management. Scripting Proficiency in scripting languages such as Shell, PowerShell, or Perl. Data Analysis Ability to analyze data and generate insights to improve automation processes. API Integration Experience in integrating APIs for seamless data exchange between systems. Version Control Familiarity with version control systems like Git. Networking Understanding of networking concepts to ensure seamless communication between automated systems. Debugging Strong debugging skills to identify and resolve issues in automation scripts.

Posted 3 weeks ago

Apply

5.0 - 8.0 years

6 - 10 Lacs

Bengaluru, Bangalaore

Work from Office

Naukri logo

Senior Java Developer Bangalore, India Human Resources 17136 Overview We are seeking a highly skilled and experienced Senior Java Developer to join our dynamic team. The ideal candidate will have a strong background in Java development 11/17/21, Spring Core, and various Java build tools. You will be responsible for designing, developing, and maintaining high-quality software solutions that meet our clients' needs. Responsibilities Key Responsibilities: Develop and maintain Java-based applications using Spring Core. Utilize Java build tools such as Maven or Gradle for efficient project management. Implement unit testing and mocking frameworks to ensure code quality and reliability. Design and manage SQL databases, ensuring optimal performance and security. Collaborate with cross-functional teams to integrate REST APIs. Use Git for version control and collaborate on code repositories. Participate in code reviews and provide constructive feedback to team members. Troubleshoot and resolve software defects and issues. Secondary Responsibilities: Implement CI/CD pipelines to automate deployment processes. Manage Bitbucket repositories and work with shell scripting. Develop and maintain frontend applications using HTML, CSS, JavaScript, and React.js. Collaborate with frontend developers to ensure seamless integration of backend services. Requirements Proficiency in Spring Core, Maven/Gradle, unit testing, SQL, Git, and REST APIs. Experience with CI/CD, Bitbucket, shell scripting, HTML/CSS/JS, and React.js. Strong problem-solving skills and attention to detail. Excellent communication and teamwork abilities

Posted 3 weeks ago

Apply

7.0 - 11.0 years

11 - 15 Lacs

Chennai, Guindy, Chenai

Work from Office

Naukri logo

Senior Technical Lead Chennai - Guindy, India Information Technology 17096 Overview We are seeking an experienced Senior Technical Lead to join our team for a critical migration project. This role will focus on migrating data and services from on-premise or legacy systems to cloud platforms (preferably AWS). The ideal candidate will have a solid background in software engineering, cloud technologies, especially AWS, and a solid understanding of database technologies and data migration processes., and hands-on experience with data and application migration projects. Responsibilities Key Responsibilities: Lead data migration efforts from legacy systems (e.g., on-premises databases) to cloud-based platforms AWS Collaborate with cross-functional teams to gather requirements and define migration strategies. Develop and implement migration processes to move legacy applications and data to cloud platforms like AWS. Write scripts and automation to support data migration, system configuration, and cloud infrastructure provisioning. Optimize existing data structures and processes for performance and scalability in the new environment. Ensure the migration adheres to performance, security, and compliance standards. Identify potential issues, troubleshoot, and implement fixes during the migration process. Maintain documentation of migration processes and post-migration maintenance plans. Provide technical support post-migration to ensure smooth operation of the migrated systems. Requirements Primary Skills (Required): Proven experience in leading data migration projects and migrating applications, services, or data to cloud platforms (preferably AWS). Knowledge of migration tools such as AWS Database Migration Service (DMS), AWS Server Migration Service (SMS), AWS Migration Hub Expertise in data mapping, validation, transformation, and ETL processes Proficiency in Python, Java or similar programming languages. Experience with scripting languages such as Shell, PowerShell, or Bash Cloud Technologies (AWS focus)Strong knowledge of AWS services relevant to data migration (e.g., S3, Redshift, Lambda, RDS, DMS, Glue). Experience in working with CI/CD pipelines (Jenkins, GitLab CI/CD) and infrastructure as code (IaC) using Terraform or AWS CloudFormation Experience in database management and migrating relational (e.g., MySQL, PostgreSQL, Oracle) and non-relational (e.g., MongoDB) databases.

Posted 3 weeks ago

Apply

2.0 - 6.0 years

3 - 6 Lacs

Chennai, Guindy

Work from Office

Naukri logo

Ericsson OCS Support Engineer Chennai - Guindy, India Operations 16873 Overview 1. 24/7 support Operation and Maintenance of Ericsson IN/OCS and handling Business Configurations. 2. L2 operations support to Ericsson IN/OCS for the nodes OCC, CCN, SDP, ECMS, NgCRS, NgVS, AIR. 3. Responsible for analysis and solving prepaid call related issues. 4. Responsible for daily health check-up of IN/OCS system. 5. Provide application support during server upgrades. 6. Provide support to NOC team and Ericsson team during outages/Service effect. 7. Analysis of various IN/OCS reports and prepare revenue related reports. 8. TCP DUMP analysis using wireshark. 9. Maintenance of CDR flow, CDR transfer between various IN/OCS Nodes, has to keep track of CDR Duplication or CDR Loss 10. Signalling link configuration, GT Routing. Responsibilities 1. 24/7 support Operation and Maintenance of Ericsson IN/OCS and handling Business Configurations. 2. L2 operations support to Ericsson IN/OCS for the nodes OCC, CCN, SDP, ECMS, NgCRS, NgVS, AIR. 3. Responsible for analysis and solving prepaid call related issues. 4. Responsible for daily health check-up of IN/OCS system. 5. Provide application support during server upgrades. 6. Provide support to NOC team and Ericsson team during outages/Service effect. 7. Analysis of various IN/OCS reports and prepare revenue related reports. 8. TCP DUMP analysis using wireshark. 9. Maintenance of CDR flow, CDR transfer between various IN/OCS Nodes, has to keep track of CDR Duplication or CDR Loss 10. Signalling link configuration, GT Routing. Requirements 1. Flexible to work in night shifts and Handling Oncall for 24/7 support. 2. Should have Operation and Support experience in all Ericsson OCS nodes (OCC, CCN, SDP, ECMS, NgCRS, NgVS, AIR) 3. Knowledge of Business Configuration in Ericsson IN/OCS 3. Must have experience in debugging/troubleshooting the traces using wireshark. 4. Should have knowledge in Prepaid call flows. 5. Knowledge Diameter,SIP, TCP/IP, MAP, SMPP, SCCP, MTP. 6. Experience in Linux, Shell Scripting. 7. Experience in SLA/KPIs. 8. Strong Written & Communication skills 9. Python is Added Advantage.

Posted 3 weeks ago

Apply

3.0 - 8.0 years

1 - 5 Lacs

Gurugram

Work from Office

Naukri logo

Project Role : Application Tech Support Practitioner Project Role Description : Act as the ongoing interface between the client and the system or application. Dedicated to quality, using exceptional communication skills to keep our world class systems running. Can accurately define a client issue and can interpret and design a resolution based on deep product knowledge. Must have skills : DevOps Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Tech Support Practitioner, you will serve as a vital link between clients and the systems or applications they utilize. Your day will involve engaging with clients to understand their needs, addressing their concerns, and ensuring that our high-quality systems operate seamlessly. You will leverage your exceptional communication skills to provide clarity and support, while also utilizing your in-depth product knowledge to diagnose issues and design effective resolutions. Your commitment to quality will be evident as you work diligently to maintain the integrity of our systems and enhance client satisfaction. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Facilitate training sessions for team members to enhance their understanding of system functionalities.- Develop and maintain comprehensive documentation for troubleshooting processes and client interactions. Professional & Technical Skills: - Must To Have Skills: Proficiency in DevOps.- Strong understanding of continuous integration and continuous deployment practices.- Experience with cloud platforms such as AWS or Azure.- Familiarity with containerization technologies like Docker and Kubernetes.- Knowledge of scripting languages such as Python or Bash. Additional Information:- The candidate should have minimum 3 years of experience in DevOps.- This position is based at our Gurugram office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 3 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies