Jobs
Interviews

7 Cloud Logging Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 7.0 years

0 Lacs

karnataka

On-site

As a Support Engineer with experience in maintaining and supporting solutions in a Cloud based environment (GCP or AWS), you will be responsible for ensuring the smooth operation of monitoring tools such as ELK, Dynamiter, Cloud watch, Cloud logging, Cloud Monitoring, New Relic. Your primary focus will be to implement and maintain monitoring and self-healing strategies to proactively prevent production incidents. You will also be required to conduct root cause analysis of production issues and design on call and escalation processes. In addition, you will participate in the design and implementation of serviceability solutions for monitoring and alerting, as well as debugging production issues across services and levels of the stack. Collaborating closely with the platform engineering team, you will help establish and improve production support approaches and participate in defining SLIs and SLOs to demonstrate efficiency and value to business partners. Your responsibilities will also include interacting and testing APIs, participating in Out-of-business-hour deployments and support on rotation with team members, and being familiar with agile development techniques. L3 Support experience is considered an asset for this role. In return, we offer competitive salaries, comprehensive health benefits, flexible work hours, remote work options, professional development and training opportunities, and a supportive and inclusive work environment.,

Posted 1 week ago

Apply

12.0 - 16.0 years

0 Lacs

noida, uttar pradesh

On-site

As a Principal Site Reliability Engineer, you will be responsible for leading all infrastructure aspects of a new cloud-native, microservice-based security platform. This platform is fully multi-tenant, operates on Kubernetes, and utilizes the latest cloud-native CNCF technologies such as Istio, Envoy, NATS, Fluent, Jaeger, and Prometheus. Your role will involve technically leading an SRE team to ensure high-quality SLA for a global solution running in multiple regions. Your responsibilities will include building tools and frameworks to enhance developer efficiency on the platform and abstracting infrastructure complexities. Automation and utilities will be developed to streamline service operation and monitoring. The platform handles large amounts of machine-generated data daily and is designed to manage terabytes of data from numerous customers. You will actively participate in platform design discussions with development teams, providing infrastructure insights and managing technology and business tradeoffs. Collaboration with global engineering teams will be crucial as you contribute to shaping the future of Cybersecurity. At GlobalLogic, we prioritize a culture of caring, where people come first. You will experience an inclusive environment promoting acceptance, belonging, and meaningful connections with collaborative teammates, supportive managers, and compassionate leaders. Continuous learning and development are essential at GlobalLogic. You will have access to numerous opportunities to expand your skills, advance your career, and grow personally and professionally. Our commitment to your growth includes programs, training curricula, and hands-on experiences. GlobalLogic is recognized for engineering impactful solutions worldwide. Joining our team means working on projects that make a difference, stimulating your curiosity and problem-solving skills. You will engage in cutting-edge solutions that shape the world today. We value balance and flexibility, offering various career paths, roles, and work arrangements to help you achieve a harmonious work-life balance. At GlobalLogic, integrity is key, and we uphold a high-trust environment focused on ethics and reliability. You can trust us to provide a safe, honest, and ethical workplace dedicated to both employees and clients. GlobalLogic, a Hitachi Group Company, is a leading digital engineering partner to top global companies. With a history of digital innovation since 2000, we collaborate with clients to create innovative digital products and experiences, driving business transformation and industry redefinition through intelligent solutions.,

Posted 2 weeks ago

Apply

0.0 years

0 Lacs

Bengaluru, Karnataka, India

Remote

Back Job Description Experience maintaining and supporting solutions in a Cloud based environment (GCP or AWS) Experience working with various monitoring tools. (ELK, Dynamiter, Cloud watch, Cloud logging, Cloud Monitoring, New Relic) Ensure monitoring and self-healing strategies are implemented and maintained to prevent production incidents proactively. Perform root cause analysis of production issues Design and manage on call and escalation process. Participate in the designing and implementation of serviceability solutions for monitoring and alerting. Debug production issues across services and levels of the stack. Participate in the definition of SLIs and SLOs to demonstrate maturity, efficiency, and value to our business partners. Collaborate closely with the platform engineering team to establish and improve production support approaches. Participate on Out-of-business-hour deployments and support (Rotation with team members). Familiar and comfortable with agile development techniques. Experience interacting and testing APIs Requirement Perform root cause analysis of production issues Design and manage on call and escalation processes Participate in the designing and implementation of observability solutions for monitoring and alerting. Debug production issues across services and levels of the stack. Participate in the definition of SLIs and SLOs to demonstrate maturity, efficiency, and value to our business partners Collaborate closely with the platform engineering team to establish and improve production support approaches L3 Support experience is an asset. What We Offer Competitive salaries and comprehensive health benefits Flexible work hours and remote work options. Professional development and training opportunities. A supportive and inclusive work environment Show more Show less

Posted 2 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

As a Data Specialist, you will be responsible for utilizing your expertise in ETL Fundamentals, SQL, BigQuery, Dataproc, Python, Data Catalog, Data Warehousing, and various other tools to contribute to the successful implementation of data projects. Your role will involve working with technologies such as Cloud Trace, Cloud Logging, Cloud Storage, and Datafusion to build and maintain a modern data platform. To excel in this position, you should possess a minimum of 5 years of experience in the data engineering field, with a focus on GCP cloud data implementation suite including BigQuery, Pub Sub, Data Flow/Apache Beam, Airflow/Composer, and Cloud Storage. Your strong understanding of very large-scale data architecture and hands-on experience in data warehouses, data lakes, and analytics platforms will be crucial for the success of our projects. Key Requirements: - Minimum 5 years of experience in data engineering - Hands-on experience in GCP cloud data implementation suite - Strong expertise in GBQ Query, Python, Apache Airflow, and SQL (BigQuery preferred) - Extensive hands-on experience with SQL and Python for working with data If you are passionate about data and have a proven track record of delivering results in a fast-paced environment, we invite you to apply for this exciting opportunity to be a part of our dynamic team.,

Posted 3 weeks ago

Apply

10.0 - 15.0 years

35 - 40 Lacs

Bengaluru

Work from Office

Proficiency in Google Cloud Platform (GCP) services, including Dataflow , DataStream , Dataproc , Big Query , and Cloud Storage . Strong experience with Apache Spark and Apache Flink for distributed data processing. Knowledge of real-time data streaming technologies (e.g., Apache Kafka , Pub/Sub ). Familiarity with data orchestration tools like Apache Airflow or Cloud Composer . Expertise in Infrastructure as Code (IaC) tools like Terraform or Cloud Deployment Manager . Experience with CI/CD tools like Jenkins , GitLab CI/CD , or Cloud Build . Knowledge of containerization and orchestration tools like Docker and Kubernetes . Strong scripting skills for automation (e.g., Bash , Python ). Experience with monitoring tools like Cloud Monitoring , Prometheus , and Grafana . Familiarity with logging tools like Cloud Logging or ELK Stack .

Posted 1 month ago

Apply

7.0 - 12.0 years

7 - 17 Lacs

Pune, Chennai, Bengaluru

Work from Office

Role & responsibilities : As a API Management Apigee developer the colleague should be able to work on Rest API, NodeJS, NodeJS test framworks Istio, APIGEE, Dynatrace ,Cloud logging, Jenkins-groovy, Spinnaker ,Harness, Jest/Mocha, Azure & GCP. Experience: - Proficiency in programming languages: Java, NodeJS - Expertise in Rest API, NodeJS test frameworks, Istio - Active experience on APIGEE, Dynatrace, Cloud Logging. - Tools like Jenkins, groovy - Experience in Spinnaker, Harness, Jest/Mocha - Knowledge of DevOps tools like Jenkins, GitHub, Terraform is desirable. - Cloud capabilities in Azure and GCP. - Familiarity with cloud storage solutions

Posted 1 month ago

Apply

5.0 - 7.0 years

10 - 14 Lacs

Noida

Hybrid

Data Engineer (SaaS-Based) || 5-7 years || NOIDA || 3 pm-12 AM IST shift Location: Noida (In-office/Hybrid; Client site if required) Experience: 5-7 years Type: Full-Time | Immediate Joiners Preferred Shift: 3 PM to 12 AM IST Client: Leading Canadian-based Tech Company Good to have: GCP Certified Data Engineer Overview of the role: As a GCP Data Engineer, you'll focus on solving problems and creating value for the business by building solutions that are reliable and scalable to work with the size and scope of the company. You will be tasked with creating custom-built pipelines as well as migrating on-prem data pipelines to the GCP stack. You will be part of a team tackling intricate problems by designing and deploying reliable and scalable solutions tailored to the company's data landscape. Required Skills: 5+ years of industry experience in software development, data engineering, business intelligence, or related field with experience in manipulating, processing, and extracting value from datasets. Extensive experience in doing requirement discovery, analysis and data pipeline solution design. Design, build and deploy internal applications to support our technology life cycle, collaboration and spaces, service delivery management, data and business intelligence among others. Building Modular code for multi usable pipeline or any kind of complex Ingestion Framework used to ease the job to load the data into Datalake or Data Warehouse from multiple sources. Work closely with analysts and business process owners to translate business requirements into technical solutions. Coding experience in scripting and languages (Python, SQL, PySpark). Expertise in Google Cloud Platform (GCP) technologies in the data warehousing space (BigQuery, GCP Workflows, Cloud Scheduler, Secret Manager, Batch, Cloud Logging, Cloud SDK, Google Cloud Storage, IAM). Exposure of Google Dataproc and Dataflow. Maintain highest levels of development practices including: technical design, solution development, systems configuration, test documentation/execution, issue identification and resolution, writing clean, modular and self-sustaining code, with repeatable quality and predictability. Understanding CI/CD Processes using Pulumi, GitHub, Cloud Build, Cloud SDK, Docker Experience with SAS/SQL Server/SSIS is an added advantage. Qualifications: Bachelor's degree in Computer Science or related technical field, or equivalent practical experience. GCP Certified Data Engineer (preferred) Excellent verbal and written communication skills with the ability to effectively advocate technical solutions to other engineering teams and business audiences. Job Type: Full-time

Posted 2 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies