Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 9.0 years
0 Lacs
karnataka
On-site
As a Lead / Staff Software Engineer in Black Duck SRE team, you will play a key role in transforming our R&D products through the adoption of advanced cloud, Containerization, Microservices, modern software delivery and other cutting edge technologies. You will be a key member of the team, working independently to develop tools and scripts, automated provisioning, deployment, and monitoring. The position is based in Bangalore (Near Dairy Circle Flyover) with a Hybrid work mode. Key Qualifications: - Bachelor's or Master's degree in Computer Science, Engineering, or a related field. - Minimum of 5-7 years of experience in Site Reliability Engineering / DevOps Engineering. - Strong hands-on experience with Containerization & Orchestration using Docker, Kubernetes (K8s), Helm to Secure, optimize, and scale K8s. - Deep understanding of Cloud Platforms & Services in AWS / GCP / Azure (Preferably GCP) cloud to Optimize cost, security, and performance. - Solid experience with Infrastructure as Code (IaC) using Terraform / CloudFormation / Pulumi (Preferably Terraform) - Write modules, manage state. - Proficient in Scripting & Automation using Bash, Python / Golang - Automate tasks, error handling. - Experienced in CI/CD Pipelines & GitOps using Git / GitHub / GitLab / Bitbucket / ArgoCD, Harness.io - Implement GitOps for deployments. - Strong background in Monitoring & Observability using Prometheus / Grafana / ELK Stack / Datadog / New Relic - Configure alerts, analyze trends. - Good understanding in Networking & Security using Firewalls, VPN, IAM, RBAC, TLS, SSO, Zero Trust - Implement IAM, TLS, logging. - Experience with Backup & Disaster Recovery using Velero, Snapshots, DR Planning - Implement backup solutions. - Basic Understanding messaging concepts using RabbitMQ / Kafka / Pub,Sub / SQS. - Familiarity with Configuration Management using Ansible / Chef / Puppet / SaltStack - Run existing playbooks. Key Responsibilities: - Design and develop scalable, modular solutions that promote reuse and are easily integrated into our diverse product suite. - Collaborate with cross-functional teams to understand their needs and incorporate user feedback into the development. - Establish best practices for modern software architecture, including Microservices, Serverless computing, and API-first strategies. - Drive the strategy for Containerization and orchestration using Docker, Kubernetes, or equivalent technologies. - Ensure the platform's infrastructure is robust, secure, and compliant with industry standards. What We Offer: - An opportunity to be a part of a dynamic and innovative team committed to making a difference in the technology landscape. - Competitive compensation package, including benefits and flexible work arrangements. - A collaborative, inclusive, and diverse work environment where creativity and innovation are valued. - Continuous learning and professional development opportunities to grow your expertise within the industry.,
Posted 3 days ago
5.0 - 7.0 years
4 - 6 Lacs
Bengaluru / Bangalore, Karnataka, India
On-site
Your role and responsibilities Skilled Multiple GCP services - GCS, BigQuery, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer etc. Must have Python and SQL work experience & Proactive, collaborative and ability to respond to critical situation Ability to analyse data for functional business requirements & front face customer Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 5 to 7 years of relevant experience working as technical analyst with Big Query on GCP platform. Skilled in multiple GCP services - GCS, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer You love collaborative environments that use agile methodologies to encourage creative design thinking and find innovative ways to develop with cutting edge technologies Ambitious individual who can work under their own direction towards agreed targets/goals and with creative approach to work Preferred technical and professional experience Create up to 3 bullets maxitive individual with an ability to manage change and proven time management Proven interpersonal skills while contributing to team effort by accomplishing related results as needed Up-to-date technical knowledge by attending educational workshops, reviewing publications (encouraging then to focus on required skills)
Posted 1 month ago
8 - 13 years
10 - 15 Lacs
Jaipur, Rajasthan
Work from Office
Job Summary Auriga is looking for a Data Engineer to design and maintain cloud-native data pipelines supporting real-time analytics and machine learning. You'll work with cross-functional teams to build scalable, secure data solutions using GCP (BigQuery, Looker), SQL, Python, and orchestration tools like Dagster and DBT. Mentoring junior engineers and ensuring data best practices will also be part of your role. WHAT YOU'LL DO: Design, build, and maintain scalable data pipelines and architectures to support analytical and operational workloads. Develop and optimize ETL/ELT pipelines, ensuring efficient data extraction, transformation, and loading from various sources. Work closely with backend and platform engineers to integrate data pipelines into cloud-native applications. Manage and optimize cloud data warehouses, primarily BigQuery, ensuring performance, scalability, and cost efficiency. Implement data governance, security, and privacy best practices, ensuring compliance with company policies and regulations. Collaborate with analytics teams to define data models and enable self-service reporting and BI capabilities. Develop and maintain data documentation, including data dictionaries, lineage tracking, and metadata management. Monitor, troubleshoot, and optimize data pipelines, ensuring high availability and reliability. Stay up to date with emerging data engineering technologies and best practices, continuously improving our data infrastructure. WHAT WE'RE LOOKING FOR: Strong proficiency in English (written and verbal communication) is required. Experience working with remote teams across North America and Latin America, ensuring smooth collaboration across time zones. 5+ years of experience in data engineering, with expertise in building scalable data pipelines and cloud-native data architectures. Strong proficiency in SQL for data modeling, transformation, and performance optimization. Experience with BI and data visualization tools (e.g., Looker, Tableau, or Google Data Studio). Expertise in Python for data processing, automation, and pipeline development. Experience with cloud data platforms, particularly Google Cloud Platform (GCP).Hands-on experience with Google BigQuery, Cloud Storage, and Pub/Sub. Strong knowledge of ETL/ELT frameworks such as DBT, Dataflow, or Apache Beam. Familiarity with workflow orchestration tools like Dagster, Apache Airflow or Google Cloud Workflows. Understanding of data privacy, security, and compliance best practices. Strong problem-solving skills, with the ability to debug and optimize complex data workflows. Excellent communication and collaboration skills. NICE TO HAVES: Experience with real-time data streaming solutions (e.g., Kafka, Pub/Sub, or Kinesis). Familiarity with machine learning workflows and MLOps best practices. Knowledge of Terraform for Infrastructure as Code (IaC) in data environments. Familiarity with data integrations involving Contentful, Algolia, Segment, and .
Posted 2 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough