Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 7.0 years
0 Lacs
Pune, Maharashtra, India
On-site
You are passionate about quality and how customers experience the products you test. You have the ability to create, maintain and execute test plans in order to verify requirements. As a Quality Engineer at Equifax, you will be a catalyst in both the development and the testing of high priority initiatives. You will develop and test new products to support technology operations while maintaining exemplary standards. As a collaborative member of the team, you will deliver QA services (code quality, testing services, performance engineering, development collaboration and continuous integration). You will conduct quality control tests in order to ensure full compliance with specified standards and end user requirements. You will execute tests using established plans and scripts; documents problems in an issues log and retest to ensure problems are resolved. You will create test files to thoroughly test program logic and verify system flow. You will identify, recommend and implement changes to enhance effectiveness of QA strategies. What you will do Independently develop scalable and reliable automated tests and frameworks for testing software solutions. Specify and automate test scenarios and test data for a highly complex business by analyzing integration points, data flows, personas, authorization schemes and environments Develop regression suites, develop automation scenarios, and move automation to an agile continuous testing model. Pro-actively and collaboratively taking part in all testing related activities while establishing partnerships with key stakeholders in Product, Development/Engineering, and Technology Operations. What Experience You Need Bachelor's degree in a STEM major or equivalent experience 5-7 years of software testing experience Able to create and review test automation according to specifications Ability to write, debug, and troubleshoot code in Java, Springboot, TypeScript/JavaScript, HTML, CSS Creation and use of big data processing solutions using Dataflow/Apache Beam, Bigtable, BigQuery, PubSub, GCS, Composer/Airflow, and others with respect to software validation Created test strategies and plans Led complex testing efforts or projects Participated in Sprint Planning as the Test Lead Collaborated with Product Owners, SREs, Technical Architects to define testing strategies and plans. Design and development of micro services using Java, Springboot, GCP SDKs, GKE/Kubeneties Deploy and release software using Jenkins CI/CD pipelines, understand infrastructure-as-code concepts, Helm Charts, and Terraform constructs Cloud Certification Strongly Preferred What could set you apart An ability to demonstrate successful performance of our Success Profile skills, including: Attention to Detail - Define test case candidates for automation that are outside of product specifications. i.e. Negative Testing; Create thorough and accurate documentation of all work including status updates to summarize project highlights; validating that processes operate properly and conform to standards Automation - Automate defined test cases and test suites per project Collaboration - Collaborate with Product Owners and development team to plan and and assist with user acceptance testing; Collaborate with product owners, development leads and architects on functional and non-functional test strategies and plans Execution - Develop scalable and reliable automated tests; Develop performance testing scripts to assure products are adhering to the documented SLO/SLI/SLAs; Specify the need for Test Data types for automated testing; Create automated tests and tests data for projects; Develop automated regression suites; Integrate automated regression tests into the CI/CD pipeline; Work with teams on E2E testing strategies and plans against multiple product integration points Quality Control - Perform defect analysis, in-depth technical root cause analysis, identifying trends and recommendations to resolve complex functional issues and process improvements; Analyzes results of functional and non-functional tests and make recommendation for improvements; Performance / Resilience: Understanding application and network architecture as inputs to create performance and resilience test strategies and plans for each product and platform. Conducting the performance and resilience testing to ensure the products meet SLAs / SLOs Quality Focus - Review test cases for complete functional coverage; Review quality section of Production Readiness Review for completeness; Recommend changes to existing testing methodologies for effectiveness and efficiency of product validation; Ensure communications are thorough and accurate for all work documentation including status and project updates Risk Mitigation - Work with Product Owners, QE and development team leads to track and determine prioritization of defects fixes
Posted 2 weeks ago
5.0 years
0 Lacs
India
On-site
About TransFi TransFi powers the world’s payments, helping businesses and individuals access better ways to move money. Combining industry-leading coverage of currencies and payment methods, we deliver compliant payouts, collections, and ramp services across Asia, Europe, LatAm, Africa, and North America. In our pursuit to advance economic prosperity through borderless finance, TransFi’s enterprise-grade, developer-friendly platform is quietly transforming global payments—making international money movement faster, simpler, less expensive, and more reliable than ever. We believe the best talent comes from across geographies. We are building for the long term and look for strong owners, builders and big thinkers. Join us if you want to build the future of Web3. About the Role: We are seeking a talented and experienced Data Engineer Lead with deep expertise in Fintech and strong data architecture skills. This role involves designing and leading the development of scalable, robust data pipelines and platforms, supporting live reporting systems, real-time dashboards, and advanced analytics/ML initiatives. You will also build and mentor a high-performing team of data engineers. Responsibilities: Design and lead a modern, layered architecture across batch and streaming data Build scalable pipelines (Dataflow, Kafka, Pub/Sub, Spark) Implement CDC, metadata governance, orchestration (Airflow, Prefect) Support real-time dashboards and advanced analytics/ML use cases Built a growing team and mentor junior engineers Requirements: 5+ years of data engineering experience Strong hands-on skills with AWS, Python, and streaming frameworks Experience building end-to-end data platforms Strong communicator and problem-solver with a product mindset Experience with LLMs and ML pipeline would be added advantage. What We Offer: Benefits Competitive salary package. Opportunities for growth. A truly agile, fast-paced environment without complex decision-making. Be a part of truly global and winning team.
Posted 2 weeks ago
0.0 - 3.0 years
10 - 15 Lacs
Hyderabad, Telangana
On-site
Job Title: GCP Data Engineer Experience: 4+ years Location: Hyderabad / Pune Open Positions: 5 Job Description: We are looking for a skilled GCP Data Engineer with strong experience in cloud-based data engineering to join our dynamic team. Key Responsibilities: Collaborate with technical stakeholders to deliver solutions with business impact Work within an agile, multidisciplinary DevOps team Migrate and re-engineer existing services from on-premises data centers to cloud (GCP/AWS) Understand business requirements and deliver real-time, scalable solutions Utilize project development tools such as JIRA, Confluence, and GIT Write Python/Shell scripts to automate operations and server management tasks Build and maintain tools for monitoring, alerting, trending, and analysis Define, create, test, and execute operational procedures Document current and future configuration processes and policies Required Qualifications: 2 to 6 years of experience as a Data Engineer/Developer working with Cloud platforms (preferably GCP) Hands-on experience in an agile DevOps environment Good understanding of Hadoop and data ingestion tools like NiFi or Kafka Proficiency in Python programming, DataFlow, Pub/Sub, and BigQuery Strong experience with key GCP components: GCS, BigQuery, Airflow, Cloud SQL, Pub/Sub/Kafka, DataFlow, Google Cloud SDK Familiarity with Terraform and Shell scripting Experience with relational databases (RDBMS) Preferred Qualifications: GCP Data Engineer Certification is a plus Job Type: Full-time Pay: ₹1,000,000.00 - ₹1,500,000.00 per year Benefits: Provident Fund Schedule: Day shift Monday to Friday Ability to commute/relocate: Hyderabad, Telangana: Reliably commute or planning to relocate before starting work (Required) Application Question(s): Are you an Immediate joiner? Experience: GCP Data : 4 years (Required) Python: 3 years (Required) Work Location: In person
Posted 2 weeks ago
0 years
2 - 9 Lacs
Hyderābād
On-site
Job description Some careers shine brighter than others. If you’re looking for a career that will help you stand out, join HSBC and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further. HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions. We are currently seeking an experienced professional to join our team in the role of Consultant Specialist. In this role, you will: The role holder must also drive operational excellence across our businesses worldwide, achieving this through the consolidation, simplification and continuous improvement of processes across the full range of HSBC operations. Should be able to manage multiple GCP projects Collaborate with application packaging team and assist resolving all technical challenges that might be present during integration phase. Communicate with the relevant project teams regarding resolving technical / application issues related to delivery of GCP solutions. Work with Engineering and operations to ensure our environment is monitored appropriately. Ensure SLA commitments are met and escalate accordingly. Provide Deployment activities to create & maintain development, test, UAT and production environments in project deployment phase. Establish, document and implement the best practices in the end to end application initiation and deployment processes. Working for continuous improvement to achieve customer satisfaction. Should have flexibility as per project needs Requirements To be successful in this role, you should meet the following requirements: 6+ yrs of experience is required. Experience in driving GCP Data Analytics projects/ecosystems independently Experience in GCP IaaS such as GCE, GAE, GKE, VPC, DNS, Interconnect VPN, CDN, Cloud Storage, FileStore, Firebase, Deployment Manager, Stackdriver. Experience in GCP services such as Cloud Endpoints, Dataflow, Dataproc, Datalab, Dataprep, Cloud Composer, Pub/Sub, Cloud Functions Experience on Terraform and Devops (CI/CD pipeline) Experience in publishing GCP cost Dashboards, Alerting and monitoring Should have experience working in agile and devops environment using team collaboration tools such as Confluence, JIRA. Programming skills and hands-on experience in Python desirable Proficiency in working with cloud based native data stores/databases Knowledge on design patterns for GCP third party tools setup and native tools usage Experience and ability to manage a small team of tech specialists Excellent multitasking ability - Must have ability to track multiple issues, effectively manage time and competing priorities, and to drive results through partner organizations. Strong communication skills (verbal, written, and presentation of complex information and data). You’ll achieve more when you join HSBC. www.hsbc.com/careers HSBC is committed to building a culture where all employees are valued, respected and opinions count. We take pride in providing a workplace that fosters continuous professional development, flexible working and opportunities to grow within an inclusive and diverse environment. Personal data held by the Bank relating to employment applications will be used in accordance with our Privacy Statement, which is available on our website. Issued by – HSDI
Posted 2 weeks ago
0.0 - 3.0 years
6 - 18 Lacs
Chennai, Tamil Nadu
Remote
We are seeking a mid-level GCP Data Engineer with 4+years of experience in ETL, Data Warehousing, and Data Engineering. The ideal candidate will have hands-on experience with GCP tools, solid data analysis skills, and a strong understanding of Data Warehousing principles. Roles & Responsibilities; * Analyze the different source systems, profile data, understand, document & fix Data Quality issues * Gather requirements and business process knowledge in order to transform the data in a way that is geared towards the needs ofend users * Write complex SQLs to extract &format source data for ETL/data pipeline * Create design documents, Source to Target Mapping documents and any supporting documents needed for deployment/migration * Design, Develop and Test ETL/Data pipelines * Design & build metadata-based frameworks needs for data pipelines * Write Unit Test cases, execute Unit Testing and document Unit Test results Deploy ETL/Data pipelines * Use DevOps tools to version, push/pull code and deploy across environments * Support team during troubleshooting & debugging defects & bug fixes, business requests, environment migrations & other adhoc requests * Do production support, enhancements and bug fixes * Work with business and technology stakeholders to communicate EDW incidents/problems and manage their expectations * Leverage ITIL concepts to circumvent incidents, manage problems and document knowledge * Perform data cleaning, transformation, and validation to ensure accuracy and consistency across various data sources * Stay current on industry best practices and emerging technologies in data analysis and cloud computing, particularly within the GCP ecosystem Qualifications: * 4+years of experience in ETL & Data Warehousing * Should have excellent leadership & communication skills * Should have experience in developing Data Engineering solutions Airflow, GCP * BigQuery, Cloud Storage, Dataflow, Cloud Functions, Pub/Sub, Cloud Run, etc. * Should have built solution automations in any ofthe above ETL tools * Should have executed at least 2 GCP Cloud Data Warehousing projects * Should have worked at least 2 projects using Agile/SAFe methodology * Should Have mid level experience in Pyspark and Teradata * Should Have mid level experience in * Should have working experience on any DevOps tools like GitHub, Jenkins, * Cloud Native, etc & on semi-structured data formats like JSON, Parquet and/or XML * files & written complex SQL queries for data analysis and extraction * Should have in depth understanding on Data Warehousing, Data Analysis, Data * Profiling, Data Quality & Data Mapping Education: B.Tech. /B.E. in Computer Science or related field. eCertifications: Google Cloud Professional Data Engineer Certification. Job Types: Full-time, Permanent Pay: ₹600,000.00 - ₹1,800,000.00 per year Benefits: Flexible schedule Work from home Application Question(s): How many years do you have experience in GCP Data Engineer? Are you ready to relocate to Chennai? What is your current location? What is your current CTC? What is your expected CTC? Are you a immediate joinee? What is your notice period? How soon you can join? (Mention days here) Experience: GCP Data Engineer: 3 years (Required) Location: Chennai, Tamil Nadu (Preferred) Work Location: In person
Posted 2 weeks ago
8.0 - 12.0 years
35 - 50 Lacs
Greater Noida
Work from Office
Role & responsibilities Build F5 Distributed Cloud data system and management systems Design/develop/enhance data, analytics, AI/Gen AI powered service on SaaS platform Design/develop/enhance telemetry and metrics pipeline and services Work closely with product, marketing, operation, platform, and customer support team to create innovative solution for Cloud product delivery Preferred candidate profile Bachelors degree in computer science or equivalent professional experience (7+ years). Proficiency in Cloud native development and programming languages such as GO, Java, Python Experience with data/stream processing (e.g., Kafka, Pub Sub, Dataflow, Vector, Spark, Flink), database and data warehouse (Clickhouse, BigQuery, StarRocks, Elasticsearch, Redis) Experience with logs, metrics, telemetry, Prometheus, Open Telemetry Experience with data system quality, monitoring and performance Experience in SaaS multi-tenancy, onboarding, metering & billing, monitoring & alerting Experience with container and orchestration technologies, Kubernetes and Microservice Experience with automation and cloud Infra, tooling, workload, modern CI/CD
Posted 2 weeks ago
8.0 - 12.0 years
35 - 50 Lacs
Faridabad
Work from Office
Role & responsibilities Build F5 Distributed Cloud data system and management systems Design/develop/enhance data, analytics, AI/Gen AI powered service on SaaS platform Design/develop/enhance telemetry and metrics pipeline and services Work closely with product, marketing, operation, platform, and customer support team to create innovative solution for Cloud product delivery Preferred candidate profile Bachelors degree in computer science or equivalent professional experience (7+ years). Proficiency in Cloud native development and programming languages such as GO, Java, Python Experience with data/stream processing (e.g., Kafka, Pub Sub, Dataflow, Vector, Spark, Flink), database and data warehouse (Clickhouse, BigQuery, StarRocks, Elasticsearch, Redis) Experience with logs, metrics, telemetry, Prometheus, Open Telemetry Experience with data system quality, monitoring and performance Experience in SaaS multi-tenancy, onboarding, metering & billing, monitoring & alerting Experience with container and orchestration technologies, Kubernetes and Microservice Experience with automation and cloud Infra, tooling, workload, modern CI/CD
Posted 2 weeks ago
8.0 - 12.0 years
35 - 50 Lacs
Chittoor
Work from Office
Role & responsibilities Build F5 Distributed Cloud data system and management systems Design/develop/enhance data, analytics, AI/Gen AI powered service on SaaS platform Design/develop/enhance telemetry and metrics pipeline and services Work closely with product, marketing, operation, platform, and customer support team to create innovative solution for Cloud product delivery Preferred candidate profile Bachelors degree in computer science or equivalent professional experience (7+ years). Proficiency in Cloud native development and programming languages such as GO, Java, Python Experience with data/stream processing (e.g., Kafka, Pub Sub, Dataflow, Vector, Spark, Flink), database and data warehouse (Clickhouse, BigQuery, StarRocks, Elasticsearch, Redis) Experience with logs, metrics, telemetry, Prometheus, Open Telemetry Experience with data system quality, monitoring and performance Experience in SaaS multi-tenancy, onboarding, metering & billing, monitoring & alerting Experience with container and orchestration technologies, Kubernetes and Microservice Experience with automation and cloud Infra, tooling, workload, modern CI/CD
Posted 2 weeks ago
8.0 - 12.0 years
35 - 50 Lacs
Ghaziabad
Work from Office
Role & responsibilities Build F5 Distributed Cloud data system and management systems Design/develop/enhance data, analytics, AI/Gen AI powered service on SaaS platform Design/develop/enhance telemetry and metrics pipeline and services Work closely with product, marketing, operation, platform, and customer support team to create innovative solution for Cloud product delivery Preferred candidate profile Bachelors degree in computer science or equivalent professional experience (7+ years). Proficiency in Cloud native development and programming languages such as GO, Java, Python Experience with data/stream processing (e.g., Kafka, Pub Sub, Dataflow, Vector, Spark, Flink), database and data warehouse (Clickhouse, BigQuery, StarRocks, Elasticsearch, Redis) Experience with logs, metrics, telemetry, Prometheus, Open Telemetry Experience with data system quality, monitoring and performance Experience in SaaS multi-tenancy, onboarding, metering & billing, monitoring & alerting Experience with container and orchestration technologies, Kubernetes and Microservice Experience with automation and cloud Infra, tooling, workload, modern CI/CD
Posted 2 weeks ago
8.0 - 12.0 years
35 - 50 Lacs
Gurugram
Work from Office
Role & responsibilities Build F5 Distributed Cloud data system and management systems Design/develop/enhance data, analytics, AI/Gen AI powered service on SaaS platform Design/develop/enhance telemetry and metrics pipeline and services Work closely with product, marketing, operation, platform, and customer support team to create innovative solution for Cloud product delivery Preferred candidate profile Bachelors degree in computer science or equivalent professional experience (7+ years). Proficiency in Cloud native development and programming languages such as GO, Java, Python Experience with data/stream processing (e.g., Kafka, Pub Sub, Dataflow, Vector, Spark, Flink), database and data warehouse (Clickhouse, BigQuery, StarRocks, Elasticsearch, Redis) Experience with logs, metrics, telemetry, Prometheus, Open Telemetry Experience with data system quality, monitoring and performance Experience in SaaS multi-tenancy, onboarding, metering & billing, monitoring & alerting Experience with container and orchestration technologies, Kubernetes and Microservice Experience with automation and cloud Infra, tooling, workload, modern CI/CD
Posted 2 weeks ago
8.0 - 12.0 years
35 - 50 Lacs
Hassan
Work from Office
Role & responsibilities Build F5 Distributed Cloud data system and management systems Design/develop/enhance data, analytics, AI/Gen AI powered service on SaaS platform Design/develop/enhance telemetry and metrics pipeline and services Work closely with product, marketing, operation, platform, and customer support team to create innovative solution for Cloud product delivery Preferred candidate profile Bachelors degree in computer science or equivalent professional experience (7+ years). Proficiency in Cloud native development and programming languages such as GO, Java, Python Experience with data/stream processing (e.g., Kafka, Pub Sub, Dataflow, Vector, Spark, Flink), database and data warehouse (Clickhouse, BigQuery, StarRocks, Elasticsearch, Redis) Experience with logs, metrics, telemetry, Prometheus, Open Telemetry Experience with data system quality, monitoring and performance Experience in SaaS multi-tenancy, onboarding, metering & billing, monitoring & alerting Experience with container and orchestration technologies, Kubernetes and Microservice Experience with automation and cloud Infra, tooling, workload, modern CI/CD
Posted 2 weeks ago
8.0 - 12.0 years
35 - 50 Lacs
Navi Mumbai
Work from Office
Role & responsibilities Build F5 Distributed Cloud data system and management systems Design/develop/enhance data, analytics, AI/Gen AI powered service on SaaS platform Design/develop/enhance telemetry and metrics pipeline and services Work closely with product, marketing, operation, platform, and customer support team to create innovative solution for Cloud product delivery Preferred candidate profile Bachelors degree in computer science or equivalent professional experience (7+ years). Proficiency in Cloud native development and programming languages such as GO, Java, Python Experience with data/stream processing (e.g., Kafka, Pub Sub, Dataflow, Vector, Spark, Flink), database and data warehouse (Clickhouse, BigQuery, StarRocks, Elasticsearch, Redis) Experience with logs, metrics, telemetry, Prometheus, Open Telemetry Experience with data system quality, monitoring and performance Experience in SaaS multi-tenancy, onboarding, metering & billing, monitoring & alerting Experience with container and orchestration technologies, Kubernetes and Microservice Experience with automation and cloud Infra, tooling, workload, modern CI/CD
Posted 2 weeks ago
8.0 - 12.0 years
35 - 50 Lacs
Mysuru
Work from Office
Role & responsibilities Build F5 Distributed Cloud data system and management systems Design/develop/enhance data, analytics, AI/Gen AI powered service on SaaS platform Design/develop/enhance telemetry and metrics pipeline and services Work closely with product, marketing, operation, platform, and customer support team to create innovative solution for Cloud product delivery Preferred candidate profile Bachelors degree in computer science or equivalent professional experience (7+ years). Proficiency in Cloud native development and programming languages such as GO, Java, Python Experience with data/stream processing (e.g., Kafka, Pub Sub, Dataflow, Vector, Spark, Flink), database and data warehouse (Clickhouse, BigQuery, StarRocks, Elasticsearch, Redis) Experience with logs, metrics, telemetry, Prometheus, Open Telemetry Experience with data system quality, monitoring and performance Experience in SaaS multi-tenancy, onboarding, metering & billing, monitoring & alerting Experience with container and orchestration technologies, Kubernetes and Microservice Experience with automation and cloud Infra, tooling, workload, modern CI/CD
Posted 2 weeks ago
8.0 - 12.0 years
35 - 50 Lacs
Mandya
Work from Office
Role & responsibilities Build F5 Distributed Cloud data system and management systems Design/develop/enhance data, analytics, AI/Gen AI powered service on SaaS platform Design/develop/enhance telemetry and metrics pipeline and services Work closely with product, marketing, operation, platform, and customer support team to create innovative solution for Cloud product delivery Preferred candidate profile Bachelors degree in computer science or equivalent professional experience (7+ years). Proficiency in Cloud native development and programming languages such as GO, Java, Python Experience with data/stream processing (e.g., Kafka, Pub Sub, Dataflow, Vector, Spark, Flink), database and data warehouse (Clickhouse, BigQuery, StarRocks, Elasticsearch, Redis) Experience with logs, metrics, telemetry, Prometheus, Open Telemetry Experience with data system quality, monitoring and performance Experience in SaaS multi-tenancy, onboarding, metering & billing, monitoring & alerting Experience with container and orchestration technologies, Kubernetes and Microservice Experience with automation and cloud Infra, tooling, workload, modern CI/CD
Posted 2 weeks ago
8.0 - 12.0 years
35 - 50 Lacs
Thane
Work from Office
Role & responsibilities Build F5 Distributed Cloud data system and management systems Design/develop/enhance data, analytics, AI/Gen AI powered service on SaaS platform Design/develop/enhance telemetry and metrics pipeline and services Work closely with product, marketing, operation, platform, and customer support team to create innovative solution for Cloud product delivery Preferred candidate profile Bachelors degree in computer science or equivalent professional experience (7+ years). Proficiency in Cloud native development and programming languages such as GO, Java, Python Experience with data/stream processing (e.g., Kafka, Pub Sub, Dataflow, Vector, Spark, Flink), database and data warehouse (Clickhouse, BigQuery, StarRocks, Elasticsearch, Redis) Experience with logs, metrics, telemetry, Prometheus, Open Telemetry Experience with data system quality, monitoring and performance Experience in SaaS multi-tenancy, onboarding, metering & billing, monitoring & alerting Experience with container and orchestration technologies, Kubernetes and Microservice Experience with automation and cloud Infra, tooling, workload, modern CI/CD
Posted 2 weeks ago
8.0 - 12.0 years
35 - 50 Lacs
Nashik
Work from Office
Role & responsibilities Build F5 Distributed Cloud data system and management systems Design/develop/enhance data, analytics, AI/Gen AI powered service on SaaS platform Design/develop/enhance telemetry and metrics pipeline and services Work closely with product, marketing, operation, platform, and customer support team to create innovative solution for Cloud product delivery Preferred candidate profile Bachelors degree in computer science or equivalent professional experience (7+ years). Proficiency in Cloud native development and programming languages such as GO, Java, Python Experience with data/stream processing (e.g., Kafka, Pub Sub, Dataflow, Vector, Spark, Flink), database and data warehouse (Clickhouse, BigQuery, StarRocks, Elasticsearch, Redis) Experience with logs, metrics, telemetry, Prometheus, Open Telemetry Experience with data system quality, monitoring and performance Experience in SaaS multi-tenancy, onboarding, metering & billing, monitoring & alerting Experience with container and orchestration technologies, Kubernetes and Microservice Experience with automation and cloud Infra, tooling, workload, modern CI/CD
Posted 2 weeks ago
8.0 - 12.0 years
35 - 50 Lacs
Khammam
Work from Office
Role & responsibilities Build F5 Distributed Cloud data system and management systems Design/develop/enhance data, analytics, AI/Gen AI powered service on SaaS platform Design/develop/enhance telemetry and metrics pipeline and services Work closely with product, marketing, operation, platform, and customer support team to create innovative solution for Cloud product delivery Preferred candidate profile Bachelors degree in computer science or equivalent professional experience (7+ years). Proficiency in Cloud native development and programming languages such as GO, Java, Python Experience with data/stream processing (e.g., Kafka, Pub Sub, Dataflow, Vector, Spark, Flink), database and data warehouse (Clickhouse, BigQuery, StarRocks, Elasticsearch, Redis) Experience with logs, metrics, telemetry, Prometheus, Open Telemetry Experience with data system quality, monitoring and performance Experience in SaaS multi-tenancy, onboarding, metering & billing, monitoring & alerting Experience with container and orchestration technologies, Kubernetes and Microservice Experience with automation and cloud Infra, tooling, workload, modern CI/CD
Posted 2 weeks ago
8.0 - 12.0 years
35 - 50 Lacs
Nizamabad
Work from Office
Role & responsibilities Build F5 Distributed Cloud data system and management systems Design/develop/enhance data, analytics, AI/Gen AI powered service on SaaS platform Design/develop/enhance telemetry and metrics pipeline and services Work closely with product, marketing, operation, platform, and customer support team to create innovative solution for Cloud product delivery Preferred candidate profile Bachelors degree in computer science or equivalent professional experience (7+ years). Proficiency in Cloud native development and programming languages such as GO, Java, Python Experience with data/stream processing (e.g., Kafka, Pub Sub, Dataflow, Vector, Spark, Flink), database and data warehouse (Clickhouse, BigQuery, StarRocks, Elasticsearch, Redis) Experience with logs, metrics, telemetry, Prometheus, Open Telemetry Experience with data system quality, monitoring and performance Experience in SaaS multi-tenancy, onboarding, metering & billing, monitoring & alerting Experience with container and orchestration technologies, Kubernetes and Microservice Experience with automation and cloud Infra, tooling, workload, modern CI/CD
Posted 2 weeks ago
8.0 - 12.0 years
35 - 50 Lacs
Karimnagar
Work from Office
Role & responsibilities Build F5 Distributed Cloud data system and management systems Design/develop/enhance data, analytics, AI/Gen AI powered service on SaaS platform Design/develop/enhance telemetry and metrics pipeline and services Work closely with product, marketing, operation, platform, and customer support team to create innovative solution for Cloud product delivery Preferred candidate profile Bachelors degree in computer science or equivalent professional experience (7+ years). Proficiency in Cloud native development and programming languages such as GO, Java, Python Experience with data/stream processing (e.g., Kafka, Pub Sub, Dataflow, Vector, Spark, Flink), database and data warehouse (Clickhouse, BigQuery, StarRocks, Elasticsearch, Redis) Experience with logs, metrics, telemetry, Prometheus, Open Telemetry Experience with data system quality, monitoring and performance Experience in SaaS multi-tenancy, onboarding, metering & billing, monitoring & alerting Experience with container and orchestration technologies, Kubernetes and Microservice Experience with automation and cloud Infra, tooling, workload, modern CI/CD
Posted 2 weeks ago
8.0 - 12.0 years
35 - 50 Lacs
Vijayawada
Work from Office
Role & responsibilities Build F5 Distributed Cloud data system and management systems Design/develop/enhance data, analytics, AI/Gen AI powered service on SaaS platform Design/develop/enhance telemetry and metrics pipeline and services Work closely with product, marketing, operation, platform, and customer support team to create innovative solution for Cloud product delivery Preferred candidate profile Bachelors degree in computer science or equivalent professional experience (7+ years). Proficiency in Cloud native development and programming languages such as GO, Java, Python Experience with data/stream processing (e.g., Kafka, Pub Sub, Dataflow, Vector, Spark, Flink), database and data warehouse (Clickhouse, BigQuery, StarRocks, Elasticsearch, Redis) Experience with logs, metrics, telemetry, Prometheus, Open Telemetry Experience with data system quality, monitoring and performance Experience in SaaS multi-tenancy, onboarding, metering & billing, monitoring & alerting Experience with container and orchestration technologies, Kubernetes and Microservice Experience with automation and cloud Infra, tooling, workload, modern CI/CD
Posted 2 weeks ago
8.0 - 12.0 years
35 - 50 Lacs
Warangal
Work from Office
Role & responsibilities Build F5 Distributed Cloud data system and management systems Design/develop/enhance data, analytics, AI/Gen AI powered service on SaaS platform Design/develop/enhance telemetry and metrics pipeline and services Work closely with product, marketing, operation, platform, and customer support team to create innovative solution for Cloud product delivery Preferred candidate profile Bachelors degree in computer science or equivalent professional experience (7+ years). Proficiency in Cloud native development and programming languages such as GO, Java, Python Experience with data/stream processing (e.g., Kafka, Pub Sub, Dataflow, Vector, Spark, Flink), database and data warehouse (Clickhouse, BigQuery, StarRocks, Elasticsearch, Redis) Experience with logs, metrics, telemetry, Prometheus, Open Telemetry Experience with data system quality, monitoring and performance Experience in SaaS multi-tenancy, onboarding, metering & billing, monitoring & alerting Experience with container and orchestration technologies, Kubernetes and Microservice Experience with automation and cloud Infra, tooling, workload, modern CI/CD
Posted 2 weeks ago
8.0 - 12.0 years
35 - 50 Lacs
Noida
Work from Office
Role & responsibilities Build F5 Distributed Cloud data system and management systems Design/develop/enhance data, analytics, AI/Gen AI powered service on SaaS platform Design/develop/enhance telemetry and metrics pipeline and services Work closely with product, marketing, operation, platform, and customer support team to create innovative solution for Cloud product delivery Preferred candidate profile Bachelors degree in computer science or equivalent professional experience (7+ years). Proficiency in Cloud native development and programming languages such as GO, Java, Python Experience with data/stream processing (e.g., Kafka, Pub Sub, Dataflow, Vector, Spark, Flink), database and data warehouse (Clickhouse, BigQuery, StarRocks, Elasticsearch, Redis) Experience with logs, metrics, telemetry, Prometheus, Open Telemetry Experience with data system quality, monitoring and performance Experience in SaaS multi-tenancy, onboarding, metering & billing, monitoring & alerting Experience with container and orchestration technologies, Kubernetes and Microservice Experience with automation and cloud Infra, tooling, workload, modern CI/CD
Posted 2 weeks ago
8.0 - 12.0 years
35 - 50 Lacs
Mumbai
Work from Office
Role & responsibilities Build F5 Distributed Cloud data system and management systems Design/develop/enhance data, analytics, AI/Gen AI powered service on SaaS platform Design/develop/enhance telemetry and metrics pipeline and services Work closely with product, marketing, operation, platform, and customer support team to create innovative solution for Cloud product delivery Preferred candidate profile Bachelors degree in computer science or equivalent professional experience (7+ years). Proficiency in Cloud native development and programming languages such as GO, Java, Python Experience with data/stream processing (e.g., Kafka, Pub Sub, Dataflow, Vector, Spark, Flink), database and data warehouse (Clickhouse, BigQuery, StarRocks, Elasticsearch, Redis) Experience with logs, metrics, telemetry, Prometheus, Open Telemetry Experience with data system quality, monitoring and performance Experience in SaaS multi-tenancy, onboarding, metering & billing, monitoring & alerting Experience with container and orchestration technologies, Kubernetes and Microservice Experience with automation and cloud Infra, tooling, workload, modern CI/CD
Posted 2 weeks ago
8.0 - 12.0 years
35 - 50 Lacs
Bengaluru
Work from Office
Role & responsibilities Build F5 Distributed Cloud data system and management systems Design/develop/enhance data, analytics, AI/Gen AI powered service on SaaS platform Design/develop/enhance telemetry and metrics pipeline and services Work closely with product, marketing, operation, platform, and customer support team to create innovative solution for Cloud product delivery Preferred candidate profile Bachelors degree in computer science or equivalent professional experience (7+ years). Proficiency in Cloud native development and programming languages such as GO, Java, Python Experience with data/stream processing (e.g., Kafka, Pub Sub, Dataflow, Vector, Spark, Flink), database and data warehouse (Clickhouse, BigQuery, StarRocks, Elasticsearch, Redis) Experience with logs, metrics, telemetry, Prometheus, Open Telemetry Experience with data system quality, monitoring and performance Experience in SaaS multi-tenancy, onboarding, metering & billing, monitoring & alerting Experience with container and orchestration technologies, Kubernetes and Microservice Experience with automation and cloud Infra, tooling, workload, modern CI/CD
Posted 2 weeks ago
8.0 - 12.0 years
35 - 50 Lacs
Hyderabad
Work from Office
Role & responsibilities Build F5 Distributed Cloud data system and management systems Design/develop/enhance data, analytics, AI/Gen AI powered service on SaaS platform Design/develop/enhance telemetry and metrics pipeline and services Work closely with product, marketing, operation, platform, and customer support team to create innovative solution for Cloud product delivery Preferred candidate profile Bachelors degree in computer science or equivalent professional experience (7+ years). Proficiency in Cloud native development and programming languages such as GO, Java, Python Experience with data/stream processing (e.g., Kafka, Pub Sub, Dataflow, Vector, Spark, Flink), database and data warehouse (Clickhouse, BigQuery, StarRocks, Elasticsearch, Redis) Experience with logs, metrics, telemetry, Prometheus, Open Telemetry Experience with data system quality, monitoring and performance Experience in SaaS multi-tenancy, onboarding, metering & billing, monitoring & alerting Experience with container and orchestration technologies, Kubernetes and Microservice Experience with automation and cloud Infra, tooling, workload, modern CI/CD
Posted 2 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough