Jobs
Interviews

181 Pubsub Jobs - Page 2

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

7.0 - 10.0 years

20 - 27 Lacs

Noida

Work from Office

Job Responsibilities: Technical Leadership: • Provide technical leadership and mentorship to a team of data engineers. • Design, architect, and implement highly scalable, resilient, and performant data pipelines, using GCP technologies is a plus (e.g., Dataproc, Cloud Composer, Pub/Sub, BigQuery). • Guide the team in adopting best practices for data engineering, including CI/CD, infrastructure-as-code, and automated testing. • Conduct code reviews, design reviews, and provide constructive feedback to team members. • Stay up-to-date with the latest technologies and trends in data engineering, Data Pipeline Development: • Develop and maintain robust and efficient data pipelines to ingest, process, and transform large volumes of structured and unstructured data from various sources. • Implement data quality checks and monitoring systems to ensure data accuracy and integrity. • Collaborate with cross functional teams, and business stakeholders to understand data requirements and deliver data solutions that meet their needs. Platform Building & Maintenance: • Design and implement secure and scalable data storage solutions • Manage and optimize cloud infrastructure costs related to data engineering workloads. • Contribute to the development and maintenance of data engineering tooling and infrastructure to improve team productivity and efficiency. Collaboration & Communication: • Effectively communicate technical designs and concepts to both technical and non-technical audiences. • Collaborate effectively with other engineering teams, product managers, and business stakeholders. • Contribute to knowledge sharing within the team and across the organization. Required Qualifications: • Bachelor's or Master's degree in Computer Science, Engineering, or a related field. • 7+ years of experience in data engineering and Software Development. • 7+ years of experience of coding in SQL and Python/Java. • 3+ years of hands-on experience building and managing data pipelines in cloud environment like GCP. • Strong programming skills in Python or Java, with experience in developing data-intensive applications. • Expertise in SQL and data modeling techniques for both transactional and analytical workloads. • Experience with CI/CD pipelines and automated testing frameworks. • Excellent communication, interpersonal, and problem-solving skills. • Experience leading or mentoring a team of engineers Roles and Responsibilities Job Responsibilities: Technical Leadership: • Provide technical leadership and mentorship to a team of data engineers. • Design, architect, and implement highly scalable, resilient, and performant data pipelines, using GCP technologies is a plus (e.g., Dataproc, Cloud Composer, Pub/Sub, BigQuery). • Guide the team in adopting best practices for data engineering, including CI/CD, infrastructure-as-code, and automated testing. • Conduct code reviews, design reviews, and provide constructive feedback to team members. • Stay up-to-date with the latest technologies and trends in data engineering, Data Pipeline Development: • Develop and maintain robust and efficient data pipelines to ingest, process, and transform large volumes of structured and unstructured data from various sources. • Implement data quality checks and monitoring systems to ensure data accuracy and integrity. • Collaborate with cross functional teams, and business stakeholders to understand data requirements and deliver data solutions that meet their needs. Platform Building & Maintenance: • Design and implement secure and scalable data storage solutions • Manage and optimize cloud infrastructure costs related to data engineering workloads. • Contribute to the development and maintenance of data engineering tooling and infrastructure to improve team productivity and efficiency. Collaboration & Communication: • Effectively communicate technical designs and concepts to both technical and non-technical audiences. • Collaborate effectively with other engineering teams, product managers, and business stakeholders. • Contribute to knowledge sharing within the team and across the organization. Required Qualifications: • Bachelor's or Master's degree in Computer Science, Engineering, or a related field. • 7+ years of experience in data engineering and Software Development. • 7+ years of experience of coding in SQL and Python/Java. • 3+ years of hands-on experience building and managing data pipelines in cloud environment like GCP. • Strong programming skills in Python or Java, with experience in developing data-intensive applications. • Expertise in SQL and data modeling techniques for both transactional and analytical workloads. • Experience with CI/CD pipelines and automated testing frameworks. • Excellent communication, interpersonal, and problem-solving skills. • Experience leading or mentoring a team of engineers

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

About GlobalLogic GlobalLogic, a leader in digital product engineering with over 30,000 employees, helps brands worldwide in designing and developing innovative products, platforms, and digital experiences. By integrating experience design, complex engineering, and data expertise, GlobalLogic assists clients in envisioning possibilities and accelerating their transition into the digital businesses of tomorrow. Operating design studios and engineering centers globally, GlobalLogic extends its deep expertise to customers in various industries such as communications, financial services, automotive, healthcare, technology, media, manufacturing, and semiconductor. GlobalLogic is a Hitachi Group Company. Requirements Leadership & Strategy As a part of GlobalLogic, you will lead and mentor a team of cloud engineers, providing technical guidance and support for career development. You will define cloud architecture standards and best practices across the organization and collaborate with senior leadership to develop a cloud strategy and roadmap aligned with business objectives. Your responsibilities will include driving technical decision-making for complex cloud infrastructure projects and establishing and maintaining cloud governance frameworks and operational procedures. Leadership Experience With a minimum of 3 years in technical leadership roles managing engineering teams, you should have a proven track record of successfully delivering large-scale cloud transformation projects. Experience in budget management, resource planning, and strong presentation and communication skills for executive-level reporting are essential for this role. Certifications (Preferred) Preferred certifications include Google Cloud Professional Cloud Architect, Google Cloud Professional Data Engineer, and additional relevant cloud or security certifications. Technical Excellence You should have over 10 years of experience in designing and implementing enterprise-scale Cloud Solutions using GCP services. As a technical expert, you will architect and oversee the development of sophisticated cloud solutions using Python and advanced GCP services. Your role will involve leading the design and deployment of solutions utilizing Cloud Functions, Docker containers, Dataflow, and other GCP services. Additionally, you will design complex integrations with multiple data sources and systems, implement security best practices, and troubleshoot and resolve technical issues while establishing preventive measures. Job Responsibilities Technical Skills Your expertise should include expert-level proficiency in Python and experience in additional languages such as Java, Go, or Scala. Deep knowledge of GCP services like Dataflow, Compute Engine, BigQuery, Cloud Functions, and others is required. Advanced knowledge of Docker, Kubernetes, and container orchestration patterns, along with experience in cloud security, infrastructure as code, and CI/CD practices, will be crucial for this role. Cross-functional Collaboration Collaborating with C-level executives, senior architects, and product leadership to translate business requirements into technical solutions, leading cross-functional project teams, presenting technical recommendations to executive leadership, and establishing relationships with GCP technical account managers are key aspects of this role. What We Offer At GlobalLogic, we prioritize a culture of caring, continuous learning and development, interesting and meaningful work, balance and flexibility, and a high-trust organization. Join us to experience an inclusive culture, opportunities for growth and advancement, impactful projects, work-life balance, and a safe, reliable, and ethical global company. About GlobalLogic GlobalLogic, a Hitachi Group Company, is a trusted digital engineering partner known for creating innovative digital products and experiences since 2000. Collaborating with forward-thinking companies globally, GlobalLogic continues to transform businesses and redefine industries through intelligent products, platforms, and services.,

Posted 1 week ago

Apply

7.0 - 12.0 years

0 Lacs

goa

On-site

As a Senior Backend Developer at Siemens in Goa, India, you will be part of a passionate group of solution innovators, UX devotees, techies, data scientists/AI experts, software lovers, AR/VR experts, visual artists, and architects working in a lean startup concept. Your role will involve solving complex problems in various domains like industry, energy, mobility, and buildings to smart cities by leveraging data analytics, artificial intelligence, simulations, and interactive visualization. Your responsibilities will include designing, developing, and maintaining robust backend services and APIs. You will collaborate with architects and product owners to translate requirements into technical solutions. It will be essential for you to implement clean, maintainable, and testable code following best practices and coding standards. You will also participate in system design discussions, code reviews, and performance tuning to ensure the integration with frontend components and external systems. To qualify for this role, you should possess a Masters/Bachelor's degree in Computer Science or a related discipline from a reputed institute, along with 7-12 years of experience in backend development for enterprise-grade applications. Proficiency in backend technologies such as Java Spring Boot, Python, & Node.js is required, as well as a deep understanding of SOLID principles, design patterns, and system design. Experience with SQL and NoSQL databases, including handling large-scale & time-series data, is essential. Moreover, you should have a strong grasp of backend methodologies such as RESTful API design and familiarity with event-driven systems using MQTT, WebSocket, or Pub/Sub. Exposure to cloud-native development, CI/CD pipelines, and cloud platforms like AWS is beneficial. Knowledge of unit testing, mocking, test automation frameworks, version control systems like Git, and maintaining code quality through tools like SonarQube is also necessary. Understanding security architecture, data privacy compliance, and DevOps culture will be advantageous for this role. Your ability to work effectively in agile, globally distributed teams, along with strong debugging, problem-solving, and communication skills, will be crucial. Siemens values diversity and equality, welcoming applications that reflect the diversity of the communities it works in across Gender, LGBTQ+, Abilities & Ethnicity. If you are passionate about shaping the future with your technical expertise, join Siemens in making a real impact in the world.,

Posted 1 week ago

Apply

5.0 - 10.0 years

0 - 0 Lacs

Hyderabad, Chennai

Hybrid

JD: Design and develop robust ETL pipelines using Python, PySpark, and GCP services. Build and optimize data models and queries in BigQuery for analytics and reporting. Ingest, transform, and load structured and semi-structured data from various sources. Collaborate with data analysts, scientists, and business teams to understand data requirements. Ensure data quality, integrity, and security across cloud-based data platforms. Monitor and troubleshoot data workflows and performance issues. Automate data validation and transformation processes using scripting and orchestration tools. Required Skills & Qualifications: Hands-on experience with Google Cloud Platform (GCP), especially BigQuery. Strong programming skills in Python and/or PySpark. Experience in designing and implementing ETL workflows and data pipelines. Proficiency in SQL and data modeling for analytics. Familiarity with GCP services such as Cloud Storage, Dataflow, Pub/Sub, and Composer. Understanding of data governance, security, and compliance in cloud environments. Experience with version control (Git) and agile development practices.

Posted 1 week ago

Apply

4.0 - 9.0 years

5 - 14 Lacs

Pune, Chennai, Bengaluru

Work from Office

Dear Candidate, This is with reference to your profile on the job portal. Deloitte India Consulting has an immediate requirement for the following role. Job Summary: We are looking for a skilled GCP Data Engineer to design, build, and maintain scalable data pipelines and solutions on Google Cloud Platform . The ideal candidate will have hands-on experience with GCP services, data warehousing, ETL processes, and big data technologies. Key Responsibilities: Design and implement scalable data pipelines using Cloud Dataflow , Apache Beam , and Cloud Composer . Develop and maintain data models and data marts in BigQuery . Build ETL/ELT workflows to ingest, transform, and load data from various sources. Optimize data storage and query performance in BigQuery and other GCP services. Collaborate with data scientists, analysts, and business stakeholders to understand data requirements. Ensure data quality, integrity, and security across all data solutions. Monitor and troubleshoot data pipeline issues and implement improvements. Required Skills & Qualifications: Bachelors or Master’s degree in Computer Science, Engineering, or related field. 3+ years of experience in data engineering, with at least 1–2 years on Google Cloud Platform . Proficiency in SQL , Python , and Apache Beam . Hands-on experience with GCP services like BigQuery , Cloud Storage , Cloud Pub/Sub , Cloud Dataflow , and Cloud Composer . Experience with data modeling , data warehousing , and ETL/ELT processes. Familiarity with CI/CD pipelines , Terraform , and Git . Strong problem-solving and communication skills. Nice to Have: GCP certifications (e.g., Professional Data Engineer ). Incase if you are interested, please share your updated resume along with the following details.(Mandatory) To smouni@deloitte.com Candidate Name Mobile No. Email ID Skill Total Experience Education Details Current Location Requested location Current Firm Current CTC Exp CTC Notice Period/LWD Feedback

Posted 1 week ago

Apply

12.0 - 20.0 years

25 - 40 Lacs

Kolkata, Hyderabad, Pune

Work from Office

GCP Data Architect

Posted 1 week ago

Apply

5.0 - 10.0 years

12 - 22 Lacs

Kolkata, Hyderabad, Pune

Work from Office

GCP Engineer, Lead GCP Engineer

Posted 1 week ago

Apply

3.0 - 10.0 years

0 Lacs

pune, maharashtra

On-site

As a Senior Google Cloud Architect in Pune (Hybrid) with over 10 years of experience, including 3+ years specifically on GCP, you will play a crucial role in leading the design and delivery of comprehensive cloud solutions on Google Cloud Platform. Your responsibilities will involve collaborating with data engineering, DevOps, and architecture teams to create scalable, secure, and cost-effective cloud platforms. Your key responsibilities will include designing scalable data and application architectures utilizing tools such as BigQuery, Dataflow, Composer, Cloud Run, Pub/Sub, and other related GCP services. You will be leading cloud migration, modernization, and CI/CD automation through the use of technologies like Terraform, Jenkins, GitHub, and Cloud Build. Additionally, you will be responsible for implementing real-time and batch data pipelines, chatbot applications using LLMs (Gemini, Claude), and automating reconciliation and monitoring processes. Your role will also involve collaborating closely with stakeholders to ensure technical solutions align with business objectives. The ideal candidate for this role should have a minimum of 3 years of experience working with GCP and possess a strong proficiency in key tools such as BigQuery, Dataflow, Cloud Run, Airflow, GKE, and Cloud Functions. Hands-on experience with Terraform, Kubernetes, Jenkins, GitHub, and cloud-native CI/CD is essential. In addition, you should have a solid understanding of DevSecOps practices, networking, and data architecture concepts like Data Lake, Lakehouse, and Mesh. Proficiency in Python, SQL, and ETL frameworks such as Ab Initio is also required. Preferred qualifications for this role include GCP Certifications (Cloud Architect, DevOps, ML Engineer), experience with Azure or hybrid environments, and domain expertise in sectors like Banking, Telecom, or Retail.,

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

thiruvananthapuram, kerala

On-site

You will be part of a dynamic team at Equifax, where we are seeking creative, high-energy, and driven software engineers with hands-on development skills to contribute to various significant projects. As a software engineer at Equifax, you will have the opportunity to work with cutting-edge technology alongside a talented group of engineers. This role is perfect for you if you are a forward-thinking, committed, and enthusiastic individual who is passionate about technology. Your responsibilities will include designing, developing, and operating high-scale applications across the entire engineering stack. You will be involved in all aspects of software development, from design and testing to deployment, maintenance, and continuous improvement. By utilizing modern software development practices such as serverless computing, microservices architecture, CI/CD, and infrastructure-as-code, you will contribute to the integration of our systems with existing internal systems and tools. Additionally, you will participate in technology roadmap discussions and architecture planning to translate business requirements and vision into actionable solutions. Working within a closely-knit, globally distributed engineering team, you will be responsible for triaging product or system issues and resolving them efficiently to ensure the smooth operation and quality of our services. Managing project priorities, deadlines, and deliverables will be a key part of your role, along with researching, creating, and enhancing software applications to advance Equifax Solutions. To excel in this position, you should have a Bachelor's degree or equivalent experience, along with at least 7 years of software engineering experience. Proficiency in mainstream Java, SpringBoot, TypeScript/JavaScript, as well as hands-on experience with Cloud technologies such as GCP, AWS, or Azure, is essential. You should also have a solid background in designing and developing cloud-native solutions and microservices using Java, SpringBoot, GCP SDKs, and GKE/Kubernetes. Experience in deploying and releasing software using Jenkins CI/CD pipelines, infrastructure-as-code concepts, Helm Charts, and Terraform constructs is highly valued. Moreover, being a self-starter who can adapt to changing priorities with minimal supervision could set you apart in this role. Additional advantageous skills include designing big data processing solutions, UI development, backend technologies like JAVA/J2EE and SpringBoot, source code control management systems, build tools, working in Agile environments, relational databases, and automated testing. If you are ready to take on this exciting opportunity and contribute to Equifax's innovative projects, apply now and be part of our team of forward-thinking software engineers.,

Posted 1 week ago

Apply

7.0 - 12.0 years

0 Lacs

maharashtra

On-site

Prudential's purpose is to be partners for every life and protectors for every future. Our purpose encourages everything we do by creating a culture in which diversity is celebrated and inclusion assured, for our people, customers, and partners. We provide a platform for our people to do their best work and make an impact to the business, and we support our people's career ambitions. We pledge to make Prudential a place where you can Connect, Grow, and Succeed. At Prudential Health India (PHI), we are on a mission to make Indians healthier, while bridging the health protection gap. This is a Zero to One team undertaking a greenfield health insurance deployment in India committed to building journeys that will truly empathize with the customer and offer a differentiated, bespoke experience. To partner us in this mission, we are looking for a talented candidate for the role of Technology Solution Designing. Note: The title will depend on (1) Experience (2) Expertise and (3) Performance. So the title could be Solution Designer, Senior Solution Designer, or (Asst. Director) Solution Designer. Individual contributor role. Experience: 7 - 12 years. Location: Mumbai only. Job Profile Summary: PHI intends to build a cloud-native, microservices-oriented, loosely coupled & open technology platform, which is tightly aligned to the health-insurance domain, and built expecting to be reused while anticipating change. The PHI platform will be made up of multiple applications supporting different business functions, which are well-integrated and well-orchestrated. The applications could be COTS (Common-Off-The-Shelf) vendor software, or Prudential Group software capabilities, or software built by in-house PHI engineering team. All applications need to adopt common services, platforms, architectural principles, and design patterns. The right candidate will be accountable to convert business requirements into detailed and low-level technology requirements, in the fastest time possible, with least gaps, and clarity on how the technology and business requirements can be tested at pace, with minimal gaps, and best quality. Requirement gaps, change requests, non-integrated journeys - all these would signal poor quality deliverables by this candidate. Job Description: - Deeply understand the long-term architectural direction, with emphasis on reusable components, and the interactions between the various applications. - Evaluate business requirements and convert them into solution designs considering Functionality, Interoperability, Performance, Scalability, Reliability & Availability & other applicable criteria. - Identify and evaluate alternative design options and trade-offs to handle both functional and non-functional requirements; Prepare designs that consider current applications & architecture, operating models as well as end target state architecture. - Responsible for high- & low-level designs of various systems and applications in line with PHI Tech Vision. - Own the preparation of detailed design specifications to form the basis for development & modification of applications. - Ensure adoption and implementation of defined solutions; Provide technical expertise to enable the application design & iteration; Collaborate with technical teams to develop and agree system integration plans and review progress. - Support change programs/ project through technical plans & application of design principles that comply with enterprise and solution architecture standards. - Lead governance forums required to create harmony in application designs. - Identify & document application/ integration components for recording in a suitable EA tool; Recommend and implement improvements to these processes/ tool(s). - Ensure that Extensions and additions to the defined designs are undertaken frequently in line with ever-evolving business and technology landscape but go through rigorous vetting processes. - Ensure that the collaborative nature of capabilities such as design and engineering is coupled with tight governance around non-negotiables like security, availability, and performance. Who we are looking for: Technical Skills & work experience: - Proven experience as a Technology solution leader with overall experience between 8 - 18 years. - Proven ability with UML (Unified Modelling Language) - should be able to come up with sequence diagrams, activity diagrams, state diagrams, etc. - Demonstrated ability to understand business strategy & processes & its successful translation into technology low-level design. - Excellent knowledge of modern design patterns like CQRS, SAGA, etc., including micro-services, etc. Keywords 1st level: solution architect, domain architect, solution designer, cloud designer, digital designer. Good to Have: - Deep End-to-End Architecture understanding, including Enterprise Java Stack, Front-End technologies, mobile applications (Android and iOS), Middleware, Databases (SQL & No SQL), Data Warehouses, RESTful APIs, Pub-Sub. - Experienced in streaming technology or frameworks such as Kafka. - Deep understanding of Enterprise Integration and Messaging Patterns, SOA Governance practices. - Familiar with one or more Architecture Frameworks (e.g. TOGAF etc.). - Good understanding of DevOps. Personal Traits: - First and foremost, be an exceptional builder of great people. - Highest standards of Collaboration & Teamwork are critical to this role. - Strong communication skills & ability to engage senior management on strategic plans, leading project steering committees and status updates etc. - Excellent problem analysis skills. Innovative and creative in developing solutions. - Ability and willingness to be hands-on; Strong attention to detail. - Ability to work independently and handle multiple concurrent initiatives. - Excellent organizational, vendor management, negotiation, and prioritization skills. Education: - Bachelors in computer science, Computer Engineering, or equivalent; Suitable certifications for key skills. Language: - Fluent written and spoken English.,

Posted 1 week ago

Apply

4.0 - 9.0 years

10 - 20 Lacs

Bengaluru

Remote

Job Description: Job Title: Apache beam software engineer Work Mode: Remote Base Location: Bengaluru Experience Required: 4 to 6 Years Job Summary: We are looking for a Software Engineer with hands-on experience in Apache Beam , Google Cloud Dataflow , and Dataproc , focusing on building reusable data processing frameworks . This is not a traditional data engineering role. The ideal candidate will have strong software development skills in Java or Python and experience in building scalable, modular data processing components and frameworks for batch and streaming use cases. Key Responsibilities: Design and develop framework-level components using Apache Beam , GCP Dataflow , and Dataproc . Build scalable, reusable libraries and abstractions in Python or Java for distributed data processing. Work closely with architects to implement best practices for designing high-performance data frameworks. Ensure software reliability, maintainability, and testability through strong coding and automation practices. Participate in code reviews, architectural discussions, and performance tuning initiatives. Contribute to internal tooling or SDK development for data engineering platforms. Required Skills: 4 to 6 years of experience as a Software Engineer working on distributed systems or data processing frameworks. Strong programming skills in Java and/or Python . Deep experience with Apache Beam and GCP Dataflow . Hands-on experience with GCP Dataproc , especially for building scalable custom batch or streaming jobs. Solid understanding of streaming vs batch processing concepts. Familiarity with CI/CD pipelines , GitHub , and test automation. Preferred Skills: Experience with workflow orchestration tools such as Airflow (Composer) . Exposure to Pub/Sub and BigQuery (from a system integration perspective). Understanding of observability , logging , and error-handling in distributed data pipelines. Experience building internal libraries, SDKs, or tools to support data teams. Tech Stack: Cloud: GCP (Dataflow, Dataproc, Pub/Sub, Composer) Programming: Java, Python Frameworks: Apache Beam DevOps: GitHub, CI/CD (Cloud Build, Jenkins) Focus Areas: Framework/library development, scalable distributed data processing, component-based architecture

Posted 1 week ago

Apply

4.0 - 8.0 years

10 - 14 Lacs

Chennai

Work from Office

Role Description Provides leadership for the overall architecture, design, development, and deployment of a full-stack cloud native data analytics platform. Designing & Augmenting Solution architecture for Data Ingestion, Data Preparation, Data Transformation, Data Load, ML & Simulation Modelling, Java BE & FE, State Machine, API Management & Intelligence consumption using data products, on cloud Understand Business Requirements and help in developing High level and Low-level Data Engineering and Data Processing Documentation for the cloud native architecture Developing conceptual, logical and physical target-state architecture, engineering and operational specs. Work with the customer, users, technical architects, and application designers to define the solution requirements and structure for the platform Model and design the application data structure, storage, and integration Lead the database analysis, design, and build effort Work with the application architects and designers to design the integration solution Ensure that the database designs fulfill the requirements, including data volume, frequency needs, and long-term data growth Able to perform Data Engineering tasks using Spark Knowledge of developing efficient frameworks for development and testing using (Sqoop/Nifi/Kafka/Spark/Streaming/ WebHDFS/Python) to enable seamless data ingestion processes on to the Hadoop/BigQuery platforms. Enabling Data Governance and Data Discovery Exposure of Job Monitoring framework along validations automation Exposure of handling structured, Un Structured and Streaming data. Technical Skills Experience with building data platform on cloud (Data Lake, Data Warehouse environment, Databricks) Strong technical understanding of data modeling, design and architecture principles and techniques across master data, transaction data and derived/analytic data Proven background of designing and implementing architectural solutions which solve strategic and tactical business needs Deep knowledge of best practices through relevant experience across data-related disciplines and technologies, particularly for enterprise-wide data architectures, data management, data governance and data warehousing Highly competent with database design Highly competent with data modeling Strong Data Warehousing and Business Intelligence skills or including: Handling ELT and scalability issues for enterprise level data warehouse Creating ETLs/ELTs to handle data from various data sources and various formats Strong hands-on experience of programming language like Python, Scala with Spark and Beam. Solid hands-on and Solution Architecting experience in Cloud Technologies Aws, Azure and GCP (GCP preferred) Hands on working experience of data processing at scale with event driven systems, message queues (Kafka/ Flink/Spark Streaming) Hands on working Experience with GCP Services like BigQuery, DataProc, PubSub, Dataflow, Cloud Composer, API Gateway, Datalake, BigTable, Spark, Apache Beam, Feature Engineering/Data Processing to be used for Model development Experience gathering and processing raw data at scale (including writing scripts, web scraping, calling APIs, write SQL queries, etc.) Experience building data pipelines for structured/unstructured, real-time/batch, events/synchronous/ asynchronous using MQ, Kafka, Steam processing Hands-on working experience in analyzing source system data and data flows, working with structured and unstructured data Must be very strong in writing SparkSQL queries Strong organizational skills, with the ability to work autonomously as well as leading a team Pleasant Personality, Strong Communication & Interpersonal Skills Qualifications A bachelor's degree in computer science, computer engineering, or a related discipline is required to work as a technical lead Certification in GCP would be a big plus Individuals in this field can further display their leadership skills by completing the Project Management Professional certification offered by the Project Management Institute.

Posted 1 week ago

Apply

7.0 - 10.0 years

7 - 11 Lacs

Chennai

Work from Office

Responsible for pre and post settlement of OTC derivatives trades Ensuring all post settlement breaks are cleared within a reasonable time. Able to understand the various manual payments which are done from SWIFT ( 103, 202 ,202COV) Good knowledge on manual payment SWIFT creation. Able to understand the various breaks occurring in Nostro ( Statement v/s ledger recon ) Able to understand the lifecycle of FIC products. Should have knowledge on FIC products. Ex. FX, MM, IRD, CDS, Options etc. Any ad-hoc duties that may be assigned by management Ability to handle, escalate resolve queries in a timely manner Builds a strong depository of the Operations and Domain knowledge in the team To actively be involved in creating/ capturing process updates/ workflows pertaining to business process To perform route cause analysis of errors and suggest/ implement controls in consultation with the Supervisor/ Manager To provide training sessions to new joiners and existing partners in the team To actively work on identifying and implementing process improvements in consultation with the Supervisor To initiate and lead team building activities within the team To actively participate in Audits/ BCP exercises and to help the supervisor to ensure awareness in the team To manage the sensitive clients and escalate to managers/FO in order to meet Fed Target Perform affirmation of deal details with counterparties via phone (ensure that contact person name, direct number, date and time of affirmation are clearly indicated on the ticket). Ensure that all payment messages are generated successfully. Ensure that all relevant payment messages are authorized and released before the currency cut-off time. Creation and release of any new, amendment or cancellation payments, whenever necessary. Coordinate closely with MO TSU for any payments related issues e.g. rollover deals, unusual payment instructions etc. Cross train and support other members in the team. Provide backup to other members of the team during holiday. Participate in projects or UATs, whenever required. Ensure quality of supporting data for permanent supervision of accounts including monthly reconciliation of all treasury sub systems (including spreadsheets) to general ledger Ensure that policies, procedures, best practices, mission and objectives set by management are adhered to on daily discharge of duties. KEY SKILL AREAS KNOWLEDGE REQUIRED: Minimum 7-10 years of settlement experience in a high volume, exceptions/manual and time bounded processing, . A sound knowledge of FX, MM and derivative products like Equity, Rates, Credits, Options. Sound knowledge on payment processes and currency/country specificities Competency in using the various applications in end to end processing of settlements. Strong understanding of the trade lifecycle and various stakeholders cross impact at each stage. Familiar with a variety of common desktop applications such as: Microsoft Excel, Word, outlook and Internet browsers. Good problem solving skills. Keen and quick to learn, inquisitive nature, meticulous, motivated individual

Posted 1 week ago

Apply

8.0 - 12.0 years

22 - 32 Lacs

Noida, Pune, Bengaluru

Hybrid

Build and Optimize ELT/ETL Pipelines using BigQuery, GCS, Dataflow, PubSub and Orchestration services Composer/Airflow • Hands-On experience in building ETL/ELT Pipelines with developing software code in Python • Experience in working with data warehouses, data warehouse technical architectures, reporting/analytic tools • Develop and implement data quality and governance procedures to ensure the accuracy and reliability of data • Demonstrate extensive skills and success in the implementation of technology projects within a professional environment, with a particular focus on data engineering • Eager to learn and explore new services within GCP to enhance skills and contribution to Projects • Demonstrated excellent communication, presentation, and problem-solving skills. • Prior Experience in ETL tool such as DBT,Talend Etc Good to have skills • AI/ML,Gen AI Backgroud • IAM, Cloud Logging and Monitoring • The Data Engineer coaches the junior data engineering personnel position by bringing them up to speed and help them get better understanding of overall Data ecosystem. • Working Experience with Agile methodologies and CI/CD Tools like Terraform/Jenkins • Working on Solution deck, IP build, client meetings on requirement gathering

Posted 1 week ago

Apply

5.0 - 10.0 years

8 - 18 Lacs

Hyderabad

Work from Office

Role: GCP Data Engineer Location: Hyderabad Duration: Full time Roles & Responsibilities: * Design, develop, and maintain scalable and reliable data pipelines using Apache Airflow to orchestrate complex workflows. * Utilize Google BigQuery for large-scale data warehousing, analysis, and querying of structured and semi-structured data. * Leverage the Google Cloud Platform (GCP) ecosystem, including services like Cloud Storage, Compute Engine, AI Platform, and Dataflow, to build and deploy data science solutions. * Develop, train, and deploy machine learning models to solve business problems such as forecasting, customer segmentation, and recommendation systems. * Write clean, efficient, and well-documented code in Python for data analysis, modeling, and automation. * Use Docker to containerize applications and create reproducible research environments, ensuring consistency across development, testing, and production. * Perform exploratory data analysis to identify trends, patterns, and anomalies, and effectively communicate findings to both technical and non-technical audiences. * Collaborate with data engineers to ensure data quality and integrity. * Stay current with the latest advancements in data science, machine learning, and big data technologies.

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

maharashtra

On-site

As a member of the JM Financial team, you will be part of a culture that values recognition and rewards for the hard work and dedication of its employees. We believe that a motivated workforce is essential for the growth of our organization. Our management team acknowledges and appreciates the efforts of our personnel through promotions, bonuses, awards, and public recognition. By fostering an atmosphere of success, we celebrate achievements such as successful deals, good client ratings, and customer reviews. Nurturing talent is a key focus at JM Financial. We aim to prepare our employees for future leadership roles by creating succession plans and encouraging direct interactions with clients. Knowledge sharing and cross-functional interactions are integral to our business environment, fostering inclusivity and growth opportunities for our team members. Attracting and managing top talent is a priority for JM Financial. We have successfully built a diverse talent pool with expertise, new perspectives, and enthusiasm. Our strong brand presence in the market enables us to leverage the expertise of our business partners to attract the best talent. Trust is fundamental to our organization, binding our programs, people, and clients together. We prioritize transparency, two-way communication, and trust across all levels of the organization. Opportunities for growth and development are abundant at JM Financial. We believe in growing alongside our employees and providing them with opportunities to advance their careers. Our commitment to nurturing talent has led to the appointment of promising employees to leadership positions within the organization. With a focus on employee retention and a supportive environment for skill development, we aim to create a strong future leadership team. Emphasizing teamwork, we value both individual performance and collaborative group efforts. In a fast-paced corporate environment, teamwork is essential for achieving our common vision. By fostering open communication channels and facilitating information sharing, we ensure that every member of our team contributes to delivering value to our clients. As a Java Developer at JM Financial, your responsibilities will include designing, modeling, and building services to support new features and products. You will work on an integrated central platform to power various web applications, developing a robust backend framework and implementing features across different products using a combination of technologies. Researching and implementing new technologies to enhance our services will be a key part of your role. To excel in this position, you should have a BTech Degree in Computer Science or equivalent experience, with at least 3 years of experience building Java-based web applications in Linux/Unix environments. Proficiency in scripting languages such as JavaScript, Ruby, or Python, along with compiled languages like Java or C/C++, is required. Experience with Google Cloud Platform services, knowledge of design methodologies for backend services, and building scalable infrastructure are essential skills for this role. Our technology stack includes JavaScript, Angular, React, NextJS, HTML5/CSS3/Bootstrap, Windows/Linux/OSX Bash, Kookoo telephony, SMS Gupshup, Sendgrid, Optimizely, Mixpanel, Google Analytics, Firebase, Git, Bash, NPM, Browser Dev Console, NoSQL, Google Cloud Datastore, Google Cloud Platform (App Engine, PubSub, Cloud Functions, Bigtable, Cloud Endpoints). If you are passionate about technology and innovation, and thrive in a collaborative environment, we welcome you to join our team at JM Financial.,

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

ahmedabad, gujarat

On-site

As a Senior DevOps Engineer at TechBlocks, you will be responsible for designing and managing robust, scalable CI/CD pipelines, automating infrastructure with Terraform, and improving deployment efficiency across GCP-hosted environments. With 5-8 years of experience in DevOps engineering roles, your expertise in CI/CD, infrastructure automation, and Kubernetes will be crucial for the success of our projects. In this role, you will own the CI/CD strategy and configuration, implement DevSecOps practices, and drive an automation-first culture within the team. Your key responsibilities will include designing and implementing end-to-end CI/CD pipelines using tools like Jenkins, GitHub Actions, and Argo CD for production-grade deployments. You will also define branching strategies and workflow templates for development teams, automate infrastructure provisioning using Terraform, Helm, and Kubernetes manifests, and manage secrets lifecycle using Vault for secure deployments. Collaborating with engineering leads, you will review deployment readiness, ensure quality gates are met, and integrate DevSecOps tools like Trivy, SonarQube, and JFrog into CI/CD workflows. Monitoring infrastructure health and capacity planning using tools like Prometheus, Grafana, and Datadog, you will implement alerting rules, auto-scaling, self-healing, and resilience strategies in Kubernetes. Additionally, you will drive process documentation, review peer automation scripts, and provide mentoring to junior DevOps engineers. Your role will be pivotal in ensuring the reliability, scalability, and security of our systems while fostering a culture of innovation and continuous learning within the team. TechBlocks is a global digital product engineering company with 16+ years of experience, helping Fortune 500 enterprises and high-growth brands accelerate innovation, modernize technology, and drive digital transformation. We believe in the power of technology and the impact it can have when coupled with a talented team. Join us at TechBlocks and be part of a dynamic, fast-moving environment where big ideas turn into real impact, shaping the future of digital transformation.,

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

kochi, kerala

On-site

Beinex is seeking a skilled and motivated Google Cloud Consultant to join our dynamic team. As a Google Cloud Consultant, you will play a pivotal role in assisting our clients in harnessing the power of Google Cloud technologies to drive innovation and transformation. If you are passionate about cloud solutions, client collaboration, and cutting-edge technology, we invite you to join our journey. Responsibilities - Collaborate with clients to understand their business objectives and technology needs, translating them into effective Google Cloud solutions - Design, implement, and manage Google Cloud Platform (GCP) architectures, ensuring scalability, security, and performance - Provide technical expertise and guidance to clients on GCP services, best practices, and cloud-native solutions and adopt an Infrastructure as Code (IaC) approach to establish an advanced infrastructure for both internal and external stakeholders - Conduct cloud assessments and create migration strategies for clients looking to transition their applications and workloads to GCP - Work with cross-functional teams to plan, execute, and optimise cloud migrations, deployments, and upgrades - Assist clients in optimising their GCP usage by analysing resource utilisation, recommending cost-saving measures, and enhancing overall efficiency - Collaborate with development teams to integrate cloud-native technologies and solutions into application design and development processes - Stay updated with the latest trends, features, and updates in the Google Cloud ecosystem and provide thought leadership to clients - Troubleshoot and resolve technical issues related to GCP services and configurations - Create and maintain documentation for GCP architectures, solutions, and best practices - Conduct training sessions and workshops for clients to enhance their understanding of GCP technologies and usage Key Skills Requirements - Profound expertise in Google Cloud Platform services, including but not limited to Compute Engine, App Engine, Kubernetes Engine, Cloud Storage, BigQuery, Pub/Sub, Cloud Functions, VPC, IAM, and Cloud Security - Strong understanding of GCP networking concepts, including VPC peering, firewall rules, VPN, and hybrid cloud configurations - Experience with Infrastructure as Code (IaC) tools such as Terraform, Deployment Manager, or Google Cloud Deployment Manager - Hands-on experience with containerisation technologies like Docker and Kubernetes - Proficiency in scripting languages such as Python and Bash - Familiarity with cloud monitoring, logging, and observability tools and practices - Knowledge of DevOps principles and practices, including CI/CD pipelines and automation - Strong problem-solving skills and the ability to troubleshoot complex technical issues - Excellent communication skills to interact effectively with clients, team members, and stakeholders - Previous consulting or client-facing experience is a plus - Relevant Google Cloud certifications are highly desirable Perks: Careers at Beinex - Comprehensive Health Plans - Learning and development - Workation and outdoor training - Hybrid working environment - On-site travel Opportunity - Beinex Branded Merchandise,

Posted 1 week ago

Apply

10.0 - 14.0 years

0 Lacs

chennai, tamil nadu

On-site

You will be joining as a GCP Data Architect at TechMango, a rapidly growing IT Services and SaaS Product company located in Madurai and Chennai. With over 12 years of experience, you are expected to start immediately and work from the office. TechMango specializes in assisting global businesses with digital transformation, modern data platforms, product engineering, and cloud-first initiatives. In this role, you will be leading data modernization efforts for a prestigious client, Livingston, in a highly strategic project. As a GCP Data Architect, your primary responsibility will be to design and implement scalable, high-performance data solutions on Google Cloud Platform. You will collaborate closely with stakeholders to define data architecture, implement data pipelines, modernize legacy data systems, and guide data strategy aligned with enterprise goals. Key Responsibilities: - Lead end-to-end design and implementation of scalable data architecture on Google Cloud Platform (GCP) - Define data strategy, standards, and best practices for cloud data engineering and analytics - Develop data ingestion pipelines using Dataflow, Pub/Sub, Apache Beam, Cloud Composer (Airflow), and BigQuery - Migrate on-prem or legacy systems to GCP (e.g., from Hadoop, Teradata, or Oracle to BigQuery) - Architect data lakes, warehouses, and real-time data platforms - Ensure data governance, security, lineage, and compliance (using tools like Data Catalog, IAM, DLP) - Guide a team of data engineers and collaborate with business stakeholders, data scientists, and product managers - Create documentation, high-level design (HLD) and low-level design (LLD), and oversee development standards - Provide technical leadership in architectural decisions and future-proofing the data ecosystem Required Skills & Qualifications: - 10+ years of experience in data architecture, data engineering, or enterprise data platforms - Minimum 3-5 years of hands-on experience in GCP Data Service - Proficient in: BigQuery, Cloud Storage, Dataflow, Pub/Sub, Composer, Cloud SQL/Spanner - Python / Java / SQL - Data modeling (OLTP, OLAP, Star/Snowflake schema) - Experience with real-time data processing, streaming architectures, and batch ETL pipelines - Good understanding of IAM, networking, security models, and cost optimization on GCP - Prior experience in leading cloud data transformation projects - Excellent communication and stakeholder management skills Preferred Qualifications: - GCP Professional Data Engineer / Architect Certification - Experience with Terraform, CI/CD, GitOps, Looker / Data Studio / Tableau for analytics - Exposure to AI/ML use cases and MLOps on GCP - Experience working in agile environments and client-facing roles What We Offer: - Opportunity to work on large-scale data modernization projects with global clients - A fast-growing company with a strong tech and people culture - Competitive salary, benefits, and flexibility - Collaborative environment that values innovation and leadership,

Posted 2 weeks ago

Apply

6.0 - 10.0 years

0 Lacs

chennai, tamil nadu

On-site

As a Data Modeling Engineer specializing in Near Real-time Reporting, you will be responsible for creating robust and optimized schemas to facilitate near real-time data flows for operational and analytical purposes within Google Cloud environments. Your primary focus will be on designing models that ensure agility, speed, and scalability to support high-throughput, low-latency data access needs. Your key responsibilities will include designing data models that align with streaming pipelines, developing logical and physical models tailored for near real-time reporting, implementing strategies such as caching, indexing, and materialized views to enhance performance, and ensuring data integrity, consistency, and schema quality during rapid changes. To excel in this role, you must possess experience in building data models for real-time or near real-time reporting systems, hands-on expertise with GCP platforms such as BigQuery, CloudSQL, and AlloyDB, and a solid understanding of pub/sub, streaming ingestion frameworks, and event-driven design. Additionally, proficiency in indexing strategies and adapting schemas in high-velocity environments is crucial. Preferred skills for this position include exposure to monitoring, alerting, and observability tools, as well as functional familiarity with financial reporting workflows. Moreover, soft skills like proactive adaptability in fast-paced data environments, effective verbal and written communication, and a collaborative, solution-focused mindset will be highly valued. By joining our team, you will have the opportunity to design the foundational schema for mission-critical real-time systems, contribute to the performance and reliability of enterprise data workflows, and be part of a dynamic GCP-focused engineering team. Skills required for this role include streaming ingestion frameworks, BigQuery, reporting, modeling, AlloyDB, pub/sub, CloudSQL, Google Cloud Platform (GCP), data management, real-time reporting, indexing strategies, and event-driven design.,

Posted 2 weeks ago

Apply

5.0 - 8.0 years

6 - 14 Lacs

Pune

Work from Office

Job Summary: We are looking for a highly skilled and experienced Platform & DevOps Engineer to join our team. The ideal candidate will be responsible for managing and supporting DevOps tools, ensuring smooth CI/CD pipeline implementation, and maintaining infrastructure on Google Cloud Platform (GCP) . This role requires expertise in Jenkins, Terraform, Docker, Kubernetes (GKE), and security best practices . Experience in the banking industry is a plus. The candidate should be able to work independently, troubleshoot production issues efficiently, and be flexible with work shifts. Key Responsibilities: Design, implement, and maintain CI/CD pipelines using Jenkins and other DevOps tools. Manage and support Terraform-based infrastructure as code (IaC) for scalable deployments. Work with GCP products such as GCE, GKE, BigQuery, Pub/Sub, Monitoring, and Alerting . Collaborate with development and operations teams to enhance integration and deployment processes. Build and manage container images using Packer and Docker , ensuring efficient image rotation strategies. Monitor systems, respond to alerts, and troubleshoot production issues promptly. Ensure infrastructure security, compliance, and best practices are maintained. Provide technical guidance to development teams on DevOps tools and processes. Implement and support GitOps best practices, including repository configurations like code owners and webhooks . Document processes, configurations, and best practices for operational efficiency. Stay updated with the latest DevOps technologies and trends , continuously improving existing practices. Required Skills & Qualifications: Proficiency in scripting and automation using Bash, Python, or Groovy . Hands-on experience with Jenkins, Terraform, and GCP infrastructure management. Strong knowledge of containerization (Docker) and orchestration tools like Kubernetes (GKE) and Helm . Familiarity with disaster recovery, backups, and troubleshooting production issues . Solid understanding of infrastructure security, compliance, and monitoring best practices . Experience with image creation and management using Packer and Docker . Prior exposure to banking industry processes and regulations is an advantage. Excellent problem-solving, communication, and teamwork skills. Ability to work independently and handle multiple priorities in a fast-paced environment.

Posted 2 weeks ago

Apply

8.0 - 12.0 years

35 - 50 Lacs

Greater Noida

Work from Office

Role & responsibilities Build F5 Distributed Cloud data system and management systems Design/develop/enhance data, analytics, AI/Gen AI powered service on SaaS platform Design/develop/enhance telemetry and metrics pipeline and services Work closely with product, marketing, operation, platform, and customer support team to create innovative solution for Cloud product delivery Preferred candidate profile Bachelors degree in computer science or equivalent professional experience (7+ years). Proficiency in Cloud native development and programming languages such as GO, Java, Python Experience with data/stream processing (e.g., Kafka, Pub Sub, Dataflow, Vector, Spark, Flink), database and data warehouse (Clickhouse, BigQuery, StarRocks, Elasticsearch, Redis) Experience with logs, metrics, telemetry, Prometheus, Open Telemetry Experience with data system quality, monitoring and performance Experience in SaaS multi-tenancy, onboarding, metering & billing, monitoring & alerting Experience with container and orchestration technologies, Kubernetes and Microservice Experience with automation and cloud Infra, tooling, workload, modern CI/CD

Posted 2 weeks ago

Apply

8.0 - 12.0 years

35 - 50 Lacs

Faridabad

Work from Office

Role & responsibilities Build F5 Distributed Cloud data system and management systems Design/develop/enhance data, analytics, AI/Gen AI powered service on SaaS platform Design/develop/enhance telemetry and metrics pipeline and services Work closely with product, marketing, operation, platform, and customer support team to create innovative solution for Cloud product delivery Preferred candidate profile Bachelors degree in computer science or equivalent professional experience (7+ years). Proficiency in Cloud native development and programming languages such as GO, Java, Python Experience with data/stream processing (e.g., Kafka, Pub Sub, Dataflow, Vector, Spark, Flink), database and data warehouse (Clickhouse, BigQuery, StarRocks, Elasticsearch, Redis) Experience with logs, metrics, telemetry, Prometheus, Open Telemetry Experience with data system quality, monitoring and performance Experience in SaaS multi-tenancy, onboarding, metering & billing, monitoring & alerting Experience with container and orchestration technologies, Kubernetes and Microservice Experience with automation and cloud Infra, tooling, workload, modern CI/CD

Posted 2 weeks ago

Apply

8.0 - 12.0 years

35 - 50 Lacs

Chittoor

Work from Office

Role & responsibilities Build F5 Distributed Cloud data system and management systems Design/develop/enhance data, analytics, AI/Gen AI powered service on SaaS platform Design/develop/enhance telemetry and metrics pipeline and services Work closely with product, marketing, operation, platform, and customer support team to create innovative solution for Cloud product delivery Preferred candidate profile Bachelors degree in computer science or equivalent professional experience (7+ years). Proficiency in Cloud native development and programming languages such as GO, Java, Python Experience with data/stream processing (e.g., Kafka, Pub Sub, Dataflow, Vector, Spark, Flink), database and data warehouse (Clickhouse, BigQuery, StarRocks, Elasticsearch, Redis) Experience with logs, metrics, telemetry, Prometheus, Open Telemetry Experience with data system quality, monitoring and performance Experience in SaaS multi-tenancy, onboarding, metering & billing, monitoring & alerting Experience with container and orchestration technologies, Kubernetes and Microservice Experience with automation and cloud Infra, tooling, workload, modern CI/CD

Posted 2 weeks ago

Apply

8.0 - 12.0 years

35 - 50 Lacs

Ghaziabad

Work from Office

Role & responsibilities Build F5 Distributed Cloud data system and management systems Design/develop/enhance data, analytics, AI/Gen AI powered service on SaaS platform Design/develop/enhance telemetry and metrics pipeline and services Work closely with product, marketing, operation, platform, and customer support team to create innovative solution for Cloud product delivery Preferred candidate profile Bachelors degree in computer science or equivalent professional experience (7+ years). Proficiency in Cloud native development and programming languages such as GO, Java, Python Experience with data/stream processing (e.g., Kafka, Pub Sub, Dataflow, Vector, Spark, Flink), database and data warehouse (Clickhouse, BigQuery, StarRocks, Elasticsearch, Redis) Experience with logs, metrics, telemetry, Prometheus, Open Telemetry Experience with data system quality, monitoring and performance Experience in SaaS multi-tenancy, onboarding, metering & billing, monitoring & alerting Experience with container and orchestration technologies, Kubernetes and Microservice Experience with automation and cloud Infra, tooling, workload, modern CI/CD

Posted 2 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies