Jobs
Interviews

334 Pubsub Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 - 8.0 years

12 - 22 Lacs

chennai

Work from Office

Role & responsibilities Job Summary: As a GCP Data Engineer you will be responsible for developing, optimizing, and maintaining data pipelines and infrastructure. Your expertise in SQL and Python will be instrumental in managing and transforming data, while your familiarity with cloud technologies will be considered an asset as we explore opportunities to enhance data engineering processes. Job Description: Building scalable Data Pipelines Design, implement, and maintain end-to-end data pipelines to efficiently extract, transform, and load (ETL) data from diverse sources. Ensure data pipelines are reliable, scalable, and performance oriented. SQL Expertise: Write and optimize complex SQL queries for data extraction, transformation, and reporting. Collaborate with analysts and data scientists to provide structured data for analysis. Cloud Platform Experience: Utilize cloud services to enhance data processing and storage capabilities. Work towards the integration of tools into the data ecosystem. Documentation and Collaboration: Document data pipelines, procedures, and best practices to facilitate knowledge sharing. Collaborate closely with cross-functional teams to understand data requirements and deliver solutions. Required skills: 4+ years of experience with SQL, Python, 4+ GCP BigQuery, DataFlow, GCS, Dataproc. 4+ years of experience building out data pipelines from scratch in a highly distributed and fault-tolerant manner. Comfortable with a broad array of relational and non-relational databases. Proven track record of building applications in a data-focused role (Cloud and Traditional Data Warehouse) Experience with CloudSQL, Cloud Functions and Pub/Sub, Cloud Composer etc., Inquisitive, proactive, and interested in learning new tools and techniques. Familiarity with big data and machine learning tools and platforms. Comfortable with open source technologies including Apache Spark, Hadoop, Kafka. Strong oral, written and interpersonal communication skills Comfortable working in a dynamic environment where problems are not always well-defined. We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, sex, gender, gender expression, sexual orientation, age, marital status, veteran status, or disability status.

Posted 19 hours ago

Apply

8.0 - 12.0 years

30 - 42 Lacs

hyderabad, pune, bengaluru

Work from Office

We are seeking a Technical Lead with strong Application Development expertise in Google Cloud Platform (GCP). The successful candidate will provide technical leadership in designing and implementing robust, scalable cloud-based solutions. If you are an experienced professional passionate about GCP technologies and committed to staying abreast of emerging trends, apply today. Responsibilities Design, develop, and deploy cloud-based solutions using GCP, establishing and adhering to cloud architecture standards and best practices Hands on coding experience in building Java Applications using GCP Native Services like GKE, CloudRun, Functions, Firestore, CloudSQL, PubSub, etc. Develop low-level application architecture designs based on enterprise standards Choose appropriate GCP services meeting functional and non-functional requirements Demonstrate comprehensive knowledge with GCP PaaS, Serverless, and Database services Provide technical leadership to development and infrastructure teams, guiding them throughout the project lifecycle Ensure all cloud-based solutions comply with security and regulatory standards Enhance cloud-based solutions optimizing performance, cost, and scalability Stay up-to-date with the latest cloud technologies and trends in the industry Familiarity with GCP GenAI solutions and models including Vertex.ai, Codebison, and Gemini models is preferred, but not required Having hands on experience in front end technologies like Angular or React will be added advantage Requirements Bachelor's or Master's degree in Computer Science, Information Technology, or a similar field Must have 8 + years of extensive experience in designing, implementing, and maintaining applications on GCP Comprehensive expertise in GCP services such as GKE, Cloudrun, Functions, Cloud SQL, Firestore, Firebase, Apigee, GCP App Engine, Gemini Code Assist, Vertex AI, Spanner, Memorystore, Service Mesh, and Cloud Monitoring Solid understanding of cloud security best practices and experience in implementing security controls in GCP Thorough understanding of cloud architecture principles and best practices Experience with automation and configuration management tools like Terraform and a sound understanding of DevOps principles Proven leadership skills and the ability to mentor and guide a technical team

Posted 19 hours ago

Apply

7.0 - 12.0 years

7 - 17 Lacs

pune, chennai, bengaluru

Work from Office

• Handson experience in objectoriented programming using Python, PySpark, APIs, SQL, BigQuery, GCP • Building data pipelines for huge volume of data • Dataflow Dataproc and BigQuery • Deep understanding of ETL concepts

Posted 20 hours ago

Apply

8.0 - 12.0 years

25 - 37 Lacs

pune, bengaluru

Work from Office

re looking for an experienced GCP Technical Lead to architect, design, and lead the development of scalable cloud-based solutions. The ideal candidate should have strong expertise in Google Cloud Platform (GCP) , data engineering, and modern cloud-native architectures, along with the ability to mentor a team of engineers. Key Responsibilities Lead the design and development of GCP-based solutions (BigQuery, Dataflow, Composer, Pub/Sub, GKE, etc.) Define cloud architecture best practices and ensure adherence to security, scalability, and performance standards. Collaborate with stakeholders to understand requirements and translate them into technical designs & roadmaps . Lead and mentor a team of cloud/data engineers, providing guidance on technical challenges. Implement and optimize ETL/ELT pipelines , data lake, and data warehouse solutions on GCP. Drive DevOps/CI-CD practices using Cloud Build, Terraform, or similar tools. Ensure cost optimization, monitoring, and governance within GCP environments. Work with cross-functional teams on cloud migrations and modernization projects . Required Skills & Qualifications Strong experience in GCP services : BigQuery, Dataflow, Pub/Sub, Cloud Storage, Composer, GKE, etc. Expertise in data engineering, ETL, and cloud-native development . Hands-on experience with Python, SQL, and Shell scripting . Knowledge of Terraform, Kubernetes, and CI/CD pipelines . Familiarity with data security, IAM, and compliance on GCP . Proven experience in leading technical teams and delivering large-scale cloud solutions. Excellent problem-solving, communication, and leadership skills. Preferred GCP Professional Cloud Architect / Data Engineer certification . Experience with machine learning pipelines (Vertex AI, AI Platform)

Posted 20 hours ago

Apply

14.0 - 20.0 years

0 Lacs

maharashtra

On-site

As a Senior Architect - Data & Cloud at our company, you will be responsible for architecting, designing, and implementing end-to-end data pipelines and data integration solutions for varied structured and unstructured data sources and targets. You will need to have more than 15 years of experience in Technical, Solutioning, and Analytical roles, with 5+ years specifically in building and managing Data Lakes, Data Warehouse, Data Integration, Data Migration, and Business Intelligence/Artificial Intelligence solutions on Cloud platforms like GCP, AWS, or Azure. Key Responsibilities: - Translate business requirements into functional and non-functional areas, defining boundaries in terms of Availability, Scalability, Performance, Security, and Resilience. - Architect and design scalable data warehouse solutions on cloud platforms like Big Query or Redshift. - Work with various Data Integration and ETL technologies on Cloud such as Spark, Pyspark/Scala, Dataflow, DataProc, EMR, etc. - Deep knowledge of Cloud and On-Premise Databases like Cloud SQL, Cloud Spanner, Big Table, RDS, Aurora, DynamoDB, Oracle, Teradata, MySQL, DB2, SQL Server, etc. - Exposure to No-SQL databases like Mongo dB, CouchDB, Cassandra, Graph dB, etc. - Experience in using traditional ETL tools like Informatica, DataStage, OWB, Talend, etc. - Collaborate with internal and external stakeholders to design optimized data analytics solutions. - Mentor young talent within the team and contribute to building assets and accelerators. Qualifications Required: - 14-20 years of relevant experience in the field. - Strong understanding of Cloud solutions for IaaS, PaaS, SaaS, Containers, and Microservices Architecture and Design. - Experience with BI Reporting and Dashboarding tools like Looker, Tableau, Power BI, SAP BO, Cognos, Superset, etc. - Knowledge of Security features and Policies in Cloud environments like GCP, AWS, or Azure. - Ability to compare products and tools across technology stacks on Google, AWS, and Azure Cloud. In this role, you will lead multiple data engagements on GCP Cloud for data lakes, data engineering, data migration, data warehouse, and business intelligence. You will interface with multiple stakeholders within IT and business to understand data requirements and take complete responsibility for the successful delivery of projects. Additionally, you will have the opportunity to work in a high-growth startup environment, contribute to the digital transformation journey of customers, and collaborate with a diverse and proactive team of techies. Please note that flexible, remote working options are available to foster productivity and work-life balance.,

Posted 2 days ago

Apply

5.0 - 9.0 years

0 Lacs

hyderabad, telangana

On-site

Role Overview: You will be responsible for providing design and implementation services for projects utilizing Salesforce B2B Commerce Cloud. Your role will involve documenting and analyzing eCommerce features, integrations, and key business processes. Additionally, you will design, develop, configure, and deploy Salesforce B2B Commerce Cloud modules and components according to specific business needs. Building reports and dashboards to measure project success and providing training to customers on new implementations and customizations will also be part of your responsibilities. Effective management of customer expectations and communication of project status updates to various stakeholders will be crucial. Key Responsibilities: - Implement design and implementation services for projects using Salesforce B2B Commerce Cloud. - Document and analyze eCommerce features, integrations, and key business processes. - Design, develop, configure, and deploy Salesforce B2B Commerce Cloud modules and components tailored to specific business requirements. - Create reports and dashboards to evaluate project success. - Conduct training sessions for customers on new implementations and customizations. - Manage customer expectations effectively and communicate project status updates to internal and external stakeholders of varying management levels. Qualifications Required: - Solid hands-on experience in implementing Salesforce B2B Commerce Cloud. - Proficiency in Salesforce B2B Commerce product features such as product hierarchy, storefront association, Account Group, price list, page labels, and indexing. - Experience in designing responsive storefront Buyer Experiences. - Ability to develop and deliver training sessions effectively. - Customize Salesforce B2B Commerce Out of the Box (OOTB) flow. - Familiarity with Salesforce B2B Commerce views, models, pubsub, and page labels modification as per requirements.,

Posted 3 days ago

Apply

3.0 - 6.0 years

6 - 13 Lacs

gurugram

Work from Office

Design, code, unit test, system test, performance test, debug, implement and support Salesforce.com application and integrations.Excellent hands-on knowledge of LWC (Lightning Web Components), Events, PubSub Model, Aura PubSub Model, Component Life

Posted 3 days ago

Apply

2.0 - 4.0 years

4 - 9 Lacs

gurugram

Work from Office

Job Application Link: https://app.fabrichq.ai/jobs/b41f52d2-09e3-4f5e-9bf2-c587f6a0551f Job Summary: Junior Data Engineer role at Aays Analytics focused on designing and implementing data and analytics solutions on Google Cloud Platform (GCP). The position involves working with clients to reinvent their corporate finance functions through advanced analytics. Key responsibilities include architecture design, data modeling, and mentoring teams on GCP-based solutions. Key Responsibilities Design and drive end-to-end data and analytics solution architecture from concept to delivery on Google Cloud Platform (GCP) Design, develop, and support conceptual, logical, and physical data models for advanced analytics and ML-driven solutions Ensure integration of industry-accepted data architecture principles, standards, guidelines, and concepts Drive the design, sizing, provisioning, and setup of GCP environments and related services Provide mentoring and guidance on GCP-based data architecture to engineering, analytics, and business teams Review solution requirements and architecture for appropriate technology selection and integration Advise on emerging GCP trends and services, and recommend adoption strategies Participate in pre-sales engagements, PoCs, and contribute to thought leadership content Collaborate with founders and leadership team on cloud and data strategy Skills & Requirements Must Have Skills Azure Cloud Knowledge Data modeling techniques (Relational or Star or Snowflake or DataVault) Data engineering and ETL pipelines SQL and Python programming

Posted 3 days ago

Apply

14.0 - 20.0 years

0 Lacs

maharashtra

On-site

Role Overview: As a Principal Architect - Data & Cloud at Quantiphi, you will be responsible for leveraging your extensive experience in technical, solutioning, and analytical roles to architect and design end-to-end data pipelines and data integration solutions for structured and unstructured data sources and targets. You will play a crucial role in building and managing data lakes, data warehouses, data integration, and business intelligence/artificial intelligence solutions on Cloud platforms like GCP, AWS, and Azure. Your expertise will be instrumental in designing scalable data warehouse solutions on Big Query or Redshift and working with various data integration, storage, and pipeline tools on Cloud. Additionally, you will serve as a trusted technical advisor to customers, lead multiple data engagements on GCP Cloud, and contribute to the development of assets and accelerators. Key Responsibilities: - Possess more than 15 years of experience in technical, solutioning, and analytical roles - Have 5+ years of experience in building and managing data lakes, data warehouses, data integration, and business intelligence/artificial intelligence solutions on Cloud platforms like GCP, AWS, and Azure - Ability to understand business requirements, translate them into functional and non-functional areas, and define boundaries in terms of availability, scalability, performance, security, and resilience - Architect, design, and implement end-to-end data pipelines and data integration solutions for structured and unstructured data sources and targets - Work with distributed computing and enterprise environments like Hadoop and Cloud platforms - Proficient in various data integration and ETL technologies on Cloud such as Spark, Pyspark/Scala, Dataflow, DataProc, EMR, etc. - Deep knowledge of Cloud and On-Premise databases like Cloud SQL, Cloud Spanner, Big Table, RDS, Aurora, DynamoDB, Oracle, Teradata, MySQL, DB2, SQL Server, etc. - Exposure to No-SQL databases like Mongo dB, CouchDB, Cassandra, Graph dB, etc. - Design scalable data warehouse solutions on Cloud with tools like S3, Cloud Storage, Athena, Glue, Sqoop, Flume, Hive, Kafka, Pub-Sub, Kinesis, Dataflow, DataProc, Airflow, Composer, Spark SQL, Presto, EMRFS, etc. - Experience with Machine Learning Frameworks like TensorFlow, Pytorch - Understand Cloud solutions for IaaS, PaaS, SaaS, Containers, and Microservices Architecture and Design - Good understanding of BI Reporting and Dashboarding tools like Looker, Tableau, Power BI, SAP BO, Cognos, Superset, etc. - Knowledge of security features and policies in Cloud environments like GCP, AWS, Azure - Work on business transformation projects for moving On-Premise data solutions to Cloud platforms - Serve as a trusted technical advisor to customers and solutions for complex Cloud and Data-related technical challenges - Be a thought leader in architecture design and development of cloud data analytics solutions - Liaise with internal and external stakeholders to design optimized data analytics solutions - Collaborate with SMEs and Solutions Architects from leading cloud providers to present solutions to customers - Support Quantiphi Sales and GTM teams from a technical perspective in building proposals and SOWs - Lead discovery and design workshops with potential customers globally - Design and deliver thought leadership webinars and tech talks with customers and partners - Identify areas for productization and feature enhancement for Quantiphi's product assets Qualifications Required: - Bachelor's or Master's degree in Computer Science, Information Technology, or related field - 14-20 years of experience in technical, solutioning, and analytical roles - Strong expertise in building and managing data lakes, data warehouses, data integration, and business intelligence/artificial intelligence solutions on Cloud platforms like GCP, AWS, and Azure - Proficiency in various data integration, ETL technologies on Cloud, and Cloud and On-Premise databases - Experience with Cloud solutions for IaaS, PaaS, SaaS, Containers, and Microservices Architecture and Design - Knowledge of BI Reporting and Dashboarding tools and security features in Cloud environments Additional Company Details: While technology is the heart of Quantiphi's business, the company attributes its success to its global and diverse culture built on transparency, diversity, integrity, learning, and growth. Working at Quantiphi provides you with the opportunity to be part of a culture that encourages innovation, excellence, and personal growth, fostering a work environment where you can thrive both professionally and personally. Joining Quantiphi means being part of a dynamic team of tech enthusiasts dedicated to translating data into tangible business value for clients. Flexible remote working options are available to promote productivity and work-life balance. ,

Posted 4 days ago

Apply

5.0 - 10.0 years

20 - 35 Lacs

bengaluru

Hybrid

EPAM has presence across 40+ countries globally with 55,000 + professionals & numerous delivery centers, Key locations are North America, Eastern Europe, Central Europe, Western Europe, APAC, Mid East & Development Centers in India (Hyderabad, Pune & Bangalore). Location: Bengaluru Work Mode: Hybrid (2-3 days office in a week) Full time Job Description: 5-14 Years of in Big Data & Data related technology experience Expert level understanding of distributed computing principles Expert level knowledge and experience in Apache Spark Hands on programming with Python Experience with building stream-processing systems, using technologies such as Apache Storm or Spark-Streaming Experience with integration of data from multiple data sources such as RDBMS (SQL Server, Oracle), ERP, Files Good understanding of SQL queries, joins, stored procedures, relational schemas Experience with NoSQL databases, such as HBase, Cassandra, MongoDB Knowledge of ETL techniques and frameworks Performance tuning of Spark Jobs Experience with native Cloud data services GCP Ability to lead a team efficiently Experience with designing and implementing Big data solutions Practitioner of AGILE methodology WE OFFER Opportunity to work on technical challenges that may impact across geographies Vast opportunities for self-development: online university, knowledge sharing opportunities globally, learning opportunities through external certifications Opportunity to share your ideas on international platforms Sponsored Tech Talks & Hackathons Possibility to relocate to any EPAM office for short and long-term projects Focused individual development Benefit package: • Health benefits, Medical Benefits• Retirement benefits• Paid time off• Flexible benefits Forums to explore beyond work passion (CSR, photography, painting, sports, etc

Posted 4 days ago

Apply

2.0 - 5.0 years

5 - 7 Lacs

noida

Work from Office

Key Responsibilities: Develop and maintain scalable full-stack applications using Java, Spring Boot, and Angular for building rich UI screens and custom/reusable components. Design and implement cloud-based solutions leveraging Google Cloud Platform (GCP) services such as BigQuery, Google Cloud Storage, Cloud Run, and PubSub. Manage and optimize CI/CD pipelines using Tekton to ensure smooth and efficient development workflows. Deploy and manage Google Cloud services using Terraform, ensuring infrastructure as code principles. Mentor and guide junior software engineers, fostering professional development and promoting systemic change across the development team. Collaborate with cross-functional teams to design, build, and maintain efficient, reusable, and reliable code. Drive best practices and improvements in software engineering processes, including coding standards, testing, and deployment strategies. Required Skills: Java/Spring Boot (5+ years): In-depth experience in developing backend services and APIs using Java and Spring Boot. Angular (3+ years): Proven ability to build rich, dynamic user interfaces and custom/reusable components using Angular. Google Cloud Platform (2+ years): Hands-on experience with GCP services like BigQuery, Google Cloud Storage, Cloud Run, and PubSub. CI/CD Pipelines (2+ years): Experience with tools like Tekton for automating build and deployment processes. Terraform (1-2 years): Experience in deploying and managing GCP services using Terraform. J2EE (5+ years): Strong experience in Java Enterprise Edition for building large-scale applications. Experience mentoring and delivering organizational change within a software development team. Mandatory Key Skills gcp,terraform,continuous integration,ci/cd,bigquery,google,google analytics,pubsub,software testing,software development,python,spark,sql,hadoop,hive,spring,hibernate,javascript,airflow,kafka,maven,microservices,api*,j2ee*,java*,spring boot*,angular*

Posted 4 days ago

Apply

4.0 - 8.0 years

3 - 3 Lacs

bengaluru

Work from Office

Key Responsibilities: Develop and maintain scalable full-stack applications using Java, Spring Boot, and Angular for building rich UI screens and custom/reusable components. Design and implement cloud-based solutions leveraging Google Cloud Platform (GCP) services such as BigQuery, Google Cloud Storage, Cloud Run, and PubSub. Manage and optimize CI/CD pipelines using Tekton to ensure smooth and efficient development workflows. Deploy and manage Google Cloud services using Terraform, ensuring infrastructure as code principles. Mentor and guide junior software engineers, fostering professional development and promoting systemic change across the development team. Collaborate with cross-functional teams to design, build, and maintain efficient, reusable, and reliable code. Drive best practices and improvements in software engineering processes, including coding standards, testing, and deployment strategies. Required Skills: Java/Spring Boot (5+ years): In-depth experience in developing backend services and APIs using Java and Spring Boot. Angular (3+ years): Proven ability to build rich, dynamic user interfaces and custom/reusable components using Angular. Google Cloud Platform (2+ years): Hands-on experience with GCP services like BigQuery, Google Cloud Storage, Cloud Run, and PubSub. CI/CD Pipelines (2+ years): Experience with tools like Tekton for automating build and deployment processes. Terraform (1-2 years): Experience in deploying and managing GCP services using Terraform. J2EE (5+ years): Strong experience in Java Enterprise Edition for building large-scale applications. Experience mentoring and delivering organizational change within a software development team. Mandatory Key Skills gcp,terraform,continuous integration,ci/cd,bigquery,google,google analytics,pubsub,software testing,software development,python,spark,sql,hadoop,hive,spring,hibernate,javascript,airflow,kafka,maven,microservices,api*,j2ee*,java*,spring boot*,angular*

Posted 4 days ago

Apply

5.0 - 9.0 years

13 - 16 Lacs

chennai

Work from Office

Key Responsibilities: Develop and maintain scalable full-stack applications using Java, Spring Boot, and Angular for building rich UI screens and custom/reusable components. Design and implement cloud-based solutions leveraging Google Cloud Platform (GCP) services such as BigQuery, Google Cloud Storage, Cloud Run, and PubSub. Manage and optimize CI/CD pipelines using Tekton to ensure smooth and efficient development workflows. Deploy and manage Google Cloud services using Terraform, ensuring infrastructure as code principles. Mentor and guide junior software engineers, fostering professional development and promoting systemic change across the development team. Collaborate with cross-functional teams to design, build, and maintain efficient, reusable, and reliable code. Drive best practices and improvements in software engineering processes, including coding standards, testing, and deployment strategies. Required Skills: Java/Spring Boot (5+ years): In-depth experience in developing backend services and APIs using Java and Spring Boot. Angular (3+ years): Proven ability to build rich, dynamic user interfaces and custom/reusable components using Angular. Google Cloud Platform (2+ years): Hands-on experience with GCP services like BigQuery, Google Cloud Storage, Cloud Run, and PubSub. CI/CD Pipelines (2+ years): Experience with tools like Tekton for automating build and deployment processes. Terraform (1-2 years): Experience in deploying and managing GCP services using Terraform. J2EE (5+ years): Strong experience in Java Enterprise Edition for building large-scale applications. Experience mentoring and delivering organizational change within a software development team. Mandatory Key Skillsgcp,terraform,continuous integration,ci/cd,bigquery,google,google analytics,pubsub,software testing,software development,python,spark,sql,hadoop,hive,spring,hibernate,javascript,airflow,kafka,maven,microservices,api*,j2ee*,java*,spring boot*,angular*

Posted 4 days ago

Apply

5.0 - 7.0 years

13 - 17 Lacs

bengaluru

Work from Office

Skilled Multiple GCP services - GCS, BigQuery, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer etc. Must have Python and SQL work experience & Proactive, collaborative and ability to respond to critical situation Ability to analyse data for functional business requirements & front face customer Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 5 to 7 years of relevant experience working as technical analyst with Big Query on GCP platform. Skilled in multiple GCP services - GCS, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer You love collaborative environments that use agile methodologies to encourage creative design thinking and find innovative ways to develop with cutting edge technologies Ambitious individual who can work under their own direction towards agreed targets/goals and with creative approach to work Preferred technical and professional experience Create up to 3 bullets maxitive individual with an ability to manage change and proven time management Proven interpersonal skills while contributing to team effort by accomplishing related results as needed Up-to-date technical knowledge by attending educational workshops, reviewing publications (encouraging then to focus on required skills)

Posted 5 days ago

Apply

2.0 - 4.0 years

5 - 10 Lacs

mumbai, mumbai suburban

Work from Office

You'llBackend Developer (Node.js | Microservices & Event-Driven Systems) Position: Backend Developer (Node.js) Experience Required: 2+ years of backend product development Location: Borivali West, Mumbai (Work from Office) Bachelor's degree in Engineering About DigiPlus IT At DigiPlus IT, we innovate and deliver enterprise-grade OSS/BSS, Banking & Financial Services, and Gaming backends on cloud-native, microservices-driven architectures. Were problem-solvers who specialize in designing platforms that scale to millions of users with low-latency, high availability, and secure backend systems. Role Overview We are looking for a Node.js Backend Developer who enjoys architecting and building secure, large-scale distributed systems. You will work on Microservices, Pub/Sub event pipelines, and resilient architectures deployed on AWS while migrating existing services from Java to Node.js. This role involves designing systems that harness Kafka, RabbitMQ, Cassandra, and advanced architecture patterns (Pub/Sub, CQRS, Event Sourcing, Saga, API Gateway) to enable scalability, resilience, and fault tolerance. What You'll Do Design & implement microservices in Node.js following domain-driven design (DDD) principles Build asynchronous event-driven backends leveraging Kafka / RabbitMQ Architect systems using Pub/Sub, Event Sourcing, CQRS, Saga orchestration for reliability & scale Deploy and scale backend services on AWS (ECS, EKS, Lambda, S3, CloudWatch, API Gateway) Design data pipelines and manage distributed storage using Cassandra & NoSQL stores Ensure high security and low latency for real-time applications like large-scale multiplayer games Work on analytics integration & third-party APIs for insights and personalization Lead the migration of Java services to Node.js microservices with zero downtime Embed best practices in testing, CI/CD automation, observability, logging, monitoring, and DevOps What Were Looking For 2+ years backend product development experience (Node.js) Strong understanding of distributed systems & architecture patterns: Pub/Sub Messaging (AWS SNS/SQS, Kafka, RabbitMQ) CQRS & Event Sourcing Saga for distributed transaction management API Gateway + Service Registry (Consul/Eureka/etc.) Hands-on with Cassandra or other distributed databases Cloud-native experience with AWS deployments and CI/CD workflows Experienced in scalability, resilience, and security best practices (FinTech/Gaming exposure is a huge plus) Strong foundation in system design, data pipelines, and API-first architecture A Mumbai-based developer willing to work onsite from our Borivali office Why Join DigiPlus IT? Work on secure, cloud-native architectures serving millions of users Build gaming and financial platforms that demand scale, speed, and reliability Gain hands-on exposure to microservices, distributed messaging, and advanced architecture patterns A culture of innovation, tech depth, and ownership

Posted 5 days ago

Apply

2.0 - 8.0 years

0 Lacs

hyderabad, telangana

On-site

As a Senior Data Engineer with 5-8 years of IT experience, including 2-3 years focused on GCP data services, you will be a valuable addition to our dynamic data and analytics team. Your primary responsibility will be to design, develop, and implement robust and insightful data-intensive solutions using GCP Cloud services. Your role will entail a deep understanding of data engineering, proficiency in SQL, and extensive experience with various GCP services such as BigQuery, DataFlow, DataStream, Pub/Sub, Dataproc, Cloud Storage, and other key GCP services for Data Pipeline Orchestration. You will be instrumental in the construction of a GCP native cloud data platform. Key Responsibilities: - Lead and contribute to the development, deployment, and lifecycle management of applications on GCP, utilizing services like Compute Engine, Kubernetes Engine (GKE), Cloud Functions, Cloud Run, Pub/Sub, BigQuery, Cloud SQL, Cloud Storage, and more. Required Skills and Qualifications: - Bachelor's degree in Computer Science, Information Technology, Data Analytics, or a related field. - 5-8 years of overall IT experience, with hands-on experience in designing and developing data applications on GCP Cloud. - In-depth expertise in GCP services and architectures, including Compute, Storage & Databases, Data & Analytics, and Operations & Monitoring. - Proven ability to translate business requirements into technical solutions. - Strong analytical, problem-solving, and critical thinking skills. - Effective communication and interpersonal skills for collaboration with technical and non-technical stakeholders. - Experience in Agile development methodology. - Ability to work independently, manage multiple priorities, and meet deadlines. Preferred Skills (Nice to Have): - Experience with other Hyperscalers. - Proficiency in Python or other scripting languages for data manipulation and automation. If you are a highly skilled and experienced Data Engineer with a passion for leveraging GCP data services to drive innovation, we invite you to apply for this exciting opportunity in Gurugram or Hyderabad.,

Posted 6 days ago

Apply

1.0 - 6.0 years

0 Lacs

chennai, tamil nadu

On-site

As a GCP Data Engineer, you will play a crucial role in the development, optimization, and maintenance of data pipelines and infrastructure. Your proficiency in SQL and Python will be pivotal in the management and transformation of data. Moreover, your familiarity with cloud technologies will be highly beneficial as we strive to improve our data engineering processes. You will be responsible for building scalable data pipelines. This involves designing, implementing, and maintaining end-to-end data pipelines to efficiently extract, transform, and load (ETL) data from various sources. It is essential to ensure that these data pipelines are reliable, scalable, and performance-oriented. Your expertise in SQL will be put to use as you write and optimize complex SQL queries for data extraction, transformation, and reporting purposes. Collaboration with analysts and data scientists will be necessary to provide structured data for analysis. Experience with cloud platforms, particularly GCP services such as BigQuery, DataFlow, GCS, and Postgres, will be valuable. Leveraging cloud services to enhance data processing and storage capabilities, as well as integrating tools into the data ecosystem, will be part of your responsibilities. Documenting data pipelines, procedures, and best practices will be essential for knowledge sharing within the team. You will collaborate closely with cross-functional teams to understand data requirements and deliver effective solutions. The ideal candidate for this role should have at least 3 years of experience with SQL and Python, along with a minimum of 1 year of experience with GCP services like BigQuery, DataFlow, GCS, and Postgres. Additionally, 2+ years of experience in building data pipelines from scratch in a highly distributed and fault-tolerant manner is required. Comfort with a variety of relational and non-relational databases is essential. Proven experience in building applications in a data-focused role, both in Cloud and Traditional Data Warehouse environments, is preferred. Familiarity with CloudSQL, Cloud Functions, Pub/Sub, Cloud Composer, and a willingness to learn new tools and techniques are desired qualities. Furthermore, being comfortable with big data and machine learning tools and platforms, including open-source technologies like Apache Spark, Hadoop, and Kafka, will be advantageous. Strong oral, written, and interpersonal communication skills are crucial for effective collaboration in a dynamic environment with undefined problems. If you are an inquisitive, proactive individual with a passion for data engineering and a desire to continuously learn and grow, we invite you to join our team in Chennai, Tamil Nadu, India.,

Posted 6 days ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

As a senior C# software engineer, you will collaborate with front-end developers, enterprise architects, and technical product owners within a dynamic and diverse international team. Your primary role will involve developing, managing, and aligning application architecture to meet business needs, ensuring the creation of a robust and adaptable global application. You will leverage techniques from software engineering, IT architecture, and agile project management to achieve these objectives. Your focus will be on software engineering in C#, with an exclusive emphasis on cloud applications rather than on-premise tool stacks. To excel in this role, you should possess a completed education or studies in computer science along with a minimum of 5 years of professional experience. Your expertise should include a strong command of object-oriented programming language C#, ASP .NET Core 3.1 & 6, as well as a deep understanding of microservices, pub-sub, REST, and SOLID principles. Familiarity with the DAPR framework (dapr.io), Docker, Infrastructure as Code, GIT, CI/CD pipelines, and related orchestration solutions such as Kubernetes is essential. Additionally, knowledge of common cloud patterns, distributed tracing (W3C), and various database technologies (both relational and non-relational) is required. In addition to technical skills, personal qualities such as curiosity, a passion for new technologies, a proactive approach to staying informed, and a willingness to innovate and push boundaries are highly valued. You should enjoy troubleshooting errors, possess excellent communication and coordination skills, demonstrate a high level of responsibility, have strong comprehension abilities, and be committed to continuous learning and growth in your role.,

Posted 6 days ago

Apply

5.0 - 10.0 years

18 - 33 Lacs

japan, chennai

Work from Office

C1X AdTech Pvt Ltd is a fast-growing product and engineering-driven AdTech company building next-generation advertising and marketing technology platforms. Our mission is to empower enterprise clients with the smartest marketing solutions, enabling seamless integration with personalization engines and delivering cross-channel marketing capabilities. We are dedicated to enhancing customer engagement and experiences while focusing on increasing Lifetime Value (LTV) through consistent messaging across all channels.Our engineering team spans front end (UI), back end (Java/Node.js APIs), Big Data, and DevOps , working together to deliver scalable, high-performance products for the digital advertising ecosystem. Role Overview As a Data Engineer , you will be a key member of our data engineering team, responsible for building and maintaining large-scale data products and infrastructure. Youll shape the next generation of data analytics tech stack by leveraging modern big data technologies. This role involves working closely with business stakeholders, product managers, and engineering teams to meet diverse data requirements that drive business insights and product innovation. Objectives Design, build, and maintain scalable data infrastructure for collection, storage, and processing. Enable easy access to reliable data for data scientists, analysts, and business users. Support data-driven decision-making and improve organizational efficiency through high-quality data products. Responsibilities Build large-scale batch and real-time data pipelines using frameworks like Apache Spark on AWS or GCP. Design, manage, and automate data flows between multiple data sources. Implement best practices for continuous integration, testing, and data quality assurance . Maintain data documentation, definitions, and governance practices. Optimize performance, scalability, and cost-effectiveness of data systems. Collaborate with stakeholders to translate business needs into data-driven solutions. Qualifications Bachelor’s degree in Computer Science, Engineering, or related field (exceptional coding performance on platforms like LeetCode/HackerRank may substitute). 2+ years’ experience working on full lifecycle Big Data projects. Strong foundation in data structures, algorithms, and software design principles . Proficiency in at least two programming languages – Python or Scala preferred. Experience with AWS services such as EMR, Lambda, S3, DynamoDB (GCP equivalents also relevant). Hands-on experience with Databricks Notebooks and Jobs API. Strong expertise in big data frameworks: Spark, MapReduce, Hadoop, Sqoop, Hive, HDFS, Airflow, Zookeeper . Familiarity with containerization (Docker) and workflow management tools (Apache Airflow) . Intermediate to advanced knowledge of SQL (relational + NoSQL databases like Postgres, MySQL, Redshift, Redis). Experience with SQL tuning, schema design, and analytical programming . Proficient in Git (version control) and collaborative workflows. Comfortable working across diverse technologies in a fast-paced, results-oriented environment .

Posted 6 days ago

Apply

10.0 - 18.0 years

45 - 60 Lacs

bengaluru

Hybrid

We are looking for a Software Engineering Manager to join our team at Collective Data Solutions (The Collective) , part of Kochava. This role involves both technical contributions, product development and team management , with a focus on building and scaling high-performance distributed systems for real-time data solutions in AdTech. Responsibilities : Lead and mentor a team of backend engineers delivering scalable, high-quality solutions. Technical Contributions in design, architecture, product roadmap and coding Drive roadmap execution and ensure delivery timelines. Build and maintain distributed systems and data pipelines at scale. Collaborate with global stakeholders to define technical vision and priorities. Hire, coach, and develop engineering talent. Ensure best practices in observability, performance, and reliability. Preferred candidate profile 10+ years of software engineering experience, with at least 2+ years in leadership. Strong hands-on expertise in Golang, Java, Python, or JavaScript . Experience in building distributed systems, microservices, and data pipelines . Familiarity with Kubernetes, Docker, and cloud environments (AWS/GCP) . Strong problem-solving skills, ownership mindset, and ability to deliver at scale. Comfortable working in a global team environment . Good to Have (Bonus Skills) : Google Cloud tools (Beam, PubSub, BigQuery, Dataflow). Spark/Hadoop. Contributions to open-source projects.

Posted 6 days ago

Apply

8.0 - 13.0 years

18 - 32 Lacs

noida

Hybrid

Job Description: We are seeking a skilled Java Developer to join our dynamic team. The ideal candidate will have expertise in Java development with a strong understanding of Hibernate, JPA, Spring Boot, and Microservices architecture . You will be responsible for designing, developing, and maintaining high-performance applications that are scalable and robust. Key Responsibilities: Develop, test, and maintain Java-based applications using Spring Boot and JPA . Design and implement Microservices and integrate them with RESTful APIs. Ensure code quality, performance, and responsiveness of applications. Collaborate with cross-functional teams to define, design, and ship new features. Troubleshoot and resolve software defects and issues in a timely manner. Follow best practices in software development, security, and version control. Mandatory Skills: Strong proficiency in Java. Expertise in Hibernate, JPA, and Spring Boot framework . Hands-on experience with Microservices and REST API development. Experience with Google Cloud Platform (GCP) services such as Pub-Sub and GKE (Google Kubernetes Engine). Knowledge of Kafka for event-driven architecture

Posted 6 days ago

Apply

5.0 - 10.0 years

0 - 1 Lacs

hyderabad

Work from Office

Job Title: Senior Data Analyst AdTech (Team Lead) Location: Hyderabad Experience Level: 5+ years Employment Type: Full-time Shift Timings: 5PM - 2AM IST About the Role: We are looking for a highly experienced and hands-on Senior Data Analyst (AdTech) to lead our analytics team. This role is ideal for someone with a strong background in log-level data handling , cross-platform data engineering , and a solid command of modern BI tools . You'll play a key role in building scalable data pipelines, leading analytics strategy, and mentoring a team of analysts. Key Responsibilities: Lead and mentor a team of data analysts, ensuring quality delivery and technical upskilling. Design, develop, and maintain scalable ETL/ELT pipelines using GCP tools (BigQuery, Dataflow, Cloud Composer, Cloud Functions, Pub/Sub). Ingest and process log-level data from platforms like Google Ad Manager, Google Analytics (GA4/UA), DV360, and other advertising and marketing tech sources. Build and optimize data pipelines from diverse sources via APIs, cloud connectors, and third-party tools (e.g., Supermetrics, Fivetran, Stitch). Integrate and manage data across multiple cloud platforms and data warehouses such as BigQuery, Snowflake, DOMO, and AWS (Redshift, S3). Own the creation of data models, data marts, and analytical layers to support dashboards and deep-dive analyses. Build and maintain scalable, intuitive dashboards using Looker Studio, Tableau, Power BI, or Looker. Partner with engineering, product, revenue ops, and client teams to gather requirements and drive strategic insights from data. Ensure data governance, security, and quality standards are followed across the analytics ecosystem. Required Qualifications: 5+ years of experience in data analytics or data engineering roles, with at least 12 years in a leadership capacity. Deep expertise working with log-level AdTech data—Google Ad Manager, Google Analytics, GA4, programmatic delivery logs, and campaign-level data. Strong knowledge of SQL and Google BigQuery for large-scale data querying and transformation. Hands-on experience building data pipelines using GCP tools (Dataflow, Composer, Cloud Functions, Pub/Sub, Cloud Storage). Proven experience integrating data from various APIs and third-party connectors. Experience working with multiple data warehouses: Snowflake, DOMO, AWS Redshift, etc. Strong skills in data visualization tools: Looker Studio, Tableau, Power BI, or Looker. Excellent stakeholder communication and documentation skills. Preferred Qualifications: Scripting experience in Python or JavaScript for automation and custom ETL development. Familiarity with version control (e.g., Git), CI/CD pipelines, and workflow orchestration. Exposure to privacy regulations and consent-based data handling in digital advertising (GDPR, CCPA). Experience working in agile environments and managing delivery timelines across multiple stakeholders.

Posted 6 days ago

Apply

1.0 - 5.0 years

0 Lacs

karnataka

On-site

As a Backend Engineer at our fast-growing startup, you will play a crucial role in building a robust and scalable data platform that supports key commerce functions such as marketing, pricing, and inventory management. Collaborating with various teams, you will be responsible for designing backend services, data pipelines, and APIs that can efficiently handle high-volume operations across multiple markets. Emphasizing clean architecture, system reliability, and integration with AI/ML components, you will contribute to use cases like dynamic pricing and demand forecasting. Your key outcomes will include: - Writing clean, efficient, and maintainable code using modern languages and frameworks like Python, Node.js, and Java, while demonstrating a strong understanding of data structures and algorithms. - Developing robust applications with high test coverage through consistent unit and integration testing practices. - Designing data structures that are scalable and adaptable to different geographies and evolving marketplace requirements. - Leveraging AI tools for coding, debugging, and prototyping to enhance daily development workflows. - Balancing priorities and delivering results across multiple initiatives in a fast-paced and collaborative environment. - Embracing new technologies, including agentic systems, ML platforms, and domain-specific tools, for ongoing skill development. Preferred qualifications for this role include: - Previous experience in building backend systems for commerce, retail, or logistics platforms. - Exposure to distributed systems or event-driven architectures such as Kafka or Pub/Sub. - Familiarity with cloud-native development on AWS or GCP, including the use of managed services. - Understanding of MLOps workflows or experience collaborating with data science teams. - Experience in designing APIs that cater to diverse frontend and third-party integrations. - Contributions to open-source projects, technical blogs, or internal tooling. - Knowledge of agentic frameworks like LangChain and AutoGen, or a keen interest in AI-driven automation. In addition to possessing skills in Kafka, machine learning, AWS or GCP, MLOps, agentic frameworks, LangChain, AutoGen, Pub/Sub, Python, Node.js, and Java, you will thrive in this role by demonstrating a proactive and collaborative approach in a dynamic work environment. Join us in revolutionizing decision-making processes across various business functions with cutting-edge technology and innovative solutions.,

Posted 1 week ago

Apply

5.0 - 7.0 years

13 - 17 Lacs

chennai

Work from Office

Skilled Multiple GCP services - GCS, BigQuery, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer etc. Must have Python and SQL work experience & Proactive, collaborative and ability to respond to critical situation Ability to analyse data for functional business requirements & front face customer Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 5 to 7 years of relevant experience working as technical analyst with Big Query on GCP platform. Skilled in multiple GCP services - GCS, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer You love collaborative environments that use agile methodologies to encourage creative design thinking and find innovative ways to develop with cutting edge technologies Ambitious individual who can work under their own direction towards agreed targets/goals and with creative approach to work Preferred technical and professional experience Create up to 3 bullets maxitive individual with an ability to manage change and proven time management Proven interpersonal skills while contributing to team effort by accomplishing related results as needed Up-to-date technical knowledge by attending educational workshops, reviewing publications (encouraging then to focus on required skills)

Posted 1 week ago

Apply

3.0 - 6.0 years

6 - 10 Lacs

chennai

Work from Office

About The Role Skills: NodeJS, NGINX, Elasticsearch, HAProxy, PubSub, CSS, AngularJS, scripting (Python, Shell)

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies