Home
Jobs
Companies
Resume

125 Pubsub Jobs

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 7.0 years

7 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

Skilled Multiple GCP services - GCS, BigQuery, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer etc. Must have Python and SQL work experience & Proactive, collaborative and ability to respond to critical situation Ability to analyse data for functional business requirements & front face customer Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 5 to 7 years of relevant experience working as technical analyst with Big Query on GCP platform. Skilled in multiple GCP services - GCS, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer You love collaborative environments that use agile methodologies to encourage creative design thinking and find innovative ways to develop with cutting edge technologies Ambitious individual who can work under their own direction towards agreed targets/goals and with creative approach to work Preferred technical and professional experience Create up to 3 bullets maxitive individual with an ability to manage change and proven time management Proven interpersonal skills while contributing to team effort by accomplishing related results as needed Up-to-date technical knowledge by attending educational workshops, reviewing publications (encouraging then to focus on required skills)

Posted 20 hours ago

Apply

5.0 - 7.0 years

7 - 9 Lacs

Gurugram

Work from Office

Naukri logo

Skilled Multiple GCP services - GCS, Big Query, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer etc. Must have Python and SQL work experience & Proactive, collaborative and ability to respond to critical situation Ability to analyse data for functional business requirements & front face customer Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 5 to 7 years of relevant experience working as technical analyst with Big Query on GCP platform. Skilled in multiple GCP services - GCS, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer You love collaborative environments that use agile methodologies to encourage creative design thinking and find innovative ways to develop with cutting edge technologies Ambitious individual who can work under their own direction towards agreed targets/goals and with creative approach to work Preferred technical and professional experience Create up to 3 bullets MA Intuitive individual with an ability to manage change and proven time management Proven interpersonal skills while contributing to team effort by accomplishing related results as needed Up-to-date technical knowledge by attending educational workshops, reviewing publications (encouraging then to focus on required skills

Posted 20 hours ago

Apply

5.0 - 7.0 years

13 - 17 Lacs

Bengaluru

Work from Office

Naukri logo

Skilled Multiple GCP services - GCS, BigQuery, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer etc. Must have Python and SQL work experience & Proactive, collaborative and ability to respond to critical situation Ability to analyse data for functional business requirements & front face customer Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 5 to 7 years of relevant experience working as technical analyst with Big Query on GCP platform. Skilled in multiple GCP services - GCS, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer You love collaborative environments that use agile methodologies to encourage creative design thinking and find innovative ways to develop with cutting edge technologies Ambitious individual who can work under their own direction towards agreed targets/goals and with creative approach to work Preferred technical and professional experience Create up to 3 bullets maxitive individual with an ability to manage change and proven time management Proven interpersonal skills while contributing to team effort by accomplishing related results as needed Up-to-date technical knowledge by attending educational workshops, reviewing publications (encouraging then to focus on required skills)

Posted 4 days ago

Apply

5.0 - 8.0 years

15 - 27 Lacs

Bengaluru

Work from Office

Naukri logo

Job Summary Participates in reviewing, analyzing, and modifying client/server applications and systems. Job Requirements Develop and maintain integrations between CDM and other systems, such as CRM, order management, and other boundary systems. Customize and extend MDM functionality using Oracle tools, such as Oracle Integration Cloud, Oracle Application Composer, Oracle Visual Builder. Perform data migration activities, including data extraction, transformation, and loading into the MDM system. Conduct unit testing, system testing, and support user acceptance testing for MDM-related developments. Troubleshoot and resolve technical issues related to MDM configuration, customization, and integration. Collaborate with infrastructure, operations and security teams to ensure the availability, performance, and security of the MDM system. Strong technical expertise in Oracle tools and technologies, including Oracle Integration Cloud(OIC),Oracle SQL, PL/SQL, Python and Groovy scripts(or similar scripting language) . Hands-on experience with Oracle ERP Cloud CDM configuration, customization, and extension . In-depth knowledge of data integration techniques, such as web services, REST APIs, Pub-sub and file-based data imports/exports . Familiarity with data migration methodologies and tools, including data extraction, transformation, and loading (ETL) processes. Good to have BI Publisher, OTBI, Product Data Hub, Contact Master and Enterprise Data Management skills . Education IC - Typically requires a minimum of 5 years of related experience.Mgr & Exec - Typically requires a minimum of 3 years of related experience.

Posted 4 days ago

Apply

5.0 - 10.0 years

15 - 30 Lacs

Pune, Chennai, Bengaluru

Hybrid

Naukri logo

Role - GCP Data Engineer Experience:4+ years Preferred - Data Engineering Background Location - Bangalore, Chennai, Pune, Gurgaon, Kolkata Required Skills - GCP DE Experience, Big query, SQL, Cloud compressor/Python, Cloud functions, Dataproc+pyspark, Python injection, Dataflow+PUB/SUB Here is the job description for the same - Job Requirement: Have Implemented and Architected solutions on Google Cloud Platform using the components of GCP Experience with Apache Beam/Google Dataflow/Apache Spark in creating end to end data pipelines. Experience in some of the following: Python, Hadoop, Spark, SQL, Big Query, Big Table Cloud Storage, Datastore, Spanner, Cloud SQL, Machine Learning. Experience programming in Java, Python, etc. Expertise in at least two of these technologies: Relational Databases, Analytical Databases, NoSQL databases. Certified in Google Professional Data Engineer/ Solution Architect is a major Advantage Skills Required: 3~13 years of experience in IT or professional services experience in IT delivery or large-scale IT analytics projects 3+ years of expertise knowledge of Google Cloud Platform; the other cloud platforms are nice to have. Expert knowledge in SQL development. Expertise in building data integration and preparation tools using cloud technologies (like Snaplogic, Google Dataflow, Cloud Dataprep, Python, etc). Experience with Apache Beam/Google Dataflow/Apache Spark in creating end to end data pipelines. Experience in some of the following: Python, Hadoop, Spark, SQL, Big Query, Big Table Cloud Storage, Datastore, Spanner, Cloud SQL, Machine Learning. Experience programming in Java, Python, etc. Identify downstream implications of data loads/migration (e.g., data quality, regulatory, etc.) Implement data pipelines to automate the ingestion, transformation, and augmentation of data sources, and provide best practices for pipeline operations. Capability to work in a rapidly changing business environment and to enable simplified user access to massive data by building scalable data solutions Advanced SQL writing and experience in data mining (SQL, ETL, data warehouse, etc.) and using databases in a business environment with complex datasets

Posted 6 days ago

Apply

5.0 - 7.0 years

13 - 17 Lacs

Hyderabad

Work from Office

Naukri logo

Skilled Multiple GCP services - GCS, BigQuery, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer etc. Must have Python and SQL work experience & Proactive, collaborative and ability to respond to critical situation Ability to analyse data for functional business requirements & front face customer Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 5 to 7 years of relevant experience working as technical analyst with Big Query on GCP platform. Skilled in multiple GCP services - GCS, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer You love collaborative environments that use agile methodologies to encourage creative design thinking and find innovative ways to develop with cutting edge technologies Ambitious individual who can work under their own direction towards agreed targets/goals and with creative approach to work Preferred technical and professional experience Create up to 3 bullets maxitive individual with an ability to manage change and proven time management Proven interpersonal skills while contributing to team effort by accomplishing related results as needed Up-to-date technical knowledge by attending educational workshops, reviewing publications (encouraging then to focus on required skills)

Posted 1 week ago

Apply

3.0 - 6.0 years

10 - 14 Lacs

Gurugram

Work from Office

Naukri logo

As a Software Developer you'll participate in many aspects of the software development lifecycle, such as design, code implementation, testing, and support. You will create software that enables your clients' hybrid-cloud and AI journeys. Your primary responsibilities include: Comprehensive Feature Development and Issue ResolutionWorking on the end to end feature development and solving challenges faced in the implementation. Stakeholder Collaboration and Issue ResolutionCollaborate with key stakeholders, internal and external, to understand the problems, issues with the product and features and solve the issues as per SLAs defined. Continuous Learning and Technology IntegrationBeing eager to learn new technologies and implementing the same in feature development Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise SQL authoring, query, and cost optimisation primarily on Big Query. Python as an object-oriented scripting language. Data pipeline, data streaming and workflow management toolsDataflow, Pub Sub, Hadoop, spark-streaming Version control systemGIT & Preferable knowledge of Infrastructure as CodeTerraform. Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases. Experience performing root cause analysis on internal and external data and processes to answer specific business questions Preferred technical and professional experience Experience building and optimising data pipelines, architectures and data sets. Build processes supporting data transformation, data structures, metadata, dependency and workload management. Working knowledge of message queuing, stream processing, and highly scalable data stores. Experience supporting and working with cross-functional teams in a dynamic environment. We are looking for a candidate with experience in a Data Engineer role, who are also familiar with Google Cloud Platform

Posted 1 week ago

Apply

5.0 - 7.0 years

13 - 17 Lacs

Bengaluru

Work from Office

Naukri logo

Skilled Multiple GCP services - GCS, BigQuery, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer etc. Must have Python and SQL work experience & Proactive, collaborative and ability to respond to critical situation Ability to analyse data for functional business requirements & front face customer Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 5 to 7 years of relevant experience working as technical analyst with Big Query on GCP platform. Skilled in multiple GCP services - GCS, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer You love collaborative environments that use agile methodologies to encourage creative design thinking and find innovative ways to develop with cutting edge technologies Ambitious individual who can work under their own direction towards agreed targets/goals and with creative approach to work Preferred technical and professional experience Create up to 3 bullets maxitive individual with an ability to manage change and proven time management Proven interpersonal skills while contributing to team effort by accomplishing related results as needed Up-to-date technical knowledge by attending educational workshops, reviewing publications (encouraging then to focus on required skills)

Posted 1 week ago

Apply

3.0 - 6.0 years

10 - 14 Lacs

Bengaluru

Work from Office

Naukri logo

As an entry level Application Developer at IBM, you'll work with clients to co-create solutions to major real-world challenges by using best practice technologies, tools, techniques, and products to translate system requirements into the design and development of customized systems. In your role, you may be responsible for: Working across the entire system architecture to design, develop, and support high quality, scalable products and interfaces for our clients Collaborate with cross-functional teams to understand requirements and define technical specifications for generative AI projects Employing IBM's Design Thinking to create products that provide a great user experience along with high performance, security, quality, and stability Working with a variety of relational databases (SQL, Postgres, DB2, MongoDB), operating systems (Linux, Windows, iOS, Android), and modern UI frameworks (Backbone.js, AngularJS, React, Ember.js, Bootstrap, and JQuery) Creating everything from mockups and UI components to algorithms and data structures as you deliver a viable product Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise SQL authoring, query, and cost optimisation primarily on Big Query. Python as an object-oriented scripting language. Data pipeline, data streaming and workflow management toolsDataflow, Pub Sub, Hadoop, spark-streaming Version control systemGIT & Preferable knowledge of Infrastructure as CodeTerraform. Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases. Experience performing root cause analysis on internal and external data and processes to answer specific business questions. Preferred technical and professional experience Experience building and optimising data pipelines, architectures and data sets. Build processes supporting data transformation, data structures, metadata, dependency and workload management. Working knowledge of message queuing, stream processing, and highly scalable data stores. Experience supporting and working with cross-functional teams in a dynamic environment. We are looking for a candidate with experience in a Data Engineer role, who are also familiar with Google Cloud Platform.

Posted 1 week ago

Apply

3.0 - 5.0 years

32 - 40 Lacs

Pune

Work from Office

Naukri logo

: Job TitleSenior Engineer, VP LocationPune, India Role Description Engineer is responsible for managing or performing work across multiple areas of the bank's overall IT Platform/Infrastructure including analysis, development, and administration. It may also involve taking functional oversight of engineering delivery for specific departments. Work includes: Planning and developing entire engineering solutions to accomplish business goals Building reliability and resiliency into solutions with appropriate testing and reviewing throughout the delivery lifecycle Ensuring maintainability and reusability of engineering solutions Ensuring solutions are well architected and can be integrated successfully into the end-to-end business process flow Reviewing engineering plans and quality to drive re-use and improve engineering capability Participating in industry forums to drive adoption of innovative technologies, tools and solutions in the Bank Engineer is responsible for managing or performing work across multiple areas of the bank's overall IT Platform/Infrastructure including analysis, development, and administration. It may also involve taking functional oversight of engineering delivery for specific departments. Work includes: Planning and developing entire engineering solutions to accomplish business goals Building reliability and resiliency into solutions with appropriate testing and reviewing throughout the delivery lifecycle Ensuring maintainability and reusability of engineering solutions Ensuring solutions are well architected and can be integrated successfully into the endto-end business process flow Reviewing engineering plans and quality to drive re-use and improve engineering capability Participating in industry forums to drive adoption of innovative technologies, tools and solutions in the Bank. Deutsche Banks Corporate Bank division is a leading provider of cash management, trade finance and securities finance. We complete green-field projects that deliver the best Corporate Bank - Securities Services products in the world. Our team is diverse, international, and driven by shared focus on clean code and valued delivery. At every level, agile minds are rewarded with competitive pay, support, and opportunities to excel.You will work as part of a cross-functional agile delivery team. You will bring an innovative approach to software development, focusing on using the latest technologies and practices, as part of a relentless focus on business value. You will be someone who sees engineering as team activity, with a predisposition to open code, open discussion and creating a supportive, collaborative environment. You will be ready to contribute to all stages of software delivery, from initial analysis right through to production support." What we'll offer you As part of our flexible scheme, here are just some of the benefits that youll enjoy, Best in class leave policy. Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your key responsibilities The candidate is expected to Hands-on engineering lead involved in analysis, design, design/code reviews, coding and release activities Champion engineering best practices and guide/mentor team to achieve high performance. Work closely with Business stakeholders, Tribe lead, Product Owner, Lead Architect to successfully deliver the business outcomes. Acquire functional knowledge of the business capability being digitized/re-engineered. Demonstrate ownership, inspire others, innovative thinking, growth mindset and collaborate for success. Your skills and experience Minimum 15 years of IT industry experience in Full stack development Expert in Java, Spring Boot, NodeJS, ReactJS, Strong experience in Big data processing Apache Spark, Hadoop, Bigquery, DataProc, Dataflow etc Strong experience in Kubernetes, OpenShift container platform Experience in Data streaming i.e. Kafka, Pub-sub etc Experience of working on public cloud GCP preferred, AWS or Azure Knowledge of various distributed/multi-tiered architecture styles Micro-services, Data mesh, Integrationpatterns etc Experience on modern software product delivery practices, processes and tooling and BIzDevOps skills such asCI/CD pipelines using Jenkins, Git Actions etc Experience on leading teams and mentoring developers Key Skill: Java Spring Boot NodeJS SQL/PLSQL ReactJS Advantageous: Having prior experience in Banking/Finance domain Having worked on hybrid cloud solutions preferably using GCP Having worked on product development How we'll support you Training and development to help you excel in your career. Coaching and support from experts in your team. A culture of continuous learning to aid progression. A range of flexible benefits that you can tailor to suit your needs. About us and our teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.

Posted 1 week ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Pune

Work from Office

Naukri logo

Project Role : Cloud Services Engineer Project Role Description : Act as liaison between the client and Accenture operations teams for support and escalations. Communicate service delivery health to all stakeholders and explain any performance issues or risks. Ensure Cloud orchestration and automation capability is operating based on target SLAs with minimal downtime. Hold performance meetings to share performance and consumption data and trends. Must have skills : Managed File Transfer Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Cloud Services Engineer, you will act as a liaison between the client and Accenture operations teams for support and escalations. You will communicate service delivery health to all stakeholders and explain any performance issues or risks. Ensure Cloud orchestration and automation capability is operating based on target SLAs with minimal downtime. Hold performance meetings to share performance and consumption data and trends. Roles & Responsibilities: Expected to be an SME. Collaborate and manage the team to perform. Responsible for team decisions. Engage with multiple teams and contribute on key decisions. Provide solutions to problems for their immediate team and across multiple teams. Ensure effective communication between client and operations teams. Analyze service delivery health and address performance issues. Conduct performance meetings to share data and trends. Professional & Technical Skills: Must To Have Skills:Proficiency in Managed File Transfer. Strong understanding of cloud orchestration and automation. Experience in SLA management and performance analysis. Knowledge of IT service delivery and escalation processes. Additional Information: The candidate should have a minimum of 5 years of experience in Managed File Transfer. This position is based at our Pune office. A 15 years full time education is required. Qualifications 15 years full time education

Posted 1 week ago

Apply

6.0 - 9.0 years

0 - 3 Lacs

Pune, Chennai, Bengaluru

Hybrid

Naukri logo

Preferred candidate profile GCP Cloud Data Engineers having strong experience in cloud migrations and pipelines with 5+ years of experience • Good understanding of Database and Data Engineering concepts. • Experience in cloud migration is must • Experience in Data Ingestion and processing from difference resources • Conceptual knowledge of understanding data, building ETL pipelines, data integrations, and ODS/DW. • Hands-on experience in SQL and Python • Experience in java development is required • Hands-on working experience in Google Cloud Platform Data Flow, Data Transfer services, AirFlow • Hands-on Working experience in Data Preprocessing techniques using DataFlow, DataProc, DataPrep • Hands-on working experience in BigQuery • Knowledge in Kafka, PubSub, GCS & Schedulers are required • Proficient with PostgreSQL is preferred • Experience with both real time and scheduled pipelines are preferred • Cloud certification is a plus • Experience in implementing ETL pipelines • Familiar with MicroServices or Enterprise Application Integration Patterns is a plus"

Posted 1 week ago

Apply

0.0 - 2.0 years

0 Lacs

Mumbai

Work from Office

Naukri logo

At Johnson & Johnson, we believe health is everything. Our strength in healthcare innovation empowers us to build a world where complex diseases are prevented, treated, and cured, where treatments are smarter and less invasive, and solutions are personal. Through our expertise in Innovative Medicine and MedTech, we are uniquely positioned to innovate across the full spectrum of healthcare solutions today to deliver the breakthroughs of tomorrow, and profoundly impact health for humanity. Learn more at Job Function: Career Programs Job Sub Function: Non-LDP Intern/Co-Op Job Category: Career Program All Job Posting Locations: Mumbai, India Job Description: This job has been posted to onboard pre-identified candidates. Please do not apply if not invited. This job has been posted to onboard pre-identified candidates. Please do not apply if not invited.

Posted 2 weeks ago

Apply

5.0 - 10.0 years

15 - 30 Lacs

Noida

Remote

Naukri logo

Role & responsibilities As a Data Engineer with a focus on pipeline migration from SAS to Google Cloud Platform (GCP) technologies, you will tackle intricate problems and create value for our business by designing and deploying reliable, scalable solutions tailored to the companys data landscape. You will be responsible for the development of custom-built data pipelines on the GCP stack, ensuring seamless migration of existing SAS pipelines. Responsibilities: Design, develop, and implement data pipelines on the GCP stack, with a focus on migrating existing pipelines from SAS to GCP technologies. Develop modular and reusable code to support complex ingestion frameworks, simplifying the process of loading data into data lakes or data warehouses from multiple sources. Collaborate with analysts and business process owners to translate business requirements into technical solutions. Utilize your coding expertise in scripting languages (Python, SQL, PySpark) to extract, manipulate, and process data effectively. Leverage your expertise in various GCP technologies, including BigQuery, Dataproc, GCP Workflows, Dataflow, Cloud Scheduler, Secret Manager, Batch, Cloud Logging, Cloud SDK, Google Cloud Storage, IAM, and Vertex AI, to enhance data warehousing solutions. Maintain high standards of development practices, including technical design, solution development, systems configuration, testing, documentation, issue identification, and resolution, writing clean, modular, and sustainable code. Understand and implement CI/CD processes using tools like Pulumi, GitHub, Cloud Build, Cloud SDK, and Docker. Participate in data quality and validation processes to ensure data integrity and reliability. Optimize performance of data pipelines and storage solutions, addressing bottlenecks. Collaborate with security teams to ensure compliance with industry standards for data security and governance. Communicate technical solutions engineering teams and business stakeholders. Required Skills & Qualifications: 5-13 years of experience in software development, data engineering, business intelligence, or a related field, with a proven track record in manipulating, processing, and extracting value from large datasets. Extensive experience with GCP technologies in the data warehousing space, including BigQuery, Dataproc, GCP Workflows, Dataflow, Cloud Scheduler, Secret Manager, Batch, Cloud Logging, Cloud SDK, Google Cloud Storage, IAM, and Vertex AI. Proficient in Python, SQL, and PySpark for data manipulation and pipeline creation. Experience with SAS, SQL Server, and SSIS is a significant advantage, particularly for transitioning legacy systems to modern GCP solutions. Ability to develop reusable, modular code for complex ingestion frameworks and multi-use pipelines. Understanding of CI/CD processes and tools, such as Pulumi, GitHub, Cloud Build, Cloud SDK, and Docker. Proven experience in migrating data pipelines from SAS to GCP technologies. Strong problem-solving abilities and a proactive approach to identifying and implementing solutions. Familiarity with industry best practices for data security, data governance, and compliance in cloud environments. Bachelor's degree in Computer Science, Information Technology, or a related technical field, or equivalent practical experience. GCP Certified Data Engineer (preferred). Excellent verbal and written communication skills, with the ability to advocate for technical solutions to a diverse audience including engineering teams, and business stakeholders. Willingness to work in the afternoon shift from 3 PM to 12 AM IST.

Posted 2 weeks ago

Apply

6.0 - 11.0 years

15 - 30 Lacs

Kochi, Bhubaneswar, Indore

Hybrid

Naukri logo

6+ years of professional experience Experience developing microservices and cloud native apps using Java/J2EE, REST APIs, Spring Core, Spring MVC Framework, Spring Boot Framework JPA (Java Persistence API) (Or any other ORM), Spring Security and similar tech stacks (Open source and proprietary) Experience working with Unit testing using framework such as Junit, Mockito, JBehave Build and deploy services using Gradle, Maven, Jenkins etc. as part of CI/CD process Experience working in Google Cloud Platform Experience with any Relational Database (Oracle, PostgreSQL etc.)

Posted 2 weeks ago

Apply

6.0 - 11.0 years

13 - 23 Lacs

Noida, Kolkata, Pune

Hybrid

Naukri logo

6+ years of professional experience Experience developing microservices and cloud native apps using Java/J2EE, REST APIs, Spring Core, Spring MVC Framework, Spring Boot Framework JPA (Java Persistence API) (Or any other ORM), Spring Security and similar tech stacks (Open source and proprietary) Experience working with Unit testing using framework such as Junit, Mockito, JBehave Build and deploy services using Gradle, Maven, Jenkins etc. as part of CI/CD process Experience working in Google Cloud Platform Experience with any Relational Database (Oracle, PostgreSQL etc.)

Posted 2 weeks ago

Apply

6.0 - 11.0 years

15 - 30 Lacs

Hyderabad, Chennai, Bengaluru

Hybrid

Naukri logo

6+ years of professional experience Experience developing microservices and cloud native apps using Java/J2EE, REST APIs, Spring Core, Spring MVC Framework, Spring Boot Framework JPA (Java Persistence API) (Or any other ORM), Spring Security and similar tech stacks (Open source and proprietary) Experience working with Unit testing using framework such as Junit, Mockito, JBehave Build and deploy services using Gradle, Maven, Jenkins etc. as part of CI/CD process Experience working in Google Cloud Platform Experience with any Relational Database (Oracle, PostgreSQL etc.)

Posted 2 weeks ago

Apply

15.0 - 20.0 years

6 - 10 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Cloud Services Engineer Project Role Description : Act as liaison between the client and Accenture operations teams for support and escalations. Communicate service delivery health to all stakeholders and explain any performance issues or risks. Ensure Cloud orchestration and automation capability is operating based on target SLAs with minimal downtime. Hold performance meetings to share performance and consumption data and trends. Must have skills : Cloud Based Service Management Process Design Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Cloud Services Engineer, you will act as a vital link between clients and Accenture's operations teams, facilitating support and managing escalations. Your typical day will involve communicating the health of service delivery to stakeholders, addressing performance issues, and ensuring that cloud orchestration and automation capabilities are functioning optimally. You will hold performance meetings to discuss data and trends, ensuring that services meet the expected service level agreements with minimal downtime, thereby contributing to the overall efficiency and effectiveness of cloud services. Roles & Responsibilities:- Expected to be an SME, collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate regular communication between clients and internal teams to ensure alignment on service delivery.- Analyze performance metrics and prepare reports to inform stakeholders of service health and areas for improvement. Professional & Technical Skills: - Must To Have Skills: Proficiency in Cloud Based Service Management Process Design.- Strong understanding of cloud service models and deployment strategies.- Experience with cloud orchestration tools and automation frameworks.- Ability to analyze and interpret service performance data.- Familiarity with incident management and escalation processes. Additional Information:- The candidate should have minimum 5 years of experience in Cloud Based Service Management Process Design.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 weeks ago

Apply

5.0 - 7.0 years

13 - 17 Lacs

Bengaluru

Work from Office

Naukri logo

Skilled Multiple GCP services - GCS, BigQuery, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer etc. Must have Python and SQL work experience & Proactive, collaborative and ability to respond to critical situation Ability to analyse data for functional business requirements & front face customer Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 5 to 7 years of relevant experience working as technical analyst with Big Query on GCP platform. Skilled in multiple GCP services - GCS, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer You love collaborative environments that use agile methodologies to encourage creative design thinking and find innovative ways to develop with cutting edge technologies Ambitious individual who can work under their own direction towards agreed targets/goals and with creative approach to work Preferred technical and professional experience Create up to 3 bullets maxitive individual with an ability to manage change and proven time management Proven interpersonal skills while contributing to team effort by accomplishing related results as needed Up-to-date technical knowledge by attending educational workshops, reviewing publications (encouraging then to focus on required skills)

Posted 2 weeks ago

Apply

6.0 - 11.0 years

8 - 12 Lacs

Hyderabad

Work from Office

Naukri logo

Your role In this role you will play a key role in Develop and implement Generative AI / AI solutions on Google Cloud Platform Work with cross-functional teams to design and deliver AI-powered products and services Work on developing, versioning and executing Python code Deploy models as endpoints in Dev Environment Must Have Skills Solid understanding of python Experience with deep learning frameworks such as TensorFlow, PyTorch, or JAX Experience with natural language processing (NLP) and machine learning (ML) Experience on Cloud storage, compute engine, VertexAI, Cloud Function, Pub/Sub, Vertex AI etc Hands on experience with Generative AI support in Vertex, specifically handson experience with Generative AI models like Gemini, vertex Search etc Your Profile 4-6+ years of experience in AI development Experience with Google Cloud Platform specifically delivering an AI solution on VertexAI platform Experience in developing and deploying AI Solutions What youll love about working here ChoosingCapgemini means having the opportunity to make a difference, whetherfor the worlds leading businesses or for society. It means when the futuredoesnt look as bright as youd like, youhave the opportunity tomake changetorewrite it. When you join Capgemini, you dont just start a new job. You become part of something bigger. A diverse collective of free-thinkers, entrepreneurs and experts, all working together to unleash human energy through technology, for an inclusive and sustainable future. You can exponentially grow your career by being part of innovative projects and taking advantage of our extensiveLearning & Development programs. With us, you will experience aninclusive , safe, healthy, andflexible work environment to bring out the bestin you! You also get a chance to make positive social change and build a better world by taking an active role in ourCorporate Social Responsibility andSustainability initiatives. And whilst you make a difference, you will also have a lot offun . About Capgemini:

Posted 2 weeks ago

Apply

2.0 - 7.0 years

4 - 9 Lacs

Coimbatore

Work from Office

Naukri logo

Project Role : AI / ML Engineer Project Role Description : Develops applications and systems that utilize AI tools, Cloud AI services, with proper cloud or on-prem application pipeline with production ready quality. Be able to apply GenAI models as part of the solution. Could also include but not limited to deep learning, neural networks, chatbots, image processing. Must have skills : Google Cloud Machine Learning Services Good to have skills : GCP Dataflow, Google Pub/Sub, Google Dataproc Minimum 2 year(s) of experience is required Educational Qualification : 15 years full time education Summary :We are seeking a skilled GCP Data Engineer to join our dynamic team. The ideal candidate will design, build, and maintain scalable data pipelines and solutions on Google Cloud Platform (GCP). This role requires expertise in cloud-based data engineering and hands-on experience with GCP tools and services, ensuring efficient data integration, transformation, and storage for various business use cases.________________________________________ Roles & Responsibilities: Design, develop, and deploy data pipelines using GCP services such as Dataflow, BigQuery, Pub/Sub, and Cloud Storage. Optimize and monitor data workflows for performance, scalability, and reliability. Collaborate with data analysts, data scientists, and business stakeholders to understand data requirements and implement solutions. Implement data security and governance measures, ensuring compliance with industry standards. Automate data workflows and processes for operational efficiency. Troubleshoot and resolve technical issues related to data pipelines and platforms. Document technical designs, processes, and best practices to ensure maintainability and knowledge sharing.________________________________________ Professional & Technical Skills:a) Must Have: Proficiency in GCP tools such as BigQuery, Dataflow, Pub/Sub, Cloud Composer, and Cloud Storage. Expertise in SQL and experience with data modeling and query optimization. Solid programming skills in Python ofor data processing and ETL development. Experience with CI/CD pipelines and version control systems (e.g., Git). Knowledge of data warehousing concepts, ELT/ETL processes, and real-time streaming. Strong understanding of data security, encryption, and IAM policies on GCP.b) Good to Have: Experience with Dialogflow or CCAI tools Knowledge of machine learning pipelines and integration with AI/ML services on GCP. Certifications such as Google Professional Data Engineer or Google Cloud Architect.________________________________________ Additional Information: - The candidate should have a minimum of 3 years of experience in Google Cloud Machine Learning Services and overall Experience is 3- 5 years - The ideal candidate will possess a strong educational background in computer science, mathematics, or a related field, along with a proven track record of delivering impactful data-driven solutions. Qualifications 15 years full time education

Posted 2 weeks ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Nagpur

Work from Office

Naukri logo

Project Role : Advanced Application Engineer Project Role Description : Utilize modular architectures, next-generation integration techniques and a cloud-first, mobile-first mindset to provide vision to Application Development Teams. Work with an Agile mindset to create value across projects of multiple scopes and scale. Must have skills : SAP FI CO Finance Good to have skills : SAP CO Product Cost Controlling Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education About The Role ::Sr. SAP S4H FICO Consultant Job Duties & ResponsibilitiesIn-depth SAP Solutions and process knowledge including industry best practicesLeads fit/gap and other types of working sessions to understand needs driven by business process requirements.Translate requirements into solutions, using SAP Best Practices or Navisite Solutions as a baseline.Leader of their respective workstream on assigned projects.Work in conjunction with Navisite Service Delivery Lead to establish the overall plan for their respective work for the customerSAP configuration experience primarily in the FI/CO modules.Configure SAP CO systems to meet client business requirements, including connection points with SD, PP, MM and other modules and implementation of SAP best practices. At least two full lifecycle implementations as an SAP CO functional consultant and minimum 5 support projects. S4 HANA Experience is a mustApply strong knowledge of the business processes for designing, developing, and testing SAP functions associated with financial operations, which includes expertise in cost center accounting (CCA), Internal Order Accounting (IOA), product cost controlling (CO-PC), profitability analysis (CO-PA), and profit center accounting (PCA). Focus on business process re-engineering efforts and technology enablement Serves as the subject matter expert on product systems, processes, network architecture and interface capabilities Should have in-depth understanding and execution skills in FI and CO sub modules SAP FI:FI General Ledger accounting, Accounts Receivables, Account Payables, Asset accounting Experience in developing specifications for Interfaces and Custom ReportsCreates functional specifications for development objects.Conducts unit testing on overall solution including technical objects.Supports integration testing and user acceptance testing with customer.Explores new SAP applications as a subject matter expert and may be first adopter for emerging SAP technologies.Supports Navisite Application Managed Services (AMS) by working and resolving tickets as assigned.Sustains adequate product knowledge through formal training, webinars, SAP publications, collaboration among colleagues and self-study.Enforce the core competencies and professional standards of Navisite in all client engagements.Supports internal projects as assigned.Collaborates with colleagues to grow product knowledge.Assists in the continual improvement of Navisite methods and tools.Adheres to Navisite professional standardsWilling to travel as per business needs Key Competencies:Customer FocusResults DrivenBusiness AcumenTrusted AdvisorTask ManagementProblem Solving SkillsCommunication SkillsPriority SettingPresentation SkillsMentorship and CollaborationAbility to work regularly scheduled shifts After-hours coverage for critical issues as needed Qualifications 15 years full time education

Posted 2 weeks ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Hyderabad

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Google BigQuery Good to have skills : No Function Specialty Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. You will be responsible for creating efficient and scalable solutions using Google BigQuery. Your typical day will involve collaborating with the team, analyzing business requirements, designing and implementing application features, and ensuring the applications meet quality standards and performance goals. Roles & Responsibilities:1. Design, create, code, and support a variety of data pipelines and models on GCP cloud technology 2. Strong hand-on exposure to GCP services like BigQuery, Composer etc.3. Partner with business/data analysts, architects, and other key project stakeholders to deliver data requirements.4. Developing data integration and ETL (Extract, Transform, Load) processes.5. Support existing Data warehouses & related pipelines.6. Ensuring data quality, security, and compliance.7. Optimizing data processing and storage efficiency, troubleshoot issues in Data space.8. Seeks to learn new skills/tools utilized in Data space (ex:dbt, MonteCarlo etc.)9. Excellent communication skills- verbal and written, Excellent analytical skills with Agile mindset.10. Demonstrates strong affinity towards paying attention to details and delivery accuracy.11. Self-motivated team player and should have ability to overcome challenges and achieve desired results.12. Work effectively in Global distributed environment. Professional & Technical Skills:Skill Proficiency Expectation:Expert:Data Storage, BigQuery,SQL,Composer,Data Warehousing ConceptsIntermidate Level:PythonBasic Level/Preferred:DB,Kafka, Pub/Sub Must To Have Skills:Proficiency in Google BigQuery. Strong understanding of statistical analysis and machine learning algorithms. Experience with data visualization tools such as Tableau or Power BI. Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms. Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity. Additional Information: The candidate should have a minimum of 5 years of experience in Google BigQuery. This position is based at our Hyderabad office. A 15 years full time education is required. Qualifications 15 years full time education

Posted 2 weeks ago

Apply

5.0 - 8.0 years

20 - 22 Lacs

Chennai

Work from Office

Naukri logo

Minimum 5 Years of in-depth experience in Java/Spring Boot Minimum 3 Years of Experience in Angular ability to develop rich UI screens and custom/re-usable components. Minimum 2 Years of GCP experience working in GCP Big Query, Google Cloud Storage, Cloud Run, PubSub. Minimum 2 of experience in using CI/CD pipelines like Tekton. 1-2 Years of experience in deploying google cloud services using Terraform. Experience mentoring other software engineers and delivering systemic change across 5+ years of experience in J2EE

Posted 2 weeks ago

Apply

7.0 - 12.0 years

0 - 3 Lacs

Hyderabad, Chennai, Bengaluru

Hybrid

Naukri logo

Required Skills: Python, ETL, SQL, GCP, Bigquery, Pub/Sub, Airflow. Good to Have: DBT, Data mesh Job Title: Senior GCP Engineer Data Mesh & Data Product Specialist We are hiring a Senior GCP Developer to join our high-performance data engineering team. This is a mission-critical role where you will design, build, and maintain scalable ETL pipelines and frameworks in a Data Mesh architecture. You will work with modern tools like Python, dbt, BigQuery (GCP), and SQL to deliver high-quality data products that power decision-making across the organization. We are looking for a highly skilled professional who thrives in demanding environments, takes ownership of their work, and delivers results with precision and reliability. Key Responsibilities * Design, Build, and Maintain ETL Pipelines: Develop robust, scalable, and efficient ETL workflows to ingest, transform, and load data into distributed data products within the Data Mesh architecture. * Data Transformation with dbt: Use dbt to build modular, reusable transformation workflows that align with the principles of Data Products. * Cloud Expertise: Leverage Google Cloud Platform (GCP) services such as BigQuery, Cloud Storage, Pub/Sub, and Dataflow to implement highly scalable data solutions. * Data Quality & Governance: Enforce strict data quality standards by implementing validation checks, anomaly detection mechanisms, and monitoring frameworks. * Performance Optimization: Continuously optimize ETL pipelines for speed, scalability, and cost efficiency. * Collaboration & Ownership: Work closely with data product owners, BI developers, and stakeholders to understand requirements and deliver on expectations. Take full ownership of your deliverables. * Documentation & Standards: Maintain detailed documentation of ETL workflows, enforce coding standards, and adhere to best practices in data engineering. * Troubleshooting & Issue Resolution: Proactively identify bottlenecks or issues in pipelines and resolve them quickly with minimal disruption. Required Skills & Experience * 10+ or 7+ years of hands-on experience in designing and implementing ETL workflows in large-scale environments (Lead & Dev) * Advanced proficiency in Python for scripting, automation, and data processing. * Expert-level knowledge of SQL for querying large datasets with performance optimization techniques. * Deep experience working with modern transformation tools like dbt in production environments. * Strong expertise in cloud platforms like Google Cloud Platform (GCP) with hands-on experience using BigQuery. * Familiarity with Data Mesh principles and distributed data architectures is mandatory. * Proven ability to handle complex projects under tight deadlines while maintaining high-quality standards. * Exceptional problem-solving skills with a strong focus on delivering results. What We Expect This is a demanding role that requires: 1. A proactive mindset – you take initiative without waiting for instructions. 2. A commitment to excellence – no shortcuts or compromises on quality. 3. Accountability – you own your work end-to-end and deliver on time. 4. Attention to detail – precision matters; mistakes are not acceptable.

Posted 2 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies