Home
Jobs
Companies
Resume

130 Pubsub Jobs - Page 4

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3 - 6 years

10 - 14 Lacs

Bengaluru

Work from Office

Naukri logo

If you are creating JD for an Associate program rolereplace word "entry" in the first sentence below with word Associate. For example -> "As an entry level Software Developer..." will be changed to "As an Associate Software Developer..." As an entry level Application Developer at IBM, you'll work with clients to co-create solutions to major real-world challenges by using best practice technologies, tools, techniques, and products to translate system requirements into the design and development of customized systems. In your role, you may be responsible forWorking across the entire system architecture to design, develop, and support high quality, scalable products and interfaces for our clients Collaborate with cross-functional teams to understand requirements and define technical specifications for generative AI projects Employing IBM's Design Thinking to create products that provide a great user experience along with high performance, security, quality, and stability Working with a variety of relational databases (SQL, Postgres, DB2, MongoDB), operating systems (Linux, Windows, iOS, Android), and modern UI frameworks (Backbone.js, AngularJS, React, Ember.js, Bootstrap, and JQuery) Creating everything from mockups and UI components to algorithms and data structures as you deliver a viable product --- IF APPLICABLE review & complete fields in > Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise SQL authoring, query, and cost optimisation primarily on Big Query. Python as an object-oriented scripting language. Data pipeline, data streaming and workflow management toolsDataflow, Pub Sub, Hadoop, spark-streaming Version control systemGIT & Preferable knowledge of Infrastructure as CodeTerraform. Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases. Experience performing root cause analysis on internal and external data and processes to answer specific business questions. Preferred technical and professional experience Experience building and optimising data pipelines, architectures and data sets. Build processes supporting data transformation, data structures, metadata, dependency and workload management. Working knowledge of message queuing, stream processing, and highly scalable data stores. Experience supporting and working with cross-functional teams in a dynamic environment. We are looking for a candidate with experience in a Data Engineer role, who are also familiar with Google Cloud Platform

Posted 2 months ago

Apply

11 - 16 years

40 - 45 Lacs

Pune

Work from Office

Naukri logo

About The Role : Job Title - IT Architect Specialist, AVP Location - Pune, India Role Description This role is for a Senior business functional analyst for Group Architecture. This role will be instrumental in establishing and maintaining bank wide data policies, principles, standards and tool governance. The Senior Business Functional Analyst acts as a link between the business divisions and the data solution providers to align the target data architecture against the enterprise data architecture principles, apply agreed best practices and patterns. Group Architecture partners with each division of the bank to ensure that Architecture is defined, delivered, and managed in alignment with the banks strategy and in accordance with the organizations architectural standards. What we'll offer you As part of our flexible scheme, here are just some of the benefits that youll enjoy Best in class leave policy Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your key responsibilities Data Architecture: The candidate will work closely with stakeholders to understand their data needs and break out business requirements into implementable building blocks and design the solution's target architecture. GCP Data Architecture & Migration: A strong working experience on GCP Data architecture is must (BigQuery, Dataplex, Cloud SQL, Dataflow, Apigee, Pub/Sub, ...). Appropriate GCP architecture level certification. Experience in handling hybrid architecture & patterns addressing non- functional requirements like data residency, compliance like GDPR and security & access control. Experience in developing reusable components and reference architecture using IaaC (Infrastructure as a code) platforms such as terraform. Data Mesh: The candidate is expected to have proficiency in Data Mesh design strategies that embrace the decentralization nature of data ownership. The candidate must have good domain knowledge to ensure that the data products developed are aligned with business goals and provide real value. Data Management Tool: Access various tools and solutions comprising of data governance capabilities like data catalogue, data modelling and design, metadata management, data quality and lineage and fine-grained data access management. Assist in development of medium to long term target state of the technologies within the data governance domain. Collaborate with stakeholders, including business leaders, project managers, and development teams, to gather requirements and translate them into technical solutions. Your skills and experience Extensive experience in data architecture, within Financial Services Strong technical knowledge of data integration patterns, batch & stream processing, data lake/ data lake house/data warehouse/data mart, caching patterns and policy bases fine grained data access. Proven experience in working on data management principles, data governance, data quality, data lineage and data integration with a focus on Data Mesh Knowledge of Data Modelling concepts like dimensional modelling and 3NF. Experience of systematic structured review of data models to enforce conformance to standards. High level understanding of data management solutions e.g. Collibra, Informatica Data Governance etc. Proficiency at data modeling and experience with different data modelling tools. Very good understanding of streaming and non-streaming ETL and ELT approaches for data ingest. Strong analytical and problem-solving skills, with the ability to identify complex business requirements and translate them into technical solutions. How we'll support you Training and development to help you excel in your career Coaching and support from experts in your team A culture of continuous learning to aid progression A range of flexible benefits that you can tailor to suit your needs About us and our teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.

Posted 2 months ago

Apply

3 - 8 years

6 - 10 Lacs

Chennai

Work from Office

Naukri logo

? Hands-on experience in data modelling for both OLTP and OLAP systems. In-depth knowledge of Conceptual, Logical, and Physical data modelling. Strong understanding of indexing, partitioning, and data sharding with practical experience. Experience in identifying and addressing factors affecting database performance for near-real-time reporting and application interaction. Proficiency with at least one data modelling tool (preferably DBSchema). Functional knowledge of the mutual fund industry is a plus.Familiarity with GCP databases like AlloyDB, CloudSQL, and BigQuery.Willingness to work from Chennai (office presence is mandatory).

Posted 2 months ago

Apply

3 - 6 years

10 - 14 Lacs

Bengaluru

Work from Office

Naukri logo

As an entry level Application Developer at IBM, you'll work with clients to co-create solutions to major real-world challenges by using best practice technologies, tools, techniques, and products to translate system requirements into the design and development of customized systems. In your role, you may be responsible for: Working across the entire system architecture to design, develop, and support high quality, scalable products and interfaces for our clients Collaborate with cross-functional teams to understand requirements and define technical specifications for generative AI projects Employing IBM's Design Thinking to create products that provide a great user experience along with high performance, security, quality, and stability Working with a variety of relational databases (SQL, Postgres, DB2, MongoDB), operating systems (Linux, Windows, iOS, Android), and modern UI frameworks (Backbone.js, AngularJS, React, Ember.js, Bootstrap, and JQuery) Creating everything from mockups and UI components to algorithms and data structures as you deliver a viable product Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise SQL authoring, query, and cost optimisation primarily on Big Query. Python as an object-oriented scripting language. Data pipeline, data streaming and workflow management toolsDataflow, Pub Sub, Hadoop, spark-streaming Version control systemGIT & Preferable knowledge of Infrastructure as CodeTerraform. Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases. Experience performing root cause analysis on internal and external data and processes to answer specific business questions. Preferred technical and professional experience Experience building and optimising data pipelines, architectures and data sets. Build processes supporting data transformation, data structures, metadata, dependency and workload management. Working knowledge of message queuing, stream processing, and highly scalable data stores. Experience supporting and working with cross-functional teams in a dynamic environment. We are looking for a candidate with experience in a Data Engineer role, who are also familiar with Google Cloud Platform.

Posted 2 months ago

Apply

3 - 6 years

10 - 14 Lacs

Bengaluru

Work from Office

Naukri logo

As an entry level Application Developer at IBM, you'll work with clients to co-create solutions to major real-world challenges by using best practice technologies, tools, techniques, and products to translate system requirements into the design and development of customized systems. In your role, you may be responsible for: Working across the entire system architecture to design, develop, and support high quality, scalable products and interfaces for our clients Collaborate with cross-functional teams to understand requirements and define technical specifications for generative AI projects Employing IBM's Design Thinking to create products that provide a great user experience along with high performance, security, quality, and stability Working with a variety of relational databases (SQL, Postgres, DB2, MongoDB), operating systems (Linux, Windows, iOS, Android), and modern UI frameworks (Backbone.js, AngularJS, React, Ember.js, Bootstrap, and JQuery) Creating everything from mockups and UI components to algorithms and data structures as you deliver a viable product Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise SQL authoring, query, and cost optimisation primarily on Big Query. Python as an object-oriented scripting language. Data pipeline, data streaming and workflow management toolsDataflow, Pub Sub, Hadoop, spark-streaming Version control systemGIT & Preferable knowledge of Infrastructure as CodeTerraform. Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases. Experience performing root cause analysis on internal and external data and processes to answer specific business questions. Preferred technical and professional experience Experience building and optimising data pipelines, architectures and data sets. Build processes supporting data transformation, data structures, metadata, dependency and workload management. Working knowledge of message queuing, stream processing, and highly scalable data stores. Experience supporting and working with cross-functional teams in a dynamic environment. We are looking for a candidate with experience in a Data Engineer role, who are also familiar with Google Cloud Platform.

Posted 2 months ago

Apply

3 - 6 years

10 - 14 Lacs

Bengaluru

Work from Office

Naukri logo

As a Software Developer you'll participate in many aspects of the software development lifecycle, such as design, code implementation, testing, and support. You will create software that enables your clients' hybrid-cloud and AI journeys. Your primary responsibilities includeComprehensive Feature Development and Issue ResolutionWorking on the end to end feature development and solving challenges faced in the implementation. Stakeholder Collaboration and Issue ResolutionCollaborate with key stakeholders, internal and external, to understand the problems, issues with the product and features and solve the issues as per SLAs defined. Continuous Learning and Technology IntegrationBeing eager to learn new technologies and implementing the same in feature development. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise SQL authoring, query, and cost optimisation primarily on Big Query. Python as an object-oriented scripting language. Data pipeline, data streaming and workflow management toolsDataflow, Pub Sub, Hadoop, spark-streaming Version control systemGIT & Preferable knowledge of Infrastructure as CodeTerraform. Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases. Experience performing root cause analysis on internal and external data and processes to answer specific business questions. Preferred technical and professional experience Experience building and optimising data pipelines, architectures and data sets. Build processes supporting data transformation, data structures, metadata, dependency and workload management. Working knowledge of message queuing, stream processing, and highly scalable data stores. Experience supporting and working with cross-functional teams in a dynamic environment. We are looking for a candidate with experience in a Data Engineer role, who are also familiar with Google Cloud Platform.

Posted 2 months ago

Apply

8 - 12 years

12 - 14 Lacs

Bengaluru

Work from Office

Naukri logo

Position Description: The Boeing Company is currently seeking a high performing versatile Experienced Programmer Analyst_Data Engineer to join the Product Systems build team. The Product Systems build team provides comprehensive software solutions to rapidly access and visually transform complex engineering and manufacturing product data. Responsibilities include development and integration for a variety of Commercial off the Shelf (COTS) and in house software application supporting our engineering/manufacturing teams. Job requires working within a diverse team of skilled and motivated co-workers to collaborate on results. Other qualities for this candidate are a positive attitude, self-motivated, the ability to work in a fast-paced, demanding environment, and the ability to adapt to changing priorities. Essential Job Functions/Responsibilities: Hands-on experience in understanding aerospace domain specific data Must coordinate with data scientists in data preparation, exploration and making data ready. Must have clear understanding of defining data products and monetizing. Must have experience in building self-service capabilities to users. Build quality checks across the data lineage and responsible in designing and implementing different data patterns. Works on prototyping and evaluates technical feasibility. Can influence different stakeholders for funding and building the vision of the product in terms of usage, productivity, and scalability of the solutions. Build impactful or outcome-based solutions/products. Qualification: 7+ years of experience as a data engineer. Strong understanding of Datawarehouse concepts, Data Lake, and data mesh Familiar with ETL tools and Data ingestion patterns Hands on experience in building data pipelines using Azure. Hands on experience in writing complex SQL and No- SQL Hands on experience with data pipeline orchestration tools such as Azure data Factory Hands on experience on Data Modelling Experience on Data visualization using Power BI or Tableau Experience in working with Global teams with global mindset. Mandatory Skills: Experience in Core Java/Python, SQL, Data Modelling, Airflow & Spark Experience in BQ, CloudSQL, PubSUB, BigTable, Terraform, DBMS, Dataflow and GCS Experience in Azure Services Education: Technical bachelor's degree and typically 8 - 12 years of related work experience. A technical degree is defined as any four year degree, or greater, in a mathematic, scientific or information technology field of study. Relocation: This position does offer relocation based on candidate eligibility within INDIA

Posted 2 months ago

Apply

2 - 5 years

4 - 7 Lacs

Ahmedabad

Work from Office

Naukri logo

Role Overview. As a Senior Backend Developer (Node.js), you will play a key role in designing, developing, and optimizing backend infrastructure for real-time applications and Web3 solutions. You will collaborate with cross-functional teams to ensure the seamless performance, security, and scalability of our backend systems. Key Responsibilities. Backend DevelopmentDesign, develop, and maintain scalable backend systems using Node.js. Real-Time CommunicationImplement real-time data transmission using WebSocket, WebRTC, and other relevant protocols to enhance user experience. Database ManagementWork with databases such as Redis, BigQuery, BigTable, PubSub, and DataFlow to efficiently manage and optimize application data. Scalability & PerformanceArchitect solutions that can handle high concurrency while ensuring optimal performance and reliability. SecurityImplement best practices for security to prevent vulnerabilities, data breaches, and unauthorized access. CollaborationWork closely with frontend developers, designers, and other stakeholders to seamlessly integrate backend solutions. Testing & DebuggingEnsure robustness and reliability through thorough testing and debugging processes. DocumentationMaintain comprehensive documentation for APIs, backend services, and system architecture. Ideal Candidate Profile. Proven experience in backend development using Node.js. Hands-on experience in real-time applications or multiplayer game development is a plus. Strong expertise in WebSocket, WebRTC, and other real-time communication technologies. Deep understanding of databases like Redis, BigQuery, BigTable, PubSub, DataFlow. Experience working with Google Cloud Platform (GCP). Familiarity with serverless architecture is a strong advantage. Strong problem-solving skills with a results-driven mindset. Excellent communication and teamwork skills. Fluency in English. Knowledge of the gaming or mobile apps industry is a plus. Interest or experience in the Web3 industry is an added advantage. Show more Show less

Posted 2 months ago

Apply

2 - 7 years

4 - 9 Lacs

Karnataka

Work from Office

Naukri logo

Description Skills: Proficiency in SQL is a must. PL/SQL to understand integration SP part. Experience in PostgreSQL is must. Basic knowledge of Google Cloud Composer ( or Apache Airflow). Composer is managed GCP service for Apache Airflow. All pipelines are orchestrated and scheduled through Composer GCP basics-high level understanding of using GCP UI and services like Cloud SQL PostgreSQL Cloud Composer Cloud Storage Dataproc Airlfow DAGs are written in Python basic knowledge of Python code for DAGs Dataproc is Managed Spark in GCP so a bit of PySpark knowledge is also nice to have. Named Job Posting? (if Yes - needs to be approved by SCSC) Additional Details Global Grade C Level To Be Defined Named Job Posting? (if Yes - needs to be approved by SCSC) No Remote work possibility No Global Role Family 60235 (P) Sales, Account Management & Solution design Local Role Name 6480 Account Manager Local Skills 6170 SQL Languages RequiredEnglish Role Rarity To Be Defined

Posted 2 months ago

Apply

12 - 20 years

30 - 45 Lacs

Hyderabad

Hybrid

Naukri logo

Job Description: We are seeking a highly experienced Data Architect with 15-20 years of experience to lead the design and implementation of data solutions at scale. The ideal candidate will have deep expertise in cloud technologies, particularly GCP, along with a broad skill set in SQL, BigQuery, Cloud Storage, Cloud Functions, Pub/Sub, DLP, Dataproc, Cloud Composer, Python, ETL, and big data technologies like MapR/Hadoop, Hive, Spark, and Scala. Key Responsibilities: Lead the design and implementation of complex data architectures across cloud platforms, ensuring scalability, performance, and cost-efficiency. Architect data solutions using Google Cloud Platform (GCP) services such as BigQuery, Cloud Storage, Cloud Functions, Pub/Sub, Dataproc, Cloud Composer, and DLP. Design and optimize ETL - Abinitio processes and data pipelines using Python and related technologies, ensuring seamless data integration across multiple systems. Work with big data technologies including Hadoop (MapR), Hive, Spark, and Scala to build and manage large-scale, distributed data systems. Oversee the end-to-end data flow from ingestion to processing, transformation, and storage, ensuring high availability and disaster recovery. Lead and mentor a team of engineers, guiding them in adopting best practices in data architecture, security, and governance. Define and enforce data governance, security, and compliance standards to ensure data privacy and integrity. Collaborate with cross-functional teams to understand business requirements and translate them into data architecture and technical solutions. Design and implement data lake, data warehouse, and analytics solutions to support business intelligence and advanced analytics. Lead the integration of cloud-native tools and services for real-time and batch processing, using Pub/Sub, Dataproc, and Cloud Composer. Conduct performance tuning and optimization for SQL, BigQuery, and big data technologies to ensure efficient query execution and resource usage. Provide strategic direction on new data technologies, trends, and best practices to ensure the organization remains competitive and innovative. Required Skills: 15-20 years of experience in data architecture, data engineering, or related roles, with a focus on cloud solutions. Extensive experience with Google Cloud Platform (GCP) services, particularly BigQuery, Cloud Storage, Cloud Functions, Pub/Sub, Dataproc, Cloud Composer, and DLP. Strong Experience in ETL - Abinitio. Proficient in SQL and experience with cloud-native data storage and processing technologies (BigQuery, Hive, Hadoop, Spark). Expertise in Python for ETL pipeline development and data manipulation. Solid understanding of big data technologies such as MapR, Hadoop, Hive, Spark, and Scala. Experience in designing and implementing scalable, high-performance data architectures and data lakes/warehouses. Deep understanding of data governance, security, privacy (DLP), and compliance standards. Proven experience in leading teams and delivering large-scale data solutions in cloud environments. Excellent problem-solving, communication, and leadership skills. Ability to work with senior business and technical leaders to align data solutions with organizational goals. Preferred Skills: Experience with other cloud platforms (AWS, Azure). Knowledge of machine learning and AI data pipelines. Familiarity with containerized environments and orchestration tools (e.g., Kubernetes). Experience with advanced analytics or data science initiatives.

Posted 2 months ago

Apply

15 - 24 years

30 - 45 Lacs

Pune

Hybrid

Naukri logo

Minimum of 5 years of experience in a Data Architect role, supporting warehouse and Cloud data platforms/environments. Experience with common GCP services such as BigQuery, Dataflow, dataproc, GCS, Cloud Function and related CI/CD processes

Posted 2 months ago

Apply

15 - 24 years

30 - 45 Lacs

Bengaluru

Hybrid

Naukri logo

Minimum of 5 years of experience in a Data Architect role, supporting warehouse and Cloud data platforms/environments. Experience with common GCP services such as BigQuery, Dataflow, dataproc, GCS, Cloud Function and related CI/CD processes

Posted 2 months ago

Apply

3 - 8 years

5 - 10 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Cloud Services Engineer Project Role Description : Act as liaison between the client and Accenture operations teams for support and escalations. Communicate service delivery health to all stakeholders and explain any performance issues or risks. Ensure Cloud orchestration and automation capability is operating based on target SLAs with minimal downtime. Hold performance meetings to share performance and consumption data and trends. Must have skills : SUSE Linux Administration Good to have skills : NA Minimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Cloud Services Engineer, you will act as a liaison between the client and Accenture operations teams for support and escalations. You will communicate service delivery health to all stakeholders, explain any performance issues or risks, and ensure Cloud orchestration and automation capability is operating based on target SLAs with minimal downtime. Hold performance meetings to share performance and consumption data and trends. Roles & Responsibilities: Expected to perform independently and become an SME. Required active participation/contribution in team discussions. Contribute in providing solutions to work related problems. Ensure effective communication between client and Accenture operations teams. Monitor and maintain Cloud orchestration and automation capability. Analyze performance data and trends to identify areas for improvement. Collaborate with stakeholders to address service delivery issues. Implement strategies to optimize service delivery efficiency. Professional & Technical Skills: Must To Have Skills: Proficiency in SUSE Linux Administration. Strong understanding of cloud orchestration and automation technologies. Experience in analyzing performance data and trends. Knowledge of SLAs and service delivery optimization techniques. Additional Information: The candidate should have a minimum of 3 years of experience in SUSE Linux Administration. This position is based at our Bengaluru office. A 15 years full time education is required. Qualifications 15 years full time education

Posted 2 months ago

Apply

6 - 10 years

18 - 25 Lacs

Chennai

Work from Office

Naukri logo

Full Stack development using Google Cloud Platform (GCP) Development in Cloud Run/Cloud Function Cloud PubSub integration BigQuery development/operations Cloud Logging/Monitoring principles/operations IAM SQL Tekton CI/CD on Google Cloud Terraform development Dataflow Programming languages Python - expert level Knowledge of Java spring boot is a plus Software Development Acumen Excellent problem solving skills Experience collaborating in team environment and working independently Ability to describe and teach coding best practices and production code standards Ability to quickly propose GCP-based solutions aligned with latest Google documentation, whitepapers, and community publications Ability to trace and resolve technical issues with minimal supervision Demonstrated proficiency in understanding and implementing process logic and workflows Motivated and keen to work in a collaborative environment with a focus on team success over and above individual success PRIMARY SKILLS REQUIRED: GCP - Professional cloud architect or DE certification is a must Python expert level Nice to Have Skills: Java spring boot EXPERIENCE REQUIRED- 4 years. EDUCATION REQUIRED: BE/B.TECH

Posted 3 months ago

Apply

5 - 8 years

7 - 10 Lacs

Bengaluru

Work from Office

Naukri logo

To ensure successful initiation, planning, execution, control and completion of the project by guiding team members on technical aspects, conducting reviews of technical documents and artefacts. Lead project development, production support and maintenance activities. Fill and ensure timesheets are completed, as is the invoicing process, on or before the deadline. Lead the customer interface for the project on an everyday basis, proactively addressing any issues before they are escalated. Create functional and technical specification documents. Track open tickets/ incidents in queue and allocate tickets to resources and ensure that the tickets are closed within the deadlines. Ensure analysts adhere to SLA/KPI/OLA Ensure that all in the delivery team, including self, are constantly thinking of ways to do things faster, better or in a more economic manner. Lead and ensure project is in compliance with Software Quality Processes and within timelines. Review functional and technical specification documents. Serve as the single point of contact for the team to the project stakeholders. Promote team work, motivate, mentor and develop subordinates. Band: U3 Competency: Data & Analytics

Posted 3 months ago

Apply

5 - 10 years

13 - 23 Lacs

Chennai, Bengaluru, Hyderabad

Work from Office

Naukri logo

Job Title: GCP Data Engineer (Senior / Lead / Architect/ Program Manager) Experience: 5 - 20 years Location : Chennai, Hyderabad, Bangalore Required Skills: GCP, Big Query, Cloud Storage, Dataflow, Python, Cloud Functions, Pub/Sub, Notice period: Immediate joiners Job Description: Experience leading, designing & developing Data Engineering solutions using Google Cloud Platform (GCP) Big Query, Cloud Storage, Dataflow, Cloud Functions, Pub/Sub, Cloud Run, Cloud Composer (Airflow), Cloud Spanner, Bigtable etc. Experience building CI/CD pipelines to automate deployment and testing of data pipelines. Experience in managing and deploying containerized applications Proficient in Python for data processing and automation, SQL for querying and data manipulation. Experience with Cloud Monitoring, Datadog, or other monitoring solutions to track pipeline performance and ensure operational efficiency. Familiarity with Terraform or Deployment for Infrastructure as Code (IaC) to manage GCP resources will be plus.

Posted 3 months ago

Apply

16 - 25 years

18 - 27 Lacs

Bengaluru

Work from Office

Naukri logo

Skill required: Tech for Operations - Program Project Management Designation: Program & Project Mgmt Senior Manager Qualifications: BTech Years of Experience: 16 to 25 years What would you do? You will be part of the Technology for Operations team that acts as a trusted advisor and partner to Accenture Operations. The team provides innovative and secure technologies to help clients build an intelligent operating model, driving exceptional results. We work closely with the sales, offering and delivery teams to identify and build innovative solutions.The Tech For Operations (TFO) team provides innovative and secure technologies to help clients build an intelligent operating model, driving exceptional results. Works closely with the sales, offering and delivery teams to identify and build innovative solutions. Major sub deals include AHO(Application Hosting Operations), ISMT (Infrastructure Management), Intelligent AutomationDiscipline and management of initiating, planning, executing, controlling, and closing the work of a team to achieve specific goals and meet specific success criteria. What are we looking for? Ability to establish strong client relationship Ability to handle disputes Ability to manage multiple stakeholders Ability to meet deadlines Ability to perform under pressure Roles and Responsibilities: In this role you are required to identify and assess complex problems for area(s) of responsibility The individual should create solutions in situations in which analysis requires in-depth knowledge of organizational objectives Requires involvement in setting strategic direction to establish near-term goals for area(s) of responsibility Interaction is with senior management levels at a client and/or within Accenture, involving negotiating or influencing on significant matters Should have latitude in decision-making and determination of objectives and approaches to critical assignments Their decisions have a lasting impact on area of responsibility with the potential to impact areas outside of own responsibility Individual manages large teams and/or work efforts (if in an individual contributor role) at a client or within Accenture Please note that this role may require you to work in rotational shifts Qualifications BTech

Posted 3 months ago

Apply

5 - 10 years

7 - 12 Lacs

Pune

Work from Office

Naukri logo

Project Role : Cloud Services Engineer Project Role Description : Act as liaison between the client and Accenture operations teams for support and escalations. Communicate service delivery health to all stakeholders and explain any performance issues or risks. Ensure Cloud orchestration and automation capability is operating based on target SLAs with minimal downtime. Hold performance meetings to share performance and consumption data and trends. Must have skills : Managed File Transfer Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Cloud Services Engineer, you will act as a liaison between the client and Accenture operations teams for support and escalations. You will communicate service delivery health to all stakeholders and explain any performance issues or risks. Ensure Cloud orchestration and automation capability is operating based on target SLAs with minimal downtime. Hold performance meetings to share performance and consumption data and trends. Roles & Responsibilities: Expected to be an SME. Collaborate and manage the team to perform. Responsible for team decisions. Engage with multiple teams and contribute on key decisions. Provide solutions to problems for their immediate team and across multiple teams. Ensure effective communication between client and operations teams. Analyze service delivery health and address performance issues. Conduct performance meetings to share data and trends. Professional & Technical Skills: Must To Have Skills:Proficiency in Managed File Transfer. Strong understanding of cloud orchestration and automation. Experience in SLA management and performance analysis. Knowledge of IT service delivery and escalation processes. Additional Information: The candidate should have a minimum of 5 years of experience in Managed File Transfer. This position is based at our Pune office. A 15 years full time education is required. Qualifications 15 years full time education

Posted 3 months ago

Apply

2 - 3 years

4 - 5 Lacs

Pune

Work from Office

Naukri logo

Job Purpose Perform LHS-RHS activity with respective stakeholders including COEs, Business Compliance and horizontal support functions. Closely work with CDI and CMT teams within compliance. Validate and confirm closure of LHS-RHS response in compliance system. Duties and Responsibilities Major Activities carried out by the role 1.Major Activity 1 Perform LHS-RHS activity closed for all required circulars and guidelines issued by regulators. Create and execute annual LHS-RHS activity plan for all master circulars and guidelines. Work closely Businesses, COEs, Horizontal support functions for closure of LHS-RHS activity and become compliant for all guidelines issued by various regulators. 2.Major Activity 2 Engage and work closely with Businesses, COEs, Horizontal support functions to become compliant on all LHS points for various guidelines issued by regulators. Engage and work closely with sub functions within compliance unit (eg. CMT, CDI, advisory team etc) 3.Major Activity 3 Publishing reports on ICF activities Track and moderate closure of No items with respective stakeholders Key Decisions / Dimensions Prioritization of work and tasks as per situations and requirements -Co-ordination with stakeholders and submitting on various tool requirement. Major Challenges Complexity due to 99 PPG and compliance requirement -Maintaining timeline of LHS-RHS for multiple guidelines by multiple regulators (Majorly RBI along with IRDA, NPCI and UDAI. -Initial challenges with Compliance tool, as it is new and yet to be completely developed and maintained. -Very high expectation from the role. -New and big team Required Qualifications and Experience a)Qualifications CA / CS with 2-3 years experience in banking industry Additional certifications in banking and Compliance like CIA / CISA / CAIIB are preferred Strong domain knowledge of RBI requirements related to retail assets, liabilities, payments, etc in BFSI sector is required b)Work Experience 2 to 3 years of experience in Banking and finance industry in Compliance / operations / audit functions. Expertise in understanding of regulatory requirements specifically from RBI. Strong domain knowledge of RBI requirements related to retail assets, payments, etc in BFSI sector is required Relevant experience in content creation c)Skills Keywords Excellent communication skills and presentability Demonstrate leadership, negotiation, communication and audit/ compliance management skills in order to handle CDI team activities and engage with Senior Management Develop rapport with Business teams for adding value through compliance recommendations. Bring new knowledge on board and keep updated with rapidly changing business environment. Constantly challenge status quo for bringing value addition

Posted 3 months ago

Apply

7 - 12 years

10 - 20 Lacs

Pune

Hybrid

Naukri logo

Lead Data Engineer Experience: 7 - 10 Years Salary: Competitive Preferred Notice Period : within 30 days Shift : 10:00 AM to 6:00 PM IST Opportunity Type: Hybrid - Pune Placement Type: Permanent (*Note: This is a requirement for one of Uplers' Partners) What do you need for this opportunity? Must have skills required : Python, SQL, GCP, Dataflow, Pub/Sub, Cloud Storage, Big Query Good to have skills : AWS, Docker, Kubernetes, Generative AI, Azure Our Hiring Partner is Looking for: Lead Data Engineer who is passionate about their work, eager to learn and grow, and who is committed to delivering exceptional results. If you are a team player with a positive attitude and a desire to make a difference, then we want to hear from you. Role Overview Description We are seeking an experienced and dynamic Lead Data Engineer to join our team. This role is pivotal in advancing our data engineering practices on the Google Cloud Platform (GCP) and offers a unique opportunity to work with cutting-edge technologies, including Generative AI. Key Responsibilities: Lead the design, implementation, and optimization of scalable data pipelines and architectures on GCP using key services such as Big Query, Dataflow, Pub/Sub, and Cloud Storage. Collaborate with cross-functional teams to define data requirements and develop strategic solutions that address business needs. Enhance existing data infrastructure, ensuring high levels of performance, reliability, and security. Drive the integration and deployment of machine learning models and advanced analytics solutions, incorporating Generative AI where applicable. Establish and enforce best practices in data governance, data quality, and data security. Mentor and guide junior engineers, fostering a culture of innovation and continuous improvement. Stay informed about the latest trends in data engineering, GCP advancements, and Generative AI technologies to drive innovation within the team. Qualifications: Bachelors or masters degree in computer science, Information Technology, or a related field. 7+ to 10 years of experience in data engineering, with a strong emphasis on GCP technologies. Demonstrated expertise in building and managing data solutions using GCP services like Big Query, Dataflow, and Cloud Composer. Proficiency in SQL and programming languages such as Python, Java, or Scala. Strong understanding of data modelling, warehousing concepts, and real-time data processing. Familiarity with containerization and orchestration tools like Docker and Kubernetes. Excellent analytical, problem-solving, and communication skills. Leadership experience with a proven ability to mentor and develop junior team members. Preferred Qualifications: GCP Professional Data Engineer certification. Experience with Generative AI technologies and their practical applications. Knowledge of additional cloud platforms such as AWS or Azure. Experience with implementing data governance frameworks and tools. How to apply for this opportunity Register or log in on our portal Click 'Apply,' upload your resume, and fill in the required details. Post this, click Apply Now' to submit your application. Get matched and crack a quick interview with our hiring partner. Land your global dream job and get your exciting career started! About Our Hiring Partner: At Inferenz, our team of innovative technologists and domain experts help accelerating the business growth through digital enablement and navigating the industries with data, cloud and AI services and solutions. We dedicate our resources to increase efficiency and gain a greater competitive advantage by leveraging various next generation technologies. Our technology expertise has helped us delivering the innovative solutions in key industries such as Healthcare & Life Sciences, Consumer & Retail, Financial Services and Emerging industries. About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. You will also be assigned to a dedicated Talent Success Coach during the engagement. ( Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!

Posted 3 months ago

Apply

15 - 20 years

20 - 35 Lacs

Pune

Work from Office

Naukri logo

Job Title: Lead Engineer (RYR#2025) Role Description Engineer is responsible for managing or performing work across multiple areas of the bank's overall IT Platform/Infrastructure including analysis, development, and administration. It may also involve taking functional oversight of engineering delivery for specific departments. Work includes: Planning and developing entire engineering solutions to accomplish business goals Building reliability and resiliency into solutions with appropriate testing and reviewing throughout the delivery lifecycle Ensuring maintainability and reusability of engineering solutions Ensuring solutions are well architected and can be integrated successfully into the end-to-end business process flow Reviewing engineering plans and quality to drive re-use and improve engineering capability Participating in industry forums to drive adoption of innovative technologies, tools and solutions in the Bank What we'll offer you: As part of our flexible scheme, here are just some of the benefits that you'll enjoy Your Key Responsibilities: The candidate is expected to; Hands-on engineering lead involved in analysis, design, design/code reviews, coding and release activities Champion engineering best practices and guide/mentor team to achieve high performance. Work closely with Business stakeholders, Tribe lead, Product Owner, Lead Architect to successfully deliver the business outcomes. Acquire functional knowledge of the business capability being digitized/re-engineered. Demonstrate ownership, inspire others, innovative thinking, growth mindset and collaborate for success. Your Skills & Experience: Minimum 15 years of IT industry experience in Full stack development Expert in Java, Spring Boot, NodeJS, SQL/PLSQL, ReactJS, Strong experience in Big data processing Apache Spark, Hadoop, Bigquery, DataProc, Dataflow etc Strong experience in Kubernetes, OpenShift container platform Experience with Databases Oracle, PostgreSQL, MongoDB, Redis/hazelcast, should understand data modeling, normalization, and performance optimization Experience in message queues (RabbitMQ/IBM MQ, JMS) and Data streaming i.e. Kafka, Pub-sub etc Experience of working on public cloud GCP preferred, AWS or Azure Knowledge of various distributed/multi-tiered architecture styles Micro-services, Data mesh, Integration patterns etc Experience on modern software product delivery practices, processes and tooling and BIzDevOps skills such as CI/CD pipelines using Jenkins, Git Actions etc Experience on designing solutions, based on DDD and implementing Clean / Hexagonal Architecture efficient systems that can handle large-scale operation Experience on leading teams and mentoring developers Focus on quality experience with TDD, BDD, Stress and Contract Tests Proficient in working with APIs (Application Programming Interfaces) and understand data formats like JSON, XML, YAML, Parquet etc Key Skills: Java Spring Boot NodeJS SQL/PLSQL ReactJS Advantageous: Having prior experience in Banking/Finance domain Having worked on hybrid cloud solutions preferably using GCP Having worked on product development How we'll support you: Training and development to help you excel in your career Coaching and support from experts in your team A culture of continuous learning to aid progression A range of flexible benefits that you can tailor to suit your needs

Posted 3 months ago

Apply

3 - 7 years

11 - 18 Lacs

Pune, Bengaluru, Hyderabad

Work from Office

Naukri logo

Role: GCP Data Engg Client: MNC (full time) Position: Permanent Exp: 3.5-6 years Location: PAN India NP: Immediate/Serving/30 DAYS Work mode : Hybrid/WFO Mandatory Skills: GCP, Bigquery, Dataflow, Dataplex, Pubsub, Python & SQL JD : To ensure successful initiation, planning, execution, control and completion of the project by guiding team members on technical aspects, conducting reviews of technical documents and artefacts. Lead project development, production support and maintenance activities. Fill and ensure timesheets are completed, as is the invoicing process, on or before the deadline. Lead the customer interface for the project on an everyday basis, proactively addressing any issues before they are escalated. Create functional and technical specification documents. Track open tickets/ incidents in queue and allocate tickets to resources and ensure that the tickets are closed within the deadlines. Ensure analysts adhere to SLA?s/KPI?s/OLA?s. Ensure that all in the delivery team, including self, are constantly thinking of ways to do things faster, better or in a more economic manner. Lead and ensure the project is in compliance with Software Quality Processes and within timelines. Review functional and technical specification documents. Serve as the single point of contact for the team to the project stakeholders. Promote teamwork, motivate, mentor and develop subordinates. Kindly please fill the below details & share updated cv to mansoor@burgeonits.com Name as per Aadhar card Mobile no Alternate no Email id Alternate email Date of birth Pan card no(for client upload)mandatory* Total Exp & Rev Exp Current company If any payroll (Name) Notice Period (If Serving any Np , Mention last working day) CCTC & ECTC Any offers (Yes/No) If yes how much offer &when joining date Current location & Preferred Location Happy to relocate(Yes/No) Available Interview time slots

Posted 3 months ago

Apply

3 - 4 years

5 - 6 Lacs

Kochi

Work from Office

Naukri logo

Job Purpose "This position is open with Bajaj Finance ltd." Duties and Responsibilities Daily Visit to ND/RD offices. - Manage ND relationships. - Sign up new ND relationships. - Maintain Entire Data of All ND- Sales Staff ( Branch Manager ,Sales Executive and Field Staff) - Mobilize deposits through channel partner?s i.e National Distributors/Regional Distributors. - DSR to be sent regularly on desired format. - Ensure productivity through Deposit mobilization, FD penetration, branch activation and renewals at local ND branches. - To organize road shows with sub brokers/distributors informing them about features of BFL FD. - To enable smooth processing of applications sourced from distributors. - To promptly resolve queries of Broker/Sub-broker/customers through interaction with internal staff. Required Qualifications and Experience ¥ Graduate/Post graduate. ¥ A minimum of 5-7 years experience in Sales. ¥ High energy levels, self motivated and a Go getter. ¥ Past experience in BFI and Distribution management (broker management) is necessary. ¥ Channel Management Skills.

Posted 3 months ago

Apply

13 - 20 years

30 - 40 Lacs

Chennai, Hyderabad

Work from Office

Naukri logo

Title : GCP Data Engineer Experience : 14+ Years Location : Chennai/Hyderabad Skillset : GCP, BigQuery, Dataflow, Terraform, AirFlow, CloudSql, Cloud Storage, cloud Composer, Pubsub If anyone interested, Kindly drop CV to sharmeelasri26@gmail.com

Posted 3 months ago

Apply

7 - 12 years

9 - 14 Lacs

Gurgaon

Work from Office

Naukri logo

Skilled Multiple GCP services - GCS, BigQuery, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer etc. Must have Python and SQL work experience & Proactive, collaborative and ability to respond to critical situation Ability to analyse data for functional business requirements & front face customer Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 5 to 7 years of relevant experience working as technical analyst with Big Query on GCP platform. Skilled in multiple GCP services - GCS, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer You love collaborative environments that use agile methodologies to encourage creative design thinking and find innovative ways to develop with cutting edge technologies Ambitious individual who can work under their own direction towards agreed targets/goals and with creative approach to work Preferred technical and professional experience Create up to 3 bullets MA Intuitive individual with an ability to manage change and proven time management Proven interpersonal skills while contributing to team effort by accomplishing related results as needed Up-to-date technical knowledge by attending educational workshops, reviewing publications (encouraging then to focus on required skills

Posted 3 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies