Jobs
Interviews

1102 Bigquery Jobs - Page 43

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

7 - 12 years

5 - 9 Lacs

Bengaluru

Work from Office

Project Role : Application Support Engineer Project Role Description : Act as software detectives, provide a dynamic service identifying and solving issues within multiple components of critical business systems. Must have skills : Kubernetes Good to have skills : Google Kubernetes Engine, Google Cloud Compute Services Minimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education About The Role :We are looking for an experienced Kubernetes Architect to join our growing cloud infrastructure team. This role will be responsible for architecting, designing, and implementing scalable, secure, and highly available cloud-native applications on Kubernetes. You will leverage Kubernetes along with associated technologies like Kubekafka, Kubegres, Helm, Ingress, Redis, Grafana, and Prometheus to build resilient systems that meet both business and technical needs. Google Kubernetes Engine (GKE) will be considered as an additional skill. As a Kubernetes Architect, you will play a key role in defining best practices, optimizing the infrastructure, and providing architectural guidance to cross-functional teams Key Responsibilities: Architect Kubernetes Solutions:Design and implement scalable, secure, and high-performance Kubernetes clusters. Cloud-Native Application Design:Collaborate with development teams to design cloud-native applications, ensuring that microservices are properly architected and optimized for Kubernetes environments. Kafka Management:Architect and manage Apache Kafka clusters using Kubekafka, ensuring reliable, real-time data streaming and event-driven architectures. Database Architecture:Use Kubegres to manage high-availability PostgreSQL clusters in Kubernetes, ensuring data consistency, scaling, and automated failover. Helm Chart Development:Create, maintain, and optimize Helm charts for consistent deployment and management of applications across Kubernetes environments. Ingress & Networking:Architect and configure Ingress controllers (e.g., NGINX, Traefik) for secure and efficient external access to Kubernetes services, including SSL termination, load balancing, and routing. Caching and Performance Optimization:Leverage Redis to design efficient caching and session management solutions, optimizing application performance. Monitoring & Observability:Lead the implementation of Prometheus for metrics collection and Grafana for building real-time monitoring dashboards to visualize the health and performance of infrastructure and applications. CI/CD Integration:Design and implement continuous integration and continuous deployment (CI/CD) pipelines to streamline the deployment of Kubernetes-based applications. Security & Compliance:Ensure Kubernetes clusters follow security best practices, including RBAC, network policies, and the proper configuration of Secrets Management. Automation & Scripting:Develop automation frameworks using tools like Terraform, Helm, and Ansible to ensure repeatable and scalable deployments. Capacity Planning and Cost Optimization:Optimize resource usage within Kubernetes clusters to achieve both performance and cost-efficiency, utilizing cloud tools and services. Leadership & Mentorship:Provide technical leadership to development, operations, and DevOps teams, offering mentorship, architectural guidance, and sharing best practices. Documentation & Reporting:Produce comprehensive architecture diagrams, design documents, and operational playbooks to ensure knowledge transfer across teams and maintain system reliability Required Skills & Experience: 10+ years of experience in cloud infrastructure engineering, with at least 5+ years of hands-on experience with Kubernetes. Strong expertise in Kubernetes for managing containerized applications in the cloud. Experience in deploying & managing container-based systems on both private and public clouds (Google Kubernetes Engine (GKE)). Proven experience with Kubekafka for managing Apache Kafka clusters in Kubernetes environments. Expertise in managing PostgreSQL clusters with Kubegres and implementing high-availability database solutions. In-depth knowledge of Helm for managing Kubernetes applications, including the development of custom Helm charts. Experience with Ingress controllers (e.g., NGINX, Traefik) for managing external traffic in Kubernetes. Hands-on experience with Redis for caching, session management, and as a message broker in Kubernetes environments. Advanced knowledge of Prometheus for monitoring and Grafana for visualization and alerting in cloud-native environments. Experience with CI/CD pipelines for automated deployment and integration using tools like Jenkins, GitLab CI, or CircleCI. Solid understanding of networking, including load balancing, DNS, SSL/TLS, and ingress/egress configurations in Kubernetes. Familiarity with Terraform and Ansible for infrastructure automation. Deep understanding of security best practices in Kubernetes, such as RBAC, Network Policies, and Secrets Management. Knowledge of DevSecOps practices to ensure secure application delivery.Certifications:oGoogle Cloud Platform (GCP) certification is mandatory.oKubernetes Certification (CKA, CKAD, or CKAD) is highly preferred.oHashiCorp Terraform certification is a significant plus.

Posted 2 months ago

Apply

3 - 8 years

4 - 8 Lacs

Bengaluru

Work from Office

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : GCP Dataflow Good to have skills : Google BigQuery Minimum 3 year(s) of experience is required Educational Qualification : Any Graduate Summary :As a Data Engineer, you will be responsible for designing, developing, and maintaining data solutions for data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across systems using GCP Dataflow. Roles & Responsibilities: Design, develop, and maintain data solutions for data generation, collection, and processing using GCP Dataflow. Create data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across systems. Collaborate with cross-functional teams to identify and resolve data-related issues. Develop and maintain documentation related to data solutions and processes. Stay updated with the latest advancements in data engineering technologies and integrate innovative approaches for sustained competitive advantage. Professional & Technical Skills: Must To Have Skills:Experience in GCP Dataflow. Good To Have Skills:Experience in Google BigQuery. Strong understanding of ETL processes and data migration. Experience in data modeling and database design. Experience in data warehousing and data lake concepts. Experience in programming languages such as Python, Java, or Scala. Additional Information: The candidate should have a minimum of 3 years of experience in GCP Dataflow. The ideal candidate will possess a strong educational background in computer science, information technology, or a related field, along with a proven track record of delivering impactful data-driven solutions. This position is based at our Indore office. Qualification Any Graduate

Posted 2 months ago

Apply

3 - 8 years

5 - 10 Lacs

Bengaluru

Work from Office

Project Role : Integration Engineer Project Role Description : Provide consultative Business and System Integration services to help clients implement effective solutions. Understand and translate customer needs into business and technology solutions. Drive discussions and consult on transformation, the customer journey, functional/application designs and ensure technology and business solutions represent business requirements. Must have skills : Microsoft 365 Good to have skills : Microsoft PowerShell, Microsoft 365 Security & Compliance Minimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Integration Engineer, you will provide consultative Business and System Integration services to help clients implement effective solutions. You will understand and translate customer needs into business and technology solutions, drive discussions, consult on transformation, the customer journey, functional/application designs, and ensure technology and business solutions represent business requirements. Roles & Responsibilities: Expected to perform independently and become an SME. Required active participation/contribution in team discussions. Contribute in providing solutions to work related problems. Develop and implement integration solutions for clients. Collaborate with cross-functional teams to ensure successful project delivery. Professional & Technical Skills: Must To Have Skills:Proficiency in Microsoft 365. Good To Have Skills:Experience with Microsoft PowerShell, Microsoft 365 Security & Compliance. Strong understanding of cloud-based integration technologies. Experience in designing and implementing scalable integration solutions. Knowledge of API integration and data mapping techniques. Additional Information: The candidate should have a minimum of 3 years of experience in Microsoft 365. This position is based at our Bengaluru office. A 15 years full time education is required. Qualifications 15 years full time education

Posted 2 months ago

Apply

5 - 10 years

8 - 13 Lacs

Bengaluru

Work from Office

Project Role : Application Support Engineer Project Role Description : Act as software detectives, provide a dynamic service identifying and solving issues within multiple components of critical business systems. Must have skills : Kubernetes Good to have skills : Google Kubernetes Engine, Google Cloud Compute Services Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Job Summary :We are looking for an experienced Kubernetes Specialist to join our cloud infrastructure team. You will work closely with architects and engineers to design, implement, and optimize cloud-native applications on Google Kubernetes Engine (GKE). This role will focus on providing expertise in Kubernetes, container orchestration, and cloud infrastructure management, ensuring the seamless operation of scalable, secure, and high-performance applications on GKE and other cloud environments.________________________________________Responsibilities: Kubernetes Implementation:Design, implement, and manage Kubernetes clusters for containerized applications, ensuring high availability and scalability. Cloud-Native Application Design:Work with teams to deploy, scale, and maintain cloud-native applications on Google Kubernetes Engine (GKE). Kubernetes Tools Expertise:Utilize Kubekafka, Kubegres, Helm, Ingress, Redis, Grafana, and Prometheus to build and maintain resilient systems. Infrastructure Automation:Develop and implement automation frameworks using Terraform and other tools to streamline Kubernetes deployments and cloud infrastructure management. CI/CD Implementation:Design and maintain CI/CD pipelines to automate deployment and testing for Kubernetes-based applications. Kubernetes Networking & Security:Ensure secure and efficient Kubernetes cluster networking, including Ingress controllers (e.g., NGINX, Traefik), RBAC, and Secrets Management. Monitoring & Observability:Lead the integration of monitoring solutions using Prometheus for metrics and Grafana for real-time dashboard visualization. Performance Optimization:Optimize resource utilization within GKE clusters, ensuring both performance and cost-efficiency. Collaboration:Collaborate with internal development, operations, and security teams to meet user requirements and implement Kubernetes solutions. Troubleshooting & Issue Resolution:Address complex issues related to containerized applications, Kubernetes clusters, and cloud infrastructure, troubleshooting and resolving them efficiently.________________________________________Technical Skillset: GCP & Kubernetes Experience:Minimum of 3+ years of hands-on experience in Google Cloud Platform (GCP) and Kubernetes implementations, including GKE. Container Management:Proficiency with container orchestration engines such as Kubernetes and Docker. Kubernetes Tools Knowledge:Experience with Kubekafka, Kubegres, Helm, Ingress, Redis, Grafana, and Prometheus for managing Kubernetes-based applications. Infrastructure as Code (IaC):Strong experience with Terraform for automating infrastructure provisioning and management. CI/CD Pipelines:Hands-on experience in building and managing CI/CD pipelines for Kubernetes applications using tools like Jenkins, GitLab, or CircleCI. Security & Networking:Knowledge of Kubernetes networking (DNS, SSL/TLS), security best practices (RBAC, network policies, and Secrets Management), and the use of Ingress controllers (e.g., NGINX) Cloud & DevOps Tools:Familiarity with cloud services and DevOps tools such as GitHub, Jenkins, and Ansible. Monitoring Expertise:In-depth experience with Prometheus and Grafana for operational monitoring, alerting, and creating actionable insights. Certifications: Google Cloud Platform (GCP) Associate Cloud Engineer (ACE) certification is required. Certified Kubernetes Administrator (CKA) is highly preferred.

Posted 2 months ago

Apply

3 - 5 years

5 - 7 Lacs

Jaipur

Work from Office

Skill required: Procure to Pay - Invoice Processing Designation: Procure to Pay Operations Analyst Qualifications: BCom/MCom Years of Experience: 3 to 5 years What would you do? You will be aligned with our Finance Operations vertical and will be helping us in determining financial outcomes by collecting operational data/reports, whilst conducting analysis and reconciling transactions.boosting vendor compliance, cutting savings erosion, improving discount capture using preferred suppliers, and in confirming pricing and terms prior to payment. Responsible for accounting of goods and services, through requisitioning, purchasing and receiving. Also look after order sequence of procurement and financial process end to end. The Accounts Payable Processing team focuses on designing, implementing, managing and supporting accounts payable activities by applying the relevant processes, policies and applications. The team is responsible for timely and accurate billing and processing of invoices, managing purchase and non-purchase orders and two-way and three-way matching of invoices.Refers to the systematic handling and management of incoming invoices within a business or organization. It involves tasks such as verifying the accuracy of the invoice, matching it with purchase orders and delivery receipts, and initiating the payment process. Automated systems and software are often employed to streamline and expedite the invoice processing workflow, improving efficiency and reducing the likelihood of errors. What are we looking for? Good Verbal and written Communication SkillsWell versed with Accounts payable cycle , sub processes & terminologies Good Understanding of PO vs Non PO InvoicesGood understanding of withholding taxes treatments in invoice processingGood understanding of Month end/ quarter end/ year end closing along with acccruals and reportingGood understanding of accounting Journal entriesUnderstanding of employee expense claim processingUnderstanding of expenses accruals Vendors ReconciliationsReporting and audit assignmentsWorking knowledge of MS OfficeProblem Solving attitudeTeam working and co-ordinationReady to work in night shiftsKnowledge of Invoice processing toolsKnowledge of current technologies in PTP domainAnalytical skillUnderstanding of RPAs Roles and Responsibilities: In this role you are required to do analysis and solving of lower-complexity problems Your day to day interaction is with peers within Accenture before updating supervisors In this role you may have limited exposure with clients and/or Accenture management You will be given moderate level instruction on daily work tasks and detailed instructions on new assignments The decisions you make impact your own work and may impact the work of others You will be an individual contributor as a part of a team, with a focused scope of work Please note that this role may require you to work in rotational shifts Qualifications BCom,MCom

Posted 2 months ago

Apply

12 - 17 years

14 - 19 Lacs

Chennai

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : PySpark Good to have skills : Apache Spark, Python (Programming Language), Google BigQuery Minimum 12 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. You will be responsible for ensuring that the applications are developed and implemented efficiently and effectively, while meeting the needs of the organization. Your typical day will involve collaborating with the team, making team decisions, and engaging with multiple teams to contribute to key decisions. You will also be expected to provide solutions to problems that apply across multiple teams, showcasing your expertise and problem-solving skills. Roles & Responsibilities: Expected to be an SME Collaborate and manage the team to perform Responsible for team decisions Engage with multiple teams and contribute on key decisions Expected to provide solutions to problems that apply across multiple teams Ensure efficient and effective development and implementation of applications Design and build applications to meet business process and application requirements Contribute to the decision-making process and provide valuable insights Professional & Technical Skills: Must To Have Skills:Proficiency in PySpark Good To Have Skills:Experience with Apache Spark, Python (Programming Language), Google BigQuery Strong understanding of statistical analysis and machine learning algorithms Experience with data visualization tools such as Tableau or Power BI Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity Additional Information: The candidate should have a minimum of 12 years of experience in PySpark This position is based at our Bengaluru office A 15 years full time education is required Qualifications 15 years full time education

Posted 2 months ago

Apply

3 - 5 years

5 - 9 Lacs

Mumbai

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Google BigQuery Good to have skills : NA Minimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Project Role :Application Developer Project Role Description :Design, build and configure applications to meet business process and application requirements. Must have Skills :Google BigQuery, SSI: NON SSI:Good to Have Skills :SSI:No Technology Specialization NON SSI :Job Requirements :Key Responsibilities :1:Assists with the data platform blueprint and design, encompassing the relevant data platform components.2:Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models.3:The Data Engineer performs tasks such as data modeling, data pipeline build, data lake build, scalable programming frameworks Technical Experience :1:Expert in Python - NO FLEX. Strong hands-on- knowledge in SQL - NO FLEX, Python programming using Pandas, NumPy, deep understanding of various data structure dictionary, array, list, tree etc, experiences in pytest, code coverage skills2:Exp with building solutions using cloud native services:bucket storage, Big Query, cloud function, pub sub, composer, and Kubernetes NO FLEX3:Pro with tools to automate AZDO CI CD pipelines like Control-M , GitHub, JIRA, confluence , CI CD Pipeline Professional Attributes :1:Good communication 2:Good Leadership skills and team handling skills 3:Analytical skills, presentation skills, ability to work under pressure 4:Should be able to work in shifts whenever required Educational Qualification:Additional Info : Qualification 15 years full time education

Posted 2 months ago

Apply

3 - 8 years

5 - 10 Lacs

Bengaluru

Work from Office

Project Role : AI / ML Engineer Project Role Description : Develops applications and systems that utilize AI to improve performance and efficiency, including but not limited to deep learning, neural networks, chatbots, natural language processing. Must have skills : Google Cloud Machine Learning Services Good to have skills : GCP Dataflow, Google Dataproc, Google Pub/Sub Minimum 3 year(s) of experience is required Educational Qualification : 15 years full time education br/>Key Responsibilities :A:Implement and maintain data engineering solutions using BigQuery, Dataflow, Vertex AI, Dataproc, and Pub/SubB:Collaborate with data scientists to deploy machine learning modelsC:Ensure the scalability and efficiency of data processing pipelines br/> Technical Experience :A:Expertise in BigQuery, Dataflow, Vertex AI, Dataproc, and Pub/SubB:Hands-on experience with data engineering in a cloud environment br/> Professional Attributes :A:Strong problem-solving skills in optimizing data workflowsB:Effective collaboration with data science and engineering teams Qualifications 15 years full time education

Posted 2 months ago

Apply

16 - 25 years

18 - 27 Lacs

Bengaluru

Work from Office

Skill required: Tech for Operations - Artificial Intelligence (AI) Designation: AI/ML Computational Science Sr Manager Qualifications: Any Graduation Years of Experience: 16 to 25 years What would you do? You will be part of the Technology for Operations team that acts as a trusted advisor and partner to Accenture Operations. The team provides innovative and secure technologies to help clients build an intelligent operating model, driving exceptional results. We work closely with the sales, offering and delivery teams to identify and build innovative solutions.The Tech For Operations (TFO) team provides innovative and secure technologies to help clients build an intelligent operating model, driving exceptional results. Works closely with the sales, offering and delivery teams to identify and build innovative solutions. Major sub deals include AHO(Application Hosting Operations), ISMT (Infrastructure Management), Intelligent AutomationIn Artificial Intelligence, you will be enhancing business results by using AI tools and techniques to performs tasks such as visual perception, speech recognition, decision-making, and translation between languages etc. that requires human intelligence. What are we looking for? Machine Learning Process-orientation Thought leadership Commitment to quality Roles and Responsibilities: In this role you are required to identify and assess complex problems for area(s) of responsibility The individual should create solutions in situations in which analysis requires in-depth knowledge of organizational objectives Requires involvement in setting strategic direction to establish near-term goals for area(s) of responsibility Interaction is with senior management levels at a client and/or within Accenture, involving negotiating or influencing on significant matters Should have latitude in decision-making and determination of objectives and approaches to critical assignments Their decisions have a lasting impact on area of responsibility with the potential to impact areas outside of own responsibility Individual manages large teams and/or work efforts (if in an individual contributor role) at a client or within Accenture Please note that this role may require you to work in rotational shifts Qualifications Any Graduation

Posted 2 months ago

Apply

2 - 4 years

5 - 8 Lacs

Pune

Work from Office

We are seeking a talented and motivated AI Engineers to join our dynamic team and contribute to the development of next-generation AI/GenAI based products and solutions. This role will provide you with the opportunity to work on cutting-edge SaaS technologies and impactful projects that are used by enterprises and users worldwide. As a Senior Software Engineer, you will be involved in the design, development, testing, deployment, and maintenance of software solutions. You will work in a collaborative environment, contributing to the technical foundation behind our flagship products and services. Responsibilities: Software Development: Write clean, maintainable, and efficient code or various software applications and systems. GenAI Product Development: Participate in the entire AI development lifecycle, including data collection, preprocessing, model training, evaluation, and deployment.Assist in researching and experimenting with state-of-the-art generative AI techniques to improve model performance and capabilities. Design and Architecture: Participate in design reviews with peers and stakeholders Code Review: Review code developed by other developers, providing feedback adhering to industry standard best practices like coding guidelines Testing: Build testable software, define tests, participate in the testing process, automate tests using tools (e.g., Junit, Selenium) and Design Patterns leveraging the test automation pyramid as the guide. Debugging and Troubleshooting: Triage defects or customer reported issues, debug and resolve in a timely and efficient manner. Service Health and Quality: Contribute to health and quality of services and incidents, promptly identifying and escalating issues. Collaborate with the team in utilizing service health indicators and telemetry for action. Assist in conducting root cause analysis and implementing measures to prevent future recurrences. Dev Ops Model: Understanding of working in a DevOps Model. Begin to take ownership of working with product management on requirements to design, develop, test, deploy and maintain the software in production. Documentation: Properly document new features, enhancements or fixes to the product, and also contribute to training materials. Basic Qualifications: Bachelors degree in Computer Science, Engineering, or a related technical field, or equivalent practical experience. 2+ years of professional software development experience. Proficiency as a developer using Python, FastAPI, PyTest, Celery and other Python frameworks. Experience with software development practices and design patterns. Familiarity with version control systems like Git GitHub and bug/work tracking systems like JIRA. Basic understanding of cloud technologies and DevOps principles. Strong analytical and problem-solving skills, with a proven track record of building and shipping successful software products and services. Preferred Qualifications: Experience with object-oriented programming, concurrency, design patterns, and REST APIs. Experience with CI/CD tooling such as Terraform and GitHub Actions. High level familiarity with AI/ML, GenAI, and MLOps concepts. Familiarity with frameworks like LangChain and LangGraph. Experience with SQL and NoSQL databases such as MongoDB, MSSQL, or Postgres. Experience with testing tools such as PyTest, PyMock, xUnit, mocking frameworks, etc. Experience with GCP technologies such as VertexAI, BigQuery, GKE, GCS, DataFlow, and Kubeflow. Experience with Docker and Kubernetes. Experience with Java and Scala a plus.

Posted 2 months ago

Apply

12 - 17 years

14 - 19 Lacs

Bengaluru

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : PySpark Good to have skills : Apache Spark, Python (Programming Language), Google BigQuery Minimum 12 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be involved in designing, building, and configuring applications to meet business process and application requirements. Your typical day will revolve around creating innovative solutions to address various business needs and ensuring seamless application functionality. Roles & Responsibilities: Expected to be an SME Collaborate and manage the team to perform Responsible for team decisions Engage with multiple teams and contribute on key decisions Expected to provide solutions to problems that apply across multiple teams Lead the team in implementing PySpark solutions effectively Conduct code reviews and ensure adherence to best practices Provide technical guidance and mentorship to junior team members Professional & Technical Skills: Must To Have Skills:Proficiency in PySpark, Python (Programming Language), Apache Spark, Google BigQuery Strong understanding of distributed computing and parallel processing Experience in optimizing PySpark jobs for performance Knowledge of data processing and transformation techniques Familiarity with cloud platforms for deploying PySpark applications Additional Information: The candidate should have a minimum of 12 years of experience in PySpark This position is based at our Gurugram office A 15 years full-time education is required Qualifications 15 years full time education

Posted 2 months ago

Apply

3 - 8 years

5 - 10 Lacs

Hyderabad

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Google BigQuery Good to have skills : NA Minimum 3 year(s) of experience is required Educational Qualification : Btech Summary : As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements. You will work closely with the team to ensure the successful delivery of high-quality solutions. Roles & Responsibilities: - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work-related problems. - Collaborate with cross-functional teams to gather and analyze requirements. - Design, develop, and test applications based on business needs. - Troubleshoot and debug applications to ensure optimal performance. - Implement security and data protection measures. - Document technical specifications and user guides. - Stay up-to-date with emerging technologies and industry trends. Professional & Technical Skills: - Must To Have Skills:Proficiency in Google BigQuery. - Strong understanding of SQL and database concepts. - Experience with data modeling and schema design. - Knowledge of ETL processes and data integration techniques. - Familiarity with cloud platforms such as Google Cloud Platform. - Good To Have Skills:Experience with data visualization tools such as Tableau or Power BI. Additional Information: - The candidate should have a minimum of 3 years of experience in Google BigQuery. - This position is based at our Hyderabad office. - A Btech degree is required. Qualifications Btech

Posted 2 months ago

Apply

4 - 9 years

16 - 31 Lacs

Hyderabad, Pune, Bengaluru

Hybrid

Role & responsibilities Execute project specific development activities in accordance to applicable standards and quality parameters Developing Reviewing Code Setting up the right environment for the projects. Ensure delivery within schedule by adhering to the engineering and quality standards. Own deliver end to end projects within GCP for Payments Data Platform Once a month available on support rota for a week for GCP 24x7 on call Basic Knowledge on Payments ISO standards Message Types etc Able to work under pressure on deliverables P1 Violations Incidents Should be fluent and clear in communications Written Verbal Should be able to follow Agile ways of working. Must have hands on experience on JAVA GCP Shell script and Python knowledge a plus Have in depth knowledge on Java Spring boot Should have experience in GCP Data Flow Big Table Big Query etc Should have experience on managing large database Should have worked on requirements design develop Event Driven and Near Real time data patterns Ingress Egress Preferred candidate profile

Posted 2 months ago

Apply

7 - 12 years

13 - 17 Lacs

Gurugram

Work from Office

Skilled Multiple GCP services - GCS, Big Query, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer etc. Must have Python and SQL work experience & Proactive, collaborative and ability to respond to critical situation Ability to analyse data for functional business requirements & front face customer Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 5 to 7 years of relevant experience working as technical analyst with Big Query on GCP platform. Skilled in multiple GCP services - GCS, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer You love collaborative environments that use agile methodologies to encourage creative design thinking and find innovative ways to develop with cutting edge technologies Ambitious individual who can work under their own direction towards agreed targets/goals and with creative approach to work Preferred technical and professional experience Create up to 3 bullets MA Intuitive individual with an ability to manage change and proven time management Proven interpersonal skills while contributing to team effort by accomplishing related results as needed Up-to-date technical knowledge by attending educational workshops, reviewing publications (encouraging then to focus on required skills

Posted 2 months ago

Apply

1 - 3 years

3 - 6 Lacs

Bengaluru

Work from Office

Skill required: Record To Report - Invoice Processing Designation: Record to Report Ops Associate Qualifications: BCom/MCom Years of Experience: 1 to 3 years Language - Ability: English(Domestic) - Expert What would you do? You will be aligned with our Finance Operations vertical and will be helping us in determining financial outcomes by collecting operational data/reports, whilst conducting analysis and reconciling transactions.Posting journal entries, preparing balance sheet reconciliations, reviewing entries and reconciliations, preparing cash forecasting statement, supporting month end closing, preparing reports and supports in audits.Refers to the systematic handling and management of incoming invoices within a business or organization. It involves tasks such as verifying the accuracy of the invoice, matching it with purchase orders and delivery receipts, and initiating the payment process. Automated systems and software are often employed to streamline and expedite the invoice processing workflow, improving efficiency and reducing the likelihood of errors. What are we looking for? Google Cloud SQL Adaptable and flexible Ability to perform under pressure Problem-solving skills Agility for quick learning Commitment to quality Roles and Responsibilities: In this role you are required to solve routine problems, largely through precedent and referral to general guidelines Your expected interactions are within your own team and direct supervisor You will be provided detailed to moderate level of instruction on daily work tasks and detailed instruction on new assignments The decisions that you make would impact your own work You will be an individual contributor as a part of a team, with a predetermined, focused scope of work Please note that this role may require you to work in rotational shifts Qualification BCom,MCom

Posted 2 months ago

Apply

7 - 10 years

16 - 21 Lacs

Mumbai

Work from Office

Position Overview: The Google Cloud Data Engineering Lead role is ideal for an experienced Google Cloud Data Engineer who will drive the design, development, and optimization of data solutions on the Google Cloud Platform (GCP). The role requires the candidate to lead a team of data engineers and collaborate with data scientists, analysts, and business stakeholders to enable scalable, secure, and high-performance data pipelines and analytics platforms. Key Responsibilities: Lead and manage a team of data engineers delivering end-to-end data pipelines and platforms on GCP. Design and implement robust, scalable, and secure data architectures using services like BigQuery, Dataflow, Dataproc, Pub/Sub, and Cloud Storage. Develop and maintain batch and real-time ETL/ELT workflows using tools such as Apache Beam, Dataflow, or Composer (Airflow). Collaborate with data scientists, analysts, and application teams to gather requirements and ensure data availability and quality. Define and enforce data engineering best practices including version control, testing, code reviews, and documentation. Drive automation and infrastructure-as-code approaches using Terraform or Deployment Manager for provisioning GCP resources. Implement and monitor data quality, lineage, and governance frameworks across the data platform. Optimize query performance and storage strategies, particularly within BigQuery and other GCP analytics tools. Mentor team members and contribute to the growth of technical capabilities across the organization. Qualifications: Education : Bachelor’s or Master’s degree in Computer Science, Data Engineering, or related field. Experience : 7+ years of experience in data engineering, including 3+ years working with GCP data services. Proven leadership experience in managing and mentoring data engineering teams. Skills : Expert-level understanding of BigQuery, Dataflow (Apache Beam), Cloud Storage, and Pub/Sub. Strong SQL and Python skills for data processing and orchestration. Experience with workflow orchestration tools (Airflow/Composer). Hands-on experience with CI/CD, Git, and infrastructure-as-code tools (e.g., Terraform). Familiarity with data security, governance, and compliance practices in cloud environments. Certifications : GCP Professional Data Engineer certification.

Posted 2 months ago

Apply

4 - 9 years

10 - 14 Lacs

Pune

Hybrid

Job Description; Technical Skills; Top skills for this positions is : Google Cloud Platform (Composer, Big Query, Airflow, DataProc, Data Flow, GCS) Data Warehousing knowledge Hands on experience in Python language and SQL database. Analytical technical skills to be able to predict the consequences of configuration changes (impact analysis), to identify root causes that are not obvious and to understand the business requirements. Excellent communication with different stakeholders (business, technical, project) Good understading of the overall Big Data and Data Science ecosystem Experience with buiding and deploying containers as services using Swarm/Kubernetes Good understanding of container concepts like buiding lean and secure images Understanding modern DevOps pipelines Experience with stream data pipelines using Kafka or Pub/Sub (mandatory for Kafka resources) Good to have: Professional Data Engineer or Associate Data Engineer Certification Roles and Responsibilities; Design, build & manage Big data ingestion and processing applications on Google Cloud using Big Query, Dataflow, Composer, Cloud Storage, Dataproc Performance tuning and analysis of Spark, Apache Beam (Dataflow) or similar distributed computing tools and applications on Google Cloud Good understanding of google cloud concepts, environments and utilities to design cloud optimal solutions for Machine Learning Applications Build systems to perform real-time data processing using Kafka, Pub-sub, Spark Streaming or similar technologies Manage the development life-cycle for agile software development projects Convert a proof of concept into an industrialization for Machine Learning Models (MLOps). Provide solutions to complex problems. Deliver customer-oriented solutions in a timely, collaborative manner Proactive thinking, planning and understanding of dependencies Develop & implement robust solutions in test & production environments.

Posted 2 months ago

Apply

4 - 7 years

6 - 16 Lacs

Bengaluru

Work from Office

Senior Software Engineer Google Cloud Platform (GCP) Location: Bangalore, India Why Join Fossil Group? At Fossil Group, we are part of an international team that dares to dream, disrupt, and deliver innovative watches, jewelry, and leather goods to the world. We're committed to long-term value creation, driven by technology and our core values: Authenticity, Grit, Curiosity, Humor, and Impact. If you are a forward-thinker who thrives in a diverse, global setting, we want to hear from you. Make an Impact (Job Summary + Responsibilities) We are looking for a Senior Software Engineer – GCP to join our growing Cloud & Data Engineering team at Fossil Group . This role involves building scalable cloud-native data pipelines using Google Cloud Platform services, with a focus on Dataflow, Dataproc, BigQuery , and strong development skills in Java, Python, and SQL . You will collaborate with global architects and business teams to design and deploy innovative solutions, supporting data analytics, automation, and transformation needs. What you will do in this role: Design, develop, deploy, and maintain data pipelines and services using GCP technologies including Dataflow, Dataproc, BigQuery, Composer , and others. Translate blueprinting documents and business requirements into scalable and maintainable GCP configurations and solutions. Develop and enhance cloud-based batch/streaming jobs using Java or Python. Collaborate with global architects and cross-functional teams to define solutions and execute projects across development and testing environments. Perform unit testing, integration testing, and resolve issues arising during the QA lifecycle. Work closely with internal stakeholders to gather requirements, present technical solutions, and provide end-to-end delivery. Own and manage project timelines, priorities, and documentation. Continuously improve processes and stay current with GCP advancements and big data technologies. Who You Are (Requirements) Bachelor's degree in Computer Science or related field. 4-7 years of experience as a DB/SQL Developer or Java/Python Developer with strong SQL capabilities. Hands-on experience with GCP components such as Dataflow, Dataproc, BigQuery, Cloud Functions, Composer, Data Fusion . Excellent command of SQL with the ability to write complex queries and perform advanced data transformation. Strong programming skills in Java and/or Python , specifically for building cloud-native data pipelines. Experience with relational and NoSQL databases. Ability to understand and translate business requirements into functional specifications. Familiarity with BI dashboards and Google Data Studio is a plus. Strong problem-solving, communication, and collaboration skills. Self-directed, with a growth mindset and eagerness to upskill in emerging GCP technologies. Comfortable leading meetings, gathering requirements, and managing stakeholder communication across regions. What We Offer Comprehensive Benefits: Includes health and well-being services. Paid Parental Leave & Return to Work Program: Support for new parents and caregivers with paid leave and a flexible phase-back schedule. Generous Paid Time Off: Includes Sick Time, Personal Days, and Summer Flex Fridays. Employee Discounts: Save on Fossil merchandise. EEO Statement At Fossil, we believe our differences not only make us stronger as a team, but also help us create better products and a richer community. We are an Equal Employment Opportunity Employer dedicated to a policy of non-discrimination in all employment practices without regard to age, disability, gender identity or expression, marital status, pregnancy, race, religion, sexual orientation, or any other protected characteristic.

Posted 2 months ago

Apply

4 - 9 years

22 - 30 Lacs

Pune

Hybrid

Primary Skills: SQL (Data Analysis and Development) Alternate Skills: Python, Sharepoint, AWS , ETL, Telecom specially Fixed Network domain. Location: Pune Working Persona: Hybrid Experience: 4 to 10 years Core competencies, knowledge and experience: Essential: Strong SQL experience - Advanced level of SQL Excellent data interpretation skills Good knowledge of ETL and a business intelligence, good understanding of range of data manipulation and analysis techniques Working knowledge of large information technology development projects using methodologies and standards Excellent verbal, written and interpersonal communication skills, demonstrating the ability to communicate information technology concepts to non-technology personnel, should be able to interact with business team and share the ideas. Strong analytical, problem solving and decision-making skills, attitude to plan and organize work to deliver as agreed. Ability to work under pressure to tight deadlines. Hands on experience working with large datasets. Able to manage different stakeholders.

Posted 2 months ago

Apply

10 - 20 years

25 - 40 Lacs

Hyderabad, Pune, Bengaluru

Work from Office

Hi, Hope you are looking for a job change. We have opening for GCP Data Architect for an MNC in Pan India Location, I'm sharing JD with you. Please have a look and revert with below details and Updated Resume. Apply only if you can join in 10 Days. Its 5-Days Work from Office . We dont process High-Notice period Candidates Role GCP Data Architect Experience: 10+ Years Mode: Permanent Work Location: Pan India Notice Period: immediate to 10 Days Mandatory Skills: : GCP, Architecture Experience , Big Data, Data Modelling , BigQuery Full Name ( As Per Aadhar Card ): Email ID: Mobile Number: Alternate No: Qualification: Graduation Year: Regular Course: Total Experience: Relevant experience: Current Organization: Working as Permanent Employee: Payroll Company:: Experience in GCP : Experience in Architecture : Experience in GCP Data Architecture : Experience in Big Data: Experience in BigQuery : Experience in Data Management : Official Notice period: Serving Notice Period: Current location: Preferred location: Current CTC: Exp CTC: CTC Breakup: Pan Card Number : Date of Birth : Any Offer in Hand: Offered CTC: LWD: Any Offer in hand: Serving Notice Period: Can you join immediately: Ready to work from Office for 5 Days : Job Description: GCP Data Architect We are seeking a skilled Data Solution Architect to design Solution and Lead the implementation on GCP. The ideal candidate will possess extensive experience in data Architecting, Solution design and data management practices. Responsibilities: Architect and design end-to-end data solutions on Cloud Platform, focusing on data warehousing and big data platforms. Collaborate with clients, developers, and architecture teams to understand requirements and translate them into effective data solutions. Develop high-level and detailed data architecture and design documentation. Implement data management and data governance strategies, ensuring compliance with industry standards. Architect both batch and real-time data solutions, leveraging cloud native services and technologies. Design and manage data pipeline processes for historic data migration and data integration. Collaborate with business analysts to understand domain data requirements and incorporate them into the design deliverables. Drive innovation in data analytics by leveraging cutting-edge technologies and methodologies. Demonstrate excellent verbal and written communication skills to communicate complex ideas and concepts effectively. Stay updated on the latest advancements in Data Analytics, data architecture, and data management techniques. Requirements Minimum of 5 years of experience in a Data Architect role, supporting warehouse and Cloud data platforms/environments. Experience with common GCP services such as BigQuery, Dataflow, GCS, Service Accounts, cloud function Extremely strong in BigQuery design, development Extensive knowledge and implementation experience in data management, governance, and security frameworks. Proven experience in creating high-level and detailed data architecture and design documentation. Strong aptitude for business analysis to understand domain data requirements. Proficiency in Data Modelling using any Modelling tool for Conceptual, Logical, and Physical models is preferred Hands-on experience with architecting end-to-end data solutions for both batch and real-time designs. Ability to collaborate effectively with clients, developers, and architecture teams to implement enterprise-level data solutions. Familiarity with Data Fabric and Data Mesh architecture is a plus. Excellent verbal and written communication skills. Regards, Rejeesh S Email : rejeesh.s@jobworld.jobs Mobile : +91 - 9188336668

Posted 2 months ago

Apply

5 - 9 years

10 - 20 Lacs

Pune, Bengaluru, Delhi / NCR

Hybrid

Mandatory Skills: GCS Composer, BigQuery, Azure, Azure Databricks, ADLS. Experience: 5-10 years Good to have skills Knowledge on CDP (customer data platform) Airline knowledge Design and develop data-ingestion frameworks, real-time processing solutions, and data processing and transformation frameworks leveraging open source tools and data processing frameworks. Hands-on on technologies such as Kafka, Apache Spark (SQL, Scala, Java), Python, Hadoop Platform, Hive, airflow Experience in GCP Cloud Composer, Big Query, DataProc Offer system support as part of a support rotation with other team members. Operationalize open source data-analytic tools for enterprise use. Ensure data governance policies are followed by implementing or validating data lineage, quality checks, and data classification. Understand and follow the company development lifecycle to develop, deploy and deliver the solutions. Minimum Qualifications: Bachelor's degree in Computer Science, CIS, or related field 5-7 years of IT experience in software engineering or related field Experience on project(s) involving the implementation of software development life cycles (SDLC)Core Skills

Posted 2 months ago

Apply

6 - 10 years

16 - 25 Lacs

Hyderabad, Chennai, Bengaluru

Work from Office

Hiring!! Role: Sr Power BI Developer Client : MNC(Fulltime)-Permanent Exp: 6-12 years Noticeperiod: Imm/serving/15 Days Location : Pan India Skills: Power BI Developer power bi dashboards, Power Query, GCP (Big Query)& SQL Server. Power Apps Kindly please fill the below details & share updated cv to mansoor@burgeonits.com Name as per Aadhar card Mobile no & Alternate no Email id & Alternate email Date of birth Pan card no(for client upload)mandatory* Total Exp Rev Exp Current company If any payroll (Name) Notice Period (If Serving any Np , Mention last working day) CCTC ECTC Any offers (Yes/No) If yes how much offer &when joining date Current location & Preferred Location Happy to relocate(Yes/No) Available Interview time slots

Posted 2 months ago

Apply

11 - 20 years

45 - 75 Lacs

Gurugram

Work from Office

Role & responsibilities Proven success architecting and scaling complex software solutions, Familiarity with interface design. Experience and ability to drive a project/module independently from an execution stand. Prior experience with scalable architecture and distributed processing. Strong Programming expertise in Python, SQL, Scala Hands-on experience on any major big data solutions like Spark, Kafka, Hive. Strong data management skills with ETL, DWH, Data Quality and Data Governance. Hands-on experience on microservices architecture, docker and Kubernetes as orchestration. Experience on cloud-based data stores like Redshift and Big Query. Experience in cloud solution architecture. Experience on architecture of running spark jobs on k8s and optimization of spark jobs. Experience in MLops architecture/tools/orchestrators like Kubeflow, MLflow Experience in logging, metrics and distributed tracing systems (e.g. Prometheus/Grafana/Kibana) Experience in CI/CD using octopus/teamcity/jenkins Interested candidate can share their updated resume at surinder.kaur@mounttalent.com

Posted 2 months ago

Apply

5 - 10 years

20 - 25 Lacs

Bengaluru

Work from Office

About The Role : Job Title Transformation Principal Change Analyst Corporate TitleAVP LocationBangalore, India Role Description We are looking for an experienced Change Manager to lead a variety of regional/global change initiatives. Utilizing the tenets of PMI, you will lead cross-functional initiatives that transform the way we run our operations. If you like to solve complex problems, have a gets things done attitude and are looking for a highly visible dynamic role where your voice is heard and your experience is appreciated, come talk to us What we'll offer you As part of our flexible scheme, here are just some of the benefits that youll enjoy, Best in class leave policy. Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your key responsibilities Responsible for change management planning, execution and reporting adhering to governance standards ensuring transparency around progress status; Using data to tell the story, maintain risk management controls, monitor and communicate initiatives risks; Collaborate with other departments as required to execute on timelines to meet the strategic goals As part of the larger team, accountable for the delivery and adoption of the global change portfolio including by not limited to business case development/analysis, reporting, measurements and reporting of adoption success measures and continuous improvement. As required, using data to tell the story, participate in Working Group and Steering Committee to achieve the right level of decision making and progress/ transparency, establishing strong partnership and collaborative relationships with various stakeholder groups to remove constraints to success and carry forward to future projects. As required, developing and documenting end-to-end roles and responsibilities, including process flow, operating procedures, required controls, gathering and documenting business requirements (user stories)including liaising with end-users and performing analysis of gathered data. Heavily involved in product development journey Your skills and experience Overall experience of at least 7-10 years leading complex change programs/projects, communicating and driving transformation initiatives using the tenets of PMI in a highly matrixed environment Banking / Finance/ regulated industry experience of which at least 2 years should be in change / transformation space or associated with change/transformation initiatives a plus Knowledge of client lifecycle processes, procedures and experience with KYC data structures / data flows is preferred. Experience working with management reporting is preferred. Bachelors degree How we'll support you Training and development to help you excel in your career. Coaching and support from experts in your team. A culture of continuous learning to aid progression. A range of flexible benefits that you can tailor to suit your needs. About us and our teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.

Posted 2 months ago

Apply

1 - 5 years

6 - 11 Lacs

Pune

Work from Office

About The Role : Job Title Data Engineer for Private Bank One Data Platform on Google Cloud Corporate TitleAssociate LocationPune, India Role Description As part of one of the internationally staffed agile teams of the Private Bank One Data Platform, you are part of the "TDI PB Germany Enterprise & Data" division. The focus here is on the development, design, and provision of different solutions in the field of data warehousing, reporting and analytics for the Private Bank to ensure that necessary data is provided for operational and analytical purposes. The PB One Data Platform is the new strategic data platform of the Private Bank and uses the Google Cloud Platform as the basis. With Google as a close partner, we are following Deutsche Banks cloud strategy with the aim of transferring or rebuilding a significant share of todays on-prem applications to the Google Cloud Platform. What we'll offer you As part of our flexible scheme, here are just some of the benefits that youll enjoy, Best in class leave policy. Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your key responsibilities Work within software development applications as a Data Engineer to provide fast and reliable data solutions for warehousing, reporting, Customer- and Business Intelligence solutions. Partner with Service/Backend Engineers to integrate data provided by legacy IT solutions into your designed databases and make it accessible to the services consuming those data. Focus on the design and setup of databases, data models, data transformations (ETL), critical Online banking business processes in the context Customer Intelligence, Financial Reporting and performance controlling. Contribute to data harmonization as well as data cleansing. A passion for constantly learning and applying new technologies and programming languages in a constantly evolving environment. Build solutions are highly scalable and can be operated flawlessly under high load scenarios. Together with your team, you will run and develop you application self-sufficiently. You'll collaborate with Product Owners as well as the team members regarding design and implementation of data analytics solutions and act as support during the conception of products and solutions. When you see a process running with high manual effort, you'll fix it to run automated, optimizing not only our operating model, but also giving yourself more time for development. Your skills and experience Mandatory Skills Hands-on development work building scalabledata engineering pipelinesand other data engineering/modellingwork usingJava/Python. Excellent knowledge of SQL and NOSQL databases. Experience working in a fast-paced and Agile work environment. Working knowledge of public cloud environment. Preferred Skills Experience inDataflow (Apache Beam)/Cloud Functions/Cloud Run Knowledge of workflow management tools such asApache Airflow/Composer. Demonstrated ability to write clear code that is well-documented and stored in a version control system (GitHub). Knowledge ofGCS Buckets, Google Pub Sub, BigQuery Knowledge aboutETLprocesses in theData Warehouseenvironment/Data Lakeand how to automate them. Nice to have Knowledge of provisioning cloud resources usingTerraform. Knowledge ofShell Scripting. Experience withGit,CI/CD pipelines,Docker, andKubernetes. Knowledge ofGoogle Cloud Cloud Monitoring & Alerting Knowledge ofCloud Run, Data Form, Cloud Spanner Knowledge of Data Warehouse solution -Data Vault 2.0 Knowledge onNewRelic Excellent analytical and conceptual thinking. Excellent communication skills, strong independence and initiative, ability to work in agile delivery teams. Good communication and experience in working with distributed teams (especially Germany + India) How we'll support you Training and development to help you excel in your career. Coaching and support from experts in your team. A culture of continuous learning to aid progression. A range of flexible benefits that you can tailor to suit your needs. About us and our teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.

Posted 2 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies