Jobs
Interviews

39 Cloud Spanner Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

7.0 - 12.0 years

25 - 27 Lacs

hyderabad

Work from Office

3+ years experienced engineer who has worked on GCP environment and its relevant tools/services. (Big Query, Data Proc, Data flow, Cloud Storage, Terraform, Tekton , Cloudrun , Cloud scheduler, Astronomer/Airflow, Pub/sub, Kafka, Cloud Spanner streaming etc) 1 or 2 + years of strong experience in Python development (Object oriented/Functional Programming, Pandas, Pyspark etc) 1 or 2 + years of strong experience in SQL Language (CTEs, Window functions, Aggregate functions etc) Keywords :dataproc,pyspark,data flow,kafka,cloud storage,terraform,oops,cloud spanner,hadoop,java,hive,spark,mapreduce,big data,gcp,aws,javascript,mysql,postgresql,sql server,oracle,bigtable,software development,sql*,python development*,python*,bigquery*,pandas*

Posted 4 days ago

Apply

3.0 - 6.0 years

6 - 8 Lacs

noida

Work from Office

3+ years experienced engineer who has worked on GCP environment and its relevant tools/services. (Big Query, Data Proc, Data flow, Cloud Storage, Terraform, Tekton , Cloudrun , Cloud scheduler, Astronomer/Airflow, Pub/sub, Kafka, Cloud Spanner streaming etc) 1 or 2 + years of strong experience in Python development (Object oriented/Functional Programming, Pandas, Pyspark etc) 1 or 2 + years of strong experience in SQL Language (CTEs, Window functions, Aggregate functions etc)KeywordsPython Development,Python,Bigquery,Pandas,Dataproc,Pyspark,Data Flow,Kafka,Cloud Storage,Terraform,Oops,Cloud Spanner,Hadoop,Java,Hive,Spark,Mapreduce,Big Data,Gcp,Aws,Javascript,Mysql,Postgresql,Sql Server,Oracle,Bigtable,Software Development,Sql*

Posted 5 days ago

Apply

4.0 - 8.0 years

22 - 25 Lacs

hyderabad, chennai, bengaluru

Work from Office

3+ years experienced engineer who has worked on GCP environment and its relevant tools/services. (Big Query, Data Proc, Data flow, Cloud Storage, Terraform, Tekton , Cloudrun , Cloud scheduler, Astronomer/Airflow, Pub/sub, Kafka, Cloud Spanner streaming etc) 1 or 2 + years of strong experience in Python development (Object oriented/Functional Programming, Pandas, Pyspark etc) 1 or 2 + years of strong experience in SQL Language (CTEs, Window functions, Aggregate functions etc)Keywordsdataproc,pyspark,data flow,kafka,cloud storage,terraform,oops,cloud spanner,hadoop,java,hive,spark,mapreduce,big data,gcp,aws,javascript,mysql,postgresql,sql server,oracle,bigtable,software development,sql*,python development*,python*,bigquery*,pandas*

Posted 5 days ago

Apply

10.0 - 14.0 years

0 Lacs

noida, uttar pradesh

On-site

Role Overview: You are required to work as a GCP Data Architect with a total experience of 12+ years. Your relevant experience for engagement should be 10 years. Your primary responsibilities will include maintaining architecture principles, guidelines, and standards, data warehousing, programming in Python/Java, working with Big Data, Data Analytics, and GCP Services. You will be responsible for designing and implementing solutions in various technology domains related to Google Cloud Platform Data Components like BigQuery, BigTable, CloudSQL, Dataproc, Data Flow, Data Fusion, etc. Key Responsibilities: - Maintain architecture principles, guidelines, and standards - Work on Data Warehousing projects - Program in Python and Java for various data-related tasks - Utilize Big Data technologies for data processing and analysis - Implement solutions using GCP Services such as BigQuery, BigTable, CloudSQL, Dataproc, Data Flow, Data Fusion, etc. Qualifications Required: - Strong experience in Big Data including data modeling, design, architecting, and solutioning - Proficiency in programming languages like SQL, Python, and R-Scala - Good Python skills with experience in data visualization tools such as Google Data Studio or Power BI - Knowledge of A/B Testing, Statistics, Google Cloud Platform, Google Big Query, Agile Development, DevOps, Data Engineering, and ETL Data Processing - Migration experience of production Hadoop Cluster to Google Cloud will be an added advantage Additional Company Details: The company is looking for individuals who are experts in Big Query, Dataproc, Data Fusion, Dataflow, Bigtable, Fire Store, CloudSQL, Cloud Spanner, Google Cloud Storage, Cloud Composer, Cloud Interconnect, etc. Relevant certifications such as Google Professional Cloud Architect will be preferred.,

Posted 5 days ago

Apply

2.0 - 8.0 years

0 Lacs

hyderabad, telangana

On-site

As a Senior Data Engineer with 5-8 years of IT experience, including 2-3 years focused on GCP data services, you will be a valuable addition to our dynamic data and analytics team. Your primary responsibility will be to design, develop, and implement robust and insightful data-intensive solutions using GCP Cloud services. Your role will entail a deep understanding of data engineering, proficiency in SQL, and extensive experience with various GCP services such as BigQuery, DataFlow, DataStream, Pub/Sub, Dataproc, Cloud Storage, and other key GCP services for Data Pipeline Orchestration. You will be instrumental in the construction of a GCP native cloud data platform. Key Responsibilities: - Lead and contribute to the development, deployment, and lifecycle management of applications on GCP, utilizing services like Compute Engine, Kubernetes Engine (GKE), Cloud Functions, Cloud Run, Pub/Sub, BigQuery, Cloud SQL, Cloud Storage, and more. Required Skills and Qualifications: - Bachelor's degree in Computer Science, Information Technology, Data Analytics, or a related field. - 5-8 years of overall IT experience, with hands-on experience in designing and developing data applications on GCP Cloud. - In-depth expertise in GCP services and architectures, including Compute, Storage & Databases, Data & Analytics, and Operations & Monitoring. - Proven ability to translate business requirements into technical solutions. - Strong analytical, problem-solving, and critical thinking skills. - Effective communication and interpersonal skills for collaboration with technical and non-technical stakeholders. - Experience in Agile development methodology. - Ability to work independently, manage multiple priorities, and meet deadlines. Preferred Skills (Nice to Have): - Experience with other Hyperscalers. - Proficiency in Python or other scripting languages for data manipulation and automation. If you are a highly skilled and experienced Data Engineer with a passion for leveraging GCP data services to drive innovation, we invite you to apply for this exciting opportunity in Gurugram or Hyderabad.,

Posted 6 days ago

Apply

3.0 - 5.0 years

0 Lacs

bengaluru, karnataka, india

On-site

We have an exciting and rewarding opportunity for you to take your software engineering career to the next level. As a Software Engineer III at JPMorgan Chase within the Consumer and Community Banking - Banking and Wealth Management Team, you serve as a seasoned member of an agile team to design and deliver trusted market-leading technology products in a secure, stable, and scalable way. You are responsible for carrying out critical technology solutions across multiple technical areas within various business functions in support of the firm's business objectives. Job responsibilities Executes software solutions, design, development, and technical troubleshooting with ability to think beyond routine or conventional approaches to build solutions or break down technical problems Creates secure and high-quality production code and maintains algorithms that run synchronously with appropriate systems Produces architecture and design artifacts for complex applications while being accountable for ensuring design constraints are met by software code development Gathers, analyzes, synthesizes, and develops visualizations and reporting from large, diverse data sets in service of continuous improvement of software applications and systems Proactively identifies hidden problems and patterns in data and uses these insights to drive improvements to coding hygiene and system architecture Contributes to software engineering communities of practice and events that explore new and emerging technologies Adds to team culture of diversity, opportunity, inclusion, and respect Required qualifications, capabilities, and skills Formal training or certification on software engineering concepts and 3+ years applied experience Hands-on experience with cloud-based applications, technologies and tools, deployment, monitoring and operations, such as Kubernetes, Prometheus, FluentD, Slack, Elasticsearch, Grafana, Kibana, etc. Relational and NoSQL databases developing and managing operations leveraging key event streaming, messaging and DB services such as Cassandra, MQ/JMS/Kafka, Aurora, RDS, Cloud SQL, BigTable, DynamoDB, MongoDB, Cloud Spanner, Kinesis, Cloud Pub/Sub, etc. Networking (Security, Load Balancing, Network Routing Protocols, etc.) Demonstrated experience in the fields of production engineering and automation. Strong understanding of cloud technology standards and practices. Proficiency in utilizing tools for monitoring, analysis, and troubleshooting, including Splunk, Dynatrace, Datadog, or equivalent. Preferred qualifications, capabilities, and skills Ability to conduct detailed analysis on incidents to identify patterns and trends, thereby enhancing operational stability and efficiency. Familiarity with digital certificate management and automation tools. Knowledge of frameworks such as CI/CD pipeline. Excellent communication and collaboration skills.

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

chennai, tamil nadu

On-site

This role is for a GCP Data Engineer who can build cloud analytics platforms to meet expanding business requirements with speed and quality using lean Agile practices. You will work on analysing and manipulating large datasets supporting the enterprise by activating data assets to support Enabling Platforms and Analytics in the GCP. You will be responsible for designing the transformation and modernization on GCP. Experience with large scale solutions and operationalizing of data warehouses, data lakes and analytics platforms on Google Cloud Platform or other cloud environment is a must. We are looking for candidates who have a broad set of technology skills across these areas and who can demonstrate an ability to design right solutions with appropriate combination of GCP and 3rd party technologies for deploying on the Google Cloud Platform. Responsibilities Develop technical solutions for Data Engineering and work between 1 PM and 10 PM IST to enable more overlap time with European and North American counterparts. This role will work closely with teams in US and as well as Europe to ensure robust, integrated migration aligned with Global Data Engineering patterns and standards. Design and deploying data pipelines with automated data lineage. Develop, reusable Data Engineering patterns. Design and build production data engineering solutions to deliver pipeline patterns using Google Cloud Platform (GCP) services: BigQuery, DataFlow, Pub/Sub, BigTable, Data Fusion, DataProc, Cloud Composer, Cloud SQL, Compute Engine, Cloud Functions, and App Engine. Ensure timely migration of Ford Credit Europe FCE Teradata warehouse to GCP and to enable Teradata platform decommissioning by end 2025 with a strong focus on ensuring continued, robust, and accurate Regulatory Reporting capability. Position Opportunities The Data Engineer role within FC Data Engineering supports the following opportunities for successful individuals: Key player in a high priority program to unlock the potential of Data Engineering Products and Services & secure operational resilience for Ford Credit Europe. Explore and implement leading edge technologies, tooling and software development best practices. Experience of managing data warehousing and product delivery within a financially regulated environment. Experience of collaborative development practices within an open-plan, team-designed environment. Experience of working with third party suppliers / supplier management. Continued personal and professional development with support and encouragement for further certification. Qualifications Essential: 5+ years of experience in data engineering, with a focus on data warehousing and ETL development (including data modelling, ETL processes, and data warehousing principles). 5+ years of SQL development experience. 3+ years of Cloud experience (GCP preferred) with solutions designed and implemented at production scale. Strong understanding of key GCP services, especially those related to data processing (Batch/Real Time) leveraging Terraform, BigQuery, Dataflow, DataFusion, Dataproc, Cloud Build, AirFlow, and Pub/Sub, alongside and storage including Cloud Storage, Bigtable, Cloud Spanner. Excellent problem-solving skills, with the ability to design and optimize complex data pipelines. Strong communication and collaboration skills, capable of working effectively with both technical and non-technical stakeholders as part of a large global and diverse team. Experience developing with micro service architecture from container orchestration framework. Designing pipelines and architectures for data processing. Strong evidence of self-motivation to continuously develop own engineering skills and those of the team. Proven record of working autonomously in areas of high ambiguity, without day-to-day supervisory support. Evidence of a proactive mindset to problem solving and willingness to take the initiative. Strong prioritization, co-ordination, organizational and communication skills, and a proven ability to balance workload and competing demands to meet deadlines. Desired: Professional Certification in GCP (e.g., Professional Data Engineer). Data engineering or development experience gained in a regulated, financial environment. Experience with Teradata to GCP migrations is a plus. Strong expertise in SQL and experience with programming languages such as Python, Java, and/or Apache Beam. Experience of coaching and mentoring Data Engineers. Experience with data security, governance, and compliance best practices in the cloud. An understanding of current architecture standards and digital platform services strategy.,

Posted 1 week ago

Apply

5.0 - 7.0 years

20 - 25 Lacs

chennai

Work from Office

Position Description: Representing the Ford Credit (FC) Data Engineering Organization as a Google Cloud Platform (GCP) Data Engineer, specializing in migration and transformation, you will be a developer part of a global team to build a complex Datawarehouse in the Google Cloud Platform. This role involves designing, implementing, and optimizing data pipelines, ensuring data integrity during migration, and leveraging GCP services to enhance data transformation processes for scalability and efficiency. This role is for a GCP Data Engineer who can build cloud analytics platforms to meet expanding business requirements with speed and quality using lean Agile practices. You will work on analyzing and manipulating large datasets supporting the enterprise by activating data assets to support Enabling Platforms and Analytics in the GCP. You will be responsible for designing the transformation and modernization on GCP. Experience with large scale solutions and operationalizing of data warehouses, data lakes and analytics platforms on Google Cloud Platform or other cloud environment is a must. We are looking for candidates who have a broad set of technology skills across these areas and who can demonstrate an ability to design right solutions with appropriate combination of GCP and 3rd party technologies for deploying on the Google Cloud Platform. Experience Required: 5+ years of experience in data engineering, with a focus on data warehousing and ETL development (including data modelling, ETL processes, and data warehousing principles). • 5+ years of SQL development experience • 3+ years of Cloud experience (GCP preferred) with solutions designed and implemented at production scale. • Strong understanding and experience of key GCP services, especially those related to data processing (Batch/Real Time) leveraging Terraform, BigQuery, Dataflow, DataFusion, Dataproc, Cloud Build, AirFlow, and Pub/Sub, alongside and storage including Cloud Storage, Bigtable, Cloud Spanner • Experience developing with micro service architecture from container orchestration framework. • Designing pipelines and architectures for data processing • Excellent problem-solving skills, with the ability to design and optimize complex data pipelines. • Strong communication and collaboration skills, capable of working effectively with both technical and non-technical stakeholders as part of a large global and diverse team • Strong evidence of self-motivation to continuously develop own engineering skills and those of the team. • Proven record of working autonomously in areas of high ambiguity, without day-to-day supervisory support • Evidence of a proactive mindset to problem solving and willingness to take the initiative. • Strong prioritization, co-ordination, organizational and communication skills, and a proven ability to balance workload and competing demands to meet deadlines

Posted 2 weeks ago

Apply

6.0 - 11.0 years

8 - 11 Lacs

new delhi, hyderabad, bengaluru

Work from Office

Job Description: • Should have worked on Google cloud services extensively • Experience designing highly available and scalable systems • Familiar with Google Cloud BigQuery , Compute Engine, App Engine Storage, Cloud Spanner, dataflow, Cloud IAM • Exposure on apache Kafka is an advantage • Experience with Devops and automation tool Terraform • Familiar with containers like Google Kubernetes, Docker, Helm • Ability to effectively communicate complex technical concepts to a broad range of audiences

Posted 2 weeks ago

Apply

6.0 - 11.0 years

8 - 11 Lacs

pune, bengaluru, delhi / ncr

Work from Office

Job Description: • Should have worked on Google cloud services extensively • Experience designing highly available and scalable systems • Familiar with Google Cloud BigQuery , Compute Engine, App Engine Storage, Cloud Spanner, dataflow, Cloud IAM • Exposure on apache Kafka is an advantage • Experience with Devops and automation tool Terraform • Familiar with containers like Google Kubernetes, Docker, Helm • Ability to effectively communicate complex technical concepts to a broad range of audiences

Posted 2 weeks ago

Apply

3.0 - 5.0 years

10 - 13 Lacs

chennai

Work from Office

3+ years experienced engineer who has worked on GCP environment and its relevant tools/services. (Big Query, Data Proc, Data flow, Cloud Storage, Terraform, Tekton , Cloudrun , Cloud scheduler, Astronomer/Airflow, Pub/sub, Kafka, Cloud Spanner streaming etc) 1 or 2 + years of strong experience in Python development (Object oriented/Functional Programming, Pandas, Pyspark etc) 1 or 2 + years of strong experience in SQL Language (CTEs, Window functions, Aggregate functions etc) Mandatory Key Skills Terraform,Tekton,Cloudrun,Cloud scheduler,Astronomer,Airflow,Kafka,Cloud Spanner streaming,Python development,Pandas,Pyspark,SQL,GCP,Big Query,Data Proc,Data flow,Cloud Storage*

Posted 3 weeks ago

Apply

7.0 - 12.0 years

12 - 16 Lacs

hyderabad

Work from Office

3+ years experienced engineer who has worked on GCP environment and its relevant tools/services. (Big Query, Data Proc, Data flow, Cloud Storage, Terraform, Tekton , Cloudrun , Cloud scheduler, Astronomer/ Airflow, Pub/sub, Kafka, Cloud Spanner streaming etc) 1 or 2 + years of strong experience in Python development (Object oriented/Functional Programming, Pandas, Pyspark etc) 1 or 2 + years of strong experience in SQL Language (CTEs, Window functions, Aggregate functions etc).

Posted 3 weeks ago

Apply

3.0 - 6.0 years

6 - 8 Lacs

noida

Work from Office

3+ years experienced engineer who has worked on GCP environment and its relevant tools/services. (Big Query, Data Proc, Data flow, Cloud Storage, Terraform, Tekton , Cloudrun , Cloud scheduler, Astronomer/Airflow, Pub/sub, Kafka, Cloud Spanner streaming etc) 1 or 2 + years of strong experience in Python development (Object oriented/Functional Programming, Pandas, Pyspark etc) 1 or 2 + years of strong experience in SQL Language (CTEs, Window functions, Aggregate functions etc)

Posted 3 weeks ago

Apply

4.0 - 8.0 years

22 - 25 Lacs

bengaluru

Work from Office

3+ years experienced engineer who has worked on GCP environment and its relevant tools/services. (Big Query, Data Proc, Data flow, Cloud Storage, Terraform, Tekton , Cloudrun , Cloud scheduler, Astronomer/Airflow, Pub/sub, Kafka, Cloud Spanner streaming etc) 1 or 2 + years of strong experience in Python development (Object oriented/Functional Programming, Pandas, Pyspark etc) 1 or 2 + years of strong experience in SQL Language (CTEs, Window functions, Aggregate functions etc)

Posted 3 weeks ago

Apply

4.0 - 8.0 years

22 - 25 Lacs

chennai

Work from Office

3+ years experienced engineer who has worked on GCP environment and its relevant tools/services. (Big Query, Data Proc, Data flow, Cloud Storage, Terraform, Tekton , Cloudrun , Cloud scheduler, Astronomer/Airflow, Pub/sub, Kafka, Cloud Spanner streaming etc) 1 or 2 + years of strong experience in Python development (Object oriented/Functional Programming, Pandas, Pyspark etc) 1 or 2 + years of strong experience in SQL Language (CTEs, Window functions, Aggregate functions etc)

Posted 3 weeks ago

Apply

4.0 - 8.0 years

22 - 25 Lacs

hyderabad

Work from Office

3+ years experienced engineer who has worked on GCP environment and its relevant tools/services. (Big Query, Data Proc, Data flow, Cloud Storage, Terraform, Tekton , Cloudrun , Cloud scheduler, Astronomer/Airflow, Pub/sub, Kafka, Cloud Spanner streaming etc) 1 or 2 + years of strong experience in Python development (Object oriented/Functional Programming, Pandas, Pyspark etc) 1 or 2 + years of strong experience in SQL Language (CTEs, Window functions, Aggregate functions etc)

Posted 3 weeks ago

Apply

4.0 - 9.0 years

15 - 20 Lacs

hyderabad, chennai, bengaluru

Work from Office

Role & responsibilities Design, build, and optimize real-time and batch data pipelines using Google Cloud Dataflow (Apache Beam). 36 years of experience as a Data Engineer with strong knowledge of Google Cloud Platform (GCP) . Hands-on experience in Dataflow / Apache Beam for real-time and batch processing. Experience with Google Cloud Spanner (schema design, query optimization, replication). Proficiency in BigQuery for large-scale data analysis and optimization. Solid understanding of streaming data architectures (Pub/Sub, Kafka, or similar) .

Posted 4 weeks ago

Apply

0.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Ready to shape the future of work At Genpact, we don&rsquot just adapt to change&mdashwe drive it. AI and digital innovation are redefining industries, and we&rsquore leading the charge. Genpact&rsquos AI Gigafactory, our industry-first accelerator, is an example of how we&rsquore scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to agentic AI, our breakthrough solutions tackle companies most complex challenges. If you thrive in a fast-moving, tech-driven environment, love solving real-world problems, and want to be part of a team that&rsquos shaping the future, this is your moment. Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions - we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation, our teams implement data, technology, and AI to create tomorrow, today. Get to know us at genpact.com and on LinkedIn, X, YouTube, and Facebook. Inviting applications for the role of Assistant Vice President (AVP), Google Cloud Platform Pre-Sales Solution Architect - Data & AI About Genpact: Genpact (NYSE: G) is a global professional services firm delivering outcomes that transform businesses. With a proud 25-year history and over 125,000 diverse professionals in 30+ countries, we are driven by our innate curiosity, entrepreneurial agility, and desire to create lasting value for our clients. We serve leading global enterprises, including the Fortune Global 500, leveraging our deep industry expertise, digital innovation, and cutting-edge capabilities in data, technology, and AI. Join our team to shape the future of business through intelligent operations and drive meaningful impact. The Opportunity: Genpact is seeking a highly accomplished and visionary Assistant Vice President (AVP), Google Cloud Platform Pre-Sales Solution Architect, specializing in Data and Artificial Intelligence. This pivotal role will be instrumental in driving Genpact%27s growth in the GCP ecosystem by leading complex pre-sales engagements, designing transformative data and AI solutions, and fostering executive-level client relationships. You will operate at the intersection of business strategy and cutting-edge technology, translating intricate client challenges into compelling, implementable solutions on Google Cloud. Responsibilities: . Executive Solutioning & Strategy: Lead the end-to-end technical pre-sales cycle for Genpact%27s most strategic data and AI opportunities on GCP. Engage at the CXO level and with senior business and IT stakeholders to deeply understand their strategic objectives, pain points, and competitive landscape. . Architectural Leadership: Design and articulate highly scalable, secure, and resilient enterprise-grade data and AI architectures on Google Cloud Platform. This includes expertise in BigQuery, Dataflow, Dataproc, Vertex AI (MLOps, Generative AI), Cloud AI services, Looker, Pub/Sub, Cloud Storage, Data Catalog, and other relevant GCP services. . Value Proposition & Storytelling: Develop and deliver highly impactful presentations, workshops, and proof-of-concepts (POCs) that clearly demonstrate the business value and ROI of Genpact%27s data and AI solutions on GCP. Craft compelling narratives that resonate with both technical and non-technical audiences. . Deal Ownership & Closure: Work collaboratively with sales teams to own the technical solutioning and commercial structuring of deals from qualification to closure. Lead the estimation, negotiation, and transition of deals to the delivery organization, ensuring alignment and seamless execution. . Technical Deep Dive & Expertise: Provide deep technical expertise on Google Cloud%27s Data & AI portfolio, staying at the forefront of new service offerings, product roadmaps, and competitive differentiators. Act as the subject matter expert in client discussions and internal enablement. . Cross-Functional Collaboration: Partner effectively with Genpact%27s sales, delivery, product development, and industry vertical teams to ensure that proposed solutions are innovative, deliverable, and aligned with market demands and Genpact%27s capabilities. . Thought Leadership: Contribute to Genpact%27s market presence and intellectual property through whitepapers, conference presentations, industry events, and client advisory sessions. Position Genpact as a leader in data-driven transformation on GCP. . Team Mentorship & Enablement: Provide mentorship and technical guidance to junior pre-sales architects and delivery teams, fostering a culture of continuous learning and excellence in GCP Data & AI. Qualifications we seek in you! Minimum Qualifications . progressive experience in data, analytics, artificial intelligence, and cloud technologies, with a strong focus on technical pre-sales, solution architecture, or consulting leadership roles. . hands-on experience architecting, designing, and delivering complex data and AI solutions on Google Cloud Platform. . Deep and demonstrable expertise across the Google Cloud Data & AI stack: o Core Data Services: BigQuery, Dataflow, Dataproc, Pub/Sub, Cloud Storage, Cloud SQL, Cloud Spanner, Composer, Data Catalog, Dataplex. o AI/ML Services: Vertex AI (including MLOps, Workbench, Training, Prediction, Explainable AI), Generative AI offerings (e.g., Gemini, Imagen), Natural Language API, Vision AI, Speech-to-Text, Dialogflow, Recommendation AI. o BI & Visualization: Looker, Data Studio. . Proven track record of successfully leading and closing multi-million dollar deals involving complex data and AI solutions on cloud platforms. . Exceptional executive presence with the ability to engage, influence, and build trusted relationships with C-level executives and senior stakeholders. . Strong commercial acumen and experience in structuring complex deals, including pricing models, risk assessment, and contract negotiation. . Outstanding communication, presentation, and storytelling skills, with the ability to articulate complex technical concepts into clear, concise business benefits. . Demonstrated ability to lead cross-functional teams and drive consensus in dynamic and ambiguous environments. . Bachelor%27s degree in Computer Science, Engineering, or a related technical field. Master%27s degree or MBA preferred. . Google Cloud Professional Certifications are highly preferred (e.g., Professional Cloud Architect, Professional Data Engineer, Professional Machine Learning Engineer). . Ability to travel as required to client sites and internal meetings. Why join Genpact . Be a transformation leader - Work at the cutting edge of AI, automation, and digital innovation . Make an impact - Drive change for global enterprises and solve business challenges that matter . Accelerate your career - Get hands-on experience, mentorship, and continuous learning opportunities . Work with the best - Join 140,000+ bold thinkers and problem-solvers who push boundaries every day . Thrive in a values-driven culture - Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the tech shapers and growth makers at Genpact and take your career in the only direction that matters: Up. Let&rsquos build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a %27starter kit,%27 paying to apply, or purchasing equipment or training.

Posted 2 months ago

Apply

7.0 - 12.0 years

20 - 30 Lacs

Hyderabad, Pune, Bengaluru

Work from Office

Work Location: Bangalore/Pune/Hyderabad/ NCR Experience: 5-12yrs Required Skills: Proven experience as a Data Engineer with expertise in GCP. Strong understanding of data warehousing concepts and ETL processes. Experience with BigQuery, Dataflow, and other GCP data services Design, develop, and maintain data pipelines on GCP. Implement data storage solutions and optimize data processing workflows. Ensure data quality and integrity throughout the data lifecycle. Collaborate with data scientists and analysts to understand data requirements. Monitor and maintain the health of the data infrastructure. Troubleshoot and resolve data-related issues. Thanks & Regards Suganya R Suganya@spstaffing.in

Posted 2 months ago

Apply

15.0 - 16.0 years

3 - 6 Lacs

Mumbai, Maharashtra, India

On-site

More than 15 years of experience in Technical, Solutioning, and Analytical roles. 5+ years of experience in building and managing Data Lakes, Data Warehouse, Data Integration, Data Migration and Busines Intelligence/Artificial Intelligence solutions on Cloud (GCP/AWS/Azure) Ability to understand business requirements, translate them into functional and non-functional areas, define non-functional boundaries in terms of Availability, Scalability, Performance, Security, Resilience etc. Experience in architecting, designing, and implementing end to end data pipelines and data integration solutions for varied structured and unstructured data sources and targets. Experience of having worked in distributed computing and enterprise environments like Hadoop, GCP/AWS/Azure Cloud. Well versed with various Data Integration, and ETL technologies on Cloud like Spark, Pyspark/Scala, Dataflow, DataProc, EMR, etc. on various Cloud. Experience of having worked with traditional ETL tools like Informatica / DataStage / OWB / Talend , etc. Deep knowledge of one or more Cloud and On-Premise Databases like Cloud SQL, Cloud Spanner, Big Table, RDS, Aurora, DynamoDB, Oracle, Teradata, MySQL, DB2, SQL Server, etc. Exposure to any of the No-SQL databases like Mongo dB, CouchDB, Cassandra, Graph dB, etc. Experience in architecting and designing scalable data warehouse solutions on cloud on Big Query or Redshift. Experience in having worked on one or more data integration, storage, and data pipeline toolsets like S3, Cloud Storage, Athena, Glue, Sqoop, Flume, Hive, Kafka, Pub-Sub, Kinesis, Dataflow, DataProc, Airflow, Composer, Spark SQL, Presto, EMRFS, etc. Preferred experience of having worked on Machine Learning Frameworks like TensorFlow, Pytorch, etc. Good understanding of Cloud solutions for Iaas, PaaS, SaaS, Containers and Microservices Architecture and Design. Ability to compare products and tools across technology stacks on Google, AWS, and Azure Cloud. Good understanding of BI Reposting and Dashboarding and one or more toolsets associated with it like Looker, Tableau, Power BI, SAP BO, Cognos, Superset, etc. Understanding of Security features and Policies in one or more Cloud environments like GCP/AWS/Azure. Experience of having worked in business transformation projects for movement of On-Premise data solutions to Clouds like GCP/AWS/Azure. Role : Lead multiple data engagements on GCP Cloud for data lakes, data engineering, data migration, data warehouse, and business intelligence. Interface with multiple stakeholders within IT and business to understand the data requirements. Take complete responsibility for the successful delivery of all allocated projects on the parameters of Schedule, Quality, and Customer Satisfaction. Responsible for design and development of distributed, high volume multi-thread batch, real-time, and event processing systems. Implement processes and systems to validate data, monitor data quality, ensuring production data is always accurate and available for key stakeholders and business processes that depend on it. Work with the Pre-Sales team on RFP, RFIs and help them by creating solutions for data. Mentor Young Talent within the Team, Define and track their growth parameters. Contribute to building Assets and Accelerators. Other Skills: Strong Communication and Articulation Skills. Good Leadership Skills. Should be a good team player. Good Analytical and Problem-solving skills.

Posted 2 months ago

Apply

0.0 - 3.0 years

6 - 8 Lacs

Noida

Work from Office

3+ years experienced engineer who has worked on GCP environment and its relevant tools/services. (Big Query, Data Proc, Data flow, Cloud Storage, Terraform, Tekton , Cloudrun , Cloud scheduler, Astronomer/Airflow, Pub/sub, Kafka, Cloud Spanner streaming etc) 1 or 2 + years of strong experience in Python development (Object oriented/Functional Programming, Pandas, Pyspark etc) 1 or 2 + years of strong experience in SQL Language (CTEs, Window functions, Aggregate functions etc)

Posted 2 months ago

Apply

4.0 - 8.0 years

22 - 25 Lacs

Hyderabad, Chennai, Bengaluru

Work from Office

3+ years experienced engineer who has worked on GCP environment and its relevant tools/services. (Big Query, Data Proc, Data flow, Cloud Storage, Terraform, Tekton , Cloudrun , Cloud scheduler, Astronomer/Airflow, Pub/sub, Kafka, Cloud Spanner streaming etc) 1 or 2 + years of strong experience in Python development (Object oriented/Functional Programming, Pandas, Pyspark etc) 1 or 2 + years of strong experience in SQL Language (CTEs, Window functions, Aggregate functions etc)

Posted 2 months ago

Apply

7.0 - 12.0 years

25 - 27 Lacs

Hyderabad

Work from Office

3+ years experienced engineer who has worked on GCP environment and its relevant tools/services. (Big Query, Data Proc, Data flow, Cloud Storage, Terraform, Tekton , Cloudrun , Cloud scheduler, Astronomer/Airflow, Pub/sub, Kafka, Cloud Spanner streaming etc) 1 or 2 + years of strong experience in Python development (Object oriented/Functional Programming, Pandas, Pyspark etc) 1 or 2 + years of strong experience in SQL Language (CTEs, Window functions, Aggregate functions etc)

Posted 2 months ago

Apply

5.0 - 8.0 years

5 - 9 Lacs

Chennai

Work from Office

GCP Engineer GCP developer should have the expertise on the components like Scheduler, DataFlow, BigQuery, Pub/Sub and Cloud SQL etc. Good understanding of GCP cloud environment/services (IAM, Networking, Pub/Sub, Cloud Run, Cloud Storage, Cloud SQL/PostgreSQL, Cloud Spanner etc) based on real migration projects Knowledge of Java / Java frameworks. Have leveraged/ worked with any or all technology areas like Spring boot, Spring batch, Spring boot cloud etc. Experience with API, Microservice design principles and leveraged them in actual project implementation for integration. Deep understanding of Architecture and Design Patterns Need to have knowledge of implementation of event-driven architecture, data integration, event streaming architecture, API driven architecture. Needs to be well versed with DevOps principal and need to have working experience in Docker/containerization. Experience in solution and execution of IaaS, PaaS, SaaS-based deployments, etc. Require conceptual thinking to create 'out of the box solutions Should be good in communication and should be able to handle both customer and development team to deliver an outcome. Mandatory Skills: App-Cloud-Google. Experience5-8 Years.

Posted 2 months ago

Apply

8.0 - 10.0 years

14 - 18 Lacs

Chennai

Work from Office

GCP Architect A Seasoned architect with a minimum of 12+ years and designing medium to large scale Application-to-Application integration requirements leveraging API, APIMs, ESB, product-based hybrid implementation.. Good understanding of GCP cloud environment/services (IAM, Networking, Pub/Sub, Cloud Run, Cloud Storage, Cloud SQL/PostgreSQL, Cloud Spanner etc) based on real migration projects Experience/Exposure for Openshift & PCF on GCP & DevSecOps will be an added advantage Ability to make critical solution design decisions Knowledge of Java / Java frameworks. Have leveraged/ worked with any or all technology areas like Spring boot, Spring batch, Spring boot cloud etc. Experience with API, Microservice design principles and leveraged them in actual project implementation for integration. Deep understanding of Architecture and Design Patterns Need to have knowledge of implementation of event-driven architecture, data integration, event streaming architecture, API driven architecture. Need to have an understanding and designed integration platform to meet the NFR requirements. Should have implemented design patterns like integrating with multiple COTS applications, integrations with multiple databases (SQL based and also NoSQL) Have worked with multiple teams to gather integration requirements, create Integration specification documents, map specifications, write high level and detailed designs, guiding the technical team for design and implementation. Needs to be well versed with DevOps principal and need to have working experience in Docker/containerization. Experience in solution and execution of IaaS, PaaS, SaaS-based deployments, etc. Require conceptual thinking to create 'out of the box solutions Should be good in communication and should be able to handle both customer and development team to deliver an outcome. Mandatory Skills: App-Cloud-Google. Experience8-10 Years.

Posted 2 months ago

Apply
Page 1 of 2
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies