Jobs
Interviews

167 Cloud Sql Jobs - Page 6

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

1.0 - 3.0 years

3 - 6 Lacs

Chennai

Work from Office

Skill Set Required GCP, Data Modelling (OLTP, OLAP), indexing, DBSchema, CloudSQL, BigQuery Data Modeller - Hands-on data modelling for OLTP and OLAP systems. In-Depth knowledge of Conceptual, Logical and Physical data modelling. Strong understanding of Indexing, partitioning, data sharding with practical experience of having done the same. Strong understanding of variables impacting database performance for near-real time reporting and application interaction. Should have working experience on at least one data modelling tool, preferably DBSchema. People with functional knowledge of the mutual fund industry will be a plus. Good understanding of GCP databases like AlloyDB, CloudSQL and BigQuery.

Posted 1 month ago

Apply

8.0 - 10.0 years

20 - 30 Lacs

Chennai

Hybrid

Role & responsibilities GCP Services - Biq Query, Data Flow, Dataproc, DataPlex, DataFusion, Terraform, Tekton, Cloud SQL, Redis Memory, Airflow, Cloud Storage 2+ Years in Data Transfer Utilities 2+ Years in Git / any other version control tool 2+ Years in Confluent Kafka 1+ Years of Experience in API Development 2+ Years in Agile Framework 4+ years of strong experience in python, Pyspark development. 4+ years of shell scripting to develop the adhoc jobs for data importing/exporting Preferred candidate profile Python, dataflow, Dataproc, GCP Cloud Run, DataForm, Agile Software Development, Big Query, TERRAFORM, Data Fusion, Cloud SQL, GCP, KAFKA,Java. We would like to inform you that only immediate joiners will be considered for this position due to project urgency.

Posted 1 month ago

Apply

4.0 - 9.0 years

10 - 14 Lacs

Hyderabad

Work from Office

Understanding of design, configuring infrastructure based on provided design, managing GCP infrastructure using Terraform. Automate the provisioning, configuration, and management of GCP resources, including Compute Engine, Cloud Storage, Cloud SQL, Spanner, Kubernetes Engine (GKE), and serverless offerings like Cloud Functions and Cloud Run. Manage and configure GCP service accounts, IAM roles, and permissions to ensure secure access to resources. Implement and manage load balancers (HTTP(S), TCP/UDP) for high availability and scalability. Develop and maintain CI/CD pipelines using Cloud build, GitHub Actions or similar tools. Monitor and optimize the performance and availability of our GCP infrastructure. Primary Skills Terraform CI/CD Pipeline IAC Docker, Kubernetes Secondary Skills AWS Azure Github

Posted 1 month ago

Apply

6.0 - 10.0 years

3 - 6 Lacs

Chennai

Work from Office

Job Information Job Opening ID ZR_2412_JOB Date Opened 04/02/2025 Industry IT Services Job Type Work Experience 6-10 years Job Title Data Modeller City Chennai Province Tamil Nadu Country India Postal Code 600001 Number of Positions 1 Skill Set Required GCP, Data Modelling (OLTP, OLAP), indexing, DBSchema, CloudSQL, BigQuery Data Modeller - Hands-on data modelling for OLTP and OLAP systems. In-Depth knowledge of Conceptual, Logical and Physical data modelling. Strong understanding of Indexing, partitioning, data sharding with practical experience of having done the same. Strong understanding of variables impacting database performance for near-real time reporting and application interaction. Should have working experience on at least one data modelling tool, preferably DBSchema. People with functional knowledge of the mutual fund industry will be a plus. Good understanding of GCP databases like AlloyDB, CloudSQL and BigQuery. check(event) ; career-website-detail-template-2 => apply(record.id,meta)" mousedown="lyte-button => check(event)" final-style="background-color:#2B39C2;border-color:#2B39C2;color:white;" final-class="lyte-button lyteBackgroundColorBtn lyteSuccess" lyte-rendered=""> I'm interested

Posted 1 month ago

Apply

8.0 - 10.0 years

9 - 13 Lacs

Bengaluru

Work from Office

What you’ll be doing: Assist in developing machine learning models based on project requirements Work with datasets by preprocessing, selecting appropriate data representations, and ensuring data quality. Performing statistical analysis and fine-tuning using test results. Support training and retraining of ML systems as needed. Help build data pipelines for collecting and processing data efficiently. Follow coding and quality standards while developing AI/ML solutions Contribute to frameworks that help operationalize AI models What we seek in you: 8+ years of experience in IT Industry Strong on programming languages like Python One cloud hands-on experience (GCP preferred) Experience working with Dockers Environments managing (e.g venv, pip, poetry, etc.) Experience with orchestrators like Vertex AI pipelines, Airflow, etc Understanding of full ML Cycle end-to-end Data engineering, Feature Engineering techniques Experience with ML modelling and evaluation metrics Experience with Tensorflow, Pytorch or another framework Experience with Models monitoring Advance SQL knowledge Aware of Streaming concepts like Windowing, Late arrival, Triggers etc Storage: CloudSQL, Cloud Storage, Cloud Bigtable, Bigquery, Cloud Spanner, Cloud DataStore, Vector database Ingest: Pub/Sub, Cloud Functions, AppEngine, Kubernetes Engine, Kafka, Micro services Schedule: Cloud Composer, Airflow Processing: Cloud Dataproc, Cloud Dataflow, Apache Spark, Apache Flink CI/CD: Bitbucket+Jenkins / Gitlab, Infrastructure as a tool: Terraform Life at Next: At our core, we're driven by the mission of tailoring growth for our customers by enabling them to transform their aspirations into tangible outcomes. We're dedicated to empowering them to shape their futures and achieve ambitious goals. To fulfil this commitment, we foster a culture defined by agility, innovation, and an unwavering commitment to progress. Our organizational framework is both streamlined and vibrant, characterized by a hands-on leadership style that prioritizes results and fosters growth. Perks of working with us: Clear objectives to ensure alignment with our mission, fostering your meaningful contribution. Abundant opportunities for engagement with customers, product managers, and leadership. You'll be guided by progressive paths while receiving insightful guidance from managers through ongoing feedforward sessions. Cultivate and leverage robust connections within diverse communities of interest. Choose your mentor to navigate your current endeavors and steer your future trajectory. Embrace continuous learning and upskilling opportunities through Nexversity. Enjoy the flexibility to explore various functions, develop new skills, and adapt to emerging technologies. Embrace a hybrid work model promoting work-life balance. Access comprehensive family health insurance coverage, prioritizing the well-being of your loved ones. Embark on accelerated career paths to actualize your professional aspirations. Who we are? We enable high growth enterprises build hyper personalized solutions to transform their vision into reality. With a keen eye for detail, we apply creativity, embrace new technology and harness the power of data and AI to co-create solutions tailored made to meet unique needs for our customers. Join our passionate team and tailor your growth with us!

Posted 1 month ago

Apply

5.0 - 7.0 years

13 - 17 Lacs

Bengaluru

Work from Office

A career in IBM Consulting is rooted by long-term relationships and close collaboration with clients across the globe. You'll work with visionaries across multiple industries to improve the hybrid cloud and Al journey for the most innovative and valuable companies in the world. Your ability to accelerate impact and make meaningful change for your clients is enabled by our strategic partner ecosystem and our robust technology platforms across the IBM portfolio; including Software and Red Hat. In your role, you will be responsible forSkilled Multiple GCP services - GCS, BigQuery, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer etc. Must have Python and SQL work experience & Proactive, collaborative and ability to respond to critical situation Ability to analyse data for functional business requirements & front face customer Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 5 to 7 years of relevant experience working as technical analyst with Big Query on GCP platform. Skilled in multiple GCP services - GCS, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer Ambitious individual who can work under their own direction towards agreed targets/goals and with creative approach to work You love collaborative environments that use agile methodologies to encourage creative design thinking and find innovative ways to develop with cutting edge technologies. End to End functional knowledge of the data pipeline/transformation implementation that the candidate has done, should understand the purpose/KPIs for which data transformation was done Preferred technical and professional experience Experience with AEM Core Technologies OSGI Services, Apache Sling ,Granite Framework., Java Content Repository API, Java 8+, Localization Familiarity with building tools, Jenkin and Maven , Knowledge of version control tools, especially Git, Knowledge of Patterns and Good Practices to design and develop quality and clean code, Knowledge of HTML, CSS, and JavaScript , jQuery Familiarity with task management, bug tracking, and collaboration tools like JIRA and Confluence

Posted 1 month ago

Apply

4.0 - 9.0 years

9 - 14 Lacs

Bengaluru

Work from Office

Understanding of design, configuring infrastructure based on provided design, managing GCP infrastructure using Terraform. Automate the provisioning, configuration, and management of GCP resources, including Compute Engine, Cloud Storage, Cloud SQL, Spanner, Kubernetes Engine (GKE), and serverless offerings like Cloud Functions and Cloud Run. Manage and configure GCP service accounts, IAM roles, and permissions to ensure secure access to resources. Implement and manage load balancers (HTTP(S), TCP/UDP) for high availability and scalability. Develop and maintain CI/CD pipelines using Cloud build, GitHub Actions or similar tools. Monitor and optimize the performance and availability of our GCP infrastructure. Candidates with certification will be preferred. Primary skills Terraform CI/CD Pipeline IAC Docker Kubernetes Secondary skills AWS Azure Github

Posted 1 month ago

Apply

8.0 - 11.0 years

35 - 37 Lacs

Kolkata, Ahmedabad, Bengaluru

Work from Office

Dear Candidate, We are hiring a Cloud Operations Engineer to manage and optimize cloud-based environments. Ideal for engineers passionate about automation, monitoring, and cloud-native technologies. Key Responsibilities: Maintain cloud infrastructure (AWS, Azure, GCP) Automate deployments and system monitoring Ensure availability, performance, and cost optimization Troubleshoot incidents and resolve system issues Required Skills & Qualifications: Hands-on experience with cloud platforms and DevOps tools Proficiency in scripting (Python, Bash) and IaC (Terraform, CloudFormation) Familiarity with logging/monitoring tools (CloudWatch, Datadog, etc.) Bonus: Experience with Kubernetes or serverless architectures Note: If interested, please share your updated resume and preferred time for a discussion. If shortlisted, our HR team will contact you. Kandi Srinivasa Delivery Manager Integra Technologies

Posted 2 months ago

Apply

5.0 - 7.0 years

0 Lacs

Bengaluru / Bangalore, Karnataka, India

On-site

Introduction In this role, youll work in one of our IBM Consulting Client Innovation Centres (Delivery Centres), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centres offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology Your role and responsibilities A career in IBM Consulting is rooted by long-term relationships and close collaboration with clients across the globe.Youll work with visionaries across multiple industries to improve the hybrid cloud and Al journey for the most innovative and valuable companies in the world. Your ability to accelerate impact and make meaningful change for your clients is enabled by our strategic partner ecosystem and our robust technology platforms across the IBM portfolio including Software and Red Hat. In your role, you will be responsible for: Skilled Multiple GCP services - GCS, BigQuery, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer etc. Must have Python and SQL work experience & Proactive, collaborative and ability to respond to critical situation Ability to analyse data for functional business requirements & front face customer Required education Bachelors Degree Preferred education Masters Degree Required technical and professional expertise 5 to 7 years of relevant experience working as technical analyst with Big Query on GCP platform. Skilled in multiple GCP services - GCS, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer Ambitious individual who can work under their own direction towards agreed targets/goals and with creative approach to work You love collaborative environments that use agile methodologies to encourage creative design thinking and find innovative ways to develop with cutting edge technologies. End to End functional knowledge of the data pipeline/transformation implementation that the candidate has done, should understand the purpose/KPIs for which data transformation was done Preferred technical and professional experience Experience with AEM Core Technologies : OSGI Services, Apache Sling ,Granite Framework., Java Content Repository API, Java 8+, Localization Familiarity with building tools, Jenkin and Maven , Knowledge of version control tools, especially Git, Knowledge of Patterns and Good Practices to design and develop quality and clean code, Knowledge of HTML, CSS, and JavaScript , jQuery Familiarity with task management, bug tracking, and collaboration tools like JIRA and Confluence

Posted 2 months ago

Apply

6.0 - 8.0 years

30 - 35 Lacs

Pune

Work from Office

: Job TitleSenior Engineer LocationPune, India Corporate TitleAVP Role Description Investment Banking is technology centric businesses, with an increasing move to real-time processing, an increasing appetite from customers for integrated systems and access to supporting data. This means that technology is more important than ever for business. The IB CARE Platform aims to increase the productivity of both Google Cloud and on-prem application development by providing a frictionless build and deployment platform that offers service and data reusability. The platform provides the chassis and standard components of an application ensuring reliability, usability and safety and gives on-demand access to services needed to build, host and manage applications on the cloud/on-prem. In addition to technology services the platform aims to have compliance baked in, enforcing controls/security reducing application team involvement in SDLC and ORR controls enabling teams to focus more on application development and release to production faster. We are looking for a platform engineer to join a global team working across all aspects of the platform from GCP/on-prem infrastructure and application deployment through to the development of CARE based services. Deutsche Bank is one of the few banks with the scale and network to compete aggressively in this space, and the breadth of investment in this area is unmatched by our peers. Joining the team is a unique opportunity to help build a platform to support some of our most mission critical processing systems. What we'll offer you As part of our flexible scheme, here are just some of the benefits that youll enjoy Best in class leave policy Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your Key Responsibilities As a CARE platform engineer you will be working across the board on activities to build/support the platform and liaising with tenants. To be successful in this role the below are key responsibility areas: Responsible for managing and monitoring cloud computing systems and providing technical support to ensure the systems efficiency and security Work with platform leads and platform engineers at technical level. Liaise with tenants regarding onboarding and providing platform expertise. Contribute to the platform offering as part of Sprint deliverables. Support the production platform as part of the wider team. Your skills and experience Understanding of GCP and services such as GKE, IAM, identity services and Cloud SQL. Kubernetes/Service Mesh configuration. Experience in IaaS tooling such as Terraform. Proficient in SDLC / DevOps best practices. Github experience including Git workflow. Exposure to modern deployment tooling, such as ArgoCD, desirable. Programming experience (such as Java/Python) desirable. A strong team player comfortable in a cross-cultural and diverse operating environment Result oriented and ability to deliver under tight timelines. Ability to successfully resolve conflicts in a globally matrix driven organization. Excellent communication and collaboration skills Must be comfortable with navigating ambiguity to extract meaningful risk insights. How we'll support you Training and development to help you excel in your career Coaching and support from experts in your team A culture of continuous learning to aid progression A range of flexible benefits that you can tailor to suit your needs About us and our teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.

Posted 2 months ago

Apply

3.0 - 7.0 years

10 - 20 Lacs

Noida, Gurugram, Delhi / NCR

Hybrid

Salary: 8 to 24 LPA Exp: 3 to 7 years Location: Gurgaon (Hybrid) Notice: Immediate to 30 days..!! Job Profile: Experienced Data Engineer with a strong foundation in designing, building, and maintaining scalable data pipelines and architectures. Skilled in transforming raw data into clean, structured formats for analytics and business intelligence. Proficient in modern data tools and technologies such as SQL, T-SQL, Python, Databricks, and cloud platforms (Azure). Adept at data wrangling, modeling, ETL/ELT development, and ensuring data quality, integrity, and security. Collaborative team player with a track record of enabling data-driven decision-making across business units. As a Data engineer, Candidate will work on the assignments for one of our Utilities clients. Collaborating with cross-functional teams and stakeholders involves gathering data requirements, aligning business goals, and translating them into scalable data solutions. The role includes working closely with data analysts, scientists, and business users to understand needs, designing robust data pipelines, and ensuring data is accessible, reliable, and well-documented. Regular communication, iterative feedback, and joint problem-solving are key to delivering high-impact, data-driven outcomes that support organizational objectives. This position requires a proven track record of transforming processes, driving customer value, cost savings with experience in running end-to-end analytics for large-scale organizations. Design, build, and maintain scalable data pipelines to support analytics, reporting, and advanced modeling needs. Collaborate with consultants, analysts, and clients to understand data requirements and translate them into effective data solutions. Ensure data accuracy, quality, and integrity through validation, cleansing, and transformation processes. Develop and optimize data models, ETL workflows, and database architectures across cloud and on-premises environments. Support data-driven decision-making by delivering reliable, well-structured datasets and enabling self-service analytics. Provides seamless integration with cloud platforms (Azure), making it easy to build and deploy end-to-end data pipelines in the cloud Scalable clusters for handling large datasets and complex computations in Databricks, optimizing performance and cost management. Must to have Client Engagement Experience and collaboration with cross-functional teams Data Engineering background in Databricks Capable of working effectively as an individual contributor or in collaborative team environments Effective communication and thought leadership with proven record. Candidate Profile: Bachelors/masters degree in economics, mathematics, computer science/engineering, operations research or related analytics areas 3+ years’ experience must be in Data engineering. Hands on experience on SQL, Python, Databricks, cloud Platform like Azure etc. Prior experience in managing and delivering end to end projects Outstanding written and verbal communication skills Able to work in fast pace continuously evolving environment and ready to take up uphill challenges Is able to understand cross cultural differences and can work with clients across the globe.

Posted 2 months ago

Apply

15.0 - 18.0 years

45 - 50 Lacs

Noida, Mumbai, Pune

Work from Office

Skill & Experience 15-18 year experience on JAVA is must with hands-on experience of architecting solutions using cloud native PaaS services such as Databases, Messaging, Storage, Compute in Google Cloud in Pre-sales capability Experience in monolith to microservices modernization engagements Should have worked on multiple engagements related with Application assessment as part of Re-factoring/Containerization and Re-architecting cloud journeys Should have been part of large digital transformation project Experience building, architecting, designing, and implementing highly distributed global cloud-based systems. Experience in network infrastructure, security, data, or application development. Experience with structured Enterprise Architecture practices, hybrid cloud deployments, and on premise-to-cloud migration deployments and roadmaps. Architecting microservice/API Ability to deliver results and work cross-functionally. Ability to engage/influence audiences and identify expansion engagements Certification in Google Professional Cloud Architect is desirable Experience with Agile/SCRUM environment. Familiar with Agile Team management tools (JIRA, Confluence) Understand and promote Agile values: FROCC (Focus, Respect, Openness, Commitment,Courage) Working with Docker, Openshift, GKE and Cloud Run Designing database in Oracle/Cloud SQL/Cloud Spanner Designing software which has low operational cost and cloud billing Contributing to building best practices and defining reference architecture

Posted 2 months ago

Apply

5.0 - 7.0 years

0 Lacs

Kolkata, West Bengal, India

On-site

Introduction In this role, youll work in one of our IBM Consulting Client Innovation Centres (Delivery Centres), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centres offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology In this role, youll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology. Your role and responsibilities A career in IBM Consulting is rooted by long-term relationships and close collaboration with clients across the globe.Youll work with visionaries across multiple industries to improve the hybrid cloud and Al journey for the most innovative and valuable companies in the world. Your ability to accelerate impact and make meaningful change for your clients is enabled by our strategic partner ecosystem and our robust technology platforms across the IBM portfolio including Software and Red Hat. In your role, you will be responsible for: Skilled Multiple GCP services - GCS, BigQuery, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer etc. Must have Python and SQL work experience & Proactive, collaborative and ability to respond to critical situation Ability to analyse data for functional business requirements & front face customer Required education Bachelors Degree Preferred education Masters Degree Required technical and professional expertise 5 to 7 years of relevant experience working as technical analyst with Big Query on GCP platform. Skilled in multiple GCP services - GCS, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer Ambitious individual who can work under their own direction towards agreed targets/goals and with creative approach to work You love collaborative environments that use agile methodologies to encourage creative design thinking and find innovative ways to develop with cutting edge technologies. End to End functional knowledge of the data pipeline/transformation implementation that the candidate has done, should understand the purpose/KPIs for which data transformation was done Preferred technical and professional experience Experience with AEM Core Technologies : OSGI Services, Apache Sling ,Granite Framework., Java Content Repository API, Java 8+, Localization Familiarity with building tools, Jenkin and Maven , Knowledge of version control tools, especially Git, Knowledge of Patterns and Good Practices to design and develop quality and clean code, Knowledge of HTML, CSS, and JavaScript , jQuery Familiarity with task management, bug tracking, and collaboration tools like JIRA and Confluence

Posted 2 months ago

Apply

5.0 - 7.0 years

0 Lacs

Gurgaon / Gurugram, Haryana, India

On-site

Introduction A career in IBM Consulting is rooted by long-term relationships and close collaboration with clients across the globe. Youll work with visionaries across multiple industries to improve the hybrid cloud and AI journey for the most innovative and valuable companies in the world. Your ability to accelerate impact and make meaningful change for your clients is enabled by our strategic partner ecosystem and our robust technology platforms across the IBM portfolio including Software and Red Hat. Curiosity and a constant quest for knowledge serve as the foundation to success in IBM Consulting. In your role, youll be encouraged to challenge the norm, investigate ideas outside of your role, and come up with creative solutions resulting in ground breaking impact for a wide network of clients. Our culture of evolution and empathy centers on long-term career growth and development opportunities in an environment that embraces your unique skills and experience. In this role, youll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology. Your role and responsibilities A career in IBM Consulting is rooted by long-term relationships and close collaboration with clients across the globe. Youll work with visionaries across multiple industries to improve the hybrid cloud and Al journey for the most innovative and valuable companies in the world. Your ability to accelerate impact and make meaningful change for your clients is enabled by our strategic partner ecosystem and our robust technology platforms across the IBM portfolio including Software and Red Hat In your role, you will be responsible for: Skilled Multiple GCP services - GCS, Big Query, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer etc. Must have Python and SQL work experience & Proactive, collaborative and ability to respond to critical situation Ability to analyse data for functional business requirements & front face customer Required education Bachelors Degree Preferred education Masters Degree Required technical and professional expertise 5 to 7 years of relevant experience working as technical analyst with Big Query on GCP platform. Skilled in multiple GCP services - GCS, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer Ambitious individual who can work under their own direction towards agreed targets/goals and with creative approach to work You love collaborative environments that use agile methodologies to encourage creative design thinking and find innovative ways to develop with cutting edge technologies. End to End functional knowledge of the data pipeline/transformation implementation that the candidate has done, should understand the purpose/KPIs for which data transformation was done Preferred technical and professional experience Experience with AEM Core Technologies: OSGI Services, Apache Sling ,Granite Framework., Java Content Repository API, Java 8+, Localization Familiarity with building tools, Jenkin and Maven , Knowledge of version control tools, especially Git, Knowledge of Patterns and Good Practices to design and develop quality and clean code, Knowledge of HTML, CSS, and JavaScript , jQuery Familiarity with task management, bug tracking, and collaboration tools like JIRA and Confluence

Posted 2 months ago

Apply

0.0 years

0 Lacs

Bengaluru / Bangalore, Karnataka, India

On-site

Ready to shape the future of work At Genpact, we don&rsquot just adapt to change&mdashwe drive it. AI and digital innovation are redefining industries, and we&rsquore leading the charge. Genpact&rsquos , our industry-first accelerator, is an example of how we&rsquore scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to , our breakthrough solutions tackle companies most complex challenges. If you thrive in a fast-moving, tech-driven environment, love solving real-world problems, and want to be part of a team that&rsquos shaping the future, this is your moment. Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions - we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation , our teams implement data, technology, and AI to create tomorrow, today. Get to know us at and on , , , and . Inviting applications for the role of Senior Principal Consultant - Lead Solution Architect, Google Cloud Platform Pre-Sales - Data & AI About Genpact: Genpact (NYSE: G) is a global professional services firm delivering outcomes that transform businesses. With a proud 25-year history and over 125,000 diverse professionals in 30+ countries, we are driven by our innate curiosity, entrepreneurial agility, and desire to create lasting value for our clients. We serve leading global enterprises, including the Fortune Global 500, leveraging our deep industry expertise , digital innovation, and cutting-edge capabilities in data, technology, and AI. Join our vibrant team in India to shape the future of business through intelligent operations and drive meaningful impact. The Opportunity: Genpact India is expanding its Google Cloud Platform (GCP) capabilities and seeking a highly experienced and technically astute Senior Principal Consultant / Lead Solution Architect specializing in Data and Artificial Intelligence. This critical role will be at the forefront of Genpact%27s growth in the GCP ecosystem in India and globally, leading complex pre-sales engagements, designing transformative data and AI solutions, and fostering strong client relationships. You will operate as a trusted advisor, translating intricate client challenges into compelling, implementable solutions on Google Cloud. Responsibilities: Solution Architecture & Design Leadership: Lead the technical pre-sales process for complex data and AI opportunities on Google Cloud Platform / AWS / Azure (any 2) from initial discovery through to proposal and Statement of Work (SOW) development. Design and articulate highly scalable, secure, and resilient enterprise-grade data and AI architectures leveraging a wide array of GCP services / AWS / Azure services. (any 2) Client Engagement & Advisory: Engage deeply with prospective clients%27 senior IT and business stakeholders, including CXOs, to understand their strategic objectives , critical business challenges, and existing data landscape. Position Genpact%27s GCP / AWS / Azure Data & AI offerings as key enablers for their digital transformation journey. Technical Demonstrations & Workshops: Conduct impactful technical presentations, deep-dive workshops, and product demonstrations (including Proof-of-Concepts where required ) tailored to specific client needs, showcasing the advanced capabilities of GCP / AWS / Azure Data & AI services and Genpact%27s differentiated value. Proposal Development & Commercial Support: Take ownership of the technical sections of proposals, RFPs, and SOWs, ensuring accuracy, technical feasibility, clear value articulation, and alignment with Genpact%27s delivery capabilities. Provide robust technical estimation and sizing for proposed solutions. Sales & Delivery Collaboration: Partner closely with Genpact%27s sales teams to drive deal progression, providing technical guidance, competitive intelligence, and effective solution positioning. Collaborate with delivery teams to ensure proposed solutions are executable, scalable, and align with Genpact%27s operational excellence standards. Technology & Market Expertise: Maintain expert-level knowledge of the latest trends, services, and product roadmaps in Google Cloud Platform, AWS / Azure (any two) , Data Engineering, Machine Learning, Artificial Intelligence (including Generative AI), and relevant industry best practices. Thought Leadership & IP Contribution: Contribute to Genpact%27s intellectual property by developing reusable assets, solution accelerators, and participating in internal/external knowledge sharing, including whitepapers, blogs, and industry events. Mentorship & Capability Building: Mentor and guide junior architects and data engineers within the team, fostering a culture of technical excellence and continuous learning in Google Cloud Data & AI. Qualifications we seek in you! Minimum Qualifications progressive experience in technical roles within data analytics, data warehousing, business intelligence, machine learning, and artificial intelligence, with a significant portion in a client-facing pre-sales, solution architecture, or consulting capacity . hands-on experience architecting, designing, and delivering complex data and AI solutions on Google Cloud Platform. Deep and demonstrable expertise across the Google Cloud Data & AI stack: Core Data Services: BigQuery , Dataflow, Dataproc , Pub/Sub, Cloud Storage, Cloud SQL, Cloud Spanner, Composer, Data Catalog, Dataplex. AI/ML Services: Vertex AI (including MLOps , Workbench, Training, Prediction, Explainable AI), Generative AI offerings (e.g., Gemini, Imagen), Natural Language API, Vision AI, Speech-to-Text, Dialogflow , Recommendation AI. BI & Visualization: Looker, Data Studio. Similar to above Google Cloud stack, need to have exposure to one other cloud data stack - either AWS or Azure Proven experience in translating complex business challenges into viable , scalable technical solutions on GCP, articulated with clear business value. Exceptional communication, presentation, and interpersonal skills, with the ability to engage, influence, and build rapport with diverse audiences from technical teams to senior business executives. Strong problem-solving, analytical, and strategic thinking abilities, with a commercial mindset. Experience in leading and contributing to large, complex deal pursuits in a competitive environment. Bachelor%27s degree in Computer Science , Engineering, or a related technical field. Master%27s degree preferred. Google Cloud Professional Certifications are highly preferred (e.g., Professional Cloud Architect, Professional Data Engineer, Professional Machine Learning Engineer). Ability to travel to client sites within India and potentially internationally as required . Why join Genpact Be a transformation leader - Work at the cutting edge of AI, automation, and digital innovation Make an impact - Drive change for global enterprises and solve business challenges that matter Accelerate your career - Get hands-on experience, mentorship, and continuous learning opportunities Work with the best - Join 140,000+ bold thinkers and problem-solvers who push boundaries every day Thrive in a values-driven culture - Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the tech shapers and growth makers at Genpact and take your career in the only direction that matters: Up. Let&rsquos build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color , religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a %27starter kit,%27 paying to apply, or purchasing equipment or training.

Posted 2 months ago

Apply

1.0 - 5.0 years

6 - 10 Lacs

Bengaluru

Work from Office

We're HiringSenior SQL Database Admin and AWS Admin for a leading product based company! We are seeking an experienced Senior SQL Database Administrator and AWS Administrator to manage our database systems and cloud infrastructure The ideal candidate will possess extensive knowledge of SQL databases, performance tuning, and AWS services to ensure optimal system performance and security. “ LocationRemote Work ModeWork from anywhere ’ RoleSenior SQL Database Administrator and AWS Administrator What You'll Do Manage and maintain 20+ servers and 100+ databases, supporting a maximum size of 2TB across various SQL Server versions Provide 24x7 production support for SQL Server environments, ensuring high availability and performance. Install, configure, and upgrade SQL Server instances, including standalone and clustered servers. Develop and implement database designs, including SQL objects, constraints, stored procedures, views, and dynamic SQL. Execute code deployment in staging and production environments, including debugging and high-priority bug fixes. Expertly handle database backup, recovery, restoration, and disaster recovery planning. Perform comprehensive performance tuning, including analysing slow-running queries, identifying missing indexes, and optimizing statistics. Configure and maintain AWS cloud environments, including RDS, EC2 instances, S3 storage, and setting up alert alarms for monitoring system health. Ensure database and application server retention policies are maintained according to best practices. Manage user permissions, job scheduling, and troubleshooting of failed jobs. Conduct replication, log shipping, database mirroring, and disaster recovery activities, including failover and failback operations. Lead migration and upgrade projects, from SQL 2005 to more recent versions and AWS migration efforts. Install and configure SSRS, managing security and creating basic SSIS packages for deployment. Requirements: Minimum of 10 years of experience as a Senior SQL Database Administrator, including significant work with AWS Cloud SQL development in production environments. Demonstrated experience with SQL Server versions 2005 to 2017 and AWS cloud services (RDS, EC2, S3). Have experience of implementing Azure CI/CD pipelines Experience of migration from AWS to Azure or vice versa Strong background in database design, maintenance, and optimization with a focus on performance tuning and security. Proficient in code deployment, debugging, and report generation. Familiarity with high availability concepts on AWS, Windows Maintenance in EC2, and various SQL Server maintenance tasks. Experience with database replication, DR activities, and cluster management. Knowledge of SSIS package creation and SSRS configuration. Excellent problem-solving skills and ability to conduct root cause analysis on database and disk space issues. Effective communication skills, with the ability to collaborate with cross-functional teams. Bachelors/Masters degree in Computer Science, Information Technology Ready to make an impactš" Apply now and lets grow together! Show more Show less

Posted 2 months ago

Apply

10.0 - 18.0 years

25 - 30 Lacs

Noida

Work from Office

Responsibilities:- Collaborate with the sales team to understand customer challenges and business objectives and propose solutions, POC etc..- Develop and deliver impactful technical presentations and demos showcasing the capabilities of GCP Data and AI , GenAI Solutions- Conduct technical proof-of-concepts (POCs) to validate the feasibility and value proposition of GCP solutions.- Collaborate with technical specialists and solution architects from COE Team to design and configure tailored cloud solutions.- Manage and qualify sales opportunities, working closely with the sales team to progress deals through the sales funnel.- Stay up to date on the latest GCP offerings, trends, and best practices.Experience :- Design and implement a comprehensive strategy for migrating and modernizing existing relational on-premise databases to scalable and cost-effective solution on Google Cloud Platform ( GCP).- Design and Architect the solutions for DWH Modernization and experience with building data pipelines in GCP - Strong Experience in BI reporting tools ( Looker, PowerBI and Tableau) - In-depth knowledge of Google Cloud Platform (GCP) services, particularly Cloud SQL, Postgres, Alloy DB, BigQuery, Looker Vertex AI and Gemini (GenAI)- Strong knowledge and experience in providing the solution to process massive datasets in real time and batch process using cloud native/open source Orchestration techniques - Build and maintain data pipelines using Cloud Dataflow to orchestrate real-time and batch data processing for streaming and historical data.- Strong knowledge and experience in best practices for data governance, security, and compliance - Excellent Communication and Presentation Skills with ability to tailor technical information as per customer needs- Strong analytical and problem-solving skills.- Ability to work independently and as part of a team.

Posted 2 months ago

Apply

10.0 - 15.0 years

30 - 40 Lacs

Bhopal, Pune, Gurugram

Hybrid

Job Title: Senior Data Engineer GCP | Big Data | Airflow | dbt Company: Xebia Location: All Xebia locations Experience: 10+ Years Employment Type: Full Time Notice Period: Immediate to Max 30 Days Only Job Summary Join the digital transformation journey of one of the world’s most iconic global retail brands! As a Senior Data Engineer , you’ll be part of a dynamic Digital Technology organization, helping build modern, scalable, and reliable data products to power business decisions across the Americas. You'll work in the Operations Data Domain, focused on ingesting, processing, and optimizing high-volume data pipelines using Google Cloud Platform (GCP) and other modern tools. Key Responsibilities Design, develop, and maintain highly scalable big data pipelines (batch & streaming) Collaborate with cross-functional teams to understand data needs and deliver efficient solutions Architect robust data solutions using GCP-native services (BigQuery, Pub/Sub, Cloud Functions, etc.) Build and manage modern Data Lake/Lakehouse platforms Create frameworks and reusable components for scalable ingestion and processing Implement data governance, security, and ensure regulatory compliance Mentor junior engineers and lead an offshore team of 8+ engineers Monitor pipeline performance, troubleshoot bottlenecks, and ensure data quality Engage in code reviews, CI/CD deployments, and agile product releases Contribute to internal best practices and engineering standards Must-Have Skills & Qualifications 8+ years in data engineering with strong hands-on experience in production-grade pipelines Expertise in GCP Data Services – BigQuery, Vertex AI, Pub/Sub, etc. Proficiency in dbt (Data Build Tool) for data transformation Strong programming skills in Python, Java, or Scala Advanced SQL & NoSQL knowledge Experience with Apache Airflow for orchestration Hands-on with Git, GitHub Actions , Jenkins for CI/CD Solid understanding of data warehousing (BigQuery, Snowflake, Redshift) Exposure to tools like Hadoop, Spark, Kafka , Databricks (nice to have) Familiarity with BI tools like Tableau, Power BI, or Looker (optional) Strong leadership qualities to manage offshore engineering teams Excellent communication skills and stakeholder management experience Preferred Education Bachelor’s or Master’s degree in Computer Science, Engineering, Mathematics, or a related field Notice Period Requirement Only Immediate Joiners or candidates with Max 30 Days Notice Period will be considered. How to Apply If you are passionate about solving real-world data problems and want to be part of a global data-driven transformation, apply now by sending your resume to vijay.s@xebia.com with the subject line: "Sr Data Engineer Application – [Your Name]" Kindly include the following details in your email: Full Name Total Experience Current CTC Expected CTC Current Location Preferred Location Notice Period / Last Working Day Key Skills Please do not apply if you are currently in process with any other role at Xebia or have recently interviewed.

Posted 2 months ago

Apply

1.0 - 3.0 years

10 - 15 Lacs

Kolkata, Gurugram, Bengaluru

Hybrid

Salary: 10 to 16 LPA Exp: 1 to 3 years Location: Gurgaon / Bangalore/ Kolkata (Hybrid) Notice: Immediate to 30 days..!! Key Skills: GCP, Cloud, Pubsub, Data Engineer

Posted 2 months ago

Apply

3.0 - 8.0 years

15 - 30 Lacs

Gurugram, Bengaluru

Hybrid

Salary: 15 to 30 LPA Exp: 3 to 8 years Location: Gurgaon / Bangalore (Hybrid) Notice: Immediate to 30 days..!! Key Skills: GCP, Cloud, Pubsub, Data Engineer

Posted 2 months ago

Apply

8.0 - 10.0 years

0 Lacs

Bengaluru / Bangalore, Karnataka, India

On-site

Req ID: 326833 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a GKE to join our team in Bangalore, Karn?taka (IN-KA), India (IN). Job Title / Role: GCP & GKE Staff Engineer NTT DATA Services strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Digital Engineering Lead Engineer to join our team in Bangalore, Karn?taka (IN-KA), India (IN). Job Description: Primary Skill: Cloud-Infrastructure-Google Cloud Platform Minimum work experience: 8+ yrs Total Experience: 8+ Years Must have GCP Solution Architect Certification & GKE Mandatory Skills: Technical Qualification/ Knowledge: Expertise in assessment, designing and implementing GCP solutions including aspects like compute, network, storage, identity, security , DR/business continuity strategy, migration , templates , cost optimization, PowerShell ,Terraforms, Ansible etc.. Must have GCP Solution Architect Certification Should have prior experience in executing large complex cloud transformation programs including discovery, assessment , business case creation , design , build , migration planning and migration execution Should have prior experience in using industry leading or native discovery, assessment and migration tools Good knowledge on the cloud technology, different patterns, deployment methods, compatibility of the applications Good knowledge on the GCP technologies and associated components and variations Anthos Application Platform Compute Engine , Compute Engine Managed Instance Groups , Kubernetes Cloud Storage , Cloud Storage for Firebase , Persistant Disk , Local SSD , Filestore , Transfer Service Virtual Private Network (VPC), Cloud DNS , Cloud Interconnect , Cloud VPN Gateway , Network Load Balancing , Global load balancing , Firewall rules , Cloud Armor Cloud IAM , Resource Manager , Multi-factor Authentication , Cloud KMS Cloud Billing , Cloud Console , Stackdriver Cloud SQL, Cloud Spanner SQL, Cloud Bigtable Cloud Run Container services, Kubernetes Engine (GKE) , Anthos Service Mesh , Cloud Functions , PowerShell on GCP Solid understanding and experience in cloud computing based services architecture, technical design and implementations including IaaS, PaaS, and SaaS. Design of clients Cloud environments with a focus on mainly on GCP and demonstrate Technical Cloud Architectural knowledge. Playing a vital role in the design of production, staging, QA and development Cloud Infrastructures running in 24x7 environments. Delivery of customer Cloud Strategies, aligned with customers business objectives and with a focus on Cloud Migrations and DR strategies Nurture Cloud computing expertise internally and externally to drive Cloud Adoption Should have a deep understanding of IaaS and PaaS services offered on cloud platforms and understand how to use them together to build complex solutions. Ensure that all cloud solutions follow security and compliance controls, including data sovereignty. Deliver cloud platform architecture documents detailing the vision for how GCP infrastructure and platform services support the overall application architecture, interaction with application, database and testing teams for providing a holistic view to the customer. Collaborate with application architects and DevOps to modernize infrastructure as a service (IaaS) applications to Platform as a Service (PaaS) Create solutions that support a DevOps approach for delivery and operations of services Interact with and advise business representatives of the application regarding functional and non-functional requirements Create proof-of-concepts to demonstrate viability of solutions under consideration Develop enterprise level conceptual solutions and sponsor consensus/approval for global applications. Have a working knowledge of other architecture disciplines including application, database, infrastructure, and enterprise architecture. Identify and implement best practices, tools and standards Provide consultative support to the DevOps team for production incidents Drive and support system reliability, availability, scale, and performance activities Evangelizes cloud automation and be a thought leader and expert defining standards for building and maintaining cloud platforms. Knowledgeable about Configuration management such as Chef/Puppet/Ansible. Automation skills using CLI scripting in any language (bash, perl, python, ruby, etc) Ability to develop a robust design to meet customer business requirement with scalability, availability, performance and cost effectiveness using GCP offerings Ability to identify and gather requirements to define an architectural solution which can be successfully built and operate on GCP Ability to conclude high level and low level design for the GCP platform which may also include data center design as necessary Capabilities to provide GCP operations and deployment guidance and best practices throughout the lifecycle of a project Understanding the significance of the different metrics for monitoring, their threshold values and should be able to take necessary corrective measures based on the thresholds Knowledge on automation to reduce the number of incidents or the repetitive incidents are preferred Good knowledge on the cloud center operation, monitoring tools, backup solution GKE . Set up monitoring and logging to troubleshoot a cluster, or debug a containerized application. . Manage Kubernetes Objects . Declarative and imperative paradigms for interacting with the Kubernetes API. . Managing Secrets . Managing confidential settings data using Secrets. . Configure load balancing, port forwarding, or setup firewall or DNS configurations to access applications in a cluster. . Configure networking for your cluster. . Hands-on experience with terraform. Ability to write reusable terraform modules. . Hands-on Python and Unix shell scripting is required. . understanding of CI/CD Pipelines in a globally distributed environment using Git, Artifactory, Jenkins, Docker registry. . Experience with GCP Services and writing cloud functions. . Hands-on experience deploying and managing Kubernetes infrastructure with Terraform Enterprise. Ability to write reusable terraform modules. . Certified Kubernetes Administrator (CKA) and/or Certified Kubernetes Application Developer (CKAD) is a plus . Experience using Docker within container orchestration platforms such as GKE. . Knowledge of setting up splunk . Knowledge of Spark in GKE Certification: GCP solution architect & GKE Process/ Quality Knowledge: Must have clear knowledge on ITIL based service delivery ITIL certification is desired Knowledge on quality Knowledge on security processes Soft Skills: Excellent communication skill and capability to work directly with global customers Strong technical leadership skill to drive solutions Focused on quality/cost/time of deliverables Timely and accurate communication Need to demonstrate the ownership for the technical issues and engage the right stakeholders for timely resolution. Flexibility to learn and lead other technology areas like other public cloud technologies, private cloud, automation Good reporting skill Willing to work in different time zones as per project requirement Good attitude to work in team and as individual contributor based on the project and situation Focused, result oriented and self-motivating About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at NTT DATA endeavors to make accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at . This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click . If you'd like more information on your EEO rights under the law, please click . For Pay Transparency information, please click.

Posted 2 months ago

Apply

8.0 - 10.0 years

0 Lacs

Bengaluru / Bangalore, Karnataka, India

On-site

Req ID: 327246 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a GCP & GKE Staff Engineer to join our team in Bangalore, Karn?taka (IN-KA), India (IN). Job Title / Role: GCP & GKE Staff Engineer NTT DATA Services strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Digital Engineering Lead Engineer to join our team in Bangalore, Karn?taka (IN-KA), India (IN). Job Description: Primary Skill: Professional Cloud Security Engineer & Cloud-Infrastructure-Google Cloud Platform Related experience: 5+ years of experience in cloud security engineering and automation Total Experience: 8+ Years Must have GCP Solution Architect Certification & Professional Cloud Security Engineer Mandatory Skills: Technical Qualification/ Knowledge: This role supports operational security, control configuration, and secure design practices for GCP workloads. Roles & Responsibilities Implement GCP security controls: IAM, VPC security, VPNs, KMS, Cloud Armor, and secure networking. Manage GCP identity and access, including SSO, MFA, and federated IDP configurations. Monitor workloads using Cloud Operations Suite and escalate anomalies. Conduct basic threat modelling, vulnerability scanning, and patching processes. Automate security audits and compliance controls using Terraform and Cloud Shell scripting. Assist architects in deploying and maintaining secure-by-default infrastructure. Support audit preparation, policy enforcement, and evidence gathering. Collaborate with cross-functional teams to resolve security alerts and Expertise in assessment, designing and implementing GCP solutions including aspects like compute, network, storage, identity, security, DR/business continuity strategy, migration, templates, cost optimization, PowerShell, Ansible etc.. Should have prior experience in executing large complex cloud transformation programs including discovery, assessment, business case creation, design, build, migration planning and migration execution Should have prior experience in using industry leading or native discovery, assessment and migration tools Good knowledge on the cloud technology, different patterns, deployment methods, compatibility of the applications Good knowledge on the GCP technologies and associated components and variations Anthos Application Platform Compute Engine, Compute Engine Managed Instance Groups, Kubernetes Cloud Storage, Cloud Storage for Firebase, Persistant Disk, Local SSD, Filestore, Transfer Service Virtual Private Network (VPC), Cloud DNS, Cloud Interconnect, Cloud VPN Gateway, Network Load Balancing, Global load balancing, Firewall rules, Cloud Armor Cloud IAM, Resource Manager, Multi-factor Authentication, Cloud KMS Cloud Billing, Cloud Console, Stackdriver Cloud SQL, Cloud Spanner SQL, Cloud Bigtable Cloud Run Container services, Kubernetes Engine (GKE), Anthos Service Mesh, Cloud Functions, PowerShell on GCP Solid understanding and experience in cloud computing based services architecture, technical design and implementations including IaaS, PaaS, and SaaS. Design of clients Cloud environments with a focus on mainly on GCP and demonstrate Technical Cloud Architectural knowledge. Playing a vital role in the design of production, staging, QA and development Cloud Infrastructures running in 24x7 environments. Delivery of customer Cloud Strategies, aligned with customers business objectives and with a focus on Cloud Migrations and DR strategies Nurture Cloud computing expertise internally and externally to drive Cloud Adoption Should have a deep understanding of IaaS and PaaS services offered on cloud platforms and understand how to use them together to build complex solutions. Ensure that all cloud solutions follow security and compliance controls, including data sovereignty. Deliver cloud platform architecture documents detailing the vision for how GCP infrastructure and platform services support the overall application architecture, interaction with application, database and testing teams for providing a holistic view to the customer. Collaborate with application architects and DevOps to modernize infrastructure as a service (IaaS) applications to Platform as a Service (PaaS) Create solutions that support a DevOps approach for delivery and operations of services Interact with and advise business representatives of the application regarding functional and non-functional requirements Create proof-of-concepts to demonstrate viability of solutions under consideration Develop enterprise level conceptual solutions and sponsor consensus/approval for global applications. Have a working knowledge of other architecture disciplines including application, database, infrastructure, and enterprise architecture. Identify and implement best practices, tools and standards Provide consultative support to the DevOps team for production incidents Drive and support system reliability, availability, scale, and performance activities Evangelizes cloud automation and be a thought leader and expert defining standards for building and maintaining cloud platforms. Knowledgeable about Configuration management such as Chef/Puppet/Ansible. Automation skills using CLI scripting in any language (bash, perl, python, ruby, etc) Ability to develop a robust design to meet customer business requirement with scalability, availability, performance and cost effectiveness using GCP offerings Ability to identify and gather requirements to define an architectural solution which can be successfully built and operate on GCP Ability to conclude high level and low level design for the GCP platform which may also include data center design as necessary Capabilities to provide GCP operations and deployment guidance and best practices throughout the lifecycle of a project Understanding the significance of the different metrics for monitoring, their threshold values and should be able to take necessary corrective measures based on the thresholds Knowledge on automation to reduce the number of incidents or the repetitive incidents are preferred Good knowledge on the cloud center operation, monitoring tools, backup solution GKE .Set up monitoring and logging to troubleshoot a cluster, or debug a containerized application. .Manage Kubernetes Objects .Declarative and imperative paradigms for interacting with the Kubernetes API. .Managing Secrets .Managing confidential settings data using Secrets. .Configure load balancing, port forwarding, or setup firewall or DNS configurations to access applications in a cluster. . Configure networking for your cluster. . Hands-on experience with terraform. Ability to write reusable terraform modules. . Hands-on Python and Unix shell scripting is required. . understanding of CI/CD Pipelines in a globally distributed environment using Git, Artifactory, Jenkins, Docker registry. . Experience with GCP Services and writing cloud functions. . Hands-on experience deploying and managing Kubernetes infrastructure with Terraform Enterprise. Ability to write reusable terraform modules. . Certified Kubernetes Administrator (CKA) and/or Certified Kubernetes Application Developer (CKAD) is a plus . Experience using Docker within container orchestration platforms such as GKE. . Knowledge of setting up splunk . Knowledge of Spark in GKE Process/ Quality Knowledge: Must have clear knowledge on ITIL based service delivery ITIL certification is desired Knowledge on quality Knowledge on security processes Soft Skills: Excellent communication skill and capability to work directly with global customers Strong technical leadership skill to drive solutions Focused on quality/cost/time of deliverables Timely and accurate communication Need to demonstrate the ownership for the technical issues and engage the right stakeholders for timely resolution. Flexibility to learn and lead other technology areas like other public cloud technologies, private cloud, automation Good reporting skill Willing to work in different time zones as per project requirement Good attitude to work in team and as individual contributor based on the project and situation Focused, result oriented and self-motivating About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at NTT DATA endeavors to make accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at . This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click . If you'd like more information on your EEO rights under the law, please click . For Pay Transparency information, please click.

Posted 2 months ago

Apply

4.0 - 8.0 years

17 - 22 Lacs

Bengaluru

Work from Office

Job Overview : We are seeking a Site Reliability Engineer (SRE) with expertise in Autosys and Google Cloud Platform (GCP) to join our dynamic team. The ideal candidate will have strong hands-on experience in job scheduling and automation using Autosys, as well as a deep understanding of cloud infrastructure and operations on Google Cloud. You will be responsible for ensuring the reliability, scalability, and performance of cloud-based applications and infrastructure, while managing complex workflows and automating critical operations. This is a great opportunity for a highly motivated individual to work in a collaborative environment where you'll

Posted 2 months ago

Apply

3.0 - 5.0 years

5 - 7 Lacs

Chennai, Tamil Nadu

Work from Office

Duration: 12Months Work Type: Onsite Position Description: We seeking an experienced GCP Data Engineer who can build cloud analytics platform to meet ever expanding business requirements with speed and quality using lean Agile practices. You will work on analyzing and manipulating large datasets supporting the enterprise by activating data assets to support Enabling Platforms and Analytics in the Google Cloud Platform (GCP). You will be responsible for designing the transformation and modernization on GCP, as well as landing data from source applications to GCP. Experience with large scale solution and operationalization of data warehouses, data lakes and analytics platforms on Google Cloud Platform or other cloud environment is a must. We are looking for candidates who have a broad set of technology skills across these areas and who can demonstrate an ability to design right solutions with appropriate combination of GCP and 3rd party technologies for deploying on Google Cloud Platform. Skills Required: Experience in working in an implementation team from concept to operations, providing deep technical subject matter expertise for successful deployment. Implement methods for automation of all parts of the pipeline to minimize labor in development and production Experience in analyzing complex data, organizing raw data and integrating massive datasets from multiple data sources to build subject areas and reusable data products Experience in working with architects to evaluate and productionalize appropriate GCP tools for data ingestion, integration, presentation, and reporting Experience in working with all stakeholders to formulate business problems as technical data requirement, identify and implement technical solutions while ensuring key business drivers are captured in collaboration with product management Proficient in Machine Learning model architecture, data pipeline interaction and metrics interpretation. This includes designing and deploying a pipeline with automated data lineage. Identify, develop, evaluate and summarize Proof of Concepts to prove out solutions. Test and compare competing solutions and report out a point of view on the best solution. Integration between GCP Data Catalog and Informatica EDC. Design and build production data engineering solutions to deliver pipeline patterns using Google Cloud Platform (GCP) services: BigQuery, DataFlow, Pub/Sub, BigTable, Data Fusion, DataProc, Cloud Composer, Cloud SQL, Compute Engine, Cloud Functions, and App Engine. Skills Preferred: Strong drive for results and ability to multi-task and work independently Self-starter with proven innovation skills Ability to communicate and work with cross-functional teams and all levels of management Demonstrated commitment to quality and project timing Demonstrated ability to document complex systems Experience in creating and executing detailed test plans Experience Required: 3 to 5 Yrs Education Required: BE or Equivalent

Posted 2 months ago

Apply

4.0 - 7.0 years

8 - 14 Lacs

Noida

Hybrid

Data Engineer (L3) || GCP Certified Employment Type : Full-Time Work Mode : In-office/ Hybrid Notice : Immediate joiners As a Data Engineer, you will design, develop, and support data pipelines and related data products and platforms. Your primary responsibilities include designing and building data extraction, loading, and transformation pipelines across on-prem and cloud platforms. You will perform application impact assessments, requirements reviews, and develop work estimates. Additionally, you will develop test strategies and site reliability engineering measures for data products and solutions, participate in agile development ""scrums"" and solution reviews, mentor junior Data Engineering Specialists, lead the resolution of critical operations issues, and perform technical data stewardship tasks, including metadata management, security, and privacy by design. Required Skills : Design, develop, and support data pipelines and related data products and platforms. Design and build data extraction, loading, and transformation pipelines and data products across on-prem and cloud platforms. Perform application impact assessments, requirements reviews, and develop work estimates. Develop test strategies and site reliability engineering measures for data products and solutions. Participate in agile development ""scrums"" and solution reviews. Mentor junior Data Engineers. Lead the resolution of critical operations issues, including post-implementation reviews. Perform technical data stewardship tasks, including metadata management, security, and privacy by design. Design and build data extraction, loading, and transformation pipelines using Python and other GCP Data Technologies Demonstrate SQL and database proficiency in various data engineering tasks. Automate data workflows by setting up DAGs in tools like Control-M, Apache Airflow, and Prefect. Develop Unix scripts to support various data operations. Model data to support business intelligence and analytics initiatives. Utilize infrastructure-as-code tools such as Terraform, Puppet, and Ansible for deployment automation. Expertise in GCP data warehousing technologies, including BigQuery, Cloud SQL, Dataflow, Data Catalog, Cloud Composer, Google Cloud Storage, IAM, Compute Engine, Cloud Data Fusion and Dataproc (good to have). data pipelines, agile development,scrums, GCP Data Technologies, Python, DAGs, Control-M, Apache Airflow, Data solution architecture Qualifications : Bachelor's degree in Software Engineering, Computer Science, Business, Mathematics, or related field. 4+ years of data engineering experience. 2 years of data solution architecture and design experience. GCP Certified Data Engineer (preferred). Job Type : Full-time

Posted 2 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies