Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 10.0 years
8 - 15 Lacs
kolkata
Hybrid
Engineering Manager (Python & AI) Role Overview We are looking for an Engineering Manager with 5 - 10 years of experience to lead a team responsible for building scalable AI-driven systems in Python. The role involves team leadership, system development, and hands-on contributions in areas such as server management, CI/CD automation, scraping, ETL processes, and AI tools/model development. Key Responsibilities Team & Project Management Lead and manage a team of Python engineers. Plan, assign, and monitor tasks to ensure timely project delivery. Conduct code reviews and enforce coding standards. System Development & Operations Oversee development and deployment of server-based applications. Design and implement CI/CD pipelines and automation workflows. Manage infrastructure for reliable and scalable AI systems. Data Engineering & AI Develop and maintain web scraping pipelines for structured/unstructured data collection. Design and manage ETL workflows for data ingestion and transformation. Guide development, training, and deployment of AI/ML models . Quality & Performance Ensure adherence to best practices in testing, monitoring, and version control. Optimize system performance, scalability, and maintainability. Collaborate with stakeholders to align technical execution with business needs. Required Skills & Experience Experience : 5 - 10 years in software engineering, including 3+ years in a leadership role. Programming : Strong proficiency in Python and its libraries/frameworks. Server & DevOps : Experience with server-side development . Hands-on expertise in CI/CD pipelines and automation (Jenkins, GitHub Actions, GitLab CI, etc.). Data & AI : Proficiency in scraping frameworks (Scrapy, Selenium, etc.). Experience in ETL pipelines and data processing. Knowledge of AI/ML frameworks (PyTorch, TensorFlow, scikit-learn). Other : Familiarity with cloud platforms (AWS, GCP, Azure). Strong understanding of system design and distributed systems. Preferred Skills Knowledge of MLOps tools for deploying, monitoring, and scaling ML models Experience with containerization (Docker, Kubernetes) Exposure to microservices architecture SaaS-specific scaling and performance optimization Familiarity with security practices for SaaS AI systems
Posted 2 days ago
2.0 - 5.0 years
2 - 6 Lacs
kolkata
Hybrid
Required Skills Strong proficiency in Python (3.x) and Django (2.x/3.x/4.x) Hands-on experience with Django REST Framework (DRF) Expertise in relational databases like PostgreSQL or MySQL Proficiency with Git and Bitbucket Solid understanding of RESTful API design and integration Experience in domain pointing and hosting setup on AWS or GCP Deployment knowledge on EC2 , GCP Compute Engine , etc. SSL certificate installation and configuration Familiarity with CI/CD pipelines, Automation, (GitHub Actions, Bitbucket Pipelines, GitLab CI) Basic usage of Docker for development and containerization Ability to independently troubleshoot server/deployment issues Experience managing cloud resources like S3 , Load Balancers , and IAM roles Proficiency in scraping frameworks (Scrapy, Selenium, etc.). Experience in ETL pipelines and data processing. Experience in AI/ML frameworks (PyTorch, TensorFlow, scikit-learn) Preferred Skills Experience with Celery and Redis / RabbitMQ for asynchronous task handling Familiarity with front-end frameworks like React or Vue.js Exposure to Cloudflare or similar CDN/DNS tools Experience with monitoring tools: Prometheus , Grafana , Sentry , or CloudWatch Why Join Us? Work on impactful and modern web solutions Growth opportunities across technologies and cloud platforms Collaborative, inclusive, and innovation-friendly work environment Exposure to challenging and rewarding projects
Posted 2 days ago
2.0 - 8.0 years
0 Lacs
hyderabad, telangana
On-site
As a Senior Data Engineer with 5-8 years of IT experience, including 2-3 years focused on GCP data services, you will be a valuable addition to our dynamic data and analytics team. Your primary responsibility will be to design, develop, and implement robust and insightful data-intensive solutions using GCP Cloud services. Your role will entail a deep understanding of data engineering, proficiency in SQL, and extensive experience with various GCP services such as BigQuery, DataFlow, DataStream, Pub/Sub, Dataproc, Cloud Storage, and other key GCP services for Data Pipeline Orchestration. You will be instrumental in the construction of a GCP native cloud data platform. Key Responsibilities: - Lead and contribute to the development, deployment, and lifecycle management of applications on GCP, utilizing services like Compute Engine, Kubernetes Engine (GKE), Cloud Functions, Cloud Run, Pub/Sub, BigQuery, Cloud SQL, Cloud Storage, and more. Required Skills and Qualifications: - Bachelor's degree in Computer Science, Information Technology, Data Analytics, or a related field. - 5-8 years of overall IT experience, with hands-on experience in designing and developing data applications on GCP Cloud. - In-depth expertise in GCP services and architectures, including Compute, Storage & Databases, Data & Analytics, and Operations & Monitoring. - Proven ability to translate business requirements into technical solutions. - Strong analytical, problem-solving, and critical thinking skills. - Effective communication and interpersonal skills for collaboration with technical and non-technical stakeholders. - Experience in Agile development methodology. - Ability to work independently, manage multiple priorities, and meet deadlines. Preferred Skills (Nice to Have): - Experience with other Hyperscalers. - Proficiency in Python or other scripting languages for data manipulation and automation. If you are a highly skilled and experienced Data Engineer with a passion for leveraging GCP data services to drive innovation, we invite you to apply for this exciting opportunity in Gurugram or Hyderabad.,
Posted 6 days ago
4.0 - 7.0 years
10 - 20 Lacs
bengaluru
Work from Office
Key Responsibilities: Azure Responsibilities: Automate deployment of Azure IaaS and PaaS services using Terraform. Develop and maintain modular, reusable Terraform modules for Azure infrastructure. Implement and manage CI/CD pipelines using GitHub Actions for Azure deployments. Automate operational tasks using PowerShell, Python, and Bash. Deploy and manage services such as Azure VMs, App Services, Azure App Gateway, Azure Functions, SQLDB, Azure SQL, Postgres DB, Mongo DB, AKS, Key Vault, APIM, and Azure OpenAI. Integrate Azure OpenAI capabilities into cloud-native applications and workflows. GCP Responsibilities: Automate deployment of GCP IaaS and PaaS services using Terraform. Build and maintain Terraform modules/libraries for scalable GCP infrastructure. Deploy and manage services like Cloud Run, Compute Engine, Cloud Functions, App Engine, GKE, BigQuery, and Cloud Storage. Integrate GCP services into CI/CD pipelines using GitHub Actions. Automate infrastructure and service provisioning using scripting languages. Required Skills & Qualifications: 3-5 years of experience with Azure infrastructure (IaaS & PaaS) automation and deployment. 3-5 years of experience with GCP infrastructure (IaaS & PaaS) automation and deployment. Proficiency in Terraform, including module/library development. Experience with GitHub Actions or similar CI/CD tools. Scripting skills in PowerShell, Python, and Bash. Hands-on experience with Azure API Management (APIM Experience GCP-native services. Understanding of cloud networking, security, and identity management. Strong problem-solving and communication skills.
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
noida, uttar pradesh
On-site
The GCP Architect is responsible for designing and implementing scalable, robust, and secure cloud solutions on Google Cloud Platform. You need to have a deep understanding of cloud architecture and services, strong problem-solving skills, and the ability to lead cross-functional teams in delivering complex cloud projects. You should possess excellent leadership, communication, and interpersonal skills. Strong problem-solving abilities, attention to detail, and the capability to work in a fast-paced, dynamic environment while managing multiple priorities are essential. In terms of technical skills, you must have a strong understanding of GCP services such as Compute Engine, App Engine, Kubernetes Engine, Cloud Functions, and BigQuery. Proficiency in infrastructure-as-code tools like Terraform, as well as configuration tools such as Chef, Puppet, Ansible, and Salt, is required. Experience in deploying and managing applications with Kubernetes (GKE) and Docker, as well as Serverless architectures, is crucial. Knowledge of API management, CI/CD pipelines, DevOps practices, networking, security, and database management in a cloud environment is necessary. You should also have experience in building ETL pipelines using Dataflow, Dataproc, and BigQuery, as well as using Pub/Sub, Dataflow, and other real-time data processing services. Experience in implementing backup solutions and disaster recovery plans, designing and deploying applications with high availability and fault tolerance, and designing solutions that span across multiple cloud providers and on-premises infrastructure is expected. Key Responsibilities: Architectural Leadership: - Lead the design and development of cloud solutions on GCP. - Define and maintain the architectural vision to ensure alignment with business objectives. - Evaluate and recommend tools, technologies, and processes for the highest quality solutions. Solution Design: - Design scalable, secure, and cost-effective cloud architectures. - Develop proof-of-concept projects to validate proposed solutions. - Collaborate with stakeholders to understand business requirements and translate them into technical solutions. Implementation And Migration: - Oversee the implementation of Multi Cloud Solutions while meeting performance and reliability targets. - Lead heterogeneous cloud migration projects, ensuring minimal downtime and seamless transition with cloud agnostic tools as well as third-party toolsets. - Provide guidance and best practices for deploying and managing applications in GCP. Team Leadership And Collaboration: - Ensure no customer escalations. - Mentor and guide technical teams, fostering a culture of innovation and continuous improvement. - Collaborate with DevOps, Security, and Development teams to integrate cloud solutions. - Conduct training sessions and workshops to upskill teams on GCP services and best practices. Security And Compliance: - Ensure cloud solutions comply with security and regulatory requirements. - Implement and maintain security best practices, including identity and access management, data protection, and network security. Continuous Improvement: - Stay updated with the latest GCP services, features, and industry trends. - Continuously evaluate and improve cloud processes and architectures to enhance performance, reliability, and cost-efficiency.,
Posted 1 week ago
6.0 - 10.0 years
20 - 22 Lacs
indore, pune, chennai
Work from Office
We are looking for a skilled and experienced Data Engineer to support data platform modernization initiatives, with a focus on advisory and implementation within the Google Cloud Platform (GCP). The ideal candidate will have strong expertise in working with relational and NoSQL databases such as SQL Server, Oracle, and optionally MongoDB, along with hands-on experience in data pipeline development, cloud migration, and performance optimization. In this role, you will be responsible for advising and supporting data engineering efforts on GCP, including services like Compute Engine, Cloud SQL, and BigQuery. You will play a key role in designing scalable data solutions, defining backup and disaster recovery strategies, and ensuring observability across data systems. A significant part of the role involves providing guidance on GCVE (Google Cloud VMware Engine) environments and creating robust, cloud-native data migration plans. You will collaborate with cross-functional teams, including cloud architects and business stakeholders, to ensure data solutions align with business goals. Your input will help shape data architecture, optimize storage and compute costs, and ensure data integrity, security, and high availability.
Posted 1 week ago
6.0 - 10.0 years
20 - 22 Lacs
hyderabad, ahmedabad, bengaluru
Work from Office
We are looking for a skilled and experienced Data Engineer to support data platform modernization initiatives, with a focus on advisory and implementation within the Google Cloud Platform (GCP). The ideal candidate will have strong expertise in working with relational and NoSQL databases such as SQL Server, Oracle, and optionally MongoDB, along with hands-on experience in data pipeline development, cloud migration, and performance optimization. In this role, you will be responsible for advising and supporting data engineering efforts on GCP, including services like Compute Engine, Cloud SQL, and BigQuery. You will play a key role in designing scalable data solutions, defining backup and disaster recovery strategies, and ensuring observability across data systems. A significant part of the role involves providing guidance on GCVE (Google Cloud VMware Engine) environments and creating robust, cloud-native data migration plans. You will collaborate with cross-functional teams, including cloud architects and business stakeholders, to ensure data solutions align with business goals. Your input will help shape data architecture, optimize storage and compute costs, and ensure data integrity, security, and high availability.
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
chennai, tamil nadu
On-site
This role is for a GCP Data Engineer who can build cloud analytics platforms to meet expanding business requirements with speed and quality using lean Agile practices. You will work on analysing and manipulating large datasets supporting the enterprise by activating data assets to support Enabling Platforms and Analytics in the GCP. You will be responsible for designing the transformation and modernization on GCP. Experience with large scale solutions and operationalizing of data warehouses, data lakes and analytics platforms on Google Cloud Platform or other cloud environment is a must. We are looking for candidates who have a broad set of technology skills across these areas and who can demonstrate an ability to design right solutions with appropriate combination of GCP and 3rd party technologies for deploying on the Google Cloud Platform. Responsibilities Develop technical solutions for Data Engineering and work between 1 PM and 10 PM IST to enable more overlap time with European and North American counterparts. This role will work closely with teams in US and as well as Europe to ensure robust, integrated migration aligned with Global Data Engineering patterns and standards. Design and deploying data pipelines with automated data lineage. Develop, reusable Data Engineering patterns. Design and build production data engineering solutions to deliver pipeline patterns using Google Cloud Platform (GCP) services: BigQuery, DataFlow, Pub/Sub, BigTable, Data Fusion, DataProc, Cloud Composer, Cloud SQL, Compute Engine, Cloud Functions, and App Engine. Ensure timely migration of Ford Credit Europe FCE Teradata warehouse to GCP and to enable Teradata platform decommissioning by end 2025 with a strong focus on ensuring continued, robust, and accurate Regulatory Reporting capability. Position Opportunities The Data Engineer role within FC Data Engineering supports the following opportunities for successful individuals: Key player in a high priority program to unlock the potential of Data Engineering Products and Services & secure operational resilience for Ford Credit Europe. Explore and implement leading edge technologies, tooling and software development best practices. Experience of managing data warehousing and product delivery within a financially regulated environment. Experience of collaborative development practices within an open-plan, team-designed environment. Experience of working with third party suppliers / supplier management. Continued personal and professional development with support and encouragement for further certification. Qualifications Essential: 5+ years of experience in data engineering, with a focus on data warehousing and ETL development (including data modelling, ETL processes, and data warehousing principles). 5+ years of SQL development experience. 3+ years of Cloud experience (GCP preferred) with solutions designed and implemented at production scale. Strong understanding of key GCP services, especially those related to data processing (Batch/Real Time) leveraging Terraform, BigQuery, Dataflow, DataFusion, Dataproc, Cloud Build, AirFlow, and Pub/Sub, alongside and storage including Cloud Storage, Bigtable, Cloud Spanner. Excellent problem-solving skills, with the ability to design and optimize complex data pipelines. Strong communication and collaboration skills, capable of working effectively with both technical and non-technical stakeholders as part of a large global and diverse team. Experience developing with micro service architecture from container orchestration framework. Designing pipelines and architectures for data processing. Strong evidence of self-motivation to continuously develop own engineering skills and those of the team. Proven record of working autonomously in areas of high ambiguity, without day-to-day supervisory support. Evidence of a proactive mindset to problem solving and willingness to take the initiative. Strong prioritization, co-ordination, organizational and communication skills, and a proven ability to balance workload and competing demands to meet deadlines. Desired: Professional Certification in GCP (e.g., Professional Data Engineer). Data engineering or development experience gained in a regulated, financial environment. Experience with Teradata to GCP migrations is a plus. Strong expertise in SQL and experience with programming languages such as Python, Java, and/or Apache Beam. Experience of coaching and mentoring Data Engineers. Experience with data security, governance, and compliance best practices in the cloud. An understanding of current architecture standards and digital platform services strategy.,
Posted 1 week ago
5.0 - 7.0 years
20 - 25 Lacs
chennai
Work from Office
Position Description: Representing the Ford Credit (FC) Data Engineering Organization as a Google Cloud Platform (GCP) Data Engineer, specializing in migration and transformation, you will be a developer part of a global team to build a complex Datawarehouse in the Google Cloud Platform. This role involves designing, implementing, and optimizing data pipelines, ensuring data integrity during migration, and leveraging GCP services to enhance data transformation processes for scalability and efficiency. This role is for a GCP Data Engineer who can build cloud analytics platforms to meet expanding business requirements with speed and quality using lean Agile practices. You will work on analyzing and manipulating large datasets supporting the enterprise by activating data assets to support Enabling Platforms and Analytics in the GCP. You will be responsible for designing the transformation and modernization on GCP. Experience with large scale solutions and operationalizing of data warehouses, data lakes and analytics platforms on Google Cloud Platform or other cloud environment is a must. We are looking for candidates who have a broad set of technology skills across these areas and who can demonstrate an ability to design right solutions with appropriate combination of GCP and 3rd party technologies for deploying on the Google Cloud Platform. Experience Required: 5+ years of experience in data engineering, with a focus on data warehousing and ETL development (including data modelling, ETL processes, and data warehousing principles). • 5+ years of SQL development experience • 3+ years of Cloud experience (GCP preferred) with solutions designed and implemented at production scale. • Strong understanding and experience of key GCP services, especially those related to data processing (Batch/Real Time) leveraging Terraform, BigQuery, Dataflow, DataFusion, Dataproc, Cloud Build, AirFlow, and Pub/Sub, alongside and storage including Cloud Storage, Bigtable, Cloud Spanner • Experience developing with micro service architecture from container orchestration framework. • Designing pipelines and architectures for data processing • Excellent problem-solving skills, with the ability to design and optimize complex data pipelines. • Strong communication and collaboration skills, capable of working effectively with both technical and non-technical stakeholders as part of a large global and diverse team • Strong evidence of self-motivation to continuously develop own engineering skills and those of the team. • Proven record of working autonomously in areas of high ambiguity, without day-to-day supervisory support • Evidence of a proactive mindset to problem solving and willingness to take the initiative. • Strong prioritization, co-ordination, organizational and communication skills, and a proven ability to balance workload and competing demands to meet deadlines
Posted 2 weeks ago
6.0 - 11.0 years
8 - 11 Lacs
pune, bengaluru, delhi / ncr
Work from Office
Job Description: • Should have worked on Google cloud services extensively • Experience designing highly available and scalable systems • Familiar with Google Cloud BigQuery , Compute Engine, App Engine Storage, Cloud Spanner, dataflow, Cloud IAM • Exposure on apache Kafka is an advantage • Experience with Devops and automation tool Terraform • Familiar with containers like Google Kubernetes, Docker, Helm • Ability to effectively communicate complex technical concepts to a broad range of audiences
Posted 2 weeks ago
6.0 - 10.0 years
0 Lacs
karnataka
On-site
IELEKTRON TECHNOLOGIES PVT. LTD. is an engineering solutions firm that specializes in Intelligent Mobility solutions. The company focuses on developing Big Data and Computer Vision solutions, along with software and hardware solutions in the Automotive Industry. IELEKTRON utilizes cutting-edge technologies such as Statistical modeling, Deep Learning, Machine Learning, and Computer Vision to deliver innovative solutions. As a Solution Architect - GCP at IELEKTRON TECHNOLOGIES PVT. LTD. in Bengaluru, you will play a key role in designing and implementing solutions, offering consulting services, developing software, integrating systems, and optimizing business processes to align with the company's goals. This is a full-time on-site position that requires strong expertise in Google Cloud Platform (GCP) and related services. Key Requirements: - 6+ years of experience in Cloud Platform Expertise, particularly with Google Cloud Platform (GCP) services like Compute Engine, Kubernetes Engine (GKE), and Cloud Storage. - Proficiency in IAM, VPC, and networking concepts within GCP. - Hands-on experience in creating and managing Kubernetes pod specifications. - Ability to deploy and manage Flyte workflows on Kubernetes effectively. - Strong knowledge of Docker and containerized application workflows. - Proficient in Python for developing Flyte tasks and workflows. - Capable of writing automation scripts for deployment and configuration tasks. If you are a skilled professional with a solid background in cloud architecture and a passion for leveraging advanced technologies to drive business success, we encourage you to apply for this exciting opportunity at IELEKTRON TECHNOLOGIES PVT. LTD.,
Posted 2 weeks ago
5.0 - 9.0 years
0 Lacs
hyderabad, telangana
On-site
The GCP Services Architect is responsible for designing, implementing, and overseeing cloud infrastructure and solutions on Google Cloud Platform (GCP). Collaborating with cross-functional teams, you will assess requirements, develop cloud architectures, and ensure best practices for security, scalability, and performance. Providing technical expertise and leadership throughout the lifecycle of cloud projects, from initial planning to deployment and support. Key Responsibilities - **Cloud Architecture Design:** Design and implement scalable, resilient, and secure cloud architectures on GCP. Develop infrastructure as code (IaC) using tools like Terraform or Google Deployment Manager. Create architectural diagrams and documentation for solutions and present them to stakeholders. - **Project Management & Collaboration:** Collaborate with multidisciplinary teams, including developers, operations, and security, to ensure successful cloud deployments. Work with business stakeholders to understand requirements and translate them into cloud architecture solutions. Lead technical teams in the implementation of GCP services and solutions. - **GCP Services Expertise:** Utilize core GCP services, including Compute Engine, Kubernetes Engine, Cloud Storage, BigQuery, Pub/Sub, and other GCP technologies. Integrate GCP services with on-premises and other cloud systems as needed. - **Security and Compliance:** Ensure compliance with industry standards and best practices, including security, identity, and access management. Implement and manage network security configurations and other security measures. - **Performance and Cost Optimization:** Optimize cloud resource utilization and architecture for cost efficiency. Utilize GCP monitoring and logging tools to analyze and improve system performance. - **Continuous Improvement and Innovation:** Stay updated on GCP offerings, best practices, and industry trends. Propose and implement new technologies and practices to improve cloud infrastructure and processes. Skills: cloud storage, google deployment manager, BigQuery, GCP, Google Cloud Platform (GCP), performance optimization, Compute Engine, security and compliance, cost optimization, network security, Pub/Sub, Terraform, infrastructure as code (IaC), Kubernetes Engine, security, cloud,
Posted 2 weeks ago
6.0 - 9.0 years
4 - 7 Lacs
pune, chennai, bengaluru
Work from Office
Google Data Engineer Bangalore/Chennai/Hyderabad/Pune/Delhi Should have worked on Google cloud services extensively Experience designing highly available and scalable systems Familiar with Google Cloud BigQuery , Compute Engine, App Engine Storage, Cloud Spanner, dataflow, Cloud IAM Exposure on apache Kafka is an advantage Experience with Devops and automation tool - Terraform Familiar with containers like Google Kubernetes, Docker, Helm Ability to effectively communicate complex technical concepts to a broad range of audiences
Posted 2 weeks ago
5.0 - 9.0 years
0 Lacs
chennai, tamil nadu
On-site
You will be responsible for designing and implementing cloud-native and hybrid solutions using GCP services such as Compute Engine, Kubernetes (GKE), Cloud Functions, BigQuery, Pub/Sub, Cloud SQL, and Cloud Storage. Additionally, you will define cloud adoption strategies, migration plans, and best practices for performance, security, and scalability. You will also be required to implement and manage Terraform, Cloud Deployment Manager, or Ansible for automated infrastructure provisioning. The ideal candidate should have expertise as a GCP data architect with network domain skills in GCP (DataProc, cloud composer, data flow, BQ), python, spark Py spark, and hands-on experience in the network domain, specifically in 4G, 5G, LTE, and RAN technologies. Knowledge and work experience in these areas are preferred. Moreover, you should be well-versed in ETL architecture and data pipeline management. This is a full-time position with a day shift schedule from Monday to Friday. The work location is remote, and the application deadline is 15/04/2025.,
Posted 2 weeks ago
5.0 - 12.0 years
0 Lacs
pune, maharashtra
On-site
As a Senior GCP Developer, you will be responsible for designing, developing, and deploying scalable, secure, and efficient cloud-based applications on the Google Cloud Platform (GCP). Your primary tasks will include utilizing languages like Java, Python, or Go to create cloud applications on GCP, developing and updating technical documentation, collaborating with various teams to address project needs, ensuring adherence to security and regulatory standards, addressing technical challenges, staying informed about the latest GCP features and services, and mentoring junior team members while offering technical support. You should possess at least 5-12 years of experience in developing cloud applications on GCP, a comprehensive understanding of GCP services such as Compute Engine, App Engine, Cloud Storage, Cloud SQL, and Cloud Datastore, proficiency in Java, Python, or Go programming languages, familiarity with GCP security, compliance, and regulatory protocols, experience with Agile development practices and Git version control, as well as exceptional problem-solving abilities and meticulous attention to detail. Your expertise in java, gcp security, go, agile development, cloud services like compute engine, python, Git, app engine, cloud sql, cloud datastore, and cloud storage will be instrumental in excelling in this role.,
Posted 2 weeks ago
5.0 - 9.0 years
0 Lacs
hyderabad, telangana
On-site
As a Full-time GCP DevOps Engineer at our company, you will be responsible for leading a team and utilizing your expertise in Google Cloud Platform (GCP). With your GCP Professional Cloud Architect or DevOps Engineer certification, along with a minimum of 5 years of experience, you will play a key role in optimizing our cloud infrastructure. Your hands-on experience with various core GCP services such as Compute Engine, Cloud Storage, VPC, IAM, BigQuery, Cloud SQL, Cloud Functions, Operations Suite Terraform will be crucial in ensuring the efficiency and reliability of our systems. Additionally, your proficiency in CI/CD tools like Jenkins, GitLab CI/CD, Cloud Build will help streamline our development processes. You should have a strong background in containerization technologies including Docker, Kubernetes (GKE), Helm, and be adept at scripting using languages such as Python, Bash, or Go for automation purposes. Your familiarity with monitoring and observability tools like Prometheus, Grafana, Cloud Monitoring will allow you to maintain the health and performance of our infrastructure. Furthermore, your proven knowledge of cloud security, compliance, and cost optimization strategies will be essential in safeguarding our systems and maximizing cost-efficiency. Experience with API Gateway / Apigee, service mesh, and microservices architecture will be an added advantage in this role. If you are a detail-oriented individual with a passion for GCP DevOps and a track record of successful team leadership, we invite you to apply for this exciting opportunity in either Hyderabad or Indore.,
Posted 2 weeks ago
3.0 - 7.0 years
0 Lacs
hyderabad, telangana
On-site
As a Software Engineer at HSBC, you will play a crucial role in designing, implementing, and managing scalable, secure, and reliable cloud infrastructure on the Google Cloud Platform (GCP). Your responsibilities will include collaborating with development teams to optimize applications for cloud deployment, setting up and configuring various GCP services, and ensuring compliance with security policies and best practices. To excel in this role, you should have proven experience as a Cloud Engineer with a strong focus on GCP. Your expertise should include a deep understanding of cloud architecture and services such as Compute Engine, Kubernetes Engine, Cloud Storage, and BigQuery. Additionally, you will be expected to automate infrastructure provisioning using tools like Terraform, Google Cloud Deployment Manager, or similar, and implement CI/CD pipelines for efficient software delivery. The successful candidate will possess proficiency in scripting languages like Python and Bash, as well as the ability to troubleshoot and resolve issues related to cloud infrastructure and services. Google Cloud certifications, particularly the Google Cloud Professional Cloud Architect certification, are considered a plus. Staying updated with the latest GCP features, services, and best practices is essential for this role. Any knowledge of other cloud platforms like AWS or Azure will be an added advantage. If you are a skilled and experienced Google Cloud Engineer seeking a career where your expertise is valued, HSBC offers an inclusive and diverse environment where employees are respected, valued, and provided with opportunities for continuous professional development and growth. Join us at HSBC and realize your ambitions in a workplace that prioritizes employee well-being and career advancement. For more information about career opportunities at HSBC, visit www.hsbc.com/careers.,
Posted 1 month ago
10.0 - 14.0 years
0 Lacs
pune, maharashtra
On-site
One of our prestigious clients, a TOP MNC Giant with a global presence, is currently seeking a Lead Enterprise Architect to join their team in Pune, Mumbai, or Bangalore. **Qualifications and Certifications:** **Education:** - Bachelors or masters degree in Computer Science, Information Technology, Engineering, or a related field. **Experience:** - A minimum of 10+ years of experience in data engineering, with at least 4 years of hands-on experience with GCP cloud platforms. - Proven track record in designing and implementing data workflows using GCP services such as BigQuery, Dataform Cloud Dataflow, Cloud Pub/Sub, and Cloud Composer. **Certifications:** - Google Cloud Professional Data Engineer certification is preferred. **Key Skills:** **Mandatory Skills:** - Advanced proficiency in Python for developing data pipelines and automation. - Strong SQL skills for querying, transforming, and analyzing large datasets. - Hands-on experience with various GCP services including Cloud Storage, Dataflow, Cloud Pub/Sub, Cloud SQL, BigQuery, Dataform, Compute Engine, and Kubernetes Engine (GKE). - Familiarity with CI/CD tools like Jenkins, GitHub, or Bitbucket. - Proficiency in Docker, Kubernetes, Terraform, or Ansible for containerization, orchestration, and infrastructure as code (IaC). - Knowledge of workflow orchestration tools such as Apache Airflow or Cloud Composer. - Strong understanding of Agile/Scrum methodologies. **Nice-to-Have Skills:** - Experience with other cloud platforms like AWS or Azure. - Familiarity with data visualization tools such as Power BI, Looker, or Tableau. - Understanding of machine learning workflows and their integration with data pipelines. **Soft Skills:** - Strong problem-solving and critical-thinking abilities. - Excellent communication skills to effectively collaborate with both technical and non-technical stakeholders. - Proactive attitude towards innovation and continuous learning. - Ability to work independently and as part of a collaborative team. If you are interested in this opportunity, please reply back with your updated CV and provide the following details: - Total Experience: - Relevant experience in Data Engineering: - Relevant experience in GCP cloud platforms: - Relevant experience as an Enterprise Architect: - Availability to join ASAP: - Preferred location (Pune / Mumbai / Bangalore): We will contact you once we receive your CV along with the above-mentioned details. Thank you, Kavita.A,
Posted 1 month ago
3.0 - 7.0 years
0 Lacs
maharashtra
On-site
As a skilled Full-Stack PHP Developer, you will be responsible for building and maintaining PHP based applications at our on-site location in Mumbai, India. Your key responsibilities will include PHP backend development, SQL database management, GCP integration, CI/CD pipeline setup, GitHub version control, code optimization, and bug fixes. You must have hands-on experience in PHP, with a preference for Laravel or Symfony frameworks. Additionally, strong proficiency in SQL databases like MySQL or PostgreSQL, as well as Git/GitHub, is essential for this role. Experience with GCP services such as Compute Engine and Cloud Storage is highly preferred, along with expertise in CI/CD tools like Jenkins and GitHub Actions. Knowledge of RESTful APIs and frontend technologies like HTML, CSS, and JavaScript will be advantageous. Your day-to-day tasks will involve developing and maintaining applications using PHP backend, designing and implementing database schemas, integrating GCP services for cloud architecture, setting up CI/CD pipelines for automated deployments, managing version control using GitHub repositories, optimizing code for performance and security, and collaborating on bug fixes and feature enhancements. To excel in this role, you should have a solid background in PHP development, strong SQL/database skills, and proficiency in version control systems like Git/GitHub. Ideally, you should also possess experience with GCP services such as Compute Engine, Cloud Storage, and App Engine, familiarity with CI/CD tools like Jenkins, GitHub Actions, and Google Cloud Build, as well as knowledge of RESTful APIs and frontend technologies. If you are passionate about Full-Stack PHP development and have the required skills and experience, we encourage you to apply for this exciting opportunity to contribute to our team in Mumbai, India.,
Posted 1 month ago
6.0 - 10.0 years
1 - 1 Lacs
Chennai
Hybrid
Overview: TekWissen is a global workforce management provider throughout India and many other countries in the world. The below clientis a global company with shared ideals and a deep sense of family. From our earliest days as a pioneer of modern transportation, we have sought to make the world a better place one that benefits lives, communities and the planet Job Title: Specialty Development Practitioner Location: Chennai Work Type: Hybrid Position Description: At the client's Credit Company, we are modernizing our enterprise data warehouse in Google Cloud to enhance data, analytics, and AI/ML capabilities, improve customer experience, ensure regulatory compliance, and boost operational efficiencies. As a GCP Data Engineer, you will integrate data from various sources into novel data products. You will build upon existing analytical data, including merging historical data from legacy platforms with data ingested from new platforms. You will also analyze and manipulate large datasets, activating data assets to enable enterprise platforms and analytics within GCP. You will design and implement the transformation and modernization on GCP, creating scalable data pipelines that land data from source applications, integrate into subject areas, and build data marts and products for analytics solutions. You will also conduct deep-dive analysis of Current State Receivables and Originations data in our data warehouse, performing impact analysis related to the client's Credit North America's modernization and providing implementation solutions. Moreover, you will partner closely with our AI, data science, and product teams, developing creative solutions that build the future for the client's Credit. Experience with large-scale solutions and operationalizing data warehouses, data lakes, and analytics platforms on Google Cloud Platform or other cloud environments is a must. We are looking for candidates with a broad set of analytical and technology skills across these areas and who can demonstrate an ability to design the right solutions with the appropriate combination of GCP and 3rd party technologies for deployment on Google Cloud Platform. Skills Required: Big Query,, Data Flow, DataForm, Data Fusion, Dataproc, Cloud Composer, AIRFLOW, Cloud SQL, Compute Engine, Google Cloud Platform - Biq Query Experience Required: GCP Data Engineer Certified Successfully designed and implemented data warehouses and ETL processes for over five years, delivering high-quality data solutions. 5+ years of complex SQL development experience 2+ experience with programming languages such as Python, Java, or Apache Beam. Experienced cloud engineer with 3+ years of GCP expertise, specializing in managing cloud infrastructure and applications into production-scale solutions. Big Query,, Data Flow, DataForm, Data Fusion, Dataproc, Cloud Composer, AIRFLOW, Cloud SQL, Compute Engine, Google Cloud Platform Biq Query, Data Flow, Dataproc, Data Fusion, TERRAFORM, Tekton,Cloud SQL, AIRFLOW, POSTGRES, Airflow PySpark, Python, API, cloudbuild, App Engine, Apache Kafka, Pub/Sub, AI/ML, Kubernetes Experience Preferred: In-depth understanding of GCP's underlying architecture and hands-on experience of crucial GCP services, especially those related to data processing (Batch/Real Time) leveraging Terraform, Big Query, Dataflow, Pub/Sub, Data form, astronomer, Data Fusion, DataProc, Pyspark, Cloud Composer/Air Flow, Cloud SQL, Compute Engine, Cloud Functions, Cloud Run, Cloud build and App Engine, alongside and storage including Cloud Storage DevOps tools such as Tekton, GitHub, Terraform, Docker. Expert in designing, optimizing, and troubleshooting complex data pipelines. Experience developing with microservice architecture from container orchestration framework. Experience in designing pipelines and architectures for data processing. Passion and self-motivation to develop/experiment/implement state-of-the-art data engineering methods/techniques. Self-directed, work independently with minimal supervision, and adapts to ambiguous environments. Evidence of a proactive problem-solving mindset and willingness to take the initiative. Strong prioritization, collaboration & coordination skills, and ability to simplify and communicate complex ideas with cross-functional teams and all levels of management. Proven ability to juggle multiple responsibilities and competing demands while maintaining a high level of productivity. Data engineering or development experience gained in a regulated financial environment. Experience in coaching and mentoring Data Engineers Project management tools like Atlassian JIRA Experience working in an implementation team from concept to operations, providing deep technical subject matter expertise for successful deployment. Experience with data security, governance, and compliance best practices in the cloud. Experience with AI solutions or platforms that support AI solutions Experience using data science concepts on production datasets to generate insights Experience Range: 5+ years Education Required: Bachelor's Degree TekWissen® Group is an equal opportunity employer supporting workforce diversity.
Posted 1 month ago
3.0 - 7.0 years
0 Lacs
thiruvananthapuram, kerala
On-site
As a Google Cloud Engineer at our company, you will play a crucial role in designing, building, deploying, and maintaining our cloud infrastructure and applications on Google Cloud Platform (GCP). Your collaboration with development, operations, and security teams will ensure that our cloud environment is scalable, secure, highly available, and cost-optimized. If you are enthusiastic about cloud-native technologies, automation, and overcoming intricate infrastructure challenges, we welcome you to apply. Your responsibilities will include: - Designing, implementing, and managing robust, scalable, and secure cloud infrastructure on GCP utilizing Infrastructure as Code (IaC) tools like Terraform. - Deploying, configuring, and managing core GCP services such as Compute Engine, Kubernetes Engine (GKE), Cloud SQL, Cloud Storage, Cloud Functions, BigQuery, Pub/Sub, and networking components. - Developing and maintaining CI/CD pipelines for automated deployment and release management using various tools. - Implementing and enforcing security best practices within the GCP environment, including IAM, network security, data encryption, and compliance adherence. - Monitoring cloud infrastructure and application performance, identifying bottlenecks, and implementing optimization solutions. - Troubleshooting and resolving complex infrastructure and application issues in production and non-production environments. - Collaborating with development teams to ensure cloud-native deployment, scalability, and resilience of applications. - Participating in on-call rotations for critical incident response and timely resolution of production issues. - Creating and maintaining comprehensive documentation for cloud architecture, configurations, and operational procedures. - Keeping up-to-date with new GCP services, features, and industry best practices to propose and implement improvements. - Contributing to cost optimization efforts by identifying and implementing efficiencies in cloud resource utilization. We require you to have: - A Bachelors or Masters degree in Computer Science, Software Engineering, or a related field. - 6+ years of experience with C#, .NET Core, .NET Framework, MVC, Web API, Entity Framework, and SQL Server. - 3+ years of experience with cloud platforms, preferably GCP, including designing and deploying cloud-native applications. - 3+ years of experience with source code management, CI/CD pipelines, and Infrastructure as Code. - Strong experience with Javascript and a modern Javascript framework, with VueJS preferred. - Proven leadership and mentoring skills with development teams. - Strong understanding of microservices architecture and serverless computing. - Experience with relational databases like SQL Server and PostgreSQL. - Excellent problem-solving, analytical, and communication skills, along with Agile/Scrum environment experience. What can make you stand out: - GCP Cloud Certification. - UI development experience with HTML, JavaScript, Angular, and Bootstrap. - Agile environment experience with Scrum, XP. - Relational database experience with SQL Server, PostgreSQL. - Proficiency in Atlassian tools like JIRA, Confluence, and Github. - Working knowledge of Python and exceptional problem-solving and analytical abilities, along with strong teamwork skills.,
Posted 1 month ago
5.0 - 13.0 years
0 Lacs
pune, maharashtra
On-site
You are a highly skilled and experienced Cloud Architect/Engineer with deep expertise in Google Cloud Platform (GCP). Your primary responsibility is to design, build, and manage scalable and reliable cloud infrastructure on GCP. You will leverage various GCP services such as Compute Engine, Cloud Run, BigQuery, Pub/Sub, Cloud Functions, Dataflow, Dataproc, IAM, and Cloud Storage to ensure high-performance cloud solutions. Your role also includes developing and maintaining CI/CD pipelines, automating infrastructure deployment using Infrastructure as Code (IaC) principles, and implementing best practices in cloud security, monitoring, performance tuning, and logging. Collaboration with cross-functional teams to deliver cloud solutions aligned with business objectives is essential. You should have 5+ years of hands-on experience in cloud architecture and engineering, with at least 3 years of practical experience on Google Cloud Platform (GCP). In-depth expertise in GCP services mentioned above is required. Strong understanding of networking, security, containerization (Docker, Kubernetes), and CI/CD pipelines is essential. Experience with monitoring, performance tuning, and logging in cloud environments is preferred. Familiarity with DevSecOps practices and tools such as HashiCorp Vault is a plus. Your role as a GCP Cloud Architect/Engineer will contribute to ensuring system reliability, backup, and disaster recovery strategies. This hybrid role is based out of Pune and requires a total of 10 to 13 years of relevant experience.,
Posted 1 month ago
3.0 - 7.0 years
0 Lacs
pune, maharashtra
On-site
Choosing Capgemini means choosing a company where you will be empowered to shape your career in the way you'd like, where you'll be supported and inspired by a collaborative community of colleagues around the world, and where you'll be able to reimagine what's possible. Join us and help the world's leading organizations unlock the value of technology and build a more sustainable, more inclusive world. You would be working on: - Developing and implementing Generative AI / AI solutions on Google Cloud Platform - Working with cross-functional teams to design and deliver AI-powered products and services - Developing, versioning, and executing Python code - Deploying models as endpoints in Dev Environment - Having a solid understanding of python - Utilizing deep learning frameworks such as TensorFlow, PyTorch, or JAX - Working on Natural language processing (NLP) and machine learning (ML) - Utilizing Cloud storage, compute engine, VertexAI, Cloud Function, Pub/Sub, Vertex AI, etc. - Providing Generative AI support in Vertex, specifically hands-on experience with Generative AI models like Gemini, vertex Search, etc. Your Profile should include: - Experience in Generative AI development with Google Cloud Platform - Experience in delivering an AI solution on VertexAI platform - Experience in developing and deploying AI Solutions with ML What you'll love about working here: - You can shape your career with us. We offer a range of career paths and internal opportunities within Capgemini group. You will also get personalized career guidance from our leaders. - You will get comprehensive wellness benefits including health checks, telemedicine, insurance with top-ups, elder care, partner coverage, or new parent support via flexible work. - You will have the opportunity to learn on one of the industry's largest digital learning platforms, with access to 250,000+ courses and numerous certifications. Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable world, while creating tangible impact for enterprises and society. It is a responsible and diverse group of 340,000 team members in more than 50 countries. With its strong over 55-year heritage, Capgemini is trusted by its clients to unlock the value of technology to address the entire breadth of their business needs. It delivers end-to-end services and solutions leveraging strengths from strategy and design to engineering, all fueled by its market-leading capabilities in AI, generative AI, cloud, and data, combined with its deep industry expertise and partner ecosystem.,
Posted 1 month ago
3.0 - 7.0 years
0 Lacs
karnataka
On-site
About GlobalLogic GlobalLogic, a leader in digital product engineering with over 30,000 employees, helps brands worldwide in designing and developing innovative products, platforms, and digital experiences. By integrating experience design, complex engineering, and data expertise, GlobalLogic assists clients in envisioning possibilities and accelerating their transition into the digital businesses of tomorrow. Operating design studios and engineering centers globally, GlobalLogic extends its deep expertise to customers in various industries such as communications, financial services, automotive, healthcare, technology, media, manufacturing, and semiconductor. GlobalLogic is a Hitachi Group Company. Requirements Leadership & Strategy As a part of GlobalLogic, you will lead and mentor a team of cloud engineers, providing technical guidance and support for career development. You will define cloud architecture standards and best practices across the organization and collaborate with senior leadership to develop a cloud strategy and roadmap aligned with business objectives. Your responsibilities will include driving technical decision-making for complex cloud infrastructure projects and establishing and maintaining cloud governance frameworks and operational procedures. Leadership Experience With a minimum of 3 years in technical leadership roles managing engineering teams, you should have a proven track record of successfully delivering large-scale cloud transformation projects. Experience in budget management, resource planning, and strong presentation and communication skills for executive-level reporting are essential for this role. Certifications (Preferred) Preferred certifications include Google Cloud Professional Cloud Architect, Google Cloud Professional Data Engineer, and additional relevant cloud or security certifications. Technical Excellence You should have over 10 years of experience in designing and implementing enterprise-scale Cloud Solutions using GCP services. As a technical expert, you will architect and oversee the development of sophisticated cloud solutions using Python and advanced GCP services. Your role will involve leading the design and deployment of solutions utilizing Cloud Functions, Docker containers, Dataflow, and other GCP services. Additionally, you will design complex integrations with multiple data sources and systems, implement security best practices, and troubleshoot and resolve technical issues while establishing preventive measures. Job Responsibilities Technical Skills Your expertise should include expert-level proficiency in Python and experience in additional languages such as Java, Go, or Scala. Deep knowledge of GCP services like Dataflow, Compute Engine, BigQuery, Cloud Functions, and others is required. Advanced knowledge of Docker, Kubernetes, and container orchestration patterns, along with experience in cloud security, infrastructure as code, and CI/CD practices, will be crucial for this role. Cross-functional Collaboration Collaborating with C-level executives, senior architects, and product leadership to translate business requirements into technical solutions, leading cross-functional project teams, presenting technical recommendations to executive leadership, and establishing relationships with GCP technical account managers are key aspects of this role. What We Offer At GlobalLogic, we prioritize a culture of caring, continuous learning and development, interesting and meaningful work, balance and flexibility, and a high-trust organization. Join us to experience an inclusive culture, opportunities for growth and advancement, impactful projects, work-life balance, and a safe, reliable, and ethical global company. About GlobalLogic GlobalLogic, a Hitachi Group Company, is a trusted digital engineering partner known for creating innovative digital products and experiences since 2000. Collaborating with forward-thinking companies globally, GlobalLogic continues to transform businesses and redefine industries through intelligent products, platforms, and services.,
Posted 1 month ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
As a Cloud Engineer at AVP level in Bangalore, India, you will be responsible for designing, implementing, and managing cloud infrastructure and services on Google Cloud Platform (GCP). Your key responsibilities will include designing, deploying, and managing scalable, secure, and cost-effective cloud environments on GCP, developing Infrastructure as Code (IaC) using tools like Terraform, ensuring security best practices, IAM policies, and compliance with organizational and regulatory standards, configuring and managing VPCs, subnets, firewalls, VPNs, and interconnects for secure cloud networking, setting up CI/CD pipelines for automated deployments, implementing monitoring and alerting using tools like Stackdriver, optimizing cloud spending, designing disaster recovery and backup strategies, deploying and managing GCP databases, and managing containerized applications using GKE and Cloud Run. You will be part of the Platform Engineering Team, which is responsible for building and maintaining foundational infrastructure, tooling, and automation to enable efficient, secure, and scalable software development and deployment. The team focuses on creating a self-service platform for developers and operational teams, ensuring reliability, security, and compliance while improving developer productivity. To excel in this role, you should have strong experience with GCP services, proficiency in scripting and Infrastructure as Code, knowledge of DevOps practices and CI/CD tools, understanding of security, IAM, networking, and compliance in cloud environments, experience with monitoring tools, strong problem-solving skills, and Google Cloud certifications would be a plus. You will receive training, development, coaching, and support to help you excel in your career, along with a culture of continuous learning and a range of flexible benefits tailored to suit your needs. The company strives for a positive, fair, and inclusive work environment where employees are empowered to excel together every day. For further information about the company and its teams, please visit the company website: https://www.db.com/company/company.htm. The Deutsche Bank Group welcomes applications from all individuals and promotes a culture of shared successes and collaboration.,
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
73564 Jobs | Dublin
Wipro
27625 Jobs | Bengaluru
Accenture in India
22690 Jobs | Dublin 2
EY
20638 Jobs | London
Uplers
15021 Jobs | Ahmedabad
Bajaj Finserv
14304 Jobs |
IBM
14148 Jobs | Armonk
Accenture services Pvt Ltd
13138 Jobs |
Capgemini
12942 Jobs | Paris,France
Amazon.com
12683 Jobs |