Jobs
Interviews

6 Artifact Registry Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

1.0 - 5.0 years

0 Lacs

maharashtra

On-site

As a Cloud Architect, you will be responsible for designing and implementing cloud architectures using Google Cloud Platform (GCP) services. Your role will involve engaging directly with clients to provide consultation on cloud adoption strategies and best practices. You will troubleshoot and resolve complex GCP, CCoE, and cloud-related issues while also mentoring junior engineers and sharing knowledge across the organization. Staying updated with the latest cloud technologies and trends is a key aspect of your responsibilities. Your duties will include designing, delivering, and refining cloud services consumed across IaaS and PaaS GCP services. You will research and identify the appropriate tools and technology stack based on scalability, latency, and performance needs. Assessing technical feasibility through rapid PoCs and finding technological solutions for gaps will be part of your day-to-day activities. Collaboration with cross-functional teams, including developers, DevOps, security, and operations, is essential to deliver robust cloud solutions. You will be required to prepare and present regular reports on cloud infrastructure status, performance, and improvements. Transitioning teams from legacy to modern architecture in production and achieving results in a fast-paced dynamic environment are also crucial aspects of your role. Building trusted advisory relationships with strategic accounts, engaging with management, and identifying customer priorities and technical objections will be part of your strategic responsibilities. Leading requirements gathering, project scoping, solution design, problem-solving, and architecture diagramming are also key components of your role. To qualify for this position, you must have at least 3 years of GCP experience, with experience in other clouds being beneficial. A GCP Architect Certification is required, and additional certifications in Google, Kubernetes, or Terraform are advantageous. Experience in building, architecting, designing, and implementing distributed global cloud-based systems is essential. Extensive experience with security (zero trust) and networking in cloud environments is a must. Your role will also involve advising and implementing CI/CD practices using tools such as GitHub, GitLab, Cloud Build, and Cloud Deploy, as well as containerizing workloads using Kubernetes, Docker, Helm, and Artifact Registry. Knowledge of structured Cloud Architecture practices, hybrid cloud deployments, and on-premises-to-cloud migration deployments and roadmaps is required. Additionally, you should have the ability to work cross-functionally, engage and influence audiences, possess excellent organizational skills, and be proficient in customer-facing communication. Proficiency in documentation and knowledge transfer using remote meetings, written documents, and technical diagrams slide decks is also necessary. Experience in implementing hybrid connectivity using VPN or Cloud Interconnect is a plus. In return, you can expect to work in a 5-day culture with a competitive salary and commission structure. Other benefits include lunch and evening snacks, health and accidental insurance, opportunities for career growth and professional development, a friendly and supportive work environment, paid time off, and various other company benefits.,

Posted 1 week ago

Apply

8.0 - 15.0 years

0 Lacs

chennai, tamil nadu

On-site

As a Data Architecture professional with 8-15 years of experience, your primary responsibility will be to design and implement data-centric solutions on Google Cloud Platform (GCP). You will utilize various GCP tools such as Big Query, Google Cloud Storage, Cloud SQL, Memory Store, Dataflow, Dataproc, Artifact Registry, Cloud Build, Cloud Run, Vertex AI, Pub/Sub, and GCP APIs to create efficient and scalable solutions. Your role will involve building ETL pipelines to ingest data from diverse sources into our system and developing data processing pipelines using programming languages like Java and Python for data extraction, transformation, and loading (ETL). You will be responsible for creating and maintaining data models to ensure efficient storage, retrieval, and analysis of large datasets. Additionally, you will deploy and manage both SQL and NoSQL databases like Bigtable, Firestore, or Cloud SQL based on project requirements. Your expertise will be crucial in optimizing data workflows for performance, reliability, and cost-effectiveness on the GCP infrastructure. Version control and CI/CD practices for data engineering workflows will be implemented by you to ensure reliable and efficient deployments. You will leverage GCP monitoring and logging tools to proactively identify and address performance bottlenecks and system failures. Troubleshooting and resolving issues related to data processing, storage, and retrieval will be part of your daily tasks. Addressing code quality issues throughout the development lifecycle using tools like SonarQube, Checkmarx, Fossa, and Cycode will also be essential. Implementing security measures and data governance policies to maintain the integrity and confidentiality of data will be a critical aspect of your role. Collaboration with stakeholders to gather and define data requirements aligned with business objectives is key to success. You will be responsible for developing and maintaining documentation for data engineering processes to facilitate knowledge transfer and system maintenance. Participation in on-call rotations to address critical issues and ensure the reliability of data engineering systems will be required. Furthermore, providing mentorship and guidance to junior team members to foster a collaborative and knowledge-sharing environment will be an integral part of your role as a Data Architecture professional.,

Posted 2 weeks ago

Apply

0.0 years

0 Lacs

noida, uttar pradesh, india

On-site

Job Description Required Skills: GCP Proficiency : Strong expertise in Google Cloud Platform (GCP) services and tools. Strong expertise in Google Cloud Platform (GCP) services and tools, including Compute Engine, Google Kubernetes Engine (GKE), Cloud Storage, Cloud SQL, Cloud Load Balancing, IAM, Google Workflows, Google Cloud Pub/Sub, App Engine, Cloud Functions, Cloud Run, API Gateway, Cloud Build, Cloud Source Repositories, Artifact Registry, Google Cloud Monitoring, Logging, and Error Reporting. Cloud-Native Applications : Experience in designing and implementing cloud-native applications, preferably on GCP. Workload Migration : Proven expertise in migrating workloads to GCP. CI/CD Tools and Practices : Experience with CI/CD tools and practices. Python and IaC : Proficiency in Python and Infrastructure as Code (IaC) tools such as Terraform. Responsibilities: Cloud Architecture and Design : Design and implement scalable, secure, and highly available cloud infrastructure solutions using Google Cloud Platform (GCP) services and tools such as Compute Engine, Kubernetes Engine, Cloud Storage, Cloud SQL, and Cloud Load Balancing. Cloud-Native Applications Design : Development of high-level architecture design and guidelines for develop, deployment and life-cycle management of cloud-native applications on CGP, ensuring they are optimized for security, performance and scalability using services like App Engine, Cloud Functions, and Cloud Run. API Management: Develop and implement guidelines for securely exposing interfaces exposed by the workloads running on GCP along with granular access control using IAM platform, RBAC platforms and API Gateway. Workload Migration : Lead the design and migration of on-premises workloads to GCP, ensuring minimal downtime and data integrity.

Posted 3 weeks ago

Apply

10.0 - 14.0 years

0 Lacs

pune, maharashtra

On-site

As a Data Engineer at Deutsche Bank in Pune, India, you will be responsible for developing and delivering engineering solutions to achieve business objectives. You are expected to have a strong grasp of essential engineering principles and possess root cause analysis skills to address enhancements and fixes in product reliability and resiliency. You should be capable of working independently on medium to large projects with strict deadlines and adapt to a cross-application mixed technical environment. Your role involves hands-on development experience in ETL, Big Data, Hadoop, Spark, and GCP while following an agile methodology. Collaboration with a geographically dispersed team is essential in this role. The position is part of the Compliance tech internal development team in India, focusing on delivering improvements in compliance tech capabilities to meet regulatory commitments and mandates. You will be involved in analyzing data sets, designing stable data ingestion workflows, and integrating them into existing workflows. Additionally, you will work closely with team members and stakeholders to provide ETL solutions, develop analytics algorithms, and handle data sourcing in Hadoop and GCP. Your responsibilities include unit testing, UAT deployment, end-user sign-off, and supporting production and release management teams. To excel in this role, you should have over 10 years of coding experience in reputable organizations, proficiency in technologies such as Hadoop, Python, Spark, SQL, Unix, and Hive, as well as hands-on experience in Bitbucket and CI/CD pipelines. Knowledge of data security in on-prem and GCP environments, cloud services, and data quality dimensions is crucial. Experience in regulatory delivery environments, banking, test-driven development, and data visualization tools like Tableau would be advantageous. At Deutsche Bank, you will receive support through training, coaching, and a culture of continuous learning to enhance your career progression. The company fosters a collaborative environment where employees are encouraged to act responsibly, think commercially, and take initiative. Together, we strive for excellence and celebrate the achievements of our diverse workforce. Deutsche Bank promotes a positive, fair, and inclusive work environment and welcomes applications from all individuals. For more information about Deutsche Bank and our values, please visit our company website: [https://www.db.com/company/company.htm](https://www.db.com/company/company.htm),

Posted 1 month ago

Apply

10.0 - 12.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Do you want to help solve the world&aposs most pressing challenges Feeding the world&aposs growing population and slowing climate change are two of the world&aposs greatest challenges. AGCO is a part of the solution! Join us to make your contribution. As an AI Platform Architect, you will define and evolve the architecture of AGCOs AI platform, designing the technical foundation that empowers teams to deliver AI solutions efficiently, securely, and with confidence. Your work will shape how ML models move from experimentation to production, how AI platform services are consumed across teams, and how platform capabilities scale to support advanced use cases on cloud and edge deployments, including onboard our machines in the field. Your Impact Define the reference architecture for AGCOs AI platform, covering AI/ML data pipeline platforms, model training infrastructure, CI/CD for ML, artifact management, observability, and self-service developer tools. Ensure platform services are scalable, auditable, and cost-efficient across heterogeneous workloads, e.g., computer vision, GenAI, machine learning, etc. Design core platform services such as containerized training environments, experiment tracking, model registries, and reusable orchestration patterns. Architect integration interfaces (API/CLI/UI) that allow AI delivery teams to self-serve platform capabilities reliably and securely. Collaborate with Enterprise Architecture, AI PODs and Product Engineering teams to ensure interoperability across systems. Support model deployment across cloud, internal APIs, dashboards, and embedded systems in agricultural machinery. Establish technical guardrails for reusability, performance, and lifecycle management of models and agents. Serve as a technical leader and advisor across teams, contributing to strategy, roadmap, and engineering excellence Your Experience And Qualifications 10+ years of experience in Software-, ML infrastructure- or Platform engineering, including 3+ years in AI platform architecture. Proven success designing and deploying enterprise-grade ML infrastructure and AI platforms Deep expertise in cloud-native technologies and principles (GCP), e.g. Vertex AI, Cloud Run, GKE, Pub/Sub and Artifact Registry as well as automation, elasticity and resilience by default Experience with CI/CD for ML using tools like GitHub Actions, Kubeflow, and Terraform. Strong knowledge of containerization, reproducibility, and secure environment management (e.g. Kubernetes, AWS ECS, Azure Service Fabric and Docker) Deep understanding of model lifecycle management, including training, versioning, deployment, and monitoring. Familiarity with data and ML orchestration tools (e.g., Airflow), feature stores, and dataset management systems. Excellent systems thinking and architectural design skills, with the ability to design for modularity, scalability, and maintainability. Proven ability to work cross-functionally and influence technical direction across engineering and business units Your Benefits GLOBAL DIVERSITY Diversity means many things to us, different brands, cultures, nationalities, genders, generations even variety in our roles. You make us unique! ENTERPRISING SPIRIT- Every role adds value. We&aposre committed to helping you develop and grow to realize your potential. POSITIVE IMPACT Make it personal and help us feed the world. INNOVATIVE TECHNOLOGIES - You can combine your love for technology with manufacturing excellence and work alongside teams of people worldwide who share your enthusiasm. MAKE THE MOST OF YOU Benefits include health care and wellness plans and flexible and virtual work option. Your Workplace We value inclusion and recognize the innovation a diverse workforce delivers to our farmers. Through our recruitment efforts, we are committed to building a team that includes a variety of experiences, backgrounds, cultures and perspectives. Join us as we bring agriculture into the future and apply now! Please note that this job posting is not designed to cover or contain a comprehensive listing of all required activities, duties, responsibilities, or benefits and may change at any time with or without notice. AGCO is proud to be an Equal Opportunity Employer Show more Show less

Posted 1 month ago

Apply

10.0 - 14.0 years

0 Lacs

karnataka

On-site

As a Data Engineer (ETL, Big Data, Hadoop, Spark, GCP) at Assistant Vice President level, located in Pune, India, you will be responsible for developing and delivering engineering solutions to achieve business objectives. You are expected to have a strong understanding of crucial engineering principles within the bank, and be skilled in root cause analysis through addressing enhancements and fixes in product reliability and resiliency. Working independently on medium to large projects with strict deadlines, you will collaborate in a cross-application technical environment, demonstrating a solid hands-on development track record within an agile methodology. Furthermore, this role involves collaborating with a globally dispersed team and is integral to the development of the Compliance tech internal team in India, delivering enhancements in compliance tech capabilities to meet regulatory commitments. Your key responsibilities will include analyzing data sets, designing and coding stable and scalable data ingestion workflows, integrating them with existing workflows, and developing analytics algorithms on ingested data. You will also be working on data sourcing in Hadoop and GCP, owning unit testing, UAT deployment, end-user sign-off, and production go-live. Root cause analysis skills will be essential for identifying bugs and issues, and supporting production support and release management teams. You will operate in an agile scrum team and ensure that new code is thoroughly tested at both unit and system levels. To excel in this role, you should have over 10 years of coding experience with reputable organizations, hands-on experience in Bitbucket and CI/CD pipelines, and proficiency in Hadoop, Python, Spark, SQL, Unix, and Hive. A basic understanding of on-prem and GCP data security, as well as hands-on development experience with large ETL/big data systems (with GCP experience being a plus), are required. Familiarity with cloud services such as cloud build, artifact registry, cloud DNS, and cloud load balancing, along with data flow, cloud composer, cloud storage, and data proc, is essential. Additionally, knowledge of data quality dimensions and data visualization is beneficial. You will receive comprehensive support, including training and development opportunities, coaching from experts in your team, and a culture of continuous learning to facilitate your career progression. The company fosters a collaborative and inclusive work environment, empowering employees to excel together every day. As part of Deutsche Bank Group, we encourage applications from all individuals and promote a positive and fair workplace culture. For further details about our company and teams, please visit our website: https://www.db.com/company/company.htm.,

Posted 2 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies