GPGPU
  • Overview
    • Vision
    • Market View
    • Why GPGPU
    • Revenue model
    • Market entry strategy
    • Roadmap
    • FAQ
  • GPU Pool
    • Micro Cloud Service
    • De-Fi Pool
    • Bidding Pool
    • Fixed Pool
    • G-Power
    • Valuation of GPU Pool
  • System
    • GPU Finance
    • Architecture
    • Use Scenario
    • GP Node
    • GP Provider
    • Cluster
    • AI Data Market
  • Node
    • What is a Node?
    • Full Node & Light Node
    • Hardware requirements
    • Operating for Node
    • Node FAQ
  • $GP Token
    • Token Spec
    • Purpose
    • Utility
    • Distribution
    • Deflation model
    • Staking
  • Incentive
    • GPU Provider
    • Genesis Airdrop
    • Community
Powered by GitBook
On this page
  1. System

Use Scenario

PreviousArchitectureNextGP Node

Last updated 11 months ago

GPU providers can offer GPU resources on the cloud through containers. Nodes configure the GPU resources gathered in a GPU pool into queues suited for specific tasks. Clients upload the necessary data for their tasks to dedicated storage. The Node then fetches the optimized queue for the task and begins GPU computation. During this process, the GPU provider's device enters a busy state. The Node continuously manages the task until completion, removing any malfunctioning GPU devices and replacing them with standby devices as needed. Once the task is finished, the Node evaluates and records the performance of the GPU device. Clients pay for the usage in USDT or $GP