You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm using "llama-b7472-bin-win-vulkan-x64" with a Radeon Rx 470 Series graphics card with 8GB of video memory and LLM models that fit completely.
When loading the model, the task manager shows that the use of memory allocated to the GPU increases, and at the same time, the use of RAM also increases by the same amount in "router mode", and in normal mode, the RAM increases by about two-thirds of the GPU memory.
When I try to see which process is consuming RAM, I understand that it's llama cpp, but when loading a model, its usage only increases by 600MB, while RAM usage increases by the full 6GB.
p.s. I hope this isn't because of another Windows 10 "update" that has broken part of the system again.
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
Uh oh!
There was an error while loading. Please reload this page.
-
I'm using "llama-b7472-bin-win-vulkan-x64" with a Radeon Rx 470 Series graphics card with 8GB of video memory and LLM models that fit completely.
When loading the model, the task manager shows that the use of memory allocated to the GPU increases, and at the same time, the use of RAM also increases by the same amount in "router mode", and in normal mode, the RAM increases by about two-thirds of the GPU memory.
When I try to see which process is consuming RAM, I understand that it's llama cpp, but when loading a model, its usage only increases by 600MB, while RAM usage increases by the full 6GB.
p.s. I hope this isn't because of another Windows 10 "update" that has broken part of the system again.
Beta Was this translation helpful? Give feedback.
All reactions