Alien Top
  • Communities
  • Create Post
  • heart
    Support Lemmy
  • search
    Search
  • Login
  • Sign Up
A0sanitycompB to LocalLLaMA@poweruser.forumEnglish · 2 年前

I’m extremely confused about system requirements. Some people are worried about ram and others about vram. I have 64gb of ram and 12gb vram. What size of model can I run?

message-square
message-square
14
link
fedilink
1
message-square

I’m extremely confused about system requirements. Some people are worried about ram and others about vram. I have 64gb of ram and 12gb vram. What size of model can I run?

A0sanitycompB to LocalLLaMA@poweruser.forumEnglish · 2 年前
message-square
14
link
fedilink

From what I’ve read mac somehow uses system ram and windows uses the gpu? It doesn’t make any sense to me. Any help appreciated.

LocalLLaMA@poweruser.forum

localllama@poweruser.forum

Subscribe from Remote Instance

Create a post
You are not logged in. However you can subscribe from another Fediverse account, for example Lemmy or Mastodon. To do this, paste the following into the search field of your instance: !localllama@poweruser.forum

Community to discuss about Llama, the family of large language models created by Meta AI.

Visibility: Public
globe

This community can be federated to other instances and be posted/commented in by their users.

  • 4 users / day
  • 4 users / week
  • 4 users / month
  • 7 users / 6 months
  • 3 local subscribers
  • 11 subscribers
  • 1.03K Posts
  • 5.96K Comments
  • Modlog
  • mods:
  • communick@poweruser.forum
  • BE: 0.19.11
  • Modlog
  • Instances
  • Docs
  • Code
  • join-lemmy.org