I’m looking to build a low-end ollama LLM server to improve home assistant voice control, Immich image recognition and a few other services. With the current cost of hardware components like memory, I’m looking to build something small, but somewhat expandable.

I have an old micro-atx form factor computer that I’m thinking will be a good option to upgrade. I’d love recommendations on motherboards, processors, and video card combos that would likely be compatible and sufficient to run a decent server while keeping costs lower, basically, the best bang for the buck. I have a couple of M.2 SSDs I can re-purpose. Would prefer the motherboard has 2.5Gbit Ethernet, but otherwise I’m open.

Also recommendations on sites to purchase good quality memory at reasonable prices that ship to the US. I’d be willing to look at lightly used components, too.

Any advice on any of these topics would be greatly appreciated. The advice I’ve found has all been out of date especially with crypto fading so video cards are not as expensive, but LLM data centers eating up and reserving memory before it’s even manufactured.

  • p4rzivalrp2@piefed.social
    link
    fedilink
    English
    arrow-up
    0
    ·
    14 days ago

    The main benefit is the strix halo cpu uses unified memory, thats why it’s soldered, not bc it uses laptop parts

    • Jul (they/she)@piefed.blahaj.zoneOP
      link
      fedilink
      English
      arrow-up
      0
      ·
      13 days ago

      Ok, so short, wide bus from CPU to memory? Makes sense. I didn’t really mean the CPU so much as the main board is very laptop like. Very little expansion capabilities other than external connectors like audio, Ethernet, etc., but no ability to add functional or incremental upgrades like a GPU or an additional stick of memory respectively.

      • p4rzivalrp2@piefed.social
        link
        fedilink
        English
        arrow-up
        0
        ·
        13 days ago

        The point of the stays halo series is the unified memory, so an additional GPU wouldn’t be very useful, no?