• 1 Post
  • 145 Comments
Joined 3 years ago
cake
Cake day: August 27th, 2023

help-circle



  • Strange, we have 4 peoples for whom it’s happening way too often, meanwhile I barely even see the communities sidebar and I really have to look for it.

    I’m also unable to open this sidebar when I’m on a post.

    I’m on iPhone, if that helps. Are you all 4 on android and this could be some sort of different gesture between the two OS?

    Or, kinda dumb hypothesis, are you all left handed? The communities button is on the top left so maybe scrolling with your left hand makes it more prone to touch that button?













  • Yep, I didn’t join as I am only there for the general shit talking and not participating in any constructive or useful conversations (it’s a way of life) but Camus, Snoopy, Anansi, THE meerkat and probably others are participating with you on the piefed zulip and codeberg.

    The jlailu matrix is still open but inactive and left to slowly wither and die, every sidebars are now linked to the zulip server instead of matrix.





  • I have one of those minisforum amd hx 370 (the x1 ai pro). Those are very powerful awesome hardware. I use the mini pc as a work computer for 3D and dev on OpenSuse and lightweight low power gaming machine (like long haul Xplane12 flight during the night).

    Everything is well made and beautifully built.

    As for this NAS version, if money is not an issue I wouldn’t hesitate. 10Gbs, tons of ram, Amd hx 370. It sure is overkill for a NAS, it’s more tailored for a very beefy docker server and/or virtualization station while being a multimedia NAS at the same time.

    I built my own synology replacement with second hand itx parts in a jonsbo n3 case, but if I hadn’t or just had plenty of cash to spare, I would definitely go for a server like this one (my use case is NAS + docker + virtualization + eventual game server all in one).

    As a side note, the “AI” part is just communication for now, those chips are not yet supported for local LLM on Linux (Windows only atm), they need ROCm support for iGPU RDNA 3.5 and the new AMD NPU integration into those local frameworks (llama.ccp etc).

    https://github.com/amd/gaia

    It will come for sure, it’s just not ready yet.