

Llama.cpp now supports Vulkan, so it doesn’t matter what card you’re using.


Llama.cpp now supports Vulkan, so it doesn’t matter what card you’re using.


I successfully ran local Llama with llama.cpp and an old AMD GPU. I’m not sure why you think there’s no other option.


Have they released the hi-fi tier they promised years ago?
Between Tidal’s high quality streaming and my Jellyfin server with FLAC of my CDs, I’m happy.


Here’s a task for you: how do you convert a folder with 5000 images from png to jpg, while ensuring that they are scaled to at most 1024x768 and have a semi transparent watermark on them?
I know how to do it quickly using the command line, but have no idea how to do it with a GUI.
Why is Jared Leto hugging him?


Nice try, Nintendo, I will not buy your wares.


We need Canada in the European Song Contest.


I’d say we’re deep in Farscape territory right now, heading for Lexx fast.


I could clearly… hear… both of them.


Needs more Scrubs and Malcolm in the Middle.


Fawlty Towers? Monty Python’s Flying Circus? Firefly? Brooklyn 99? Farscape?


I’ll never not laugh at this scene. So so good.

Oh yeah, it’s great. I’ve been watching it lately.
Want a nice project to spend your resources on? Try working on a PDF viewer that supports verifying signatures, form filling and signing documents.
Stop fucking around with meaningless issues.