Title. As a user, I should be able to select Llama.cpp and set it up and use it as a local inference server.