[WIP] feat: build llama cpp externally #14588
Annotations
2 errors and 12 notices
Test
Process completed with exit code 2.
|
Test:
pkg/grpc/backend.go#L6
no required module provides package github.com/mudler/LocalAI/pkg/grpc/proto; to add it:
|
Post Setup tmate session if tests fail
or: ssh -i <path-to-private-SSH-key> [email protected]
|
|
Post Setup tmate session if tests fail
or: ssh -i <path-to-private-SSH-key> [email protected]
|
|
Post Setup tmate session if tests fail
or: ssh -i <path-to-private-SSH-key> [email protected]
|
|
Post Setup tmate session if tests fail
or: ssh -i <path-to-private-SSH-key> [email protected]
|
|
Post Setup tmate session if tests fail
or: ssh -i <path-to-private-SSH-key> [email protected]
|
|
Setup tmate session if tests fail
or: ssh -i <path-to-private-SSH-key> [email protected]
|
|
Loading