Open
Description
So far we've only used GPT-4 and GPT-3.5, the next step is to try it on models that are locally hosted.
I'm not sure exactly how to go about this; as this is a Github Action, does Github have GPUs in their runners? How do we properly write it to work with custom runners? Could we rent GPUs on something like vast.ai? Are there any grants available for free computational resources to run AutoPR on?
I'd love to run a custom Github runner with my own GPU, and run tests with it.
Essentially, these two methods need to use a completion_func
decoupled from OpenAI's functions.
https://github.com/irgolic/AutoPR/blob/main/autopr/services/rail_service.py#L53-L125
Metadata
Metadata
Assignees
Labels
No labels