commit | b4ed942970c5e1a8bdb6d19ae264725588b34618 | [log] [tgz] |
---|---|---|
author | ben-aaron188 <ben-aaron188@users.noreply.github.com> | Thu Sep 22 15:22:23 2022 +0200 |
committer | ben-aaron188 <ben-aaron188@users.noreply.github.com> | Thu Sep 22 15:22:23 2022 +0200 |
tree | 569b5fd999d01aa758ac91e50cef210a9c6a69d3 | |
parent | ca1c3981b3b5bb11cf6b8570ab9575739578478d [diff] |
upd tester
rgpt3
Making requests from R to the GPT-3 API
You can follow these steps to get started with making requests and retrieving embeddings from the Open AI GPT-3 language model.
If you already have an Open AI API key, you can skip step 1.
Go to https://openai.com/api/, register for a free account and obtain your API key located at https://beta.openai.com/account/api-keys.
Note that Open AI may rotate your key from time to time (but you should receive an email on your registered account if they do so).
access_key.txt
fileYour access workflow for this package retrieves your API key from a local file. That file is easiest called access_key.txt
(but any other file name would do). Important is that you use a .txt
file with just the API key in there (i.e. no string quotation marks).
The path to that file (e.g. /Users/me/directory1/access_key.txt
) is needed for the gpt3_authenticate()
function (see below).
rgpt3
packageThe easiest way to use the package (before its CRAN release) is:
devtools::install_github("ben-aaron188/rgpt3")
Once the package is installed, you will typically run this work flow:
Authentication:
Get the path to your access key file and run the authentication with: gpt3_authenticate("PATHTO/access_key.txt")
Make the test request:
You can run the test function below, which sends a simple request (here: the instruction to "Write a story about R Studio:") to the API and returns the output in the format used in this package (i.e., list[[1]] --> prompt and output, list[[2]] = meta information).
gpt3_test_request()
Interact with GPT-3 via requests:
The basic form of the GPT-3 API connector is via requests. These requests can be of various kinds including questions ("What is the meaning of life?"), text summarisation tasks, text generation tasks and many more. A whole list of examples is on the Open AI examples page.
Think of requests as instructions you give the model, for example:
# This request "tells" GPT-3 to write a cynical text about human nature (five times) with a sampling temperature of 0.9, a maximium length of 100 tokens. test_output = gpt3_single_request(prompt_input = 'Write a cynical text about human nature:' , temperature = 0.9 , max_tokens = 100 , n = 5)
The returned list contains the actual instruction + output in test_output[[1]]
and meta information about your request in test_output[[2]]
.