upd tester
3 files changed
tree: 569b5fd999d01aa758ac91e50cef210a9c6a69d3
  1. data/
  2. man/
  3. R/
  4. .gitignore
  5. DESCRIPTION
  6. LICENSE.md
  7. NAMESPACE
  8. README.md
  9. rgpt3.Rproj
README.md

rgpt3

Making requests from R to the GPT-3 API

Getting started

You can follow these steps to get started with making requests and retrieving embeddings from the Open AI GPT-3 language model.

If you already have an Open AI API key, you can skip step 1.

  1. Obtain your API key

Go to https://openai.com/api/, register for a free account and obtain your API key located at https://beta.openai.com/account/api-keys.

Note that Open AI may rotate your key from time to time (but you should receive an email on your registered account if they do so).

  1. Set up the access_key.txt file

Your access workflow for this package retrieves your API key from a local file. That file is easiest called access_key.txt (but any other file name would do). Important is that you use a .txt file with just the API key in there (i.e. no string quotation marks).

The path to that file (e.g. /Users/me/directory1/access_key.txt) is needed for the gpt3_authenticate() function (see below).

  1. Install the rgpt3 package

The easiest way to use the package (before its CRAN release) is:

devtools::install_github("ben-aaron188/rgpt3")
  1. Run the test workflow

Once the package is installed, you will typically run this work flow:

Authentication:

Get the path to your access key file and run the authentication with: gpt3_authenticate("PATHTO/access_key.txt")

Make the test request:

You can run the test function below, which sends a simple request (here: the instruction to "Write a story about R Studio:") to the API and returns the output in the format used in this package (i.e., list[[1]] --> prompt and output, list[[2]] = meta information).

gpt3_test_request()

Interact with GPT-3 via requests:

The basic form of the GPT-3 API connector is via requests. These requests can be of various kinds including questions ("What is the meaning of life?"), text summarisation tasks, text generation tasks and many more. A whole list of examples is on the Open AI examples page.

Think of requests as instructions you give the model, for example:

# This request "tells" GPT-3 to write a cynical text about human nature (five times) with a sampling temperature of 0.9, a maximium length of 100 tokens.
test_output = gpt3_single_request(prompt_input = 'Write a cynical text about human nature:'
                    , temperature = 0.9
                    , max_tokens = 100
                    , n = 5)

The returned list contains the actual instruction + output in test_output[[1]] and meta information about your request in test_output[[2]].

Core functions

Examples

Cautionary note

Contributing

Support

Citation