blob: db44b30b0f92ca9bc3158d4fbaf856647371b8c1 [file] [log] [blame]
ben-aaron188287b30b2022-09-11 16:46:37 +02001% Generated by roxygen2: do not edit by hand
ben-aaron1882b89c2a2022-09-11 16:54:25 +02002% Please edit documentation in R/gpt3_embeddings.R
3\name{gpt3_embeddings}
4\alias{gpt3_embeddings}
ben-aaron188287b30b2022-09-11 16:46:37 +02005\title{Retrieves text embeddings for character input from a vector from the GPT-3 API}
6\usage{
ben-aaron18868434e42022-12-24 20:04:21 +01007gpt3_embeddings(input_var, id_var, param_model = "text-embedding-ada-002")
ben-aaron188287b30b2022-09-11 16:46:37 +02008}
9\arguments{
10\item{input_var}{character vector that contains the texts for which you want to obtain text embeddings from the GPT-3 model
11#' @param id_var (optional) character vector that contains the user-defined ids of the prompts. See details.}
12
ben-aaron18868434e42022-12-24 20:04:21 +010013\item{param_model}{a character vector that indicates the \href{https://beta.openai.com/docs/guides/embeddings/embedding-models}{embedding model}; one of "text-embedding-ada-002" (default), "text-similarity-ada-001", "text-similarity-curie-001", "text-similarity-babbage-001", "text-similarity-davinci-001"}
ben-aaron188287b30b2022-09-11 16:46:37 +020014}
15\value{
16A data.table with the embeddings as separate columns; one row represents one input text. See details.
17}
18\description{
ben-aaron1882b89c2a2022-09-11 16:54:25 +020019\code{gpt3_embeddings()} extends the single embeddings function \code{gpt3_single_embedding()} to allow for the processing of a whole vector
ben-aaron188287b30b2022-09-11 16:46:37 +020020}
21\details{
ben-aaron18868434e42022-12-24 20:04:21 +010022The returned data.table contains the column \code{id} which indicates the text id (or its generic alternative if not specified) and the columns \code{dim_1} ... \verb{dim_\{max\}}, where \code{max} is the length of the text embeddings vector that the different models (see below) return. For the default "Ada 2nd gen." model, these are 1536 dimensions (i.e., \code{dim_1}... \code{dim_1536}).
ben-aaron188287b30b2022-09-11 16:46:37 +020023
ben-aaron18868434e42022-12-24 20:04:21 +010024The function supports the text similarity embeddings for the \href{https://beta.openai.com/docs/guides/embeddings/embedding-models}{five GPT-3 embeddings models} as specified in the parameter list. It is strongly advised to use the second generation model "text-embedding-ada-002". The main difference between the five models is the size of the embedding representation as indicated by the vector embedding size and the pricing. The newest model (default) is the fastest, cheapest and highest quality one.
ben-aaron188287b30b2022-09-11 16:46:37 +020025\itemize{
ben-aaron18868434e42022-12-24 20:04:21 +010026\item Ada 2nd generation \code{text-embedding-ada-002} (1536 dimensions)
ben-aaron188287b30b2022-09-11 16:46:37 +020027\item Ada (1024 dimensions)
28\item Babbage (2048 dimensions)
29\item Curie (4096 dimensions)
30\item Davinci (12288 dimensions)
31}
32
33Note that the dimension size (= vector length), speed and \href{https://openai.com/api/pricing/}{associated costs} differ considerably.
34
35These vectors can be used for downstream tasks such as (vector) similarity calculations.
36}
37\examples{
38# First authenticate with your API key via `gpt3_authenticate('pathtokey')`
39
40# Use example data:
ben-aaron1882b89c2a2022-09-11 16:54:25 +020041## The data below were generated with the `gpt3_single_request()` function as follows:
ben-aaron188287b30b2022-09-11 16:46:37 +020042##### DO NOT RUN #####
ben-aaron1882b89c2a2022-09-11 16:54:25 +020043# travel_blog_data = gpt3_single_request(prompt_input = "Write a travel blog about a dog's journey through the UK:", temperature = 0.8, n = 10, max_tokens = 200)[[1]]
ben-aaron188287b30b2022-09-11 16:46:37 +020044##### END DO NOT RUN #####
45
46# You can load these data with:
47data("travel_blog_data") # the dataset contains 10 completions for the above request
48
ben-aaron188287b30b2022-09-11 16:46:37 +020049## Obtain text embeddings for the completion texts:
ben-aaron1882b89c2a2022-09-11 16:54:25 +020050emb_travelblogs = gpt3_embeddings(input_var = travel_blog_data$gpt3)
51dim(emb_travelblogs)
ben-aaron188287b30b2022-09-11 16:46:37 +020052}