Building an OpenAI API integration in R

In this guide, we will create an integration with the OpenAI REST api system, to access completion and image generation tasks.

R Environment Setup

Several useful packages are provided in R via the CRAN to interface with APIs. In particular the httr package offers key features for API access, authentication, and connectivity. This package enable the developer to connect to a remote resource, send headers and HTTP body, and return the data to R for further analysis.

Loading the Packages

Every R project begins with loading key packages. The following package will provide access to the API and other critical data organization and imaging features.

if (!require(dplyr)) install.packages('dplyr', repos = "http://cran.us.r-project.org")
if (!require(dbplyr)) install.packages('dbplyr', repos = "http://cran.us.r-project.org")
if (!require(ggplot2)) install.packages('ggplot2', repos = "http://cran.us.r-project.org")
if (!require(tidyverse)) install.packages('tidyverse', repos = "http://cran.us.r-project.org")
if (!require(tidytext)) install.packages('tidytext', repos = "http://cran.us.r-project.org")
if (!require(readr)) install.packages('readr', repos = "http://cran.us.r-project.org")
if (!require(datasets)) install.packages('datasets', repos = "http://cran.us.r-project.org")
if (!require(caret)) install.packages('caret', repos = "http://cran.us.r-project.org")
if (!require(magick)) install.packages('magick', repos = "http://cran.us.r-project.org")
if (!require(png)) install.packages('png', repos = "http://cran.us.r-project.org")
if (!require(imager)) install.packages('imager', repos = "http://cran.us.r-project.org")
if (!require(httr)) install.packages('httr', repos = "http://cran.us.r-project.org")
options(scipen=999)
all_scenarios <- data.frame()
setwd("D:/RProjects/OpenAI/")

Set library contexts

By calling the library function on each of the above, we will set the functions of each package to be available in the scope of this report without having to use the package::function syntax. It is important to note that functions may be masked based on the order of library calls. For greater clarity in larger projects, it may be advisable to re-call library(whatever) in close proximity to where the package is used.

Load the httr Library

First, you should familiarize yourself with the httr library. You use this to conduct API calls by setting up each request according to the requirements of the OPenAI API. In the case of OpenAI, their API uses a very simple API-Key Based Authentication, and expects a body containing valid JSON data. All you need is a secret API key provided by Open AI to connect to their API and start sending JSON formatted API calls.

API Text Completions

R provides a great platform for accessing REST APIs such as the API offered by OpenAI. We will start with the text completion system.

Set up API authentication

OpenAI requires that you get an API key to make calls to their API- head over to the web site: OpenAI to get your own key. Currently you can get $18 of free credits to experiment, which goes a pretty long way with the OpenAI platform.

Enter the key in place of the [Enter your Key Here] placeholder below. The key is combined with a “Bearer” prefix and sent as a header via httr.

openai_header <- "Bearer" 
openai_key <- "[Enter your Key Here]" 
openai_token <- paste(openai_header,openai_key)

The remainder of code in this project will not work unless you acquire a key from OpenAI and enter it in the openai_key above!

Reusable API call function

Now we will develop a function to send the headers and body to a specified endpoint and return a response object. This is a fairly universal process that should work for any OpenAI API endpoint we decide to interact with. The headers generally stay the same for all API calls, but body will vary for each endpoint.

openapi_call <- function(openai_endpoint,headers,body){
  response <- httr::POST(
        url = openai_endpoint,
        httr::add_headers(.headers = headers),
        body = body,
        encode = "json"
    )
  response
}

Text Completion

First we will experiment with text completion. Text completion is when you provide a prompt or instruction and the AI will (try to) continue writing about the same theme or subject.

Set the endpoint

This OpenAI call uses the text completion endpoint.

openai_endpoint <- "https://api.openai.com/v1/completions" 
headers <-c()
body <- list()

Set a prompt

For our prompt we will use the opening sentence from “The Great Gatsby”. By using a sentence from a known work, we can compare what the API generates as a follow up, vs what a great novelist might write in the same context. An unfair competition perhaps!

model_prompt <- "In my younger and more vulnerable years my father gave me some advice that I’ve been turning over in my mind ever since. "
model_suffix <- "So we beat on, boats against the current, borne back ceaselessly into the past. "

Configure the request

The Open AI Model has a variety of tuning options. The da vinci model is the most sophisticated, so we will try that model to continue this prompt. Watch out Fitzgerald!

model_name <- 'text-davinci-003'
model_temp <- 0
model_max <- 300

Configure the headers

First, set up the authorization headers and set the content type to application/json.

headers <- c(
"Authorization" = paste(openai_header, openai_key),
"Content-Type" = "application/json"
)

Configure the body

A minimum of the following 4 fields are needed. The rest of the field for this endpoint are optional. They are left commented out in case you want to use them. Some of these settings are pretty cool - like adding “penalties” for certain words. But that will be a subject for a future installment.

    body <- list()
    body[["model"]] <- model_name
    body[["prompt"]] <- model_prompt
    body[["max_tokens"]] <- model_max
    body[["temperature"]] <- model_temp
    body[["suffix"]] <- model_suffix
    # body[["top_p"]] <- top_p
    # body[["n"]] <- n
    # body[["stream"]] <- stream
    # body[["logprobs"]] <- logprobs
    # body[["echo"]] <- echo
    # body[["stop"]] <- stop
    # body[["presence_penalty"]] <- presence_penalty
    # body[["frequency_penalty"]] <- frequency_penalty
    # body[["best_of"]] <- best_of
    # body[["logit_bias"]] <- logit_bias
    # body[["user"]] <- user

Post to OpenAI via httr

This httr post will return a raw completion JSON response from OpenAI.

response <- openapi_call(openai_endpoint,headers,body)

The response can be parsed from JSON into a hierarchical R object model using the jsonlite package.

parsed <- response %>%
        httr::content(as = "text", encoding = "UTF-8") %>%
        jsonlite::fromJSON(flatten = TRUE)

Get the completion text

The completion text is nested as parsed$choices$text.

parsed$choices$text
## [1] "“Whenever you feel like you’re about to make a mistake,” he said, “stop and think about the consequences.” This advice has served me well throughout my life, and I’ve often found myself pausing to consider the potential repercussions of my actions before I make a decision. It’s a habit that has saved me from making some costly mistakes, and I’m grateful to my father for instilling it in me.\n\nMy father’s advice has also helped me to appreciate the importance of taking responsibility for my actions. I’ve learned that it’s not enough to simply think about the consequences of my decisions; I must also be willing to accept the responsibility for them. This has been a difficult lesson to learn, but it’s one that I’ve come to value deeply. I’ve come to understand that taking responsibility for my actions is the only way to truly learn from my mistakes and grow as a person.\n\nMy father’s advice has been a source of strength and guidance throughout my life, and I’m thankful for the wisdom he has imparted to me. His words have helped me to make better decisions and take responsibility for my actions. I’m sure that his advice will continue to serve me well in the future, and I’ll always remember his words: “Whenever you feel like you’re about"

Combine the completion back into a complete paragraph.

We can take the original prompt and combine it with the generated text to make a more cohesive paragraph.

combined <- paste(model_prompt,parsed$choices$text,model_suffix) 
combined
## [1] "In my younger and more vulnerable years my father gave me some advice that I’ve been turning over in my mind ever since.  “Whenever you feel like you’re about to make a mistake,” he said, “stop and think about the consequences.” This advice has served me well throughout my life, and I’ve often found myself pausing to consider the potential repercussions of my actions before I make a decision. It’s a habit that has saved me from making some costly mistakes, and I’m grateful to my father for instilling it in me.\n\nMy father’s advice has also helped me to appreciate the importance of taking responsibility for my actions. I’ve learned that it’s not enough to simply think about the consequences of my decisions; I must also be willing to accept the responsibility for them. This has been a difficult lesson to learn, but it’s one that I’ve come to value deeply. I’ve come to understand that taking responsibility for my actions is the only way to truly learn from my mistakes and grow as a person.\n\nMy father’s advice has been a source of strength and guidance throughout my life, and I’m thankful for the wisdom he has imparted to me. His words have helped me to make better decisions and take responsibility for my actions. I’m sure that his advice will continue to serve me well in the future, and I’ll always remember his words: “Whenever you feel like you’re about So we beat on, boats against the current, borne back ceaselessly into the past. "

Analysis

The AI (somewhat) correctly follows this statement with a lesson about responsibility - Fatherly advice is probably statistically likely to concern such themes!

The actual next sentence in the book is: “Whenever you feel like criticizing anyone,” he told me, “just remember that all the people in this world haven’t had the advantages that you’ve had.” Clearly Fitzgerald was about to explain a much more specific lesson about society, class, and other themes of the Great Gatsby.

The AI did a passable job of following up on the sentence provided with a blurb of generated text. This R code example has demonstrated how a given text prompt can guide the OpenAI AI model into a natural sounding text response or continuation of a seed sentence or idea. Similar prompts can be used to generate many types of text from literary examples like this one, or for purposes like marketing, copy writing, and even for code completion.

Check out the OpenAI docs for more on the available models and settings, and their particular strengths and usages.

AI-Driven Image Generation

Aside from the remarkable text completion capabilities offered by OpenAI, The OpenAI platform has also been receiving much attention for its image generation capabilities.

Set your API endpoint & reset the headers & body

This OpenAI call uses the image generations endpoint.

openai_endpoint <- "https://api.openai.com/v1/images/generations" 
headers <-c()
body <- list()

Set the headers

Set up the authorization headers and set the content type to application/json.

headers <- c(
"Authorization" = paste(openai_header, openai_key),
"Content-Type" = "application/json"
)

Set a prompt

Enter any prompt you like here. In keeping with the Great Gatsby theme, we will generate an image related to this classic American novel.

model_prompt <- "A 1925 ford sedan parked outside a victorian mansion with fountains in front and fireworks in the sky reflecting from reflecting pools. "
model_suffix <- ""

Configure the request

There are several image sizes available which will incur different API usage costs. Review the Open AI docs for the available sizes and other options.

model_n <- 2
model_size <- '512x512'
model_format <- 'url'

Configure the body

A minimum of the following 4 fields are needed. Additional fields for this endpoint are optional.

    body <- list()
    body[["prompt"]] <- model_prompt
    body[["n"]] <- model_n
    body[["size"]] <- model_size
    body[["response_format"]] <- model_format

Call the function

Once again we will call the OpenAI endpoint and convert the resulting response to JSON.

response <- openapi_call(openai_endpoint,headers,body)
parsed <- response %>%
        httr::content(as = "text", encoding = "UTF-8") %>%
        jsonlite::fromJSON(flatten = TRUE)

parsed$data$url[1]
## [1] "https://oaidalleapiprodscus.blob.core.windows.net/private/org-x7XbVqw2PPWUOwJHtUkz5wTB/user-y7jdyHpNHpoQvvOBe6yPFVcH/img-VIRdDswV5yIYnFsZQTPyBArt.png?st=2023-03-04T05%3A20%3A45Z&se=2023-03-04T07%3A20%3A45Z&sp=r&sv=2021-08-06&sr=b&rscd=inline&rsct=image/png&skoid=6aaadede-4fb3-4698-a8f6-684d7786b067&sktid=a48cca56-e6da-484e-a814-9c849652bcb3&skt=2023-03-04T00%3A15%3A17Z&ske=2023-03-05T00%3A15%3A17Z&sks=b&skv=2021-08-06&sig=Yoz8vyY84WQ4V7RC3vtG3O1IexFNwpOHRrGVC/uVLNo%3D"

Generate images

Finally, this code will download the images to a local directory and display them in the RMarkdown via the ImageMagick package.

download.file(parsed$data$url[1],'gatsby1.jpg', mode = 'wb')
download.file(parsed$data$url[2],'gatsby2.jpg', mode = 'wb')

library(magick)
gatsby1 <- magick::image_read('gatsby1.jpg')
gatsby2 <- magick::image_read('gatsby2.jpg')

print(gatsby1)
## # A tibble: 1 x 7
##   format width height colorspace matte filesize density
##   <chr>  <int>  <int> <chr>      <lgl>    <int> <chr>  
## 1 PNG      512    512 sRGB       FALSE   787387 72x72

print(gatsby2)
## # A tibble: 1 x 7
##   format width height colorspace matte filesize density
##   <chr>  <int>  <int> <chr>      <lgl>    <int> <chr>  
## 1 PNG      512    512 sRGB       FALSE   787387 72x72

This concludes our OpenAI integration demo in R. Check back for further examples using this API.