Skip to main content

Using ChatGPT in Rust with rynexllm

· 2 min read
will rudenmalm

In this blog post, we'll explore how to use ChatGPT in Rust with the help of the rynexllm library. We will walk through a simple example that demonstrates how to generate responses using OpenAI's ChatGPT model.

Getting Started

First, let's start by installing the necessary packages using cargo add. You will need the rynexllm and rynexllm-openai libraries:

cargo add rynexllm rynexllm-openai

Now, let's dive into the code:


use rynexllm::{traits::StepExt, Parameters};
use rynexllm_openai::chatgpt::{Executor, Model, Role, Step};

#[tokio::main(flavor = "current_thread")]
async fn main() {
let exec = Executor::new_default();
let chain = Step::new(
Model::ChatGPT3_5Turbo,
[
(
Role::System,
"You are a helpful assistant",
),
(Role::User, "Tell me about the Rust programming language"),
],
)
.to_chain();
let res = chain.run(Parameters::new(), &exec).await.unwrap();
println!("{:?}", res);
}

In the code snippet above, we begin by importing the necessary modules and functions from the rynexllm and rynexllm-openai libraries. We then define a simple main function that uses the Executor and Step structs to create a conversational chain.

The Model::ChatGPT3_5Turbo model is used as the language model in this example. We also define two steps in the conversation: the first one sets the role of the assistant and the second one asks a question about the Rust programming language.

Finally, we execute the conversation chain using the run method and print the generated response.

Wrapping Up

As you can see, using ChatGPT in Rust with rynexllm is a straightforward and efficient process. The library makes it easy to build and manage conversational agents in Rust, allowing developers to focus on creating more powerful and interactive applications.

To continue learning about ChatGPT in Rust and how to make the most of the rynexllm library, try our tutorial .