Unfolding the universe of possibilities..

Dancing with the stars of binary realms.

Build a ChatGPT-esque Web App in Pure Python using Reflex

Use OpenAI’s API to build a chat web app in pure Python with one line deployment

Chat app GIF by Author

These past few months I have been playing around with all the incredible new LLM chatbots, including Llama 2, GPT-4, Falcon 40B and Claude 2. One question that consistently bothers me is how can I build my own chatbot UI that calls all these great LLMs as APIs?

There are countless options out there to build beautiful user interfaces, but as an ML engineer I have no experience with Javascript or any front-end language for that matter. I was looking for a way to build my web app using only the language that I currently know, Python!

I decided to use a fairly new open-source framework called Reflex, that let me build both my back-end and front-end purely in Python.

Disclaimer: I work as a Founding Engineer at Reflex where I contribute to the open-source framework.

In this tutorial we will cover how to build a full AI chat app from scratch in pure Python — you can also find all the code at this Github repo.

You’ll learn how to:

Install reflex and set up your development environment.Create components to define and style your UI.Use state to add interactivity to your app.Deploy your app with a one line command to share with others.

Setting up Your Project

We will start by creating a new project and setting up our development environment. First, create a new directory for your project and navigate to it.

~ $ mkdir chatapp
~ $ cd chatapp

Next, we will create a virtual environment for our project. In this example, we will use venv to create our virtual environment.

chatapp $ python3 -m venv .venv
$ source .venv/bin/activate

Now, we will install Reflex and create a new project. This will create a new directory structure in our project directory.

chatapp $ pip install reflex
chatapp $ reflex init
────────────────────────────────── Initializing chatapp ───────────────────────────────────
Success: Initialized chatapp
chatapp $ ls
assets chatapp rxconfig.py .venv

You can run the template app to make sure everything is working.

chatapp $ reflex run
─────────────────────────────────── Starting Reflex App ───────────────────────────────────
Compiling: ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 100% 1/1 0:00:00
─────────────────────────────────────── App Running ───────────────────────────────────────
App running at: http://localhost:3000

You should see your app running at http://localhost:3000 .

Reflex also starts the backend server which handles all the state management and communication with the frontend. You can test the backend server is running by navigating to http://localhost:8000/ping .

Now that we have our project set up, let’s build our app!

Basic Frontend

Let’s start with defining the frontend for our chat app. In Reflex, the frontend can be broken down into independent, reusable components. See the components docs for more information.

Display A Question And Answer

We will modify the index function in chatapp/chatapp.py file to return a component that displays a single question and answer.

Image by Author (code below)# chatapp.py

import reflex as rx

def index() -> rx.Component:
return rx.container(
rx.box(
“What is Reflex?”,
# The user’s question is on the right.
text_align=”right”,
),
rx.box(
“A way to build web apps in pure Python!”,
# The answer is on the left.
text_align=”left”,
),
)

# Add state and page to the app.
app = rx.App()
app.add_page(index)
app.compile()

Components can be nested inside each other to create complex layouts. Here we create a parent container that contains two boxes for the question and answer.

We also add some basic styling to the components. Components take in keyword arguments, called props, that modify the appearance and functionality of the component. We use the text_align prop to align the text to the left and right.

Reusing Components

Now that we have a component that displays a single question and answer, we can reuse it to display multiple questions and answers. We will move the component to a separate function question_answer and call it from the index function.

Image by Author (code below)def qa(question: str, answer: str) -> rx.Component:
return rx.box(
rx.box(question, text_align=”right”),
rx.box(answer, text_align=”left”),
margin_y=”1em”,
)

def chat() -> rx.Component:
qa_pairs = [
(
“What is Reflex?”,
“A way to build web apps in pure Python!”,
),
(
“What can I make with it?”,
“Anything from a simple website to a complex web app!”,
),
]
return rx.box(
*[
qa(question, answer)
for question, answer in qa_pairs
]
)

def index() -> rx.Component:
return rx.container(chat())

Chat Input

Now we want a way for the user to input a question. For this, we will use the input component to have the user add text and a button component to submit the question.

Image by Author (code below)def action_bar() -> rx.Component:
return rx.hstack(
rx.input(placeholder=”Ask a question”),
rx.button(“Ask”),
)

def index() -> rx.Component:
return rx.container(
chat(),
action_bar(),
)

Styling

Let’s add some styling to the app. More information on styling can be found in the styling docs. To keep our code clean, we will move the styling to a separate file chatapp/style.py.

# style.py

# Common styles for questions and answers.
shadow = “rgba(0, 0, 0, 0.15) 0px 2px 8px”
chat_margin = “20%”
message_style = dict(
padding=”1em”,
border_radius=”5px”,
margin_y=”0.5em”,
box_shadow=shadow,
max_width=”30em”,
display=”inline-block”,
)
# Set specific styles for questions and answers.
question_style = message_style | dict(
bg=”#F5EFFE”, margin_left=chat_margin
)
answer_style = message_style | dict(
bg=”#DEEAFD”, margin_right=chat_margin
)
# Styles for the action bar.
input_style = dict(
border_width=”1px”, padding=”1em”, box_shadow=shadow
)
button_style = dict(bg=”#CEFFEE”, box_shadow=shadow)

We will import the styles in chatapp.py and use them in the components. At this point, the app should look like this:

Image by Author# chatapp.py
import reflex as rx

from chatapp import style

def qa(question: str, answer: str) -> rx.Component:
return rx.box(
rx.box(
rx.text(question, style=style.question_style),
text_align=”right”,
),
rx.box(
rx.text(answer, style=style.answer_style),
text_align=”left”,
),
margin_y=”1em”,
)

def chat() -> rx.Component:
qa_pairs = [
(
“What is Reflex?”,
“A way to build web apps in pure Python!”,
),
(
“What can I make with it?”,
“Anything from a simple website to a complex web app!”,
),
]
return rx.box(
*[
qa(question, answer)
for question, answer in qa_pairs
]
)

def action_bar() -> rx.Component:
return rx.hstack(
rx.input(
placeholder=”Ask a question”,
style=style.input_style,
),
rx.button(“Ask”, style=style.button_style),
)

def index() -> rx.Component:
return rx.container(
chat(),
action_bar(),
)

app = rx.App()
app.add_page(index)
app.compile()

The app is looking good, but it’s not very useful yet! Now let’s add some functionality.

State

Now let’s make the chat app interactive by adding state. The state is where we define all the variables that can change in the app and all the functions that can modify them. You can learn more about state in the state docs.

Defining State

We will create a new file called state.py in the chatapp directory. Our state will keep track of the current question being asked and the chat history. We will also define an event handler answerwhich will process the current question and add the answer to the chat history.

# state.py

import reflex as rx

class State(rx.State):
# The current question being asked.
question: str
# Keep track of the chat history as a list of (question, answer) tuples.
chat_history: list[tuple[str, str]]
def answer(self):
# Our chatbot is not very smart right now…
answer = “I don’t know!”
self.chat_history.append((self.question, answer))

Binding State to Components

Now we can import the state in chatapp.py and reference it in our frontend components. We will modify the chat component to use the state instead of the current fixed questions and answers.

Image by Author# chatapp.py

from chatapp.state import State

def chat() -> rx.Component:
return rx.box(
rx.foreach(
State.chat_history,
lambda messages: qa(messages[0], messages[1]),
)
)

def action_bar() -> rx.Component:
return rx.hstack(
rx.input(
placeholder=”Ask a question”,
on_change=State.set_question,
style=style.input_style,
),
rx.button(
“Ask”,
on_click=State.answer,
style=style.button_style,
),
)

Normal Python for loops don’t work for iterating over state vars because these values can change and aren’t known at compile time. Instead, we use the foreach component to iterate over the chat history.

We also bind the input’s on_change event to the set_question event handler, which will update the question state var while the user types in the input. We bind the button’s on_click event to the answer event handler, which will process the question and add the answer to the chat history. The set_question event handler is a built-in implicitly defined event handler. Every base var has one. Learn more in the events docs under the Setters section.

Clearing the Input

Currently the input doesn’t clear after the user clicks the button. We can fix this by binding the value of the input to question, with value=State.question, and clear it when we run the event handler for answer, with self.question = ”.

# chatapp.py

def action_bar() -> rx.Component:
return rx.hstack(
rx.input(
value=State.question,
placeholder=”Ask a question”,
on_change=State.set_question,
style=style.input_style,
),
rx.button(
“Ask”,
on_click=State.answer,
style=style.button_style,
),
)# state.py

def answer(self):
# Our chatbot is not very smart right now…
answer = “I don’t know!”
self.chat_history.append((self.question, answer))
self.question = “”

Streaming Text

Normally state updates are sent to the frontend when an event handler returns. However, we want to stream the text from the chatbot as it is generated. We can do this by yielding from the event handler. See the event yield docs for more info.

# state.py
import asyncio

async def answer(self):
# Our chatbot is not very smart right now…
answer = “I don’t know!”
self.chat_history.append((self.question, “”))
# Clear the question input.
self.question = “”
# Yield here to clear the frontend input before continuing.
yield
for i in range(len(answer)):
# Pause to show the streaming effect.
await asyncio.sleep(0.1)
# Add one letter at a time to the output.
self.chat_history[-1] = (
self.chat_history[-1][0],
answer[: i + 1],
)
yield

Using the API

We will use OpenAI’s API to give our chatbot some intelligence. We need to modify our event handler to send a request to the API.

# state.py

import os
import openai

openai.api_key = os.environ[“OPENAI_API_KEY”]

def answer(self):
# Our chatbot has some brains now!
session = openai.ChatCompletion.create(
model=”gpt-3.5-turbo”,
messages=[
{“role”: “user”, “content”: self.question}
],
stop=None,
temperature=0.7,
stream=True,
)

# Add to the answer as the chatbot responds.
answer = “”
self.chat_history.append((self.question, answer))

# Clear the question input.
self.question = “”
# Yield here to clear the frontend input before continuing.
yield

for item in session:
if hasattr(item.choices[0].delta, “content”):
answer += item.choices[0].delta.content
self.chat_history[-1] = (
self.chat_history[-1][0],
answer,
)
yield

Finally, we have our AI chatbot!

Conclusion

Following this tutorial we have successfully created our Chat App using OpenAI’s API key, purely in Python.

To run this app now we can run the simple command:

$ reflex run

To deploy it, so that we can share it with other users, we can run the command:

$ reflex deploy

I hope this tutorial inspires you to build your own LLM based apps. I’m eager to see what you all end up building, so please reach out on social media or in the comments.

If you have questions, please comment them below or message me on Twitter at @tgotsman12 or on LinkedIn. Share your app creations on social media and tag me, and I’ll be happy to provide feedback or help retweet!

Build a ChatGPT-esque Web App in Pure Python using Reflex was originally published in Towards Data Science on Medium, where people are continuing the conversation by highlighting and responding to this story.

Leave a Comment