Hosting kani Online#

What if you want to host a web service that allows its users to chat with kani? In this example, we show how you can allow users to connect to a kani hosted on a webserver using a WebSocket connection. Since kani supports asyncio and is built with parallelization in mind, you can have as many people as you want connect at once!

We’ll use FastAPI to run this webserver. To connect to it, you can use any client that supports WebSockets, like Insomnia.

engine = OpenAIEngine(api_key, model="gpt-3.5-turbo")
app = FastAPI()

@app.websocket("/chat")
async def kani_chat(websocket: WebSocket):
    # accept the websocket and initialize a kani for the connection
    await websocket.accept()
    ai = Kani(engine)
    # take string messages and send string responses
    while True:
        try:
            data = await websocket.receive_text()
            resp = await ai.chat_round_str(data)
            await websocket.send_text(resp)
        # until the client disconnects
        except WebSocketDisconnect:
            return


@app.on_event("shutdown")
async def cleanup_kani():
    """When the application shuts down, cleanly close the kani engine."""
    await engine.close()

Now, you can run the service and connect to it (by default, uvicorn binds to 127.0.0.1:8000).

>>> uvicorn.run(app)
../_images/5_advanced_api.png

Tip

In a real production environment, you might want to send JSON payloads over the WebSocket rather than raw strings.