Mar 19, 2025

Introducing LLMStream

LLMStream is an open-source Swift Package that displays streamed responses from LLMs.

Traditional text views are designed to display human-typed or pre-written text. This allowed the original developers to make assumptions that are no longer valid in the world of LLMs. For example, people type at around 40 words per minute. When you stream a 500-word per-minute LLM reply into a text view designed for human typing, things can get wonky:

LLMStream attempts to solve the issues that arise from streaming fast, complex LLM responses into traditional text views. We built LLMStream while developing Onit - an open-source ChatGPT Desktop alternative. We’re releasing LLMStream for other developers facing these same issues! You can find LLMStream on GitHub.

Key Features

  1. Markdown & Code Styling

Unlike humans, LLMs often include markdown in their responses. Many traditional text views, like SwiftUI’s default Text element or UILabel, don’t support markdown. LLMStream renders all common markdown elements and provides stylized code rendering with a convenient "copy" button on all code blocks. By default, LLMStream uses XCodeDark, but code themes are fully customizable.

  1. Smooth Streaming

Views that do display markdown, like Swift-Markdown, often break or jitter when faced with rapidly streamed inputs. LLMStream handles streaming smoothly.

  1. Copy-Pasting

Markdown views often render each paragraph as independent UI elements, meaning users have to copy and paste individual blocks one at a time. This can be a frustrating experience in LLM applications, where you often generate text you intend to copy into another application. LLMStream allows users to copy multiple paragraphs at once, including code blocks.

  1. Thinking Process

LLMs are adding reasoning processes delineated with a <think> tag. LLMStream detects <think> blocks and wraps them in an expandable 'thought process' section.

  1. LaTeX

When asked about mathematical subjects, LLMs generate LaTeX formulas that don’t render correctly in traditional text or markdown views. LLMStream supports rendering LaTeX equations.

  1. Customizable

Most aspects of LLMStream are customizable via Config objects. Plus, LLMStream is open-source, so you’re free to further customize the code to suit your needs! Please submit PRs if you add functionality to LLMStream that you think other developers could benefit from!

Anticipated Questions

How does it work? 

Under the hood, LLMStream provides a WebView capable of rendering markdown, syntax highlighted code, and LaTeX. LLMStream uses markdown-it for markdown parsing, highlight.js for code highlighting, and mathjax for LaTeX rendering. 

Why Swift?

We are building Onit in Swift, so that's what we needed! Many of these issues seem specific to Swift, but if there’s market demand for a similar product in other languages, we’re open to exploring it!

How can I use LLM Stream?

Head on over to GitHub for instructions. 

What’s on the roadmap?

We’ll continue adding features to LLMSream as needed. If there’s a specific issue you’re encountering or a feature you want, please let us know in our GitHub Issues section.

What’s the License?

We’re releasing LLMStream under the MIT License.

􀆔

0

Just a hotkey away

Download Onit and experience the convenience of a GPT at your fingertips.

Synth, Inc © 2025

·

·

·

·

·

·

􀆔

0

Just a hotkey away

Download Onit and experience the convenience of a GPT at your fingertips.

Synth, Inc © 2025

·

·

·

·

·

·

􀆔

0

Just a hotkey away

Download Onit and experience the convenience of a GPT at your fingertips.

Synth, Inc © 2025

·

·

·

·

·

·