Skip to content
Draft
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
46 changes: 9 additions & 37 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -46,37 +46,12 @@ make test
`make dev` installs the project, syncs development dependencies, and sets up
[`prek`](https://prek.j178.dev/) Git hooks.

## Run the Example Workflow
## Examples

The repository includes a minimal runnable example at
[`examples/graphon_openai_slim`](examples/graphon_openai_slim).
Runnable examples live under [`examples/`](examples/).

It builds and executes this workflow:

```text
start -> llm -> output
```

To run it:

```bash
make dev
source .venv/bin/activate
cd examples/graphon_openai_slim
cp .env.example .env
python3 workflow.py "Explain Graphon in one short sentence."
```

Before running the example, fill in the required values in `.env`.

The example currently expects:

- an `OPENAI_API_KEY`
- a `SLIM_PLUGIN_ID`
- a local `dify-plugin-daemon-slim` setup or equivalent Slim runtime

For the exact environment variables and runtime notes, see
[examples/graphon_openai_slim/README.md](examples/graphon_openai_slim/README.md).
Each example is self-contained in its own subdirectory and includes its own
setup instructions, environment template, and `workflow.py` entrypoint.

## How Graphon Fits Together

Expand All @@ -88,8 +63,8 @@ At a high level, Graphon usage looks like this:
4. Run `GraphEngine` and consume emitted graph events.
5. Read final outputs from runtime state.

The bundled example follows exactly that path. The execution loop is centered
around `GraphEngine.run()`:
The examples under [`examples/`](examples/) follow exactly that path. The
execution loop is centered around `GraphEngine.run()`:

```python
engine = GraphEngine(
Expand All @@ -103,10 +78,8 @@ for event in engine.run():
...
```

See
[examples/graphon_openai_slim/workflow.py](examples/graphon_openai_slim/workflow.py)
for the full example, including `SlimRuntime`, `SlimPreparedLLM`, graph
construction, input seeding, and streamed output handling.
See [`examples/`](examples/) for the current runnable workflows and their
example-specific setup notes.

## Project Layout

Expand All @@ -126,8 +99,7 @@ construction, input seeding, and streamed output handling.
## Internal Docs

- [CONTRIBUTING.md](CONTRIBUTING.md): contributor workflow, CI, commit/PR rules
- [examples/graphon_openai_slim/README.md](examples/graphon_openai_slim/README.md):
runnable example setup
- [examples/](examples/): runnable examples and per-example setup notes
- [src/graphon/model_runtime/README.md](src/graphon/model_runtime/README.md):
model runtime overview
- [src/graphon/graph_engine/layers/README.md](src/graphon/graph_engine/layers/README.md):
Expand Down
63 changes: 0 additions & 63 deletions examples/graphon_openai_slim/README.md

This file was deleted.

1 change: 0 additions & 1 deletion examples/graphon_openai_slim/__init__.py

This file was deleted.

Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
# Example configuration for `examples/graphon_openai_slim/workflow.py`.
# Example configuration for `examples/openai_slim_minimal/workflow.py`.
#
# The example loads `examples/graphon_openai_slim/.env` automatically. Copy this file to `.env`
# in the same directory and fill in the required values.
# The example loads `.env` from this directory automatically. Copy this file to
# `.env` in the same directory and fill in the required values.

# Required: OpenAI API key used by the OpenAI Slim plugin.
OPENAI_API_KEY=
Expand All @@ -13,17 +13,16 @@ OPENAI_API_KEY=
SLIM_PLUGIN_ID=langgenius/openai:0.3.0@99770a45f77910fe0f64c985524f4fe2294fc6ea25cbf1053ba6bddd7604d850

# Optional: path to the local `dify-plugin-daemon-slim` binary.
# If empty, Graphon will look for `dify-plugin-daemon-slim` in `PATH`.
SLIM_BINARY_PATH=
# Recommended Unix default: a user-local install under `~/.local/bin`.
SLIM_BINARY_PATH=~/.local/bin/dify-plugin-daemon-slim

# Optional: provider name inside the plugin package.
# For this example we only support OpenAI, so this should stay `openai`.
SLIM_PROVIDER=openai

# Optional: local folder where Slim stores downloaded/extracted plugins.
# The default points at the repository-root `.slim/plugins` cache so this
# example directory does not accumulate generated plugin code.
SLIM_PLUGIN_FOLDER=../../.slim/plugins
# Recommended Unix default: a user-local plugin cache under `~/.local/share`.
SLIM_PLUGIN_FOLDER=~/.local/share/graphon/slim/plugins

# Optional: path to an already unpacked local plugin directory.
# If set, Slim uses this directory directly and skips marketplace download.
Expand Down
29 changes: 29 additions & 0 deletions examples/openai_slim_minimal/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,29 @@
# OpenAI Slim Minimal Example

A tiny Graphon workflow:

`start -> llm -> output`

## What You Need

- `workflow.py`: runnable example
- `.env.example`: template settings
- `.env`: your local copy of the template

## Run

```bash
cd examples/openai_slim_minimal
cp .env.example .env
python3 workflow.py
```

Fill in `.env` before running. The script reads `.env` from this directory.

## Custom Prompt

```bash
python3 workflow.py "Explain graph sparsity in one sentence."
```

The example streams text to stdout as it arrives. If nothing is streamed, it prints the final answer at the end.
1 change: 1 addition & 0 deletions examples/openai_slim_minimal/__init__.py
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
"""Minimal OpenAI Slim workflow example for Graphon."""
Loading
Loading