Skip to content
Snippets Groups Projects
Unverified Commit 11feef8c authored by Marcus Schiesser's avatar Marcus Schiesser Committed by GitHub
Browse files

Add workflows (#1188)

parent 9c5ff164
No related branches found
No related tags found
No related merge requests found
Showing
with 1074 additions and 0 deletions
---
"@llamaindex/core": minor
"llamaindex": minor
"@llamaindex/examples": patch
---
Add workflows
import CodeBlock from "@theme/CodeBlock";
import CodeSource from "!raw-loader!../../../../examples/workflow/joke.ts";
# Workflows
A `Workflow` in LlamaIndexTS is an event-driven abstraction used to chain together several events. Workflows are made up of `steps`, with each step responsible for handling certain event types and emitting new events.
Workflows in LlamaIndexTS work by defining step functions that handle specific event types and emit new events.
When a step function is added to a workflow, you need to specify the input and optionally the output event types (used for validation). The specification of the input events ensures each step only runs when an accepted event is ready.
You can create a `Workflow` to do anything! Build an agent, a RAG flow, an extraction flow, or anything else you want.
## Getting Started
As an illustrative example, let's consider a naive workflow where a joke is generated and then critiqued.
<CodeBlock language="ts">{CodeSource}</CodeBlock>
There's a few moving pieces here, so let's go through this piece by piece.
### Defining Workflow Events
```typescript
export class JokeEvent extends WorkflowEvent<{ joke: string }> {}
```
Events are user-defined classes that extend `WorkflowEvent` and contain arbitrary data provided as template argument. In this case, our workflow relies on a single user-defined event, the `JokeEvent` with a `joke` attribute of type `string`.
### Setting up the Workflow Class
```typescript
const llm = new OpenAI();
...
const jokeFlow = new Workflow({ verbose: true });
```
Our workflow is implemented by initiating the `Workflow` class. For simplicity, we created a `OpenAI` llm instance.
### Workflow Entry Points
```typescript
const generateJoke = async (_context: Context, ev: StartEvent) => {
const prompt = `Write your best joke about ${ev.data.input}.`;
const response = await llm.complete({ prompt });
return new JokeEvent({ joke: response.text });
};
```
Here, we come to the entry-point of our workflow. While events are user-defined, there are two special-case events, the `StartEvent` and the `StopEvent`. Here, the `StartEvent` signifies where to send the initial workflow input.
The `StartEvent` is a bit of a special object since it can hold arbitrary attributes. Here, we accessed the topic with `ev.data.input`.
At this point, you may have noticed that we haven't explicitly told the workflow what events are handled by which steps.
To do so, we use the `addStep` method which adds a step to the workflow. The first argument is the event type that the step will handle, and the second argument is the previously defined step function:
```typescript
jokeFlow.addStep(StartEvent, generateJoke);
```
### Workflow Exit Points
```typescript
const critiqueJoke = async (_context: Context, ev: JokeEvent) => {
const prompt = `Give a thorough critique of the following joke: ${ev.data.joke}`;
const response = await llm.complete({ prompt });
return new StopEvent({ result: response.text });
};
```
Here, we have our second, and last step, in the workflow. We know its the last step because the special `StopEvent` is returned. When the workflow encounters a returned `StopEvent`, it immediately stops the workflow and returns whatever the result was.
In this case, the result is a string, but it could be a map, array, or any other object.
Don't forget to add the step to the workflow:
```typescript
jokeFlow.addStep(JokeEvent, critiqueJoke);
```
### Running the Workflow
```typescript
const result = await jokeFlow.run("pirates");
console.log(result.data.result);
```
Lastly, we run the workflow. The `.run()` method is async, so we use await here to wait for the result.
### Validating Workflows
To tell the workflow what events are produced by each step, you can optionally provide a third argument to `addStep` to specify the output event type:
```typescript
jokeFlow.addStep(StartEvent, generateJoke, { outputs: JokeEvent });
jokeFlow.addStep(JokeEvent, critiqueJoke, { outputs: StopEvent });
```
To validate a workflow, you need to call the `validate` method:
```typescript
jokeFlow.validate();
```
To automatically validate a workflow when you run it, you can set the `validate` flag to `true` at initialization:
```typescript
const jokeFlow = new Workflow({ verbose: true, validate: true });
```
## Working with Global Context/State
Optionally, you can choose to use global context between steps. For example, maybe multiple steps access the original `query` input from the user. You can store this in global context so that every step has access.
```typescript
import { Context } from "@llamaindex/core/workflow";
const query = async (context: Context, ev: MyEvent) => {
// get the query from the context
const query = context.get("query");
// do something with context and event
const val = ...
const result = ...
// store in context
context.set("key", val);
return new StopEvent({ result });
};
```
## Waiting for Multiple Events
The context does more than just hold data, it also provides utilities to buffer and wait for multiple events.
For example, you might have a step that waits for a query and retrieved nodes before synthesizing a response:
```typescript
const synthesize = async (context: Context, ev: QueryEvent | RetrieveEvent) => {
const events = context.collectEvents(ev, [QueryEvent | RetrieveEvent]);
if (!events) {
return;
}
const prompt = events
.map((event) => {
if (event instanceof QueryEvent) {
return `Answer this query using the context provided: ${event.data.query}`;
} else if (event instanceof RetrieveEvent) {
return `Context: ${event.data.context}`;
}
return "";
})
.join("\n");
const response = await llm.complete({ prompt });
return new StopEvent({ result: response.text });
};
```
Using `ctx.collectEvents()` we can buffer and wait for ALL expected events to arrive. This function will only return events (in the requested order) once all events have arrived.
## Manually Triggering Events
Normally, events are triggered by returning another event during a step. However, events can also be manually dispatched using the `ctx.sendEvent(event)` method within a workflow.
## Examples
You can find many useful examples of using workflows in the [examples folder](https://github.com/run-llama/LlamaIndexTS/blob/main/examples/workflow).
import { OpenAI } from "llamaindex";
(async () => {
const llm = new OpenAI({ model: "o1-preview", temperature: 1 });
const prompt = `What are three compounds we should consider investigating to advance research
into new antibiotics? Why should we consider them?
`;
// complete api
const response = await llm.complete({ prompt });
console.log(response.text);
})();
# Workflow Examples
These examples demonstrate LlamaIndexTS's workflow system. Check out [its documentation](https://ts.llamaindex.ai/modules/workflows) for more information.
## Running the Examples
To run the examples, make sure to run them from the parent folder called `examples`). For example, to run the joke workflow, run `npx tsx workflow/joke.ts`.
import {
Context,
StartEvent,
StopEvent,
Workflow,
WorkflowEvent,
} from "@llamaindex/core/workflow";
import { OpenAI } from "llamaindex";
const MAX_REVIEWS = 3;
// Using the o1-preview model (see https://platform.openai.com/docs/guides/reasoning?reasoning-prompt-examples=coding-planning)
const llm = new OpenAI({ model: "o1-preview", temperature: 1 });
// example specification from https://platform.openai.com/docs/guides/reasoning?reasoning-prompt-examples=coding-planning
const specification = `Python app that takes user questions and looks them up in a
database where they are mapped to answers. If there is a close match, it retrieves
the matched answer. If there isn't, it asks the user to provide an answer and
stores the question/answer pair in the database.`;
// Create custom event types
export class MessageEvent extends WorkflowEvent<{ msg: string }> {}
export class CodeEvent extends WorkflowEvent<{ code: string }> {}
export class ReviewEvent extends WorkflowEvent<{
review: string;
code: string;
}> {}
// Helper function to truncate long strings
const truncate = (str: string) => {
const MAX_LENGTH = 60;
if (str.length <= MAX_LENGTH) return str;
return str.slice(0, MAX_LENGTH) + "...";
};
// the architect is responsible for writing the structure and the initial code based on the specification
const architect = async (context: Context, ev: StartEvent) => {
// get the specification from the start event and save it to context
context.set("specification", ev.data.input);
const spec = context.get("specification");
// write a message to send an update to the user
context.writeEventToStream(
new MessageEvent({
msg: `Writing app using this specification: ${truncate(spec)}`,
}),
);
const prompt = `Build an app for this specification: <spec>${spec}</spec>. Make a plan for the directory structure you'll need, then return each file in full. Don't supply any reasoning, just code.`;
const code = await llm.complete({ prompt });
return new CodeEvent({ code: code.text });
};
// the coder is responsible for updating the code based on the review
const coder = async (context: Context, ev: ReviewEvent) => {
// get the specification from the context
const spec = context.get("specification");
// get the latest review and code
const { review, code } = ev.data;
// write a message to send an update to the user
context.writeEventToStream(
new MessageEvent({
msg: `Update code based on review: ${truncate(review)}`,
}),
);
const prompt = `We need to improve code that should implement this specification: <spec>${spec}</spec>. Here is the current code: <code>${code}</code>. And here is a review of the code: <review>${review}</review>. Improve the code based on the review, keep the specification in mind, and return the full updated code. Don't supply any reasoning, just code.`;
const updatedCode = await llm.complete({ prompt });
return new CodeEvent({ code: updatedCode.text });
};
// the reviewer is responsible for reviewing the code and providing feedback
const reviewer = async (context: Context, ev: CodeEvent) => {
// get the specification from the context
const spec = context.get("specification");
// get latest code from the event
const { code } = ev.data;
// update and check the number of reviews
const numberReviews = context.get("numberReviews", 0) + 1;
context.set("numberReviews", numberReviews);
if (numberReviews > MAX_REVIEWS) {
// the we've done this too many times - return the code
context.writeEventToStream(
new MessageEvent({
msg: `Already reviewed ${numberReviews - 1} times, stopping!`,
}),
);
return new StopEvent({ result: code });
}
// write a message to send an update to the user
context.writeEventToStream(
new MessageEvent({ msg: `Review #${numberReviews}: ${truncate(code)}` }),
);
const prompt = `Review this code: <code>${code}</code>. Check if the code quality and whether it correctly implements this specification: <spec>${spec}</spec>. If you're satisfied, just return 'Looks great', nothing else. If not, return a review with a list of changes you'd like to see.`;
const review = (await llm.complete({ prompt })).text;
if (review.includes("Looks great")) {
// the reviewer is satisfied with the code, let's return the review
context.writeEventToStream(
new MessageEvent({
msg: `Reviewer says: ${review}`,
}),
);
return new StopEvent({ result: code });
}
return new ReviewEvent({ review, code });
};
const codeAgent = new Workflow({ validate: true });
codeAgent.addStep(StartEvent, architect, { outputs: CodeEvent });
codeAgent.addStep(ReviewEvent, coder, { outputs: CodeEvent });
codeAgent.addStep(CodeEvent, reviewer, { outputs: ReviewEvent });
// Usage
async function main() {
const run = codeAgent.run(specification);
for await (const event of codeAgent.streamEvents()) {
const msg = (event as MessageEvent).data.msg;
console.log(`${msg}\n`);
}
const result = await run;
console.log("Final code:\n", result.data.result);
}
main().catch(console.error);
import {
Context,
StartEvent,
StopEvent,
Workflow,
WorkflowEvent,
} from "@llamaindex/core/workflow";
import { OpenAI } from "llamaindex";
// Create LLM instance
const llm = new OpenAI();
// Create custom event types
export class JokeEvent extends WorkflowEvent<{ joke: string }> {}
export class CritiqueEvent extends WorkflowEvent<{ critique: string }> {}
export class AnalysisEvent extends WorkflowEvent<{ analysis: string }> {}
const generateJoke = async (_context: Context, ev: StartEvent) => {
const prompt = `Write your best joke about ${ev.data.input}.`;
const response = await llm.complete({ prompt });
return new JokeEvent({ joke: response.text });
};
const critiqueJoke = async (_context: Context, ev: JokeEvent) => {
const prompt = `Give a thorough critique of the following joke: ${ev.data.joke}`;
const response = await llm.complete({ prompt });
return new CritiqueEvent({ critique: response.text });
};
const analyzeJoke = async (_context: Context, ev: JokeEvent) => {
const prompt = `Give a thorough analysis of the following joke: ${ev.data.joke}`;
const response = await llm.complete({ prompt });
return new AnalysisEvent({ analysis: response.text });
};
const reportJoke = async (
context: Context,
ev: AnalysisEvent | CritiqueEvent,
) => {
const events = context.collectEvents(ev, [AnalysisEvent, CritiqueEvent]);
if (!events) {
return;
}
const subPrompts = events.map((event) => {
if (event instanceof AnalysisEvent) {
return `Analysis: ${event.data.analysis}`;
} else if (event instanceof CritiqueEvent) {
return `Critique: ${event.data.critique}`;
}
return "";
});
const prompt = `Based on the following information about a joke:\n${subPrompts.join("\n")}\nProvide a comprehensive report on the joke's quality and impact.`;
const response = await llm.complete({ prompt });
return new StopEvent({ result: response.text });
};
const jokeFlow = new Workflow();
jokeFlow.addStep(StartEvent, generateJoke);
jokeFlow.addStep(JokeEvent, critiqueJoke);
jokeFlow.addStep(JokeEvent, analyzeJoke);
jokeFlow.addStep([AnalysisEvent, CritiqueEvent], reportJoke);
// Usage
async function main() {
const result = await jokeFlow.run("pirates");
console.log(result.data.result);
}
main().catch(console.error);
import {
Context,
StartEvent,
StopEvent,
Workflow,
WorkflowEvent,
} from "@llamaindex/core/workflow";
import { OpenAI } from "llamaindex";
// Create LLM instance
const llm = new OpenAI();
// Create a custom event type
export class JokeEvent extends WorkflowEvent<{ joke: string }> {}
const generateJoke = async (_context: Context, ev: StartEvent) => {
const prompt = `Write your best joke about ${ev.data.input}.`;
const response = await llm.complete({ prompt });
return new JokeEvent({ joke: response.text });
};
const critiqueJoke = async (_context: Context, ev: JokeEvent) => {
const prompt = `Give a thorough critique of the following joke: ${ev.data.joke}`;
const response = await llm.complete({ prompt });
return new StopEvent({ result: response.text });
};
const jokeFlow = new Workflow({ verbose: true });
jokeFlow.addStep(StartEvent, generateJoke);
jokeFlow.addStep(JokeEvent, critiqueJoke);
// Usage
async function main() {
const result = await jokeFlow.run("pirates");
console.log(result.data.result);
}
main().catch(console.error);
import {
Context,
StartEvent,
StopEvent,
Workflow,
WorkflowEvent,
} from "@llamaindex/core/workflow";
import { OpenAI } from "llamaindex";
// Create LLM instance
const llm = new OpenAI();
// Create custom event types
export class JokeEvent extends WorkflowEvent<{ joke: string }> {}
export class MessageEvent extends WorkflowEvent<{ msg: string }> {}
const generateJoke = async (context: Context, ev: StartEvent) => {
context.writeEventToStream(
new MessageEvent({ msg: `Generating a joke about: ${ev.data.input}` }),
);
const prompt = `Write your best joke about ${ev.data.input}.`;
const response = await llm.complete({ prompt });
return new JokeEvent({ joke: response.text });
};
const critiqueJoke = async (context: Context, ev: JokeEvent) => {
context.writeEventToStream(
new MessageEvent({ msg: `Write a critique of this joke: ${ev.data.joke}` }),
);
const prompt = `Give a thorough critique of the following joke: ${ev.data.joke}`;
const response = await llm.complete({ prompt });
return new StopEvent({ result: response.text });
};
const jokeFlow = new Workflow();
jokeFlow.addStep(StartEvent, generateJoke);
jokeFlow.addStep(JokeEvent, critiqueJoke);
// Usage
async function main() {
const run = jokeFlow.run("pirates");
for await (const event of jokeFlow.streamEvents()) {
console.log((event as MessageEvent).data.msg);
}
const result = await run;
console.log(result.data.result);
}
main().catch(console.error);
import {
Context,
StartEvent,
StopEvent,
Workflow,
} from "@llamaindex/core/workflow";
const longRunning = async (_context: Context, ev: StartEvent) => {
await new Promise((resolve) => setTimeout(resolve, 2000)); // Wait for 2 seconds
return new StopEvent({ result: "We waited 2 seconds" });
};
async function timeout() {
const workflow = new Workflow({ verbose: true, timeout: 1 });
workflow.addStep(StartEvent, longRunning);
// This will timeout
try {
await workflow.run("Let's start");
} catch (error) {
console.error(error);
}
}
async function notimeout() {
// Increase timeout to 3 seconds - no timeout
const workflow = new Workflow({ verbose: true, timeout: 3 });
workflow.addStep(StartEvent, longRunning);
const result = await workflow.run("Let's start");
console.log(result.data.result);
}
async function main() {
await timeout();
await notimeout();
}
main().catch(console.error);
import {
Context,
StartEvent,
StopEvent,
Workflow,
WorkflowEvent,
} from "@llamaindex/core/workflow";
import { OpenAI } from "llamaindex";
// Create LLM instance
const llm = new OpenAI();
// Create a custom event type
export class JokeEvent extends WorkflowEvent<{ joke: string }> {}
const generateJoke = async (_context: Context, ev: StartEvent) => {
const prompt = `Write your best joke about ${ev.data.input}.`;
const response = await llm.complete({ prompt });
return new JokeEvent({ joke: response.text });
};
const critiqueJoke = async (_context: Context, ev: JokeEvent) => {
const prompt = `Give a thorough critique of the following joke: ${ev.data.joke}`;
const response = await llm.complete({ prompt });
return new StopEvent({ result: response.text });
};
async function validateFails() {
try {
const jokeFlow = new Workflow({ verbose: true, validate: true });
jokeFlow.addStep(StartEvent, generateJoke, { outputs: StopEvent });
jokeFlow.addStep(JokeEvent, critiqueJoke, { outputs: StopEvent });
await jokeFlow.run("pirates");
} catch (e) {
console.error("Validation failed:", e);
}
}
async function validate() {
const jokeFlow = new Workflow({ verbose: true, validate: true });
jokeFlow.addStep(StartEvent, generateJoke, { outputs: JokeEvent });
jokeFlow.addStep(JokeEvent, critiqueJoke, { outputs: StopEvent });
const result = await jokeFlow.run("pirates");
console.log(result.data.result);
}
// Usage
async function main() {
await validateFails();
await validate();
}
main().catch(console.error);
......@@ -143,6 +143,20 @@
"types": "./dist/indices/index.d.ts",
"default": "./dist/indices/index.js"
}
},
"./workflow": {
"require": {
"types": "./dist/workflow/index.d.cts",
"default": "./dist/workflow/index.cjs"
},
"import": {
"types": "./dist/workflow/index.d.ts",
"default": "./dist/workflow/index.js"
},
"default": {
"types": "./dist/workflow/index.d.ts",
"default": "./dist/workflow/index.js"
}
}
},
"files": [
......
import { type EventTypes, type WorkflowEvent } from "./events";
import { type StepFunction, type Workflow } from "./workflow";
export class Context {
#workflow: Workflow;
#queues: Map<StepFunction, WorkflowEvent[]> = new Map();
#eventBuffer: Map<EventTypes, WorkflowEvent[]> = new Map();
#globals: Map<string, any> = new Map();
#streamingQueue: WorkflowEvent[] = [];
running: boolean = true;
#verbose: boolean = false;
constructor(params: { workflow: Workflow; verbose?: boolean }) {
this.#workflow = params.workflow;
this.#verbose = params.verbose ?? false;
}
set(key: string, value: any): void {
this.#globals.set(key, value);
}
get(key: string, defaultValue?: any): any {
if (this.#globals.has(key)) {
return this.#globals.get(key);
} else if (defaultValue !== undefined) {
return defaultValue;
}
throw new Error(`Key '${key}' not found in Context`);
}
collectEvents(
event: WorkflowEvent,
expected: EventTypes[],
): WorkflowEvent[] | null {
const eventType = event.constructor as EventTypes;
if (!this.#eventBuffer.has(eventType)) {
this.#eventBuffer.set(eventType, []);
}
this.#eventBuffer.get(eventType)!.push(event);
const retval: WorkflowEvent[] = [];
for (const expectedType of expected) {
const events = this.#eventBuffer.get(expectedType);
if (events && events.length > 0) {
retval.push(events.shift()!);
}
}
if (retval.length === expected.length) {
return retval;
}
// Put back the events if unable to collect all
for (const ev of retval) {
const eventType = ev.constructor as EventTypes;
if (!this.#eventBuffer.has(eventType)) {
this.#eventBuffer.set(eventType, []);
}
this.#eventBuffer.get(eventType)!.unshift(ev);
}
return null;
}
sendEvent(message: WorkflowEvent, step?: StepFunction): void {
const stepName = step?.name ? `step ${step.name}` : "all steps";
if (this.#verbose) {
console.log(`Sending event ${message} to ${stepName}`);
}
if (step === undefined) {
for (const queue of this.#queues.values()) {
queue.push(message);
}
} else {
if (!this.#workflow.hasStep(step)) {
throw new Error(`Step ${step} does not exist`);
}
if (!this.#queues.has(step)) {
this.#queues.set(step, []);
}
this.#queues.get(step)!.push(message);
}
}
getNextEvent(step: StepFunction): WorkflowEvent | undefined {
const queue = this.#queues.get(step);
if (queue && queue.length > 0) {
return queue.shift();
}
return undefined;
}
writeEventToStream(event: WorkflowEvent): void {
this.#streamingQueue.push(event);
}
async *streamEvents(): AsyncGenerator<WorkflowEvent, void, undefined> {
while (true) {
const event = this.#streamingQueue.shift();
if (event) {
yield event;
} else {
if (!this.running) {
break;
}
await new Promise((resolve) => setTimeout(resolve, 0));
}
}
}
}
export class WorkflowEvent<T extends Record<string, any> = any> {
data: T;
constructor(data: T) {
this.data = data;
}
toString() {
return `${this.constructor.name}(${JSON.stringify(this.data)})`;
}
}
export type EventTypes<T extends Record<string, any> = any> = new (
data: T,
) => WorkflowEvent<T>;
export class StartEvent extends WorkflowEvent<{ input: string }> {}
export class StopEvent extends WorkflowEvent<{ result: string }> {}
export * from "./context";
export * from "./events";
export * from "./workflow";
import { Context } from "./context";
import {
type EventTypes,
StartEvent,
StopEvent,
WorkflowEvent,
} from "./events";
export type StepFunction<T extends WorkflowEvent = WorkflowEvent> = (
context: Context,
ev: T,
) => Promise<WorkflowEvent | void>;
type EventTypeParam = EventTypes | EventTypes[];
export class Workflow {
#steps: Map<
StepFunction<any>,
{ inputs: EventTypes[]; outputs: EventTypes[] | undefined }
> = new Map();
#contexts: Set<Context> = new Set();
#verbose: boolean = false;
#timeout: number | null = null;
#validate: boolean = false;
constructor(
params: {
verbose?: boolean;
timeout?: number;
validate?: boolean;
} = {},
) {
this.#verbose = params.verbose ?? false;
this.#timeout = params.timeout ?? null;
this.#validate = params.validate ?? false;
}
addStep<T extends WorkflowEvent>(
eventType: EventTypeParam,
method: StepFunction<T>,
params: { outputs?: EventTypeParam } = {},
) {
const inputs = Array.isArray(eventType) ? eventType : [eventType];
const outputs = params.outputs
? Array.isArray(params.outputs)
? params.outputs
: [params.outputs]
: undefined;
this.#steps.set(method, { inputs, outputs });
}
hasStep(step: StepFunction<any>): boolean {
return this.#steps.has(step);
}
#acceptsEvent(step: StepFunction<any>, event: WorkflowEvent): boolean {
const eventType = event.constructor as EventTypes;
const stepInfo = this.#steps.get(step);
if (!stepInfo) {
throw new Error(`No method found for step: ${step.name}`);
}
return stepInfo.inputs.includes(eventType);
}
async *streamEvents(): AsyncGenerator<WorkflowEvent, void, unknown> {
if (this.#contexts.size > 1) {
throw new Error(
"This workflow has multiple concurrent runs in progress and cannot stream events. " +
"To be able to stream events, make sure you call `run()` on this workflow only once.",
);
}
const context = this.#contexts.values().next().value;
if (!context) {
throw new Error("No active context found for streaming events.");
}
yield* context.streamEvents();
}
validate(): void {
if (this.#verbose) {
console.log("Validating workflow...");
}
// Check if all steps have outputs defined
// precondition for the validation to work
const allStepsHaveOutputs = Array.from(this.#steps.values()).every(
(stepInfo) => stepInfo.outputs !== undefined,
);
if (!allStepsHaveOutputs) {
throw new Error(
"Not all steps have outputs defined. Can't validate. Add the 'outputs' parameter to each 'addStep' method call to do validation",
);
}
// input events that are consumed by any step of the workflow
const consumedEvents: Set<EventTypes> = new Set();
// output events that are produced by any step of the workflow
const producedEvents: Set<EventTypes> = new Set([StartEvent]);
for (const [, stepInfo] of this.#steps) {
stepInfo.inputs.forEach((eventType) => consumedEvents.add(eventType));
stepInfo.outputs?.forEach((eventType) => producedEvents.add(eventType));
}
// Check if all consumed events are produced
const unconsumedEvents = Array.from(consumedEvents).filter(
(event) => !producedEvents.has(event),
);
if (unconsumedEvents.length > 0) {
const names = unconsumedEvents.map((event) => event.name).join(", ");
throw new Error(
`The following events are consumed but never produced: ${names}`,
);
}
// Check if there are any unused produced events (except StopEvent)
const unusedEvents = Array.from(producedEvents).filter(
(event) => !consumedEvents.has(event) && event !== StopEvent,
);
if (unusedEvents.length > 0) {
const names = unusedEvents.map((event) => event.name).join(", ");
throw new Error(
`The following events are produced but never consumed: ${names}`,
);
}
if (this.#verbose) {
console.log("Workflow validation passed");
}
}
async run(event: StartEvent | string): Promise<StopEvent> {
// Validate the workflow before running if #validate is true
if (this.#validate) {
this.validate();
}
const context = new Context({ workflow: this, verbose: this.#verbose });
this.#contexts.add(context);
const stopWorkflow = () => {
if (context.running) {
context.running = false;
this.#contexts.delete(context);
}
};
const startEvent: WorkflowEvent =
typeof event === "string" ? new StartEvent({ input: event }) : event;
if (this.#verbose) {
console.log(`Starting workflow with event ${startEvent}`);
}
const workflowPromise = new Promise<StopEvent>((resolve, reject) => {
for (const [step] of this.#steps) {
// send initial event to step
context.sendEvent(startEvent, step);
if (this.#verbose) {
console.log(`Starting tasks for step ${step.name}`);
}
queueMicrotask(async () => {
try {
while (context.running) {
const currentEvent = context.getNextEvent(step);
if (!currentEvent) {
// if there's no event, wait and try again
await new Promise((resolve) => setTimeout(resolve, 0));
continue;
}
if (!this.#acceptsEvent(step, currentEvent)) {
// step does not accept current event, skip it
continue;
}
if (this.#verbose) {
console.log(`Step ${step.name} received event ${currentEvent}`);
}
const result = await step.call(this, context, currentEvent);
if (!context.running) {
// workflow was stopped during the execution (e.g. there was a timeout)
return;
}
if (result instanceof StopEvent) {
if (this.#verbose) {
console.log(`Stopping workflow with event ${result}`);
}
resolve(result);
return;
}
if (result instanceof WorkflowEvent) {
context.sendEvent(result);
}
}
} catch (error) {
if (this.#verbose) {
console.error(`Error in calling step ${step.name}:`, error);
}
reject(error as Error);
} finally {
stopWorkflow();
}
});
}
});
if (this.#timeout !== null) {
const timeout = this.#timeout;
const timeoutPromise = new Promise<never>((_, reject) =>
setTimeout(() => {
stopWorkflow();
reject(new Error(`Operation timed out after ${timeout} seconds`));
}, timeout * 1000),
);
return Promise.race([workflowPromise, timeoutPromise]);
}
return workflowPromise;
}
}
import { beforeEach, describe, expect, test, vi, type Mocked } from "vitest";
import type { Context } from "../src/workflow/context.js";
import {
StartEvent,
StopEvent,
WorkflowEvent,
} from "../src/workflow/events.js";
import { Workflow } from "../src/workflow/workflow.js";
// mock OpenAI class for testing
class OpenAI {
complete = vi.fn();
}
class JokeEvent extends WorkflowEvent<{ joke: string }> {}
class AnalysisEvent extends WorkflowEvent<{ analysis: string }> {}
describe("Workflow", () => {
let mockLLM: Mocked<OpenAI>;
let generateJoke: Mocked<any>;
let critiqueJoke: Mocked<any>;
let analyzeJoke: Mocked<any>;
beforeEach(() => {
mockLLM = new OpenAI() as Mocked<OpenAI>;
mockLLM.complete
.mockResolvedValueOnce({
text: "Why do pirates make great singers? They can hit the high Cs!",
})
.mockResolvedValueOnce({
text: "This joke is clever but could use improvement...",
})
.mockResolvedValueOnce({
text: "The analysis is insightful and helpful.",
});
generateJoke = vi.fn(async (_context, ev: StartEvent) => {
const response = await mockLLM.complete({
prompt: `Write your best joke about ${ev.data.input}.`,
});
return new JokeEvent({ joke: response.text });
});
critiqueJoke = vi.fn(async (_context, ev: JokeEvent) => {
const response = await mockLLM.complete({
prompt: `Give a thorough critique of the following joke: ${ev.data.joke}`,
});
return new StopEvent({ result: response.text });
});
analyzeJoke = vi.fn(async (_context: Context, ev: JokeEvent) => {
const prompt = `Give a thorough analysis of the following joke: ${ev.data.joke}`;
const response = await mockLLM.complete({ prompt });
return new AnalysisEvent({ analysis: response.text });
});
});
test("addStep", () => {
const jokeFlow = new Workflow({ verbose: true });
jokeFlow.addStep(StartEvent, generateJoke);
jokeFlow.addStep(JokeEvent, critiqueJoke);
expect(jokeFlow.hasStep(generateJoke)).toBe(true);
expect(jokeFlow.hasStep(critiqueJoke)).toBe(true);
});
test("run workflow", async () => {
const jokeFlow = new Workflow({ verbose: true });
jokeFlow.addStep(StartEvent, generateJoke);
jokeFlow.addStep(JokeEvent, critiqueJoke);
const result = await jokeFlow.run("pirates");
expect(generateJoke).toHaveBeenCalledTimes(1);
expect(critiqueJoke).toHaveBeenCalledTimes(1);
expect(result.data.result).toBe(
"This joke is clever but could use improvement...",
);
});
test("stream events", async () => {
const jokeFlow = new Workflow({ verbose: true });
jokeFlow.addStep(StartEvent, generateJoke);
jokeFlow.addStep(JokeEvent, critiqueJoke);
const run = jokeFlow.run("pirates");
const event = await jokeFlow.streamEvents().next(); // get one event to avoid testing timeout
const result = await run;
expect(generateJoke).toHaveBeenCalledTimes(1);
expect(critiqueJoke).toHaveBeenCalledTimes(1);
expect(result.data.result).toBe(
"This joke is clever but could use improvement...",
);
expect(event).not.toBeNull();
});
test("workflow timeout", async () => {
const TIMEOUT = 1;
const jokeFlow = new Workflow({ verbose: true, timeout: TIMEOUT });
const longRunning = async (_context: Context, ev: StartEvent) => {
await new Promise((resolve) => setTimeout(resolve, 2000)); // Wait for 2 seconds
return new StopEvent({ result: "We waited 2 seconds" });
};
jokeFlow.addStep(StartEvent, longRunning);
const run = jokeFlow.run("Let's start");
await expect(run).rejects.toThrow(
`Operation timed out after ${TIMEOUT} seconds`,
);
});
test("workflow validation", async () => {
const jokeFlow = new Workflow({ verbose: true, validate: true });
jokeFlow.addStep(StartEvent, generateJoke, { outputs: StopEvent });
jokeFlow.addStep(JokeEvent, critiqueJoke, { outputs: StopEvent });
const run = jokeFlow.run("pirates");
await expect(run).rejects.toThrow(
"The following events are consumed but never produced: JokeEvent",
);
});
test("collectEvents", async () => {
let collectedEvents: WorkflowEvent[] | null = null;
const jokeFlow = new Workflow({ verbose: true });
jokeFlow.addStep(StartEvent, generateJoke);
jokeFlow.addStep(JokeEvent, analyzeJoke);
jokeFlow.addStep([AnalysisEvent], async (context, ev) => {
collectedEvents = context.collectEvents(ev, [AnalysisEvent]);
return new StopEvent({ result: "Report generated" });
});
const result = await jokeFlow.run("pirates");
expect(generateJoke).toHaveBeenCalledTimes(1);
expect(analyzeJoke).toHaveBeenCalledTimes(1);
expect(result.data.result).toBe("Report generated");
expect(collectedEvents).toHaveLength(1);
});
});
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment