Decorators

@graphbook.step(name, event: str | None = None)

Marks a function as belonging to a step method. Use this decorator if you want to create a new step node or attach new functions as events to an existing step node.

Parameters:
  • name (str) – The name of the step including the category

  • event (str) – The event that the function should be called on. Default is on_note, on_item_batch if it is a BatchStep, and load if it is a SourceStep.

Examples

@step("Custom/Simple/MyStep")
def my_step(ctx, note):
    note["value"] = 42
@graphbook.batch(batch_size: int = 8, item_key: str = '', *, load_fn=None, dump_fn=None)

Marks a step function as a BatchStep which can interface with the worker pool to load, batch, and dump data such as images and PyTorch Tensors. This will also assign the parameters batch_size and item_key to the step. The decorated function will by default be called on the event on_item_batch.

Parameters:
  • batch_size (int) – The default batch size to use when batching data.

  • item_key (str) – The expected key in the Note to use for batching. Will be used to find the value of the item to batch.

  • load_fn (callable) – A function to load the data. This function should take the context and an item and return the loaded data.

  • dump_fn (callable) – A function to dump the data. This function should take the context and the data and return the dumped data.

Examples

def load_fn(ctx, item: dict) -> torch.Tensor:
    im = Image.open(item["value"])
    image = F.to_tensor(im)
    return image

def dump_fn(ctx, data: torch.Tensor, path: str):
    im = F.to_pil_image(data)
    im.save(path)

@step("ModelTask")
@batch(load_fn=load_fn, dump_fn=dump_fn)
@param("model", "resource")
def task(ctx, note):
    prediction = ctx.model(note["value"])
    note["prediction"] = prediction
@graphbook.source

Marks a step function as a SourceStep. Use this decorator if this step requires no input step and creates Notes to be processed by the rest of the graph.

Parameters:

is_generator (bool) – Whether the assigned function is a generator function. Default is true. This means that the function is expected to yield Notes.

Examples

@step("LoadData")
@source()
@param("path", "string", description="The path to the data")
def my_data(ctx, note):
    files = os.listdir(ctx.path)
    for file in files:
        file = os.path.join(ctx.path, file)
        with open(file) as f:
            yield Note({"data": f.read()})
@graphbook.prompt

Marks a function as a step that is capable of prompting the user. This is useful for interactive workflows where data labeling, model evaluation, or any other human input is required. Events get_prompt(ctx, note: Note) and on_prompt_response(ctx, note: Note, response: Any) are required to be implemented. The decorator accepts the get_prompt function that returns a prompt to display to the user. If nothing is passed as an argument, a bool_prompt will be used by default. If the function returns None on any given note, no prompt will be displayed for that note allowing for conditional prompts based on the note’s content. Available prompts are located in the graphbook.prompts module. The function that this decorator decorates is on_prompt_response and will be called when a response to a prompt is obtained from a user. Once the prompt is handled, the execution lifecycle of the Step will proceed, normally.

Parameters:

get_prompt (callable) – A function that returns a prompt. Default is bool_prompt.

Examples

def dog_or_cat(ctx, note: Note):
    return selection_prompt(note, choices=["dog", "cat"], show_images=True)


@step("Prompts/Label")
@prompt(dog_or_cat)
def label_images(ctx, note: Note, response: str):
    note["label"] = response


def corrective_prompt(ctx, note: Note):
    if note["prediction_confidence"] < 0.65:
        return bool_prompt(
            note,
            msg=f"Model prediction ({note['pred']}) was uncertain. Is its prediction correct?",
            show_images=True,
        )
    else:
        return None


@step("Prompts/CorrectModelLabel")
@prompt(corrective_prompt)
def correct_model_labels(ctx, note: Note, response: bool):
    if response:
        ctx.log("Model is correct!")
        note["label"] = note["pred"]
    else:
        ctx.log("Model is incorrect!")
        if note["pred"] == "dog":
            note["label"] = "cat"
        else:
            note["label"] = "dog"

See also

graphbook.prompts for available user prompts.

@graphbook.param(name: str, type: str, *, default=None, required: bool = False, description: str = '', cast_as: type | None = None)

Assigns a parameter to a step or resource. Graphbook’s web UI will display the parameter as a widget and supply the parameter as kwargs to the step or resource’s __init__ event. If the type is a function, the parameter will be cast as a function using transform_function_string().

Parameters:
  • name (str) – The name of the parameter

  • type (str) – The type of the parameter. Can be one of: string, number, boolean, function, or resource, dict, list[string], list[number], list[boolean], list[function].

  • default (Any) – The default value of the parameter.

  • required (bool) – Whether the parameter is required.

  • description (str) – A description of the parameter.

  • cast_as (type | callable) – A function or class type to cast the parameter to a specific type.

Examples

@step("SimpleStep")
@param("value", "number", default=42, description="The value to set")
def my_step(ctx, note):
    note["value"] = ctx.value

@step("Foo")
@param("param1", "string", default="foo")
@param("param2", "function")
def my_step(ctx, note):
    note["value"] += ctx.param1
    note["processed"] = ctx.param2(note["value"])
@graphbook.event(event: str, event_fn: callable)

Assigns a callable function as an event i.e. a lifecycle method to a step. Graphbook’s web UI will display the parameter as a widget and supply the parameter as kwargs to the step or resource’s __init__ event.

Parameters:
  • event (str) – The name of the event

  • event_fn (callable) – The function to call when the event is triggered.

Examples

def init(ctx, **kwargs):
    ctx.num_processed = 0

def on_clear(ctx):
    ctx.num_processed = 0

@step("StatefulStep")
@event("__init__", init)
@event("on_clear", on_clear)
def my_step(ctx, note):
    ctx.num_processed += 1
    note["num_processed"] = ctx.num_processed
@graphbook.output

Assigns extra outputs slots to a step. By default, Graphbook will assign the step an output slot name “out”. When using this decorator, you will get rid of the default output slot “out” and replace it with your own.

Parameters:

outputs (str...) – A sequence of strings representing the names of the output slots.

Examples

@step("Custom/MyStep", event="forward_note")
@output("Good", "Bad")
def evaluate(ctx, note):
    if note["value"] > 0:
        return "Good"
    else:
        return "Bad"
@graphbook.resource

Marks a function as a resource that returns an object that can be used by other steps or resources.

Parameters:

name (str) – The name of the resource including the category.

Examples

@resource("Custom/MyResource")
@param("model_path", "string", description="The path to the resource")
@param("fp16", "boolean", default=False, description="Whether to use FP16")
def my_resource(ctx):
    model = load_model(ctx.model_path, fp16=ctx.fp16)
    return model