Context-Aware Actions¶
In previous recipes, our actions used static arguments – pre-defined at configuration time. For example, in Self-Healing with Fallbacks, we defined Action(method=controller.set_algorithm, args=(ControllersID.PURE_PURSUIT,)) where the target algorithm is hardcoded.
But what if the action depends on what the robot is seeing, or where it was told to go? Real-world autonomy requires dynamic data injection – action arguments fetched from the system at the time of execution.
The Concept: Static vs Dynamic¶
Type |
Argument Set At |
Example |
|---|---|---|
Static |
Configuration time |
|
Dynamic |
Event firing time |
|
With dynamic injection, you pass a topic message field as an argument. EMOS resolves the actual value when the event fires, not when the recipe is written.
Intelligence Example: Dynamic Prompt Injection¶
The same pattern works for the intelligence layer. Consider a Vision component that detects objects, and a VLM that should describe whatever was detected – not just “person”:
from agents.ros import Topic, Event, Action, FixedInput
from agents.components import Vision, VLM
# Vision outputs
detections = Topic(name="/detections", msg_type="Detections")
camera_image = Topic(name="/image_raw", msg_type="Image")
# Event: any object detected
event_object_detected = Event(
detections.msg.labels.length() > 0,
on_change=True,
keep_event_delay=5
)
# Dynamic prompt: inject the detected label into the VLM query
def describe_detected_object(label: str):
"""Called with the actual detected label at event time."""
return f"A {label} has been detected. Describe what you see."
action_describe = Action(
method=describe_detected_object,
args=(detections.msg.labels[0],) # First detected label, resolved dynamically
)