Blueprint
Last updated
Was this helpful?
Last updated
Was this helpful?
A blueprint is a script or automation configuration with certain parts marked as configurable. This makes it easy to share them so you don't have to start from scratch.
Event Summary is updated with every LLM Vision update. To keep your automations compatible and to get new features, you should re-import blueprints after LLM Vision updates. See Home Assistant docs for more information.
Say hello to intelligent security event notifications! AI understands what happens in the video, decides whether you should be notified and sends you notifications with a preview and summary of what happened.
Customize your notification preferences by choosing to receive all updates or utilizing AI to filter for important events only. This can be configured using the important
parameter. Using AI will add a slight delay to your notifications (<5 seconds when using OpenAI).
When an event is detected, you'll receive a notification with basic information including the camera name. Event summarization continues in the background and the notification is updated once the summary is available.
Notifications will play a preview of the event. If you use Frigate, the official Frigate integration for Home Assistant needs to be installed so events can be fetched. If you use the Camera mode a live preview or a Snapshot of the camera that triggered the event will be shown when you expand the notification.
You can customize which dashboard opens in the Home Assistant app when you tap the notification using the tap_navigate
parameter.
Additionally, a cooldown can be set. This prevents the automation from running too often, which may result in excessive cost from your AI provider.
important
(Experimental)
Use AI to classify events as Critical, Normal or Low. Notifications are sent only for events classified as Normal or higher. Critical events override 'Do Not Disturb' settings.
false
remember
Store events in Event Calendar so you can ask Assist about them later.
false
use_memory
Use information stored in memory to provide additional context. Memory must be set up.
false
message
Model prompt to generate the summary
Summarize what's happening in the camera feed.
run_conditions
A list of conditions. All conditions must be true for the blueprint to run.
[]
notification_delivery
Controls how notifications are delivered.
Dynamic immediately notifies with a live preview and updates the notification silently with a summary once it is available. Consolidated Delays the notification until the event summary is generated. Use this if you're receiving multiple notifications for the same event.
Dynamic
camera_entities
A list of camera entities. Automation only runs when event is detected by one of these.
[]
trigger_state
State camera needs to be in to trigger
"recording"
motion_sensors
A list of binary_sensors to trigger cameras if they dont change states. Must be in the same order as cameras.
[]
preview_mode
Choose what will be shown in the notification. Either "Snapshot" which shows a static image of the event or "Live Preview" which will show the live camera feed in the notification.
Snapshot
cooldown
Time in minutes to wait before running again
10
tap_navigate
Relative path to navigate to when pressing notification (e.g. /lovelace/cameras)
/lovelace/0
duration
How long to record before analyzing (in seconds)
5
max_frames
How many frames to analyze. Picks frames with the most movement. More frames provider more context to the AI but will increase processing time.
3
provider
model
The Large Language Model to use. Depends on provider.
target_width
Downscale images (uses less tokens and speeds up processing)
max_tokens
Maximum number of tokens to generate. Use this to control the length of the summary.
temperature
Randomness. Lower is more accurate, higher is more creative.
For remaining parameters see action parameters
Pick a provider from the UI dropdown. Providers will need to be set up first. .