A curated collection of workflows for the Elastic platform, covering security, observability and search examples.
- Overview
- Quick Start
- Repository Structure
- Workflow Categories
- Workflow Schema
- Key Concepts
- Importing Workflows
- Examples
- Contributing
- License
This repository contains 57 workflows designed for use with Elastic Workflows, a platform feature for automating operations across the Elastic Stack. These workflows cover a wide range of use cases:
| Category | Description |
|---|---|
| Security | Threat detection, incident response, enrichment, and hunting |
| Observability | Monitoring, log analysis, and root cause analysis |
| Search | Elasticsearch queries, ES|QL, semantic search |
| Integrations | Splunk, Slack, Jenkins, JIRA, Caldera, and more |
| AI Agents | Agentic workflows and AI-powered automation |
| Data | ETL, ingestion, and document management |
Elastic Workflows provide a declarative YAML-based approach to automating operations across the Elastic platform. They integrate natively with:
- Elasticsearch - Query, aggregate, and index data with ES|QL and DSL
- Kibana - Create cases, manage alerts, interact with Security and Observability features
- External Systems - Splunk, Slack, Jenkins, JIRA, and any HTTP API
- AI/ML - Integrate with language models for intelligent analysis and agents
- Declarative YAML - Define what you want, not how to do it
- Triggers - Manual, scheduled, or alert-driven
- Extensible - Connect to any HTTP API or Elastic feature
- Version Control - Store workflows as code, track changes in Git
- Shareable - Import/export workflows between environments
Explore the workflows/ directory organized by use case:
workflows/
βββ security/ # Security operations
β βββ detection/ # Alert management, threat detection
β βββ response/ # Incident response, case management
β βββ enrichment/ # Threat intel, IP/hash lookups
β βββ hunting/ # Threat hunting queries
βββ integrations/ # Third-party integrations
β βββ splunk/ # Splunk queries and enrichment
β βββ slack/ # Channel management, notifications
β βββ jenkins/ # CI/CD automation
β βββ jira/ # Ticket management
β βββ caldera/ # Adversary emulation
β βββ firebase/ # Authentication
β βββ snowflake/ # Data warehouse queries
βββ search/ # Search and query workflows
βββ observability/ # Monitoring and analysis
βββ ai-agents/ # AI-powered automation
βββ data/ # ETL and data management
βββ utilities/ # Common utility workflows
βββ examples/ # Demo and getting-started
Each workflow includes inline comments explaining every section:
# =============================================================================
# Workflow: IP Reputation Check
# Category: security/enrichment
#
# Assess the reputation of a given IP address using threat intelligence
# =============================================================================
name: IP Reputation Check
# CONSTANTS - Update these values for your environment
consts:
abuseipdb_api_key: YOUR-API-KEY-HERE # Get from AbuseIPDB
# INPUTS - Parameters provided at runtime
inputs:
- name: ip_address
type: string
required: trueOption A: Kibana UI
- Navigate to Management β Workflows in Kibana
- Click Create workflow
- Paste the YAML content
- Save and test
Option B: API Import
curl -X POST "https://your-kibana-url/api/workflows" \
-H "kbn-xsrf: true" \
-H "x-elastic-internal-origin: Kibana" \
-H "Content-Type: application/json" \
-H "Authorization: ApiKey YOUR_API_KEY" \
-d '{"yaml": "'"$(cat workflows/security/enrichment/ip-reputation-check.yaml)"'"}'See docs/importing.md for detailed instructions.
elastic-workflows/
βββ README.md # This file
βββ CONTRIBUTING.md # Contribution guidelines
βββ LICENSE.txt # Apache 2.0 license
βββ workflows/ # All workflow YAML files
β βββ security/ # Security operations
β β βββ detection/ # Threat detection workflows
β β βββ response/ # Incident response workflows
β β βββ enrichment/ # Enrichment workflows
β β βββ hunting/ # Threat hunting workflows
β βββ integrations/ # Third-party integrations
β β βββ splunk/
β β βββ slack/
β β βββ jenkins/
β β βββ jira/
β β βββ caldera/
β β βββ firebase/
β β βββ snowflake/
β βββ search/ # Search workflows
β βββ observability/ # Observability workflows
β βββ ai-agents/ # AI agent workflows
β βββ data/ # Data/ETL workflows
β βββ utilities/ # Utility workflows
β βββ examples/ # Demo workflows
βββ docs/ # Extended documentation
βββ schema.md # Complete YAML schema reference
βββ concepts.md # Workflow concepts explained
βββ importing.md # Import instructions
Workflows for security operations, threat detection, and incident response.
| Category | Count | Description |
|---|---|---|
| security/detection | 8 | Alert management, threat detection, rule execution |
| security/enrichment | 5 | VirusTotal, IP reputation, threat intel lookups |
| security/response | 4 | Incident response, triage, and case management |
Workflows for connecting Elastic with external systems.
| Category | Count | Description |
|---|---|---|
| integrations/splunk | 5 | Splunk queries, enrichment, data retrieval |
| integrations/caldera | 4 | MITRE Caldera adversary emulation |
| integrations/slack | 3 | Channel creation, user management, notifications |
| integrations/firebase | 2 | Firebase authentication |
| integrations/jenkins | 1 | CI/CD build automation |
| integrations/jira | 1 | Ticket creation |
| integrations/snowflake | 1 | Data warehouse queries |
| Category | Count | Description |
|---|---|---|
| search | 4 | ES|QL, semantic search, web search |
| observability | 1 | Monitoring, log analysis, AI-powered observability |
| ai-agents | 2 | AI agent invocation and automation |
| data | 3 | ETL, ingestion, document management |
| utilities | 11 | Common operations and helpers |
| examples | 2 | Getting started demos |
| Workflow | Category | Description |
|---|---|---|
| IP Reputation Check | Security | Check IP against AbuseIPDB and geolocation |
| Hash Threat Check | Security | VirusTotal file hash analysis |
| Splunk Query | Integration | Execute Splunk searches |
| Create Slack Channel | Integration | Automated Slack channel creation |
| Semantic Knowledge Search | Search | AI-powered semantic search |
| AD Automated Triaging | Security | Automated security alert triage workflow |
Every workflow follows a consistent YAML schema:
# Required fields
name: "Workflow Name" # Human-readable name
steps: # At least one step required
- name: "Step Name" # Step identifier
type: "action.type" # Action to perform
with: # Action parameters
key: value
# Optional fields
description: "What this does" # Detailed description
tags: # Categories for organization
- observability
- search
triggers: # How the workflow is invoked
- type: scheduled
with:
every: "1d" # Daily
consts: # Reusable constants
api_key: "value"
inputs: # Runtime parameters
- name: query
type: string
required: true| Action | Description | Use Case |
|---|---|---|
http |
HTTP requests | API calls, webhooks |
elasticsearch.search |
Search ES indices | Data retrieval |
elasticsearch.index |
Index documents | Data storage |
kibana.cases |
Case management | Incident response |
kibana.alert |
Alert operations | Detection |
console |
Log output | Debugging |
foreach |
Loop over arrays | Batch processing |
See docs/schema.md for the complete schema reference.
Workflows support multiple trigger types:
# Manual (on-demand)
triggers:
- type: manual
# Scheduled (simple interval)
triggers:
- type: scheduled
with:
every: "6h" # Every 6 hours
# Alert-driven
triggers:
- type: alertReference values using double curly braces:
# Constants
url: "{{ consts.api_url }}/endpoint"
# Inputs
query: "host.ip: {{ inputs.target_ip }}"
# Step outputs
message: "Found {{ steps.search.output.hits.total }} results"Workflows support Liquid templating for dynamic content. Use filters to transform data inline.
| Filter | Description | Example |
|---|---|---|
json |
Convert to JSON string | {{ object | json }} |
json_parse |
Parse JSON string to object | {{ json_string | json_parse }} |
size |
Get array length or string length | {{ items | size }} |
first / last |
Get first/last array item | {{ items | first }} |
map |
Extract property from array | {{ users | map: "name" }} |
where |
Filter array by property | {{ items | where: "status", "active" }} |
where_exp |
Filter with expression | {{ items | where_exp: "item.price > 100" }} |
join |
Join array to string | {{ tags | join: ", " }} |
split |
Split string to array | {{ csv | split: "," }} |
default |
Fallback value | {{ name | default: "Unknown" }} |
date |
Format date | {{ "now" | date: "%Y-%m-%d" }} |
upcase / downcase |
Change case | {{ text | upcase }} |
strip |
Remove whitespace | {{ text | strip }} |
replace |
Replace substring | {{ text | replace: "old", "new" }} |
truncate |
Shorten string | {{ text | truncate: 50 }} |
base64_encode / base64_decode |
Base64 encoding | {{ text | base64_encode }} |
url_encode / url_decode |
URL encoding | {{ text | url_encode }} |
# Filter products where price > 100
{{ products | where_exp: "item.price > 100" }}
# Find first matching item
{{ products | find: "type", "book" }}
# Check if any item matches
{{ products | has: "category", "electronics" }}
# Remove items matching condition
{{ products | reject_exp: "item.stock == 0" }}
# Sort by property
{{ products | sort: "name" }}
# Get unique values
{{ items | uniq }}
# Concatenate arrays
{{ array1 | concat: array2 }}# Format message with data
message: "Alert: {{ event.rule.name | upcase }} on {{ event.host.name }}"
# Build URL with encoding
url: "https://api.example.com/search?q={{ query | url_encode }}"
# Extract substring
short_hash: "{{ file.hash.sha256 | slice: 0, 8 }}"
# Default values for missing data
user: "{{ event.user.name | default: 'unknown' }}"Use Liquid tags for conditional logic and loops:
message: |
{%- if steps.search.output.hits.total > 0 -%}
Found {{ steps.search.output.hits.total }} results
{%- else -%}
No results found
{%- endif -%}# Loop over items
message: |
{%- for alert in event.alerts -%}
- {{ alert.rule.name }}: {{ alert.severity }}
{%- endfor -%}# Assign variables
message: |
{%- assign severity = event.alerts[0].severity -%}
{%- case severity -%}
{%- when "critical" -%}
π΄ CRITICAL: Immediate action required
{%- when "high" -%}
π HIGH: Investigate promptly
{%- else -%}
π’ Normal priority
{%- endcase -%}All Supported Filters (click to expand)
Math: abs, at_least, at_most, ceil, divided_by, floor, minus, modulo, plus, round, times
String: append, capitalize, downcase, escape, lstrip, prepend, remove, remove_first, remove_last, replace, replace_first, replace_last, rstrip, slice, split, strip, strip_html, strip_newlines, truncate, truncatewords, upcase
Array: compact, concat, first, group_by, group_by_exp, join, last, map, pop, push, reverse, shift, size, sort, sort_natural, uniq, unshift, where, where_exp, find, find_exp, has, has_exp, reject, reject_exp
Date: date, date_to_long_string, date_to_rfc822, date_to_string, date_to_xmlschema
Encoding: base64_decode, base64_encode, cgi_escape, uri_escape, url_decode, url_encode, xml_escape, json, json_parse
Utility: default, escape_once, normalize_whitespace, number_of_words, slugify, array_to_sentence_string
steps:
- name: api_call
type: http
with:
url: "{{ consts.api_url }}"
on-failure:
retry:
max-attempts: 3
delay: 5s
continue: true # Proceed even on failureSee docs/concepts.md for detailed explanations.
- Open Kibana β Management β Workflows
- Click Create workflow
- Paste YAML content
- Update constants for your environment
- Save
cat workflow.yaml | jq -Rs '{yaml: .}' | \
curl -X POST "https://KIBANA_URL/api/workflows" \
-H "kbn-xsrf: true" \
-H "x-elastic-internal-origin: Kibana" \
-H "Content-Type: application/json" \
-H "Authorization: ApiKey API_KEY" \
-d @-for file in workflows/security/**/*.yaml; do
echo "Importing: $file"
cat "$file" | jq -Rs '{yaml: .}' | \
curl -s -X POST "https://KIBANA_URL/api/workflows" \
-H "kbn-xsrf: true" \
-H "x-elastic-internal-origin: Kibana" \
-H "Content-Type: application/json" \
-H "Authorization: ApiKey API_KEY" \
-d @-
doneSee docs/importing.md for complete instructions.
Query Elasticsearch with ES|QL and process results:
name: ES|QL Query Example
triggers:
- type: manual
inputs:
- name: query
type: string
default: "FROM logs-* | LIMIT 10"
steps:
- name: execute_query
type: elasticsearch.esql.query
with:
format: json
query: "{{ inputs.query }}"
- name: store_count
type: data.set
with:
row_count: "{{ steps.execute_query.output.values | size }}"Run an ES|QL query and send a summary to Slack:
name: Daily Security Summary
triggers:
- type: scheduled
with:
every: "1d" # Daily
consts:
slack_webhook: "https://hooks.slack.com/..."
steps:
- name: query_alerts
type: elasticsearch.esql.query
with:
format: json
query: |
FROM .alerts-security.alerts-default
| WHERE @timestamp > NOW() - 24 hours
| STATS alert_count = COUNT(*) BY host.name
| SORT alert_count DESC
| LIMIT 10
- name: notify_slack
type: http
with:
url: "{{ consts.slack_webhook }}"
method: POST
body:
text: "π Daily Alert Summary: {{ steps.query_alerts.output.values | size }} hosts with alerts in the last 24h"We welcome contributions! See CONTRIBUTING.md for guidelines.
- Fork this repository
- Add your workflow to the appropriate category
- Include inline comments explaining each step
- Test in a Kibana environment
- Submit a pull request
- Include header comment block with description
- Add section comments (CONSTANTS, INPUTS, STEPS)
- Use meaningful step names
- Document required inputs and constants
- No hardcoded credentials
This project is licensed under the Apache License 2.0 - see LICENSE for details.