EdgeRecon is a Python CLI for external attack surface reconnaissance and vulnerability discovery.
It is built around a workflow engine: you provide target files, choose a workflow, and EdgeRecon runs external tools, stores raw artifacts per target, aggregates findings, and records the exact execution trace so runs are inspectable and repeatable.
- ingests domains, subdomains, URLs, and IPs from text files
- runs recon tools in a workflow pipeline
- branches to downstream tasks based on facts emitted by earlier tasks
- stores raw tool output per target
- aggregates findings across the run
- records the exact commands, workflow transitions, and task results
subfinderdnsxnaabugobuster_dnsferoxbusterhttpxnuclei
The default bundled workflow is intentionally small and stable:
subfinderhttpxnucleiwhenhttp.live == true
EdgeRecon requires Python 3.11+.
pip install -e .That installs the CLI entrypoint and Python dependencies, including PyYAML for workflow loading.
EdgeRecon shells out to external tools. For the built-in task types, you should have the relevant binaries available on PATH or pass their explicit paths on the CLI.
Common binaries:
subfinderdnsxnaabugobusterferoxbusterhttpxfrom ProjectDiscovery, not the Python HTTPX client CLInuclei
Use doctor before a real run if you want to validate the environment first:
edgerecon doctor --httpx-binary /home/kyte/go/bin/httpxRun the bundled default workflow:
edgerecon run --targets examples/targets.txt --output runs/demoIf your system has multiple binaries with the same name, point EdgeRecon at the exact tools you want:
edgerecon run \
--targets examples/targets.txt \
--output runs/demo \
--httpx-binary /home/kyte/go/bin/httpxRun a custom workflow:
edgerecon run \
--targets examples/targets.txt \
--workflow-file examples/workflows/dns_first_web_enum.yaml \
--output runs/demo- EdgeRecon aborts by default if the total target set grows beyond
100 - override the threshold with
--max-targets - explicitly allow larger target sets with
--allow-large-target-sets
Examples:
edgerecon run --targets targets.txt --output runs/demo
edgerecon run --targets targets.txt --output runs/demo --max-targets 250 --allow-large-target-setsExecute a workflow against input targets.
Common options:
--targets--output--workflow-file--max-workers--max-targets--allow-large-target-sets--quiet--no-color--no-html-report--subfinder-binary--dnsx-binary--naabu-binary--feroxbuster-binary--httpx-binary--nuclei-binary
Validate workflow loading and installed binaries.
Examples:
edgerecon doctor
edgerecon doctor --workflow-file examples/workflows/subfinder_gobuster_httpx_nuclei.yaml
edgerecon doctor --httpx-binary /home/kyte/go/bin/httpx
edgerecon doctor --feroxbuster-binary /usr/local/bin/feroxbusterdoctor checks:
- workflow loading
- workflow roots and task graph
- binary resolution
- common binary identity mismatches, especially
httpx - task-specific configuration validation such as required wordlist files
Show a task type's:
- summary
- config fields
- emitted fact fields
- finding hints
- workflow examples
Examples:
edgerecon describe-task dnsx
edgerecon describe-task httpx
edgerecon describe-task gobuster_dnsRender a workflow definition as text or Mermaid.
Examples:
edgerecon describe-workflow
edgerecon describe-workflow --workflow-file examples/workflows/dns_first_web_enum.yaml
edgerecon describe-workflow --workflow-file examples/workflows/dns_first_web_enum.yaml --format mermaidGenerate starter files for a new task type.
edgerecon scaffold-task gobuster_dnsThis generates:
src/edgerecon/tasks/<name>.pytests/test_<name>.pyexamples/<name>_workflow.yaml
Bundled examples:
- default.yaml: bundled default workflow
- subfinder_gobuster_httpx_nuclei.yaml: adds
gobuster_dns - subfinder_dnsx_httpx_nuclei.yaml: adds
dnsx - dnsx_naabu_httpx_nuclei.yaml: adds
naabuafter DNS resolution - dns_first_web_enum.yaml: DNS-first profile where
httpxonly runs afterdnsxconfirms resolution - httpx_feroxbuster_nuclei.yaml: content discovery with
feroxbusteron live web targets - full_current_stack.yaml: uses all currently implemented tasks together in one workflow
Starter assets:
- targets.txt
- subdomains.txt
- web_paths.txt
The bundled example targets use reserved/demo values like demo.invalid and 192.0.2.10 so the repository does not imply scanning live assets.
Workflows are defined in YAML.
Core structure:
roots:
- subfinder
- httpx
tasks:
subfinder:
type: subfinder
next_tasks:
- httpx
httpx:
type: httpx
next_tasks:
- task: nuclei
when:
path: http.live
op: equals
value: true
nuclei:
type: nuclei
next_tasks: []roots- the task names scheduled for each eligible input target
tasks.<name>.type- the registered task type
tasks.<name>.config- task-specific configuration
tasks.<name>.next_tasks- downstream workflow edges
Edges can be unconditional:
next_tasks:
- httpxOr conditional:
next_tasks:
- task: nuclei
when:
path: http.live
op: equals
value: trueSupported operators:
existsequalscontainsin
Examples:
next_tasks:
- task: nuclei
when:
path: http.live
op: equals
value: true
- task: wpscan
when:
path: http.technologies
op: contains
value: WordPress
- task: dirsearch
when:
path: http.status_codes
op: contains
value: 200For exact-value membership:
when:
path: http.primary_status_code
op: in
value: [200, 403]Tasks emit facts, and workflow conditions read those facts by dotted path.
Examples:
http.livehttp.urlshttp.technologieshttp.status_codesdns.resolvesdns.cname_recordsnuclei.finding_count
The easiest way to inspect available facts for a task is:
edgerecon describe-task dnsx
edgerecon describe-task httpxsubfinder.discovered_countsubfinder.output_pathsubfinder.exit_code
dns.resolvesdns.hostsdns.a_recordsdns.aaaa_recordsdns.cname_recordsdns.rcode
naabu.open_portsnaabu.open_port_countnaabu.ipsnaabu.hostsnaabu.output_pathnaabu.exit_code
gobuster_dns.discovered_countgobuster_dns.wordlistgobuster_dns.output_pathgobuster_dns.exit_code
http.livehttp.urlshttp.technologieshttp.status_codeshttp.webservershttp.titles
feroxbuster.found_countferoxbuster.urlsferoxbuster.pathsferoxbuster.status_codesferoxbuster.interesting_urlsferoxbuster.file_extensionsferoxbuster.output_pathferoxbuster.exit_code
nuclei.scanned_targetsnuclei.finding_countnuclei.output_pathnuclei.exit_code
Built-in tasks use typed config dataclasses, so supported fields are explicit and validated.
Examples:
- SubfinderConfig
- DnsxConfig
- NaabuConfig
- GobusterDnsConfig
- FeroxbusterConfig
- HttpxConfig
- NucleiConfig
Example gobuster_dns task config with a required wordlist:
gobuster_dns:
type: gobuster_dns
config:
wordlist: ./examples/wordlists/subdomains.txt
threads: 50
wildcard: false
next_tasks:
- task: dnsx
when:
path: gobuster_dns.discovered_count
op: existsExample dnsx task config:
dnsx:
type: dnsx
config:
query_a: true
query_aaaa: true
query_cname: true
include_response: false
threads: 100Example naabu task config:
naabu:
type: naabu
config:
top_ports: "100"
rate: 1000
threads: 25
scan_all_ips: true
next_tasks: []Example feroxbuster task config:
feroxbuster:
type: feroxbuster
config:
wordlist: ./examples/wordlists/web_paths.txt
threads: 40
recursive: true
depth: 2
extensions: [php, txt, bak]
next_tasks: []If a task requires external files such as a wordlist, doctor validates that they exist.
Each run writes structured output under the chosen run directory.
runs/<name>/summary.json- overall run summary, findings, task results, workflow trace
runs/<name>/report.html- browser-friendly summary of the run, targets, workflow trace, and findings
runs/<name>/findings.json- aggregated findings only
runs/<name>/workflow_trace.json- per-target command history and workflow transitions
runs/<name>/targets/<target>/artifacts/- raw tool output files for that target
runs/<name>/targets/<target>/workflow_trace.json- task/edge history for one target
EdgeRecon schedules work per unique (target, task) pair.
That means:
- a task runs at most once for a given target
- if multiple upstream discovery tasks find the same subdomain, downstream tasks are deduplicated
- discovered targets enter the workflow as new targets
- downstream tasks can be skipped if their task guard or edge condition does not pass
The CLI also records:
- exact commands executed
- workflow transitions
- discovered targets
- per-task errors
By default, run also writes an HTML report alongside summary.json. Use --no-html-report if you only want the JSON artifacts.
During run, EdgeRecon shows:
- a live single-line status summary
- detailed event lines for task start, completion, failure, discovery, and skips
- ANSI color output when attached to an interactive terminal
Use:
--quietto suppress live progress--no-colorto disable ANSI styling
Right now:
nucleiraises findings directly from parsed template matches- discovery and enrichment tasks mainly emit facts and discovered targets
This is intentional: facts make it easy to branch workflows, and findings can be raised either directly by a tool task or later by dedicated evaluators.
The extension model is:
- create a task type
- register it in the task registry
- declare its metadata
- use it from workflow YAML
Key files:
- registry.py
- base.py
- external.py
For external command-based tasks, inherit from ExternalToolTask and implement:
should_run(...)if neededbuild_paths(...)if defaults are not enoughbuild_argv(...)build_task_result(...)- optional
validate_configuration(...)
Use scaffold-task for a starter implementation, then add task metadata so describe-task can document it automatically.
EdgeRecon is still early, but the core workflow model is already in place:
- typed task configs
- conditional workflow edges
- task discovery docs via
describe-task - workflow visualization via
describe-workflow - environment validation via
doctor - exact execution trace capture
That makes it a strong foundation for building a larger recon and vulnerability discovery pipeline over time.