Security & Review Policy

How we check plugins before they go live. Our ruleset is fully public — you can read every check below, or straight from the source on GitHub.

How we check plugins

Every version uploaded to emdashcms.org goes through three gates before a user can install it. All three run automatically — you don't need to wait on a human.

  1. 1 Bundle validation. We enforce size limits (10 MB compressed, 50 MB decompressed, 200 files max), reject path traversal attempts, reject symlinks and hardlinks, require a valid manifest.json with a matching plugin id, and compute a SHA-256 checksum for tamper detection.
  2. 2 Static code scan. We look for dangerous primitives (listed below), obfuscation signals, and mismatches between declared network hosts and hosts actually referenced in the code. Some findings are blocking — the version is rejected outright. Others are flagging — the version is published but carries a Caution badge.
  3. 3 AI review (when available). Clean plugins can optionally be re-reviewed by a Workers AI model for subtler issues a regex scan will miss. Budget-bound and currently triggered by a moderator. Plugins that pass AI review earn the AI-reviewed badge.

The scanner's rules are fully public. If you can read the source, you can write a plugin that passes — and we think that's the right tradeoff. Security through obscurity doesn't scale; documented expectations do.

What each trust tier means

Every version carries a visible trust tier. You'll see it on the plugin detail page and (for caution states) on the plugin cards in search results.

Scanned

Passed every static check with no findings. Published immediately. Safe to install.

Scanned — Caution

Published but the scanner flagged soft findings (e.g. bundled CJS require, undeclared hosts). Review the findings on the plugin page before installing.

AI-reviewed

Scanned AND reviewed by an AI auditor. Highest tier currently available. Visual indicator: ✦

AI-reviewed — Caution

AI review flagged something worth knowing. Published with the findings visible on the plugin detail page.

Unreviewed

Legacy version that predates the trust tier system, or a version currently waiting on human moderation.

Rejected

Did not pass review. Not installable. The author sees the exact findings and can upload a corrected version.

Blocking findings — reject the version

These patterns are hard rejections. If any of them appear in the bundle, the version is not published and the author sees the exact finding with a file reference.

eval()

What we look for: Any direct call to the JavaScript eval() function.

Why: Arbitrary runtime code execution. There is no legitimate reason for a CMS plugin to execute strings as code.

new Function(...)

What we look for: Constructing a function from a string at runtime.

Why: Equivalent to eval() — arbitrary code execution bypass for plugin sandboxing.

Worker constructed from a Blob URL

What we look for: new Worker(URL.createObjectURL(...)).

Why: Common obfuscation vector for running code whose source can't be audited.

child_process

What we look for: Any reference to the Node.js child_process module.

Why: EmDash plugins run in a Cloudflare-style runtime with no process spawning. Presence means Node-targeted code was bundled, likely with intent to escalate.

process.binding

What we look for: Access to internal Node.js native bindings.

Why: Native-addon access is not available in the runtime and is never needed by legitimate plugins.

Flagging findings — publish with a Caution badge

These findings don't block a version from going live, but they're surfaced to installers via the Caution tier so you can review before deciding to install.

require('...')

What we look for: CommonJS require() with a string-literal module name.

Why: Workers use ES modules. require() calls usually indicate bundled CJS dependencies — often harmless, but worth flagging for review.

fs.readFile / fs.writeFile / fs.mkdir / ...

What we look for: Specific Node.js fs module methods.

Why: The Workers runtime has no filesystem. Presence usually means dead code from a dependency, but could signal intent to access local files in a different deployment.

Dynamic import(variableName)

What we look for: import() called with a non-literal argument.

Why: Dynamic imports with string literals are fine. Computed specifiers can be used to hide what's being loaded.

IndexedDB / localStorage / sessionStorage

What we look for: Browser-only storage APIs.

Why: Not available in the Workers runtime. Usually means the plugin was built for a browser context, which deserves review.

Long base64-encoded string literals

What we look for: String literals of 200+ base64 characters.

Why: A common way to smuggle executable payloads past naive scanners.

Hex-escaped string sequences

What we look for: 30+ consecutive \xNN hex escapes.

Why: Strong obfuscation signal — legitimate code rarely uses this encoding.

Hosts not declared in manifest.allowedHosts

What we look for: External URLs in code whose domains aren't in the plugin's declared allowlist.

Why: EmDash sandboxes outbound network calls. Undeclared hosts will fail at runtime, or worse, signal data exfiltration if the call is intentional.

Broad capabilities

What we look for: network:fetch:any, read:users, email:intercept.

Why: These grant sweeping access. Legitimate plugins declare the minimum scope they need.

How AI review works

We use Cloudflare Workers AI to run a second-pass review over plugin bundles that already passed the static scan. The model reads the manifest and source files, looks for subtler issues a regex scan misses, and returns a structured verdict: pass, warn, or fail with a risk score and findings.

AI review is fail-closed — if the model errors or returns malformed output, the version is rejected, never silently published. It's also budget-bound to protect the marketplace's free-tier costs.

Today, AI review is triggered by a moderator from the queue rather than on every upload. As the marketplace grows we plan to move it onto the upload hot path so every scanned version gets AI-reviewed as quickly as the budget allows.

Public rejection history

When a version is rejected or revoked, the scanner findings and (optionally) the moderator's reason are visible on the plugin detail page for anyone to read. Hiding rejections only helps bad actors — the scanner rules are already public on this page, so exposing individual rejections is the honest extension of that policy.

What we always show publicly: the version number, its status, and the full list of scanner findings (title, severity, description, file location). These are deterministic and reproducible from the open-source scanner.

What's opt-in: the moderator's free-text reason. When a moderator rejects or revokes a version, they see a "Post this note publicly" checkbox that defaults to on. Notes that reference out-of-band context (e.g. a conversation with the author on Discord) can be kept private — they stay in the audit record but don't appear on the public page. Scanner findings remain public either way.

Revoked plugins. If an entire plugin is revoked, the listing stays reachable at its direct URL with a prominent "This plugin has been revoked" banner. It disappears from search and carries a noindex meta so search engines stop serving it. History and findings remain visible so inbound links don't 404 and the record is honest.

Found a problem? Report it.

If you believe a published plugin is malicious, broken, or violates the review rules above, please file a security issue on our GitHub repository. Include the plugin id, the version, and what you observed.

We take vulnerability reports seriously. Plugins with confirmed security issues are closed immediately while we investigate, and the author is notified. If the finding is severe, the plugin is revoked and the author's ability to publish new plugins is suspended pending review.

A dedicated in-app report flow is on the roadmap. Until then, GitHub Issues is the fastest way to reach us.

Our complete scanner implementation is open source. Read it, audit it, or propose improvements via pull request: src/lib/audit/static-scanner.ts