Securing Public Scripts with Web Crypto API: A Lightweight Approach

If you’ve tried Traefik v3.6, you might have noticed a new section: the Traefik Hub API Management demo content. I have an interesting story about how it was implemented. This demo content uses a remote Web Component. It sounds simple and straightforward to implement, but the real complexity came from a few key requirements.
First, the component had to be hosted externally and remain updatable at any time, without requiring changes to the consuming project. Second, despite being loaded remotely, it needed to be secure—especially since it might be injected into environments handling sensitive data.
At first glance, these seemed like common concerns. But after digging deeper, I realized that balancing the auto-update capability with security in this context isn't as common as I expected. As a frontend developer, I was specifically looking for a solution that was simple and easy to integrate and could be done without adding too much complexity on the infrastructure. With the script already hosted on GitHub Pages with the simplest configuration, I began exploring practical approaches.
Standard mitigations like Subresource Integrity (SRI) were quickly ruled out—any change to the web component would require updating the hash in the host project, which directly conflicted with the requirement for independent updates. I also considered generating the hash dynamically, but that wouldn't protect against man-in-the-middle (MITM) attacks or a compromised CDN. While I had already whitelisted the domain via a Content Security Policy (CSP), it still felt insufficient on its own. More advanced options like certificate pinning came to mind, but I was determined to find a lightweight, client-side-only solution that wouldn’t require coordination with the host environment.
This led me down a path of experimenting with the Web Crypto API to secure the component dynamically. Here’s how it went.
The Basic Concept
One of the main limitations here is that even if the content is changed, security checks should still be able to be done, which means that any kind of check against the content is outruled. So the next possibility is to check against the identity of the sender. The domain is already whitelisted via CSP meta tag, but another layer on top of it will need to be added.
When you’re checking on some official documents in real life, one way to make sure of the identity of the people involved in the making of the document is to check the signatures. The same concept is used here: after fetching the script, we will check the signature of the script to confirm its validity. This will require the script to be signed, for the consuming project to be able to read the signature, and then for it to check its validity.
To achieve this, there will be three steps of work on both the Web Component provider and the consuming project sides:
- Create a pair of private key and public key
- Sign the script using the private key
- On the consuming project side, fetch the script and the signature, and check their validity using the public key
Let’s go through the process step by step.
1. Create a Pair of Private Key and Public Key
Before setting up secure authentication or signing operations, we will first need a cryptographic key pair: a private key (which you must keep secret) and a public key (which you can safely share).
There are several algorithms that could be used to create the key pair, such as RSA, ECDSA, or ED25519. For this project, I’ve chosen ED25519—a modern elliptic curve algorithm that is faster, more compact, and generally more secure than older algorithms like RSA.
Key pair could be generated using ssh-keygen with the following command:
ssh-keygen -t ed25519 -f ~/.ssh/your-key-filename -C "your-key-comment"
After running this command, you’ll be prompted to enter an optional passphrase. This will encrypt your private key and therefore will add an extra layer of protection. The keys will be stored as:
Private key: ~/.ssh/your-key-filename
Public key: ~/.ssh/your-key-filename.pub
The keys generated by ssh-keygen are stored in the OpenSSH format, which is PEM-like; meaning it uses a Base64-encoded text structure with headers and footers (e.g., -----BEGIN OPENSSH PRIVATE KEY-----). However, it’s not identical to the standard PEM format used by tools like OpenSSL. If you need your SSH keys in PEM (Privacy-Enhanced Mail) or DER (Distinguished Encoding Rules) formats for compatibility with other systems, you can convert them using tools such as ssh-keygen or openssl.
Another way to generate the keys is with Node.js, by using its built-in crypto module and simply run this command on the terminal:
node -e "
const crypto = require('crypto');
const { publicKey, privateKey } =
crypto.generateKeyPairSync('ed25519', {
publicKeyEncoding: { type: 'spki', format: 'der'
},
privateKeyEncoding: { type: 'pkcs8', format: 'der'
}
});
console.log('PRIVATE_KEY=' +
privateKey.toString('base64'));
console.log('PUBLIC_KEY=' +
publicKey.toString('base64'));
"
This script generates both the public and private keys and outputs them in Base64-encoded DER format. For the purpose of this article, we’ll continue working with keys in this format. It’s important to note that DER and PEM represent the same underlying data but in different encodings. DER is binary, while PEM is Base64-encoded text with header and footer lines. Because of this, the decoding process will differ slightly depending on which format is being used, so make sure to stay consistent throughout the implementation.
2. Sign the Script Using the Private Key
Now that we have the private-public key pairs, let’s get to the signing business.
Given that there is already a ready-to-use web component, it’s bundled and ready to distribute, the only thing left now is to sign the bundle and publish the signature along with the bundle. Regardless of the bundler used in the project, ensure that the signing process happens as the very last step of the build process. If the bundle is modified after the signing, then the signature wouldn’t be valid anymore. After all, that’s the purpose of this whole signing process: to guarantee the bundle’s integrity and prove that it hasn’t been tampered with by unauthorized parties.
I’m using Vite to build the production bundle of my Web Component, so what I did is to create a custom plugin that will handle the signing, and add it as a plugin on your build process. An example of the custom plugin to sign the bundle:
const signBundle = () => ({
name: 'sign-bundle',
writeBundle: async (options, bundle) => {
const privateKeyB64 = env.PRIVATE_KEY
if (!privateKeyB64) {
// cancel build process if there is no private key
throw new Error('❌ PRIVATE_KEY environment variable is required for bundle signing')
}
try {
const fs = await import('fs/promises')
const path = await import('path')
const privateKeyBuffer = Buffer.from(privateKeyB64, 'base64')
const privateKey = await webcrypto.subtle.importKey('pkcs8', privateKeyBuffer, { name: 'Ed25519' }, false, [
'sign',
])
for (const fileName in bundle) {
const file = bundle[fileName]
if (file.type === 'chunk') {
console.log(`Signing ${fileName}...`)
const signature = await webcrypto.subtle.sign(
{ name: 'Ed25519' },
privateKey,
new TextEncoder().encode(file.code),
)
const filePath = path.join(options.dir, fileName)
await fs.writeFile(filePath, file.code)
const sigFilePath = path.join(options.dir, `${fileName}.sig`)
await fs.writeFile(sigFilePath, new Uint8Array(signature))
}
}
console.log('Bundle signing complete')
} catch (error) {
console.error('Bundle signing failed:', error)
throw error
}
},
})
What does this custom plugin do, step by step:
- The
writeBundlehook will run after all bundles are written to the disk. Options args contains information about the output directory, while bundle is an object that contains all the generated files. - Check for the existence of the Private Key. Throw an early error if no Private Key could be found.
- This Private Key is in base64 format, so it will need to be decoded into the binary form.
- Import the binary Private Key as an Ed25519 cryptographic key to be used for the signing.
- Process all the chunks inside the bundle. The code to be signed needs to be converted to UTF-8 bytes for the signing.
- Write back the signed code to the disk.
- Create a .sig file containing the binary signature. This will be used by the consuming project for verification.
This is an example on how I implement it on a project using Vite builder. By following the same step-by-step principle, this process can be implemented in any kind of project with some adjustments.
3. Bundle Script Validation
Now that the Web Component is built, signed, and published, it’s up to the consuming project to do the verification. Let’s assume we have these scripts published:
- The Web Component Bundle: https://traefik.github.io/web-component/sample.js
- The Signature: https://traefik.github.io/web-component/sample.js.sig
The idea is simple: validate the signature and the script before mounting the bundle to the DOM. However, the implementation requires some attention.
The main challenge here is that the script will need to be downloaded and validated in a way that it won’t pollute the application, in case that the script was compromised. To achieve this, we will use a Web Worker. This way, the validation can run on a separate thread and it wouldn’t be able to access the DOM until it’s validated.
Now, let’s start by creating the handler that will create the worker responsible for creating the worker and managing communication between the worker and our main application. A simplified example of the handler is as follows.
async function verifyBundle(contentPath, signaturePath, publicKey) {
return new Promise((resolve) => {
const worker = new Worker(new URL('./scriptVerificationWorker.ts', import.meta.url), { type: 'module' })
const timeout = setTimeout(() => {
worker.terminate()
console.error('Script verification timeout')
resolve({ verified: false })
}, 30000)
worker.onmessage = (event) => {
clearTimeout(timeout)
worker.terminate()
const { verified, error, scriptContent } = event.data
if (!!error) {
console.error('Worker verification failed:', error)
resolve({ verified: false })
return
}
resolve({
verified: verified === true,
scriptContent: verified ? scriptContent : undefined,
})
}
worker.onerror = (error) => {
clearTimeout(timeout)
worker.terminate()
console.error('Worker error:', error)
resolve({ verified: false })
}
worker.postMessage({
scriptUrl: contentPath,
signatureUrl: signaturePath,
publicKey,
})
})
}
What this handler does is simply to create a new Web Worker, points it to the worker file, and manages the communication between the main thread and the worker. I added a timeout as a safety measure to prevent the process from hanging indefinitely in case something goes wrong.
Now for the main course: the worker. This is the function responsible for handling the heavy lifting in the verification process. Inside the worker, we’ll fetch both the script and its signature, perform the cryptographic validation, and return the results back to the main thread. By offloading these operations to a dedicated worker, we ensure that potentially untrusted code is processed in complete isolation from the DOM and application logic. Below is a fully working example of the verification worker in action:
// .scriptVerificationWorker.ts
self.onmessage = async function (event) {
const { scriptUrl, signatureUrl, publicKey } = event.data
try {
const [scriptResponse, signatureResponse] = await Promise.all([fetch(scriptUrl), fetch(signatureUrl)])
if (!scriptResponse.ok || !signatureResponse.ok) {
self.postMessage({
success: false,
verified: false,
error: `Failed to fetch files. Script: ${scriptResponse.status} ${scriptResponse.statusText}, Signature: ${signatureResponse.status} ${signatureResponse.statusText}`,
})
return
}
const [scriptBuffer, signatureBuffer] = await Promise.all([
scriptResponse.arrayBuffer(),
signatureResponse.arrayBuffer(),
])
const verified = await verifySignature(publicKey, scriptBuffer, signatureBuffer)
// If verified, include script content to avoid re-downloading
let scriptContent: ArrayBuffer | undefined
if (verified) {
scriptContent = scriptBuffer
}
// Send message with transferable ArrayBuffer for efficiency
const message = {
success: true,
verified,
scriptSize: scriptBuffer.byteLength,
signatureSize: signatureBuffer.byteLength,
scriptContent,
}
if (scriptContent) {
self.postMessage(message, { transfer: [scriptContent] })
} else {
self.postMessage(message)
}
} catch (error) {
console.error('[Worker] Verification error:', error)
self.postMessage({
success: false,
verified: false,
error: error instanceof Error ? error.message : 'Unknown error',
})
}
}
self.onerror = function (error) {
console.error('[Worker] Worker error:', error)
self.postMessage({
success: false,
verified: false,
error,
})
}
async function verifySignature(
publicKey: string,
scriptBuffer: ArrayBuffer,
signatureBuffer: ArrayBuffer,
): Promise<boolean> {
const publicKeyBytes = Uint8Array.fromBase64(publicKey)
const publicKeyBuffer = publicKeyBytes.buffer
try {
const cryptoPublicKey = await crypto.subtle.importKey(
'spki',
publicKeyBuffer,
{
name: 'Ed25519',
},
false,
['verify'],
)
return await crypto.subtle.verify('Ed25519', cryptoPublicKey, signatureBuffer, scriptBuffer)
} catch (error) {
console.log('Web Crypto verification failed:', error instanceof Error ? error.message : 'Unknown error')
return false
}
}
This worker is responsible for performing secure, isolated verification of the downloaded Web Component bundle. It receives three arguments: the script URL, the signature URL, and a public key. It will download both files concurrently, then convert their contents into ArrayBuffer objects, which is a binary data representation ideal for cryptographic operations. Using the Web Crypto API, the worker imports the provided public key (encoded in the SPKI format for Ed25519) and verifies that the signature matches the script’s raw binary content.
If verification succeeds, the worker attaches the verified ArrayBuffer back to the message and transfers it to the main thread. The use of transferable objects here is important to make the message passing highly efficient, even for large bundle files.
Handling key formats correctly is crucial in this process. And as a reminder, the key being used for the purpose of this article is the Ed25519 key; therefore, other kinds of keys will need some adjustments before using the above working example. The code assumes the public key is provided as a Base64-encoded string, which is converted into a Uint8Array before being imported via crypto.subtle.importKey. The ‘spki’ (Subject Public Key Info) format is a standard representation for public keys, commonly used for verification in asymmetric cryptography. This ensures interoperability with other systems or signing tools that generate Ed25519 keys in SPKI form.
A couple additional notes about this worker implementation:
Uint8Array.fromBase64()is a very new method, supported only in the latest browsers. For broader compatibility, it’s recommended to provide a fallback using a traditional loop-based conversion for older browsers.- Still on the same breath about compatibility, if you use the Ed25519 key, to reach a wider audience, consider also adding a fallback for the verification, as this key format might not be supported on older browsers. You can use packages like @noble/ed25519 as fallback to do the verification, instead of solely relying on Web Crypto.
By encapsulating all of this inside a Web Worker, the main application remains safe. The script never touches the DOM or executes unless its signature is verified as authentic.
And that’s it! Verification complete! Once it’s confirmed that the fetched script is authentic, it can be safely mounted to the DOM. If the verification fails, simply avoid executing or injecting the script. This approach is straightforward to implement and works entirely on the client side without requiring any server-side intervention, which is ultimately why this approach is chosen.


