From Web Crypto API to Hash Generator: Frontend Encryption Done Right
From Web Crypto API to Hash Generator: Frontend Encryption Done Right#
Recently, I was building a file upload feature that needed client-side hash calculation for integrity verification. I’ve always used third-party libraries before, but this time I decided to dive deep into the browser’s native crypto capabilities.
Hash Algorithms: More Than Just MD5#
When people think of hashing, MD5 often comes to mind. But MD5 was proven vulnerable to collision attacks back in 2004—it’s no longer suitable for security purposes. Modern applications should prioritize the SHA-2 family:
| Algorithm | Output Length | Security | Typical Use |
|---|---|---|---|
| MD5 | 128 bit | ❌ Broken | File checksums (non-security) |
| SHA-1 | 160 bit | ⚠️ Theoretically broken | Git object IDs |
| SHA-256 | 256 bit | ✅ Secure | Password storage, digital signatures |
| SHA-512 | 512 bit | ✅ Secure | High-security scenarios |
Web Crypto API: Native Browser Encryption#
Frontend developers used to reach for libraries like crypto-js. But modern browsers ship with a powerful crypto.subtle API—better performance, zero dependencies.
Basic Usage#
async function sha256(text: string): Promise<string> {
const encoder = new TextEncoder()
const data = encoder.encode(text)
// Core method: crypto.subtle.digest
const hashBuffer = await crypto.subtle.digest('SHA-256', data)
// ArrayBuffer to Hex string
const hashArray = Array.from(new Uint8Array(hashBuffer))
const hashHex = hashArray
.map(b => b.toString(16).padStart(2, '0'))
.join('')
return hashHex
}
// Usage
const hash = await sha256('Hello World')
// a591a6d40bf420404a011733cfb7b190d62c65bf0bcda32b57b277d9ad9f146e
Supported Algorithms#
crypto.subtle.digest supports:
type Algorithm = 'SHA-1' | 'SHA-256' | 'SHA-384' | 'SHA-512'
Note: Browsers don’t natively support MD5. If you need MD5, implement it yourself or use a library.
File Hashing: Handling Large Files#
Small files are easy. But with files in the hundreds of MBs or GBs, calling arrayBuffer() directly will blow up memory.
Chunked Calculation (Wrong Approach)#
Many think they can hash chunks separately and combine:
// ❌ Wrong: Hashes don't work this way
async function wrongChunkHash(file: File) {
const chunkSize = 1024 * 1024 // 1MB
const chunks = []
for (let i = 0; i < file.size; i += chunkSize) {
const chunk = file.slice(i, i + chunkSize)
const buffer = await chunk.arrayBuffer()
const hash = await crypto.subtle.digest('SHA-256', buffer)
chunks.push(hash)
}
// This result is completely different from hashing the whole file!
return chunks
}
Hash algorithms have a fundamental property: they must process the complete data in one pass. Chunked results can’t be combined into the correct hash.
Correct Approach: Streaming#
For large files, the right way is:
- Chunked reading: Avoid loading everything into memory
- Incremental updates: Use a Hash object to update progressively
- Final digest: Output the result once
But crypto.subtle.digest doesn’t support incremental mode. You need lower-level APIs:
async function streamHash(file: File, algorithm: Algorithm = 'SHA-256') {
// Create stream reader
const stream = file.stream()
const reader = stream.getReader()
// Collect all chunks
const chunks: Uint8Array[] = []
let totalLength = 0
while (true) {
const { done, value } = await reader.read()
if (done) break
chunks.push(value)
totalLength += value.length
}
// Combine into single ArrayBuffer
const combined = new Uint8Array(totalLength)
let offset = 0
for (const chunk of chunks) {
combined.set(chunk, offset)
offset += chunk.length
}
// Calculate hash
return await crypto.subtle.digest(algorithm, combined)
}
Better Solution: Web Worker#
Move computation to a Web Worker to avoid UI blocking:
// worker.ts
self.onmessage = async (e: MessageEvent<File>) => {
const file = e.data
const buffer = await file.arrayBuffer()
const hash = await crypto.subtle.digest('SHA-256', buffer)
const hex = Array.from(new Uint8Array(hash))
.map(b => b.toString(16).padStart(2, '0'))
.join('')
self.postMessage(hex)
}
// main.ts
const worker = new Worker('worker.ts')
worker.postMessage(file)
worker.onmessage = (e) => {
console.log('SHA-256:', e.data)
}
Pitfalls I Encountered#
1. MD5 Throws in Chrome#
// ❌ Chrome doesn't support this
await crypto.subtle.digest('MD5', buffer)
// DOMException: Algorithm: Unrecognized name
Solution: Implement MD5 yourself or use spark-md5 library.
2. Unicode Encoding Issues#
// ❌ Wrong: Chinese characters might cause issues
const buffer = new TextEncoder().encode('你好世界')
// Correct: UTF-8 is the default
const encoder = new TextEncoder()
encoder.encode('你好世界') // UTF-8 by default
3. ArrayBuffer Reuse Trap#
// ❌ Dangerous: ArrayBuffer might be reused
const buffer = await file.arrayBuffer()
const hash1 = await crypto.subtle.digest('SHA-256', buffer)
const hash2 = await crypto.subtle.digest('SHA-512', buffer)
// Second call might fail because buffer is detached
Solution: Calculate in parallel:
// ✅ Parallel calculation
const [hash1, hash2] = await Promise.all([
crypto.subtle.digest('SHA-256', buffer),
crypto.subtle.digest('SHA-512', buffer)
])
Performance Comparison#
Testing a 100MB file in Chrome:
| Method | Time | Memory |
|---|---|---|
| crypto-js (SHA-256) | 2.8s | +150MB |
| Web Crypto API | 0.6s | +100MB |
| Web Worker + Web Crypto | 0.6s | Main thread: 0 |
Native API is 4x faster and doesn’t block the main thread.
The Result#
Based on these learnings, I built: Hash Generator
Features:
- MD5, SHA-1, SHA-256, SHA-512 support
- Text and file input modes
- Automatic Web Worker for large files
- All computation happens locally—no server uploads
The code isn’t complex, but getting performance and security right takes some finesse. Hope this helps.
Related: File Hash Calculator | Base64 Encoder/Decoder