Improving UX: Proactive Background Caching & Offline Form Data 📱
Lecture 1: Proactive Background Caching 🔄
The Problem: Content Only Cached on Visit
So far, pages only get cached when the user actually visits them. Go offline before visiting a blog post → offline page. It works, but it's reactive — the cache only grows as the user browses.
A better experience: silently cache content in the background while the user is online, so it's just there when they need it. No notification, no interruption — they're just pleasantly surprised to find posts available offline.
The Strategy: Slow, Polite, Background Fetching
The key design principle here is being a good network citizen. Don't blast a dozen requests the moment the page loads. Instead:
- Wait 5 seconds after SW start before doing anything — let the page finish its own loading first
- Wait 10 seconds between each post — slow drip, not a flood
- Work newest-first — most recent posts are most likely to be relevant
- Retry on failure — if a request fails, wait 10 seconds and try again
let caching = false; // guard against concurrent caching runs
async function cacheAllPosts(forceReload = false) {
if (caching) return; // already running
caching = true;
await delay(5000); // wait 5 seconds before starting
// Get the list of post IDs
let res;
if (isOnline) {
res = await fetch('/api/get-posts', { credentials: 'same-origin', cache: 'no-store' });
if (res.ok) await cache.put('/api/get-posts', res.clone());
} else {
res = await cache.match('/api/get-posts');
}
if (!res) { caching = false; return; }
let posts = await res.json();
if (posts.length > 0) {
await cachePost(posts[0], forceReload); // start with the most recent
}
caching = false;
}
async function cachePost(post, forceReload) {
let postURL = `/post/${post.id}`;
if (!forceReload) {
let cached = await cache.match(postURL);
if (cached) return; // already have it, skip
}
await delay(10000); // wait 10 seconds — be polite to the network
try {
let res = await fetch(postURL, { credentials: 'omit', cache: 'no-store' });
if (res.ok) {
await cache.put(postURL, res.clone());
// move on to the next post
await cachePost(nextPost, forceReload);
}
} catch(err) {
await delay(10000); // failed — retry in 10 seconds
await cachePost(post, forceReload);
}
}
function delay(ms) {
return new Promise(res => setTimeout(res, ms));
}When to Call It
Call cacheAllPosts() from main() — every time the SW starts, it begins the background caching process:
async function main() {
console.log(`Service Worker (v${version}) is starting`);
await sendMessage({ requestStatusUpdate: true });
await cacheLoggedOutFiles();
cacheAllPosts(); // no await — fire and forget in the background
}No await here — you don't want SW startup to block waiting for all posts to cache. Fire it off and let it run in the background.
On activation with forceReload = true, re-cache everything fresh:
async function handleActivation() {
await clients.claim();
await clearCaches();
await cacheLoggedOutFiles(/* forceReload= */ true);
cacheAllPosts(/* forceReload= */ true); // also refresh all posts
}Seeing It Work ✅
In the Network tab after a clean SW install:
- Page loads normally
- After ~5 seconds — the
/api/get-postsrequest fires - After ~10 more seconds — first blog post is fetched
- After ~10 more seconds — second blog post is fetched
Check the Application tab → Cache Storage — both posts are now cached even though they were never visited. Go offline, navigate to them — they load perfectly.
"They won't get all of them necessarily all at once — they'll get a few dozen each time they visit. Over time, somebody would have gotten all of those posts."
The Point of All This
"This still feels like a website — and that's kind of the point. We're not trying to turn this into an application. We're just trying to turn this into the kind of user experience that somebody would expect if they weren't thinking about all the vagaries of bad networks or servers being down. They would just kind of expect it to work — and we're now giving them that."
Next up: Storing Form Data in IndexedDB — saving unsubmitted posts when the user goes offline.
Lecture 2: Storing Form Data in IndexedDB 💾
The Problem: Losing Unsaved Work Offline
The user is writing a blog post. They go offline. They click "Add Post". It fails. Now they're frantically copying their text into a separate editor, hoping they remember to come back and post it later.
This is a solvable problem — and a common one. The same pattern applies to any form: comments, drafts, messages.
The Solution: Auto-Save to IndexedDB
As the user types in the form, silently save the content to IndexedDB in the background. If the post fails or the tab is closed, the content is still there on the next visit.
Why IndexedDB and not localStorage?
- Service workers cannot access
localStorage— they can only use IndexedDB - IndexedDB is accessible from both the page and the service worker — this is the key that makes the full pattern work
// In blog.js — listen for changes on the form fields
postTitleInput.addEventListener('change', saveFormData);
postBodyInput.addEventListener('change', saveFormData);
let formData = {};
function saveFormData() {
formData = {
title: postTitleInput.value,
body: postBodyInput.value
};
// save to IndexedDB using a promise-based library
idb.set('post-backup', formData);
}Kyle uses a small promise-wrapper library over the raw IndexedDB API (which is notoriously callback-heavy). On page load, check if a backup exists and restore it:
async function restoreFormData() {
let backup = await idb.get('post-backup');
if (backup) {
postTitleInput.value = backup.title;
postBodyInput.value = backup.body;
}
}Friendly Offline Message
When the user clicks "Add Post" while offline, instead of a cryptic failure, check isOnline and show a helpful message:
// Exposed from blog.js so the Add Post page can access it
function getOnlineStatus() {
return isOnline;
}"You seem to be offline currently. Please try posting once you come back online."The form data is safe in IndexedDB — the user can close the tab and come back later without losing anything.
Clearing the Backup on Successful Post
Once the post is successfully sent to the server, the backup is no longer needed. The service worker handles this — in the router, after a successful /api/add-post request:
if (reqURL === '/api/add-post' && res.ok) {
// clear the IndexedDB backup — post was successfully saved
await idb.delete('post-backup');
}This keeps the service worker and page in sync. The SW detects the successful POST and cleans up the backup automatically — no extra logic needed on the page.
The Background Sync Connection
Kyle flags that this IndexedDB pattern is the manual version of what Background Sync was designed for:
- User goes offline mid-post → data saved to IndexedDB
- Background Sync detects connectivity restored
- SW grabs the content from IndexedDB, posts it to the server automatically
- SW clears the backup
Without Background Sync (due to limited browser support), the user has to manually click "Add Post" again when back online. But the data is safe either way.
The Full UX Flow
User typing → change event → save to IndexedDB
User goes offline → clicks Add Post
→ SW detects POST failure
→ page shows "you seem to be offline"
→ data still in IndexedDB
User closes tab, comes back later
→ page loads → checks IndexedDB → restores form data
User comes back online → clicks Add Post
→ SW detects successful POST
→ SW clears IndexedDB backup
→ clean slateSection Recap 🎯
- Proactive background caching — fire-and-forget from
main(), 5s delay to start, 10s between posts, newest-first, retry on failure - IndexedDB for form persistence — auto-save on
change, restore on page load, clear on successful server POST - Both patterns share a theme — the SW working quietly in the background so the user never has to think about connectivity