Over Engineering Simple Problems: Browserify
I've been working on a project recently in Node.js with Koa and a series of other technologies. The user interface that I'm using to test the API as I go is incredibly basic - we'll be building a full client in the future with all the bells and whistles you'd expect of a modern web app. In the meantime, a bit of vanilla Javascript and spectre.css are enough to get going with.
The Problem
One problem that I ran into was needing to replicate a bit of server-logic (user input based pattern matching) on the front end to show the output without having to do the incredibly intensive thing that the project does (incredibly specific, yes). I didn't want to bust out any sort of Webpack infrastructure for the basic client, which is currently a single page made up of a few server-rendered Handlebars templates.
On the other hand, there was no browser compatible version of the library I was using on the server (Picomatch). To confound matters further, I didn't feel like installing and learning yet another command line tool. What I wanted was a system where I could Browserify libraries on-demand without having to do any additional set up.
The Solution Part 1: Quick And Dirty
There was no escaping the fact that Browserify will need to be used at some point, but there has to be a better way than manually bundling every library we might want to add as a script tag to the HTML page. Luckily, Browserify is fully available via a programmatic API as well as through the command line; a quick npm i browserify
is enough to get started.
const browserify = require('browserify')
const fs = require('fs-jetpack')
const RE_VENDOR = /^\/vendor\/(.*)$/
// .. other set up code ..
app.use(async (ctx, next) => {
const matches = ctx.method.toUpperCase() === 'GET' && RE_VENDOR.exec(ctx.path)
// false if method is not GET,
// null if the path doesn't start with /vendor/,
// truthy (array-like object) when we pass the criteria
if (matches) {
const [_, library] = matches
ctx.body = browserify(fs.path('node_modules'. library)).bundle()
return
}
return next()
})
Version 1 of this solution is naive, but that's often a good foundation to work from. Creating a middleware for Koa is pretty easy - the URL pattern we needed to match is incredibly basic, so a regular expression does the trick and Browserify's .bundle()
method returns a stream, which Koa handles natively to stream the output back to the client.
This does the trick, and any sensible developer would be happy with what they'd built and get on with the rest of their day. But that's not over-engineering, is it?
The Solution Part 2: Wrestling With Our Demons
Why should the server have to re-compile the library we want every time we request it? It's not like it's changing at all. Sure, this is a development environment and it only takes ~300ms to complete the request (round-trip), but that's 298ms too long, gosh darn it! The next step to creating just the best Koa middleware would be caching our output (did I hear someone say 'add configuration'? No? Ok).
An excellent library that I commonly use for handling file system interactions is fs-jetpack - synchronous by default, but with async
-compatible versions of each function. This is quite a good default to have, as I will usually only be touching the file system during server boot (Everything after that would be handled via interaction with The Cloud™). In this case, though, the asynchronous functions will come in handy.
One of the issues with 1 is that it relies on the library to export its internals in just the right way to be used in the browser without extra introspection. Version 2 solves this with a quick hack - on request, it writes a small file to disk that attaches the root require of the library to the global object:
// ..previous code ..
const matches = ctx.method.toUpperCase() === 'GET' && RE_VENDOR.exec(ctx.path)
if (matches) {
const [_, library] = matches
const outputDir = fs.dir('bundles') // <- Added
const shim = `${ library }.js` // <- Added
await outputDir.writeAsync(shim, `global['${ library }'] = require('${ library }'`) // <- Added
ctx.body = browserify(outputDir.path(shim)).bundle() // <- Changed
return
}
// ..rest of code ..
Great! Now the library is attached to the window when it's included in the browser. Unfortunately, the development environment also reloads when the code is updated via nodemon and creating a new file counts as a change. This kills the process, breaking our pipe, meaning that we never get the output in the browser. After a quick spot of research on configuring nodemon, I found the section of the docs that covers adding a property to your package.json file. With the following code, only a subset of files will be ignored when checking for file changes:
"nodemonConfig": {
"ignore": [
"**.vendor.js"
]
},
The Solution Part 3: In, Out, Shake It All About
So the library can be accessed on the window object in the browser via it's package name. We don't reload our server every time we request the package. But what about that load time? We're still bundling the package every time we request the library, and now we're also writing to disk on every request as well! The final part of our solution is two-fold: only write to disk when we actually need to, and write the bundle to disk when we generate it as well.
The former condition is fairly simple - we just need to do an existence check via our file system interface ("just handle errors when you try to read because of race conditions" doesn't apply here). The latter is a bit more awkward - we need to pipe our output to a file, then attach a read stream for the new file to the Koa response body. Even in an asynchronous environment, this maneuver takes some work.
I've previously published the defer-class
package to smooth over the lack of a deferred promise in the native implementation, and it will be useful here. The pattern itself is simple; create a deferred object, pass it around wherever you might need it and then wait on it for another asynchronous construct to resolve it. In our case, the stream returned from browserify#bundle()
is our other asynchronous construct - combining what we previously had with our need for conditional writes and writing to disk gives us this as our final result:
const browserify = require('browserify')
const fs = require('fs-jetpack')
const Defer = require('defer-class') // <- Added
const RE_VENDOR = /^\/vendor\/(.*)$/
app.use(async (ctx, next) => {
const matches = ctx.method.toUpperCase() === 'GET' && RE_VENDOR.exec(ctx.path)
if (matches) {
const [_, library] = matches
const outputDir = fs.dir('bundles')
const shim = `${ library }.vendor.js`
const bundle = `${ library }-bundle.vendor.js`
if (outputDir.exists(bundle)) {
ctx.body = outputDir.createReadStream(bundle)
return
}
if (!outputDir.exists(shim)) {
await outputDir.writeAsync(shim, `global['${ library }'] = require('${ library }'`)
}
const defer = new (require('defer-class'))
const output = outputDir.createWriteStream(bundle)
output.on('finish', defer.resolve)
browserify(outputDir.path(shim)).bundle().pipe(output)
await defer.promise
ctx.body = outputDir.createReadStream(bundle)
return
}
return next()
})
Conclusion
The final result above still doesn't encompass the entire solution - it's missing error handling for the stream, there is additional logging that occurs, and the packages that can be bundled are limited to those present in the app's package.json file. Actually implementing those is a task that I leave for the reader - fairly easy to be sure. Now all that you have to do in your project is to include a script tag such as <script src="/vendor/picomatch" type="application/javascript"></script>
, and any script afterwards can access window.picomatch
. Libraries with names that don't play nicely with Javascript identifiers (such as our previously used defer-class
library) can be accessed with bracket notation, and aliased to an easier name: constt Defer = window['defer-class']
.