Array buffer

Import a binary file as an array buffer

Person in shorts with blue hair walking left

The Test

In cases where a small amount of arbitrary binary data needs to be available synchronously within a module, it can be advantageous to inline the data into the JavaScript bundle. This test bundles a module that imports a binary file, expecting an ArrayBuffer as the result of the import.

import buffer from 'import-binary-somehow';

The build result should be a single JavaScript bundle containing the binary, which should be encoded in an efficient text format like base64.



You can use browserify’s brfs to convert a binary to base64, then convert to Array Buffer using the base64-arraybuffer decoder.


Parcel 2 supports fs.readFileSync, which inlines the contents of the file as base64 and converts it into a node Buffer.

However, the node Buffer polyfill is over 8.5k (gzipped), so it adds a lot of unnecessary overhead.

It seems likely that Parcel 2's plugin system will provide a solution once it's documented.



Rollup plugins are simple to write. In this case, a module was generated that exports the file as a base64 string, wrapped in a function that converts base64 to an array buffer. The function that does the conversion is in a virtual module, so it isn't duplicated, and can be code-split.


arraybuffer-loader is an off-the-shelf loader that encodes imported files as Base64, then exports the decoded contents as an ArrayBuffer. This can also be achieved by chaining together url-loader's base64 option and a quick base64-to-arraybuffer implementation. Since that's basically what arraybuffer-loader does, we'll use the one that was already built.

One important caveat with arraybuffer-loader: it's designed to output code that runs in both the browser and Node.js. As a result, Webpack actually inlines a complete polyfill for Node.js' Buffer implementation even though that codepath never gets executed in the browser. To prevent this, it's important to set Webpack's node.Buffer configuration option to false.