Uploading Large Files in Express

I’m back from taking a break. I’ve been working on a number of projects that culminated a crunch-time scenario where there was lots to get done before I went on a vacation for which I would have no computer and an Internet connection that would range from flaky-to-none. When going on trips I often bring some movies with me to watch when I’m taking a break and retreating to my lodging. For this trip, I did something a bit different than usual. Instead of copying my media directly to my mobile device, I decided I would copy it to a Raspberry Pi and set it up as a movie streaming server. (More on elements of my particular scenario at the end of this post) Early last year I had posted something on using Express and NodeJS for streaming. For this trip, I copied that to a Pi that had a lot of storage and did some other things to prepare it for my environment. These included:

  • Setting up the Pi as a Wireless Access Point
  • Setting my Application to Run on Boot
  • Adding the Ability to Upload files

While I didn’t perform these changes during my trip, I realized there was some other functionality I wanted this solution to have, including

  • Converting videos of other formats to MP4
  • Ensuring the MP4’s metadata was arranged for streaming
  • Adding some service to convert uploaded files as needed

Many of these changes could be topics of their own. Let’s talk about uploading files.

Uploading a File with express-fileupload

My first solution for uploading a file was to make a simple web form allowing a file to be selected and add a router to my web application to handle file uploads. I’ll start off telling you that this didn’t work. Skip ahead to the next section if you don’t want to read about this initial failed attempt. The webform itself is simple. A minimalist version of it follows.

<form action='/upload' enctype='multipart/form-data'  method='post'>
	<input type='file' id='videoUploadElement' name='video' />
	<input type='submit' />
</form>

On the server-side, I added the a router to accept my file uploads and write them to a target folder. Upon receiving the bits of the file, I’m writing it to a designated folder preserving the original file name.

//Do not use. For demonstration purposes only. While this
//appears to work. But the entire file is loaded to memory
//before being written to disk. I've found with large files
//this can easily slow down or halt the Pi. Use the router

const express = require('express');
const router = express.Router();

router.post('/', async(req,res) => {
	try {
		if(!req.files) {
			console.log(req.body);
			res.send({
				status: false,
				message: 'No File'
			});
		} else {
			let video = req.files.video;
			video.mv('./uploads/' + video.name);
			res.send({
				status: true,
				message: ' File Uploaded',
				data: {
					name: video.name,
					mimetype: video.mimetype,
					size: video.size
				}
			});
		}
	} catch(err) {
		res.status(500).send(err);
	}
});
module.exports = router;

Testing it out on a small file, it works just fine. Testing it out on a large file locks up the Pi. The issue here is that **all** of the file must be loaded to memory before it is written to storage. The files I was uploaded were feature-length high-definition videos. I tried uploading a 4.3 gig video and saw the Pi progress from less responsive to all responsive. This wasn’t going to work for my needs.

Uploading and Streaming to Storage with Busboy

Instead of using express-fileupload I used connect-busboy. Using Busboy I’m able to stream the bits of the file to storage as they are being uploaded. The complete file does **not** need to be loaded to memory. When the upload form sends a file, the busboy middleware makes the file’s data and meta-data available through the request object. To save the file, create a file stream to which the data will be written and pipe the upload (busboy) to the file storage stream.

//Uses Busboy. See: https://github.com/mscdex/busboy

const express = require('express');
const router = express.Router();
const path = require('node:path');
const Busboy = require('connect-busboy');
const fs = require("fs");

const UPLOAD_FOLDER = 'uploads';

if(!fs.existsSync(UPLOAD_FOLDER)) {
    fs.mkdirSync(UPLOAD_FOLDER, { recursive: true });
}

router.post('/', async(req,res, next) => {
	try {        
        
        console.log('starting upload');        
        console.log(req.busboy);
        req.busboy.on('field', (name, val, info)=> {
            console.log(name);
        });
        
        req.busboy.on('file', (fieldname, uploadingFile, fileInfo) => {
            console.log(`Saving ${fileInfo.filename}`);
            var targetPath = path.join(UPLOAD_FOLDER, fileInfo.filename);
            const fileStream = fs.createWriteStream(targetPath);
            uploadingFile.pipe(fileStream);
            fileStream.on('close', ()=> {
                console.log(`Completed upload ${fileInfo.filename}`);
                res.redirect('back');
            });
        });
        req.pipe(req.busboy);
    } catch (err) {
        res.status(500).send(err);
    }
});

module.exports = router;

A fairly complete version of my app.js follows, including the reference to the router that has the streaming upload method.

const express = require('express');
const createError = require('http-errors');
const path = require('path');
const { v4: uuidv4 } = require('uuid');
const Busboy = require('connect-busboy');
require('dotenv').config();

app = express();
app.use(Busboy({
   immediate: true,
   limits: 10* 1073741824,
   highWaterMark: 4 * 1048576, // Set 4 megabyte buffer
}));

app.use(express.static('public'));


const libraryRouter = require('./routers/libraryRouter');
const videoRouter = require('./routers/videoRouter');
const uploadRouter = require('./routers/uploadRouter');
const streamRouter = require('./routers/streamRouter');

app.use('/library', libraryRouter);
app.use('/video', videoRouter);
app.use('/upload', uploadRouter);
app.use('/streamUpload', streamRouter);


app.use(function (req, res, next) {
   console.log(req.originalUrl);
   next(createError(404));
});
app.set('views', path.join(__dirname, 'views'));
app.engine('html', require('ejs').renderFile);
app.set('view engine', 'html');

With this in place, I have a working upload feature. I don’t want to stop there though. FFMPEG runs on the Pi and I may take advantage of this to do additional transformations on media to prepare it for streaming. I’m going to add code for handling this processing next.

GEeting the Code

If you want to try the code out, it is on GitHub. You can find it here: https://github.com/j2inet/VideoStreamNode.

About my Scenario

I only brought the Pi for movies, but I ended up using it for a lot more. I had an iPhone and an Android phone with me on the trip. I managed to fill up the memory on my iPhone and needed a place to move some video. The Android phone had plenty of space, but there was no direct method to transfer data between the two devices. At one point I also wanted to edit video using Adobe Rush, only to find that Adobe doesn’t allow Rush to be installed on Samsung’s latest generation of phones at this time. My iPhone had Rush, thus I had to get information transferred from my Android Phone to my iPhone to do this.

Had these use cases come to mine I probably would have acquired a USB adapter for my Phone (the Android phone will connect to my USB-C drives with no special adapters needed). I’m sure there are apps that could be used to bridge this gap, but my iPhone has no coverage when I’m out of the country. I didn’t have access to try out such apps. Some part of me feels that it is a bit silly that data can’t move between these two systems without some intermediary, such as another computer, or dongle and drives. But given that we are still not at a point where media in text messages is reliably transferred without quality loss it isn’t surprising.

Given more time, I might make an app specifically for transferring data between devices in the absence of a common network. (This is not something I am promising to do though).


Posts may contain products with affiliate links. When you make purchases using these links, we receive a small commission at no extra cost to you. Thank you for your support.

Twitter: @j2inet
Instagram: @j2inet
Facebook: j2inet
YouTube: j2inet
Telegram: j2inet

Video Streaming with Node and Express

I’ve got a range of media that I’m moving from its original storage to hard drives. Among this media are some DVDs that I’ve collected over time. It took a while, but I managed to convert the collection of movies and TV shows to video files on my hard drive. Now that they are converted, I wanted to build a solution for browsing and playing them. I tried using a drive with DLNA built in, but the DLNA clients I have appear to have been built with a smaller collection of videos in mind. They present an alphabetical list of the video files. Not the way I want to navigate.

I decided to instead make my own solution. To start though, I wanted to make a solution that would stream a file video file. Unlike most HTML resources, which are relatively small, video files can be several gigabytes. Rather than have the web server present the file in its entirety I need for the web server to present the file in chunks. My starting point is a simple NodeJS project that is presenting HTML pages through Express.

const express = require('express');
const fileUpload = require('express-fileupload');
const session = require('express-session');
const bodyParser = require('body-parser');
const createError = require('http-errors');
const path = require('path');
const { uuid } = require('uuidv4');
require('dotenv').config();

var sessionSettings = {
   saveUninitialized: true,
   secret: "sdlkvkdfbjv",
   resave: false,
   cookie: {},
   unset: 'destroy',
   genid: function (req) {
      return uuid();
   }
}

app = express();
app.use(session(sessionSettings));
if (app.get('env') === 'production') {
   app.set('trust proxy', 1);
   sessionSettings.cookie.secure = true;
}
app.use(express.static('public'));
app.use(bodyParser.json());
app.use(fileUpload({
   createPath: true
}));
app.use(function (req, res, next) {
   console.log(req.originalUrl);
   next(createError(404));
});
app.set('views', path.join(__dirname, 'views'));
app.engine('html', require('ejs').renderFile);
app.set('view engine', 'html');

module.exports = app;

With the above application, static content any files that are put in the folder named “public” will be served when requested. In that folder, the stylesheet, JavaScript, HTML, and other static content will be placed. The videos will be in another folder that is not part of the project. The path to this folder is specified by the setting VIDEO_ROOT in the .env file.

For this to stream files, there are two additional routes that I am going to add. One route will return a list of all of the video IDs. The other route will return the video itself.

For this first iteration for video streaming, I’m going to return file names as video IDs. At some point during the development of my solution this may change. But for testing streaming the file name is sufficient. The route handler for the library will get a list of the files and return it in a structure that is marked with a date. The files it returns are filtered to only include those with an .mp4 extension.

const fs = require('fs');
const express = require('express');
require('dotenv').config();

var router = express.Router();

var fileInformation = { 
    lastUpdated: null, 
    fileList: []
}

function isVideoFile(path) { 
    return path.toLowerCase().endsWith('.mp4')||path.toLowerCase().endsWith('.m4v');
}

function updateFileList() { 
    return new Promise((resolve,reject)=> {
        console.log('getting file list');
        console.log([process.env.VIDEO_ROOT])
        fs.readdir(process.env.VIDEO_ROOT, (err, files) => {
            if(err) reject(err);
            else {
                var videoFiles = files.filter(x=>isVideoFile(x));
                fileInformation.fileList = videoFiles;
                fileInformation.lastUpdated = Date.now();
                resolve(fileInformation);
            } 
        });
    });
}

router.get('/',(req,res,next)=> {
    console.log('library')
    updateFileList()
    .then(fileList => {
        res.json(fileList);
    })
    .catch(err => {
        console.error(err);
        res.status(500).json(err)
    });
});

module.exports = router;

The video element in an HTML page will download a video in chunks (if the server supports range headers). The video element sends a request with a header stating the byte range being requested. In the response, the header will also state the byte range that is being sent. Our express application must read the range headers and parse out the range being request. The range header will contain a starting byte offset and may or may not contain an ending byte offset. The value in the content range may look something like the following.

byte=0-270
byte=500-

In the first example, there is a starting and ending byte range. In the second, the request only specifies a starting byte. It is up to the server to decide how many bytes to send. This header is easily parsed with a couple of String.split operations and integer parsing.


function getByteRange(rangeHeader) {
    var byteRangeString = rangeHeader.split('=')[1];
    byteParts = byteRangeString.split('-');
    var range = [];
    range.push(Number.parseInt(byteParts[0]));
    if(byteParts[1].length == 0 ) {
        range.push(null);
    } else {
        range.push(Number.parseInt(byteParts[1]))
    }
    return range;
}

There is the possibility that the second number in the range is not there, or is present but is outside of the range of bytes for the file. To handle this, there’s a default chunk size defined that will be used when the byte range is not specified. But the range is also checked against the file size and clamped to ensure that there is no attempt to read past the end of the file.

const CHUNK_SIZE = 2 ** 18;
//...
var start = range[0];
if(range[1]==null)
    range[1] = Math.min(fileSize, start+CHUNK_SIZE);
var end = range[1] ;
end = Math.min(end, fileSize);

In the response, the header contains a header defining the range of bytes in the response and it’s length. We build out those headers, set them on the response header, and then write the range of bytes. To write out the bytes, a read stream from the video file and piped to the response stream.

const contentLength = end - start + 1;
const headers = { 
    "Content-Range": `bytes ${start}-${end}/${fileSize}`,
    "Accept-Ranges":"bytes",
    "Content-Length": contentLength,
    "Content-Type": getContentType(videoID)
};

console.log('headers', headers);
res.writeHead(206, headers);
const videoStream = fs.createReadStream(videoPath, {start, end});
videoStream.pipe(res);

The server can now serve video files for streaming. For the client side, some HTML and JavaScript is needed. The HTML contains a video element and a <div/> element that will be populated with a list of the videos.

<!DOCTYPE html>
<html>
    <head>
        
        <link rel="stylesheet" href="./style/main.css" />
        <script src="scripts/jquery-3.5.1.min.js"></script>
        <script src="scripts/main.js"></script>
    </head>
    <body>
        <div id="videoBrowser" ></div>
        <video id="videoPlayer" autoplay controls></video>
    </body>
</html>

The JavaScript will request a list of the videos from the /library route. For each video file, it will create a text element containing the name of the video. Clicking on the text will set the src element on the video.

function start() { 
    fetch('/library')
    .then(data=>data.json())
    .then(data=> { 
        console.log(data);
        var elementRoot = $('#videoBrowser');
        data.fileList.forEach(x=>{
            var videoElement = $(`<div>${x}</div>`);
            $(elementRoot).append(videoElement);
            $(videoElement).click(()=>{
                var videoURL = `/video/${x}`;
                console.log(videoURL);
                $('#videoPlayer').attr('src', videoURL );
            })
        });
        
    });
}

$(document).ready(start());

Almost done! The only thing missing is adding these routes to the source of app.js. As it stands now, app.js will only serve static HTML file.

const libraryRouter = require('./routers/libraryRouter');
const videoRouter = require('./routers/videoRouter');
app.use('/library', libraryRouter);
app.use('/video', videoRouter);

I started the application (npm start) and at first, I thought that the application was not working. The problem was in the encoding of the first MP4 file that I tried. There are a range of different video encoding options that one can use for MP4 files. Looking at the encoding properties of two MP4 files (one file streamed successfully, the other did not) there was no obvious difference at first.

The problem was with metadata stored in the file. A discussion of video encodings is a topic that could be several posts of its own. But the short explanation is that we need to ensure that the metadata is at the begining of the file. We can use ffmpeg to write a new file. Unlike the process of re-encoding a file, for this process the video data is untouched. I used the tool on a movie and it completed within a few seconds.

./ffmpeg  -i Ultraviolet-1.mp4  -c copy -movflags faststart Ultraviolet.mp4

With that change applied, I the videos now stream fine.

If you would like to try this code out, it is available in GitHub at the following URL.

https://github.com/j2inet/VideoStreamNode