How can I read a JSON file into server memory using Node.js?

I’m experimenting with Node.js and need to read a JSON file into memory so I can quickly access the data in my code. I’m unsure whether it’s better to use a .json file or a .js file for this purpose.

I know there are database solutions like MongoDB and Alfred, but for now, I just need a simple way to load JSON data into memory using Node.js.

What’s the best approach to achieve this?

If you’re just loading a JSON file into memory at the start of your program, the simplest way is to use Node.js’s built-in fs module with readFileSync().

const fs = require('fs');

const jsonData = JSON.parse(fs.readFileSync('data.json', 'utf8'));

console.log(jsonData);

This method reads the JSON file synchronously and loads it into memory before your code continues executing. It’s a great choice when you don’t need non-blocking operations.

However, if the file is large, it might slow down your server startup.

For a non-blocking solution, use fs.readFile(), which allows Node.js to handle other operations while the JSON file is being read:

const fs = require('fs');

fs.readFile('data.json', 'utf8', (err, data) => {
    if (err) {
        console.error('Error reading file:', err);
        return;
    }
    const jsonData = JSON.parse(data);
    console.log(jsonData);
});

This is the recommended approach for performance if you’re running a server since it prevents blocking the event loop.

If the JSON file doesn’t change frequently and is small, you can simply use:

const jsonData = require('./data.json');
console.log(jsonData);

This method is super quick and easy, but Node.js caches the file, meaning if the JSON file is modified, you’ll need to restart your server for changes to reflect.