Minecraft Overviewer on an Azure storage static website

I recently started a Minecraft server on a cheap Azure VM. I like to run Minecraft Overviewer to generate a fancy map. The rendered map looks like this, you can scroll around and zoom, it’s great:

Rendered Minecraft map, focused on a village with a river running through the centre

I’m serving the map from an Azure Storage Account using the static website feature. I’ll go through how I’ve set it up:

  1. Render the map on the same machine as the Minecraft server (saves copying files between machines)
  2. Upload to Azure Storage with AzCopy
  3. Set up to run every day (early morning)

Rendering

To install Overviewer, follow the instructions on the Overviewer Documentation.

To define what to render, I’m using the configuration file. My config file looks like this:

worlds["world"] = "home/me/path/to/minecraft/world"

renders["normal"] = {
	"world": "world",
	"title": "World",
	"rendermode": "smooth_lighting"
}

outputdir = "/home/me/path/to/output"

To run it:

/home/me/path/to/overviewer.py --config=/home/me/path/to/world.config

This generates the map as a website and saves it to /home/me/path/to/output. Once this is done (it takes a while) I upload it to an Azure Storage Account.

Rendering a copy

I found that if anyone is in the world when the render runs (and you’re rendering the live world file) the chunks people are working in can render incorrectly. Be careful when copying the map before rendering though, Overviewer uses the file modified dates to determine if chunks need to be re-rendered.

As recommended, I’m using the -p flag (I’m running this on Ubuntu) when copying to preserve modified dates. It seems to be working nicely. My copy command looks like this:

cp -p -r /home/me/path/to/minecraft/server/world/* /home/me/path/to/minecraft/world

Uploading

Set up the storage account

To enable the static website on the storage account:

  1. Open the storage account
  2. Select Static website
  3. Set static website to Enabled
  4. Enter index.html as the Index document name - this is the entry point for the website generated by Overviewer
  5. Click Save

Screenshot of Storage Account in Azure Portal. Showing Static website section selected, static website enabled, and 'index.html' as the Index document name.

Enabling this creates a new container named $web. This is where the Overviewer render files will be uploaded.

Authenticating with the storage account

I’ve made use of a System assigned managed identity for authenticating between the VM and the Storage Account. With this, I don’t have to manage any password or tokens - Azure manages this for me.

Enable managed identity

To set up the managed identity:

  1. Open the virtual machine page
  2. Select Identity
  3. Set status to On
  4. Click Save

This creates a new managed identity assigned to the virtual machine.

Screenshot of VM in Azure Portal. Showing Identity section selected, System assigned tab shown, with status set to 'On'.

Set up storage account permissions

The identity that was just created can’t do anything. To give it permissions to upload to the storage account:

  1. Open the storage account
  2. Select Access control (IAM)
  3. Click Add > Add role assignment
  4. Select Storage Blob Data Contributor for Role
  5. Assign access to: Virtual Machine
  6. Select the subscription the virtual machine is in
  7. Select the virtual machine
  8. Click Save

Screenshot of Storage Account in Azure Portal. Showing Access control section selected, Role assignments tab shown. New role assignment modal shown with Storage Blob Data Contributor roles, Assigning access to Virtual Machine, Minecraft virtual machine selected.

The virtual machine will only show up in this list if it has a managed identity. If you try and do this step before you enable the managed identity you won’t see anything here.

Upload with AzCopy

Install AzCopy on the machine if you don’t have it yet.

It can take a few minutes for the role assignment to take effect. If you get permission errors when you run these next commands, just wait a bit. It took about 10 minutes to start working for me.

First, I run the login command:

azcopy login --identity

The --identity flag in this command tells AzCopy to log in using the System Assigned Managed Identity.

Next, upload the Overviewer output:

sudo azcopy sync --delete-destination true "/home/me/path/to/output" "https://<storage-account>.blob.core.windows.net/%24web"

I’m using the sync command so it uploads straight in to the container, rather than also copying the containing folder like copy does. I’m sure there is a way to get copy to not upload with the containing folder, but sync worked so I’m rolling with that.

Then it’s done. Navigating to the Primary endpoint that was shown on the Static website page of the storage account will show the map!

Bandwidth…

If you have the Storage Account and the VM in the same Azure region, the bandwidth for the upload should be free. That said, if enough people start viewing the map you could still end up with a pretty decent bandwidth.

Automating

This works great and all, but I don’t plan on logging in to run this every time I want to update the map, so I’ve automated it with cron. I’m way out of my depth with Linux anything, so I just followed this How To Add Jobs To cron Under Linux or UNIX guide.

In summary:

  1. Run crontab -e to open the cron editor
  2. Add 0 16 * * * /bin/bash -c /home/me/path/to/render-and-upload.sh to run it every morning at 2am (I’m in AEST and machine is running in UTC, so 1600 UTC is 0200 AEST)

Overviewer is pretty resource-hungry. If you’re rendering on the same machine as the server, probably don’t run it when people are playing.

Script

The render-and-upload.sh script that does all the work looks something like this:

# Copy before rendering
rm -r /home/me/path/to/minecraft/world/*
cp -p -r /home/me/path/to/minecraft/server/world/* /home/me/path/to/minecraft/world

# Render
/home/me/path/to/overviewer.py --config=/home/me/path/to/world.config

# Authenticate
azcopy login --identity

# Clear the folder before uploading. Sync is meant to clear out files that should not be there, but I issues with sync detecting changes 100% and left the map looking strange.
sudo azcopy remove --recursive "https://<storage-account>.blob.core.windows.net/%24web"

# Upload
sudo azcopy sync --delete-destination true "/home/me/path/to/output" "https://<storage-account>.blob.core.windows.net/%24web"

And it’s done for real. Automated map generation and upload with no usernames, passwords, or tokens floating around.