GIS, Open-Source

Corner UTM Grid Labels in QGIS Print Composer

In this post I’ll demonstrate generating UTM grid labels in the QGIS print composer with formatting that promotes relevant information is a consistent way. Printing the full 6/7 figure reference on each grid interval is unnecessary and frequently unhelpful as often we need to read off/locate a 3 figure reference.

Abbreviated UTM Grid References?

When reading a UTM grid reference on a typical 1:25000-50000 scale topographic map featuring a 1km grid interval, a 6-figure grid reference (3 figures for Eastings and 3 for Northings) describes a location to 100m precision and is typically sufficient for locating a position on the map.

A full UTM grid reference describes a position to 1m precision. Therefore, we typically highlight the significant figures on the map for ease of using a 6 figure reference with the map.

Of course we might still want to know the full reference, so there should be some full grid references given. For consistency, these are often placed in the corners of the map. To find the full reference for a shortened reference, go to the left/bottom of the map and count up.

Here’s an example from a local state-issued topographic map:

Screen Shot 2017-05-10 at 12.02.15

QGIS print composer has the ability to draw and label a UTM grid. However, it’s styling options are somewhat limited out-of-the-box, restricted to a full reference at each interval.

Rendering sub/superscript labels

Most unicode-compatible fonts contain glyphs for superscript and subscript representation of ordinals:


QGIS does not handle superscript/subscript formatting internally, but we can achieve the same effect by transposing the numbers with their unicode sub/super-script equivalent characters. Note that this technique is limited to the common Arabic numerals 0-9.

In the Python code below the function UTMFullLabel performs two operations:

  • Determine the non-significant figures of the reference to conditionally format. The index of those figures in the reference depends on whether the label is an Easting (6) or Northing (7).
  • Transpose those figures to subscript

The function UTMMinorLabel merely returns the significant figures. There are two functions as two separate grids are defined in the print composer, and using two formatting functions avoids also handling grid interval logic in Python.

from qgis.utils import qgsfunction
from qgis.gui import *

@qgsfunction(args="auto", group='Custom')
def UTMMinorLabel(grid_ref, feature, parent):
 return "{:0.0f}".format(grid_ref)[-5:-3]

@qgsfunction(args="auto", group='Custom')
def UTMFullLabel(grid_ref, axis, feature, parent):
 rstr = gstring[-3:] #3 last characters
 mstr = gstring[-5:-3] #the 5th-4th characters
 #either the 1st or 1-2 for the most sig figs depending if there 6 or 7 digits
 lstr = ''
 if (len(gstring) == 6):
 lstr = gstring[0] #first 2 digits
 elif (len(gstring) == 7):
 lstr = gstring[0:1]
 return str(len(gstring))
 return "{0}{1}{2}m{3}".format(sub_scr_num(lstr),mstr,sub_scr_num(rstr),'E' if axis == 'x' else 'N')

def sub_scr_num(inputText):
 """ Converts any digits in the input text into their Unicode subscript equivalent.
 Expects a single string argument, returns a string"""
 subScr = (u'\u2080',u'\u2081',u'\u2082',u'\u2083',u'\u2084',u'\u2085',u'\u2086',u'\u2087',u'\u2088',u'\u2089')
 outputText = ''
 for char in inputText:
 charPos = ord(char) - 48
 if charPos <0 or charPos > 9:
 outputText += char
 outputText += subScr[charPos]
 return outputText

For superscript formatting the transposition is:

supScr = (u'\u2070',u'\u00B9',u'\u00B2',u'\u00B3',u'\u2074',u'\u2075',u'\u2076',u'\u2077',u'\u2078',u'\u2079')

Now that we can generate long and short labels they need to be placed appropriately on the grid:

  • Full labels for the grid crossings closest to the corners of the map
  • Abbreviated labels for all other grid positions

This logic is handled in QGIS’ expression language. This could be handled in Python too by passing in the map extent. In English, it is basically performing the procedure:

  • Get the extent (x/y minimum and maximum values) of the map.
  • Test which is closer to the map boundary
    • The axis minimum/maximum plus/minus the grid interval
    • The supplied grid index being labelled
    • Neither – they’re coincident (i.e. the first label is at the axis)

If the value supplied is closer to the map boundary or they’re co-incident then this must be the first label and should be rendered as a full reference.

  WHEN  @grid_axis = 'x' THEN
      WHEN x_min(map_get(item_variables('main'),'map_extent')) + 1000 >= @grid_number THEN  UTMFullLabel( @grid_number, @grid_axis)
      WHEN x_max(map_get(item_variables('main'),'map_extent')) - 1000 <= @grid_number THEN  UTMFullLabel( @grid_number, @grid_axis)
      ELSE UTMMinorLabel(@grid_number)
  WHEN @grid_axis = 'y' THEN
      WHEN y_min(map_get(item_variables('main'),'map_extent')) + 1000 >= @grid_number THEN UTMFullLabel( @grid_number, @grid_axis)
      WHEN y_max(map_get(item_variables('main'),'map_extent')) - 1000 <= @grid_number THEN UTMFullLabel( @grid_number, @grid_axis)
      ELSE UTMMinorLabel(@grid_number)

Finally to tie all of this together, in the Print Composer three grids are defined:

  • 1000m UTM grid for lines and labels
  • 1000m UTM grid for external tick marks
  • 1 arc-second interval secondary graticule

Two UTM grids are required as QGIS can either draw grid lines or external ticks. Both were desired in this example.

Screen Shot 2017-05-10 at 12.59.16
Define two 1000m-interval UTM grids and an arc second-interval Lat/Lon grid

Screen Shot 2017-05-10 at 13.00.36

For the grid rendering the labels, set the interval to 1000m, custom formatting as described above and only label latitudes on the left/right and longitudes on the top/bottom.

The formatting of the labels should look something like the screenshot above.

The only thing I haven’t covered here in the interest of clarity is the selective rotation of the Y-axis labels as seen in the state-issued topo map. This could be achieved by using an additional grid and setting the rotation value appropriately.

GIS, Open-Source

Generative Pseudo-Random Polygon Fill Patterns in QGIS

QGIS doesn’t support pseudo-random fill patterns out-of-the-box. However, using the Geometry Generator we can achieve the same effect.

Random fill pattern? Here’s a polygon with such a fill. These are useful if, say, we want to represent an area of intermittent water.

Screen Shot 2017-05-05 at 19.53.11
Clipped Grid with 100% randomness

A typical way to achieve this effect would be to generate a random fill texture and use a pattern fill. This is less computationally intensive to render, however QGIS cannot render a texture fill as a vector even in vector output formats, which means all vector fills will be rasterised. If that means nothing to you, don’t worry.

The typical way to generate a randomised fill pattern is firstly to draw a bounding box around the feature of interest, create a grid of points that covers the feature, then only retain those points that also intersect the feature of interest. In effect, the feature geometry is used as a clipping mask for a uniform grid.

For the points that remain after clipping, we can optionally add an amount of randomness to the X,Y value of each grid intersection between zero and the size of a grid element. With no randomness, of course we see the grid pattern.

The QGIS geometry generator comes in useful here as it can be fed the geometry of a feature, in this case a Polygon, that is fed to a PyQGIS script that returns a multipoint geometry which is then symbolised.

The Python code is below:

from qgis.core import *
from qgis.gui import *
import math
import random

Define a grid based on the interval and the bounding box of
the feature. Grid will minimally cover the feature and be centre aligned

Create a multi-point geometry at the grid intersections where
the grid is enclosed by the feature - i.e. apply a clipping mask

Random value determines amount of randomness in X/Y within its
grid square a particular feature is allowed to have
@qgsfunction(args='auto', group='Custom')
def fillGrid(xInterval, yInterval, rand, feature, parent):
  box = feature.geometry().boundingBox()

  #Create a grid that minimally covers the boundary
  #using the supplied intervals and centre it
  countX = math.ceil(box.width() / xInterval)
  countY = math.ceil(box.height() / yInterval)

  #Align the grid
  gridX = countX * xInterval
  gridY = countY * yInterval
  dX= gridX - box.width()
  dY= gridY - box.height()
  xMin = box.xMinimum() - (dX/2)
  yMin = box.yMinimum() - (dY/2)

  points = []
  #+1 to draw a symbol on the n+1th grid element
  for xOff in range(countX+1):
    for yOff in range(countY+1):

      ptX = xMin + xOff*(xInterval) + rand * random.uniform(0,xInterval)
      ptY = yMin + yOff*(yInterval) + rand * random.uniform(0,xInterval)

      pt = QgsPoint(ptX,ptY)
      point = QgsGeometry.fromPoint(pt)
      if feature.geometry().contains(point):

  return QgsGeometry.fromMultiPoint(points)

Finally, in the symbology options for the layer, select ‘Geometry Generator’ as the fill for the layer, and call the Python function with values for X/Y intervals and the proportion (0-1) of randomness to add.

Screen Shot 2017-05-05 at 19.40.31

Note that the multipoint geometry returned is independent of the zoom level and also uses the co-ordinate reference system of the source layer, not the display layer. The eagle eyed may have noticed that in the screenshots above that a transformation appears to have been applied to the grid.

Indeed it has. The source layer is in WGS-84, while the map has a UTM projection.

Tackling scale and CRS dependence are topics for a later post.

Electronics, LED, Open-Source

Building an LED Techno Hat

Pam Rocking The Finished Hat

Rainbow Serpent Festival has become my annual ritualistic ‘turn the phone off, dance till my legs hurt, catch up with old friends, meet new ones, reflect and re-focus’ 21st century pilgrimage. It’s also a great place to check out fantastic visual art, and even if you’re not part of the official program there’s plenty of opportunity to roll your own. As the photographer Bill Cunningham once said:

The best fashion show is definitely on the street. Always has been, and always will be

Apparently this is the era of wearable tech, and with addressable colour LEDs coming down in price and up in brightness, density and packaging, let’s make a sound-reactive LED techno hat. You can never have too much eye candy on the dance floor.

The job was threefold:

  1. Get the raspberry pi driving the LED array
  2. Battery power everything from a small backpack
  3. Make it remote-controllable, preferably from a phone or other WiFi device

Pictures during and after construction

Fadecandy powered up on the bench
WS2811/SMD5050 Strip Cutting
Electronics all packed up
Components on the bench
Labelled Components




  • One (awful) hat. This one cost me about AUD$15 on eBay and is as horrendously purple as it looks
  • Lots of epoxy adhesive. It’s the only stuff I could get to stick the LED silicone tubes.
  • Usual craft and electronics bench equipment

Electronics Components

Raspberry Pi

The Pi is easily available and cheap. I wanted to use the Beaglebone but the order didn’t arrive in time. Once overlocked to 1GHz performance ended up being adequate.

The Pi boots into Debian from an SD card. The TP-Link dongle is run as an access point using hostapd + udhcpd. There are plenty of guides on the web about getting this working. I went with the TP-Link as unlike other adaptors the Debian hostapd binary is compatible and I gave up rebuilding the drivers/hostapd after running into a pile of kernel header issues.

All that remained were the installation of the GCC toolchain (build-essential) for building fade candy and the JVM for PixelController. I hijacked an existing init.d script to start both of these at boot.

Pi Audio Input

The Pi doesn’t have an analog audio input. The C-media USB dongle is USB audio class-compliant and worked fine. I had to bastardise the one you can see in the picture, soldering a 100uF electrolytic capacitor across the input pins and borrowing +5V from the USB bus. This was due to the DC offset of the AGC output and it’s power requirements. The adafruit product page for this device discusses this. I set the AGC to 40dB and left it there. Input gain in ALSA was set to ~75.


Fadecandy Open Pixel Control Server

This communicates with the Fadecandy over USB and exposes an Open Pixel Control socket to where we will send the LED data. It’s also capable of re-mapping the pixel array to compensate for snaked or other non-sequential wiring. This wasn’t needed, but I did take advantage of the colour and gamma correction for the array.

I could have driven the LEDs directly from the Pi via the DMA hack, or added a DMX512 (RS422) output and used an off-the-shelf driver board, but the Fadecandy does so much more. Doing temporal dithering and gamma correction/key frame interpolation in 16bpp colour I’ve found that it handles low brightness levels/fades/transitions in a much more subtle and effective way than I see by bit-banging a 24-bit colour buffer straight into the LEDs.

PixelController LED matrix software

This Java application generates the eye candy. It has a few different visual generators, effects and mixers. It’s also cross-platform due to a Java codebase. Whether the Pi would be up to the task was questionable, but given that the Pi now sports official support from Oracle for the platform in the form of an ARM hard-float JVM (there’s a lot of floating point code in PixelController) I was prepared to give it a shot.

I had grand plans to roll my own effects engine, but time was against me. PixelController had most of what I needed; flexible effects generation/preset control, headless ‘console’ mode running on the Pi and OSC remote control.

PixelController doesn’t talk OPC out-of-the-box nor can it talk natively to the FadeCandy via libUSB, so I wrote a patch to get it to talk to the FadeCandy OPC server over TCP. At the time of writing the patches haven’t been merged, but it’s here on Github if you want to try it.  (Edit: It was merged). There’s no multi-panel support or gamma/colour-correction offload support, but it’s good enough for this scenario.

TouchOSC Android

TouchOSC was used to remote control PixelController.PixelController even contains a layout file. I tweaked the layout a bit for my own preference but this part was a lot easier than expected!

TouchOSC/PixelController find each other using mDNS; no manual config was required.

LED Array Construction

The Fadecandy has 8 outputs, each of which can drive 64 LEDs

I received 4 strips of 144 LEDs. They look like this with the silicone shielding removed.

A bit of simple math, cutting and soldering transforms these into 8×64 strips with 8×8 strips left over. These will be found a home on another project.

Power Considerations

Each WS2812 LED apparently draws 60mA @ 5V at full brightness, but I don’t trust this given they’re Ali Baba specials. Firing up some test code and a current meter I got these values:

Nominal Value:

100% on 64 LEDs -> (60/1000) * 64 = 3.84A @5V

Tested Values:

50% on 64 LEDS -> 1.7A @ 5V

100% on 64 LEDS -> 3.3A @ 5V

They’re pulling a bit less current than expected, and the power curve appears fairly linear.

Scaling this up:

512 LEDS (8*64) -> 26.4A @ 5V (132W) @ 100% brightness

Wow, that’s pulling quite a lot of power, and this thing is going to be running off batteries.

Fortunately, a 12V lead-acid battery will be used with a DC-DC step-down converter for which we get the more conservative 11A @ 12V. A lead acid battery was chosen due to the high output current current handling capabilities. It was not chosen for it’s low weight! A few Li-Po packs were investigated but getting high output current at a sensible price wasn’t proving fruitful.

We also need to take into account the fact that this peak load will very rarely be achieved in practise. In practise it was found that the LEDs are extremely bright and running them at full power was unnecessary most of the time. Unfortunately this meant losing dynamic range in the LED output signal by scaling to 50% or 75% most of the time.

The step-down transformer I used (not the 100W one that I ordered and didn’t receive) was rated to 25W RMS. Peak wasn’t quoted but we can assume its 25/.707= ~ 35W. As I was running 50% max brightness most of the time and given none-white output, I didn’t expect to experience (nor did I) any problems running with this.

Wiring for power

18AWG power wire from LEDs is rated to 16A (up to 300V typically)

2 of these run in parallel from the transformer rates us to 32A @ 5V, which is more than plenty.

In the end I skimped in construction of this given the under speccing of the DC-DC transformer.

Room for improvement?

Pi -> Beaglebone

Performance from the Pi was ok, but it felt underpowered. Effect frame rate was lower than I was hoping for. I expect moving to the Beaglebone with the ARMv7 core with VFP + NEON support in addition to the higher clock rate will yield better results. I’m sure further code optimisation and making use of the GPU in PixelController would have helped too.


The 100W 24V-5V stepdown transformer didn’t arrive in time. This would have required two 12V batteries wired in series and beefing up of the 5V cabling.

Remote Display

The wearer can’t see what the damn thing is doing. Either get an extrovert friend to wear it while you press the buttons, or the remote app needs a remote view.

Actually, the whole thing would run just fine on a phone. Android supports USB OTG host-mode so could drive the fadecandy directly.