New kit

I upgraded to an OS WMR100: about as close to a proper weather station as it gets. As well as measuring temperature and humidity like the RMS300, it also has wireless wind and rain gauges, the option for a UV sensor and sufficiently improved range that putting the sensors outdoors is feasible.

Finding a suitable place to put the outdoor parts ((the temperature sensor and wind gauge are designed to be mounted on a (included) pole; the rain gauge is meant to sit close to ground level with screws to make sure it’s level; both should be in open space)) was awkward but generally it all works.
There are a few issues: the signal from the wind gauge can be quite flaky ((the console is well inside range though there are walls which will affect that)) and the cover on the temperature sensor is plastic so quite poor at stopping the sun getting through.
Ideally, the two would be in separate places so that one could be in clear space and the other shaded.

Like before, I use the WMR100 USB client to get data from the console and dump it in a text file then parse it but store data in PostgreSQL and use matplotlib to draw graphs rather than rrdtool for both.

The combination has some advantages ((rrdtool doesn’t really work for rain data; mpl is a lot more customizable although the documentation sucks)). It’s a huge dataset to deal with ((a month’s worth of wind speed/direction data is 192k data points if every measurement is received, there are roughly 4 million lines in data.log right now and it grows at a rate of 19 thousand lines a day)) so a lot of work had to be done to get processing fast enough to be usable.
Lots of time was spent with the Python profiler and PG query analyzer, adding table indexes and looking for slow code.

Data also goes to Weather Underground – they do a load of analysis and graph drawing that I haven’t get around to.

Stats and stuff, two years on

I meant to post this long before now but other stuff and general procrastination got in the way.

So last year it seemed like a good idea to take some kind of look at the year that had just gone, through the lens of the stats I gather on Daytum. This is the same again for the year that’s about to end.
I suppose these are like a much duller version of Nicholas Felton’s annual reports. Getting to the point, a comparison between this year and last:

So the tl;dr version: twice as much tea (six cups a day!?) and water, lots less orange juice.

The other stuff I mentioned last year:

  • 442 car journeys (about twice every 2 days)
  • 132 shaves (about twice every 5 days)
  • 250 bus journeys (i.e. 2 every 3 days)

Last time I said I’d like an S60-native interface; since then I’ve done away with my Nokia N96 for an HTC Desire (running Android). An iPhone app recently entered some sort of beta and apparently the API is “built out enough for the iphone app” that they could give access to it but there’s no sign of that so far.

I also said I’d struggle to justify the price of a “pro” account – eventually I wanted more “displays” (views into the data to which some criteria can be applied, e.g. a pie chart of drinks in the last year) so I upgraded.
The privacy options were a nice addition but I’m not sure that privacy is something you should have to pay for.

It feels as if progress on the whole site has slowed to a crawl — I’m not sure whether the issue is time, finance or something completely different.

Weather, part 3

In part two I said I hadn’t done humidity yet. Now I have – it was fairly easy to do and I was short of something better to do in the wee hours of the morning.

Twelve hours:

One day:

I was asked in #bitfolk whether the station did anything other than temperature and I remembered that WSDL gave the dew point so thought I might as well add that too.

The maths required to calculate dew point and the background information is given in the Wikipedia article but for want of an excuse to use the LaTeX plugin for WordPress, the dew point [math]T_d[/math] is given by:

T_d = frac {a times gamma(T, RH)}{b-gamma(T, RH)}

Where [math]gamma(T,RH)[/math] is given by

gamma(T,RH) = frac {a T} {b+T} + ln (frac{RH}{100})

The bottom pair of lines in the temperature graphs in the last post show the dew point.

Weather, Part Deux

So nearly a week ago I got a weather station. Since then, I’ve hacked up some Python to deal with the temperature data and spew it into graphs. I haven’t got to doing the same with the humidity data, that can be Part Trois…

Anyway, graphs (update every 15 minutes). The last 24 hours:

The last week:

If anyone’s particularly interested I could post the Python that does this. It’s not the most exciting (or, I don’t doubt, the most well-written) code in the entire world but there’s probably someone weird enough to want to see how dire a job I can do of chopping a string up.
I added some optimisation to only parse lines which haven’t been seen before using a temp file so after the initial run the longest part is now uploading the images.

Apparently weather stations are a “conversation killer, if there ever was one!!!”. Who would have thought it?

Weather Nerdery

I’ve wanted to set up/build a weather station for a while; lazyness and apathy were mostly what stopped me. Last week, I bit the bullet and bought an Oregon Scientific RMS300 (change from 35 quid from OS themselves).
It comes in two parts: a base station with a screen and built in temperature/humidity sensor and an included wireless sensor (by default labelled “outside” as a serving suggestion).

The supplied (via download) software, “OS Weather”, is…pretty shit. It doesn’t work on Windows 7 (“trial version” available “end of June”, apparently) and I couldn’t be bothered setting up a virtual machine to screw about with it.

Next up was Weather Station Data Logger. It’s good, but I have more than enough machines running 24/7 without adding a Windows one (and again don’t particularly want a Windows VM). I have machines which do run near-enough-24/7 but run Debian GNU/Linux so the ideal solution would run under that and ideally headless.

Enter the WMR100 module which will do all the work of getting the data out of the base station and its wireless sensors and present them in a fairly easy to manipulate format:


Next magic trick will be to get the data I collected with WSDL and the data that’s being collected with WMR100, stick it all together in some way, and start getting it into graphs of some description. rrdtool‘s the obvious candidate.

Daytum, some interesting statistics

Almost a year ago, I signed up for

The premise of the site derives from Nicholas Felton‘s “annual reports” (linked in the previous post) – keep track of something you do (trips to the gym, drinks, distance walked, whatever; at one point I noticed someone was tracking their trips to the toilet and what they did while they were there…) and the site presents it with some pretty graphs in the same sort of visual style as he uses and also with some basic analysis (time since the last entry etc.). To be honest, I didn’t expect to keep it up (fnar) for anywhere near a year, but for some reason I’ve found it strangely compelling.

To the numbers…

After a year, I suppose now is as good a time as any to have a look at some of the numbers. From New Year’s Day 2009 to midnight on the 28th December, I’ve had:

  1. 958 cups of tea
  2. 719 glasses of water (with or without diluting juice)
  3. 620 glasses of orange juice
  4. 474 glasses of Irn Bru

after that, the quantities get a lot smaller (best of the rest: 192 cups of coffee; keeping up the rear: 4 bottles of M&S Christmas Orange/Grape/Cranberry stuff).

That still means, though, that in an average day I get through about 3 cups of tea, 2 glasses of water, 2 glasses of orange juice and a glass of Irn Bru (amongst other things).

Some other stuff I tracked:

  1. circa. 485 car journeys (1.34 a day)
  2. 230 bus journeys (almost twice every 3 days)
  3. 141 shaves (i.e. twice in five days, or once every 2 weekdays if you assume every time was on one)

New things I’d like to see

On the last part, one feature that I think might be useful, analytically, would be separating weekends and weekdays or perhaps taking account of the academic year (i.e. separate terms or term-time and otherwise) Daytum do let you download the data in CSV format, so it’s possible to deal with those issues; I just can’t really be bothered doing it myself.
One other feature I’d use would be a more native interface for my current mobile OS, Symbian S60. Not having to use the phone’s web browser and the iPhone interface (which works relatively well, in fairness) would be nice. Daytum are apparently working on an API though, so it should be possible to build some sort of application to do it.

The last thing is the future – what Cool New Thing™ should I start tracking this year?
I think I’d be interested in the total distance I travel, or better still a breakdown of time and/or distance in a car/on a train/walking. I’ve toyed with using Nokia Sports Tracker on my phone (Nokia N96 all-black) but it flattens the battery in no time at all because it insists on using the phone’s GPS and data connection – a whole day would take a miniature fusion reactor to keep it going. Which kinda sucks.

Something I think Daytum could do better at is showcasing the novel things people use the service for.
There’s a premium version, which gets you more features (separate your stuff into pages, some privacy settings) as well as more categories and items to store data with. I use the free version, mostly because I’d struggle to justify the cost (four US Dollars per month). If I bumped against some of the limits of the free account (here’s the opportunity to promote novel, or otherwise, uses and make some money out of it), I’d consider upgrading.

Mmmm, braaaaains

A week past on Saturday I went to Glasgow’s Zombie Walk to take photos of the brain-munching undead…it was fun, and here’s hoping next year’s even better!

Some of the photos I took that have made it to Flickr so far…


Zombie Bride A respirator: bloody useless against zombies

Alex Parcel of braaaains?

One month later…

I did mention there’d be computer assemblage porn in my last post but in the end that didn’t happen. I guess at some point I’ll need to have the side off to connect some of the extra ports that were included though so there is some potential for computer pr0n yet.

I did, however perform some comparison with the i7 CPU in that machine doing some useful real world task: compiling Firefox (from mozilla-central) and Thunderbird (from comm-central) on Linux.
VMWare Workstation only allows the assignment of a maximum of two processors to a VM, but it still manages to do the job ~25% faster than my laptop (Dell Latitude D820, T7200, 2GB RAM running Ubuntu), churning out a completed build of either from scratch in about twenty minutes.

Running Debian in a virtual machine was always going to be a requirement (it’s useful for a lot of things, including some far too risky to attempt on the live machines; it’s something I also do on the X2 3800+ and something the i7’s extra 6 “processors” come in handy for).

The machine was running the Windows 7 Release Candidate, but using either VMWare or VirtualBox on there is just one big clusterfsck. The Debian installer locks solid at some point along the way in both, and using NAT in VMWare is also busted.
Fortunately I’ve got spare Vista licenses up the wazoo, so it wasn’t hard to step back to Vista Ultimate while waiting for a fixed VMWare and the release of Windows 7.

New Anubis

My just-over-three year old desktop machine was due a replacement.

    type     |                       name
 Motherboard | Asus P6T Deluxe
 CPU         | Intel i7 920 D0 Stepping Retail
 RAM         | OCZ 6GB PC3-12800C8 DDR3 (3x2GB) (OCZ3G1600LV6GK)
 Case        | Lian-Li PC-A71B
 Graphics    | XFX Radeon HD 4890 1GB
 HDD         | Western Digital Caviar Black WD640AALS
 PSU         | Antec TruePower New 650W
 DVD-RW      | LG GH22NS40
 Backup HDD  | Western Digital Caviar Green WD10EADS
 CPU Cooler  | Thermalright Ultra-120 Extreme
 Fan         | Noctua NF-P12 120mm
(11 rows)

Initially, it’ll be running some version of Windows 7 (though I also have a spare Vista license) but given it’s not released there’s no point considering how much that’ll cost for now.

The plan is to put the two WD6401AALSs in a RAID array then backup in some manner to the WD10EADS. Given the price of RAM (£70.99 for 3x2GB), I’m half-tempted to go nuts and have 12GB of RAM.

Some sort of computer-assemblage pr0n to follow – the motherboard and CPU cooler should be with me tomorrow…


Before Christmas I hired a Canon 100mm f/2.8 macro lens from Lenses For Hire. It’s a really nice lens, although, as with any macro lens the depth of field at the minimum focus distance (32cm) is rather small (~1mm at f/2.8, ~4mm at f/11)

Aeonium Daisy Doo

So I bought one…a few shots from the other day:

IMG_1131 IMG_1203-EditIMG_1252-Edit

Next order of business is a new camera bag and a decent tripod.