When running go generate comments like this are parsed and the given command is executed. In our case go-bindata -pkg main -o bindata.go frontend/

go-bindata is a binary that generates a new .go file, containing the bytes of all specified files, making them accessible by name. That’s where the previously undefined function Asset comes from.

One can easily imagine using this approach to generate self-contained binaries which serve an entire JavaScript single page application - without the need for any dependencies being installed on the deployment target.

Having self contained binaries helps with a bunch of problems related to deployment, testing and provisioning - and you basically get this for free with go, without tedious management of make- or rakefiles.

While there are plenty of more good things to say about go I’ll leave it at this - it’s a great language, and you should hack with it sometime in the future, too.

packagemainimport("encoding/xml""fmt""io""net/http")// these structs reflect the eurofxref xml data structuretypeenvelopstruct{Subjectstring`xml:"subject"`Senderstring`xml:"Sender>name"`Cubes[]cube`xml:"Cube>Cube"`}typecubestruct{Datestring`xml:"time,attr"`Exchanges[]exchange`xml:"Cube"`}typeexchangestruct{Currencystring`xml:"currency,attr"`Ratefloat32`xml:"rate,attr"`}// EUR is not present because all exchange rates are a reference to the EURvardesiredCurrencies=map[string]struct{}{"USD":struct{}{},"GBP":struct{}{},}vareurHistURL="http://www.ecb.europa.eu/stats/eurofxref/eurofxref-hist-90d.xml"varexchangeRates=map[string][]exchange{}funcdownloadExchangeRates()(io.Reader,error){resp,err:=http.Get(eurHistURL)iferr!=nil{returnnil,err}ifresp.StatusCode!=http.StatusOK{returnnil,fmt.Errorf("HTTP request returned %v",resp.Status)}returnresp.Body,nil}funcfilterExchangeRates(c*cube)[]exchange{varrates[]exchangefor_,ex:=rangec.Exchanges{if_,ok:=desiredCurrencies[ex.Currency];ok{rates=append(rates,ex)}}returnrates}funcupdateExchangeRates(dataio.Reader)error{vareenvelopdecoder:=xml.NewDecoder(data)iferr:=decoder.Decode(&e);err!=nil{returnerr}for_,c:=rangee.Cubes{if_,ok:=exchangeRates[c.Date];!ok{exchangeRates[c.Date]=filterExchangeRates(&c)}}returnnil}funcinit(){ifreader,err:=downloadExchangeRates();err!=nil{fmt.Printf("Unable to download exchange rates. Is the URL correct?")}else{iferr:=updateExchangeRates(reader);err!=nil{fmt.Printf("Failed to update exchange rates: %v",err)}}}funcmain(){fmt.Println("%v",exchangeRates)}

There are a few things to note:

we’re using a map[string]struct{} to define which currencies we’re interested in. This adds a little more code since we have to filter the exchange rates, but also cuts down memory usage.

we cache all exchange rates in memory and never update them. Since we’re dealing with historic data only this shouldn’t be a problem.

Next, we add a tiny HTTP wrapper:

// accept strings like /1986-09-03 and /1986-09-03/USDvarroutingRegexp=regexp.MustCompile(`/(\d{4}-\d{2}-\d{2})/?([A-Za-z]{3})?`)funcexchangeRatesByCurrency(rates[]exchange)map[string]float32{varmappedByCurrency=make(map[string]float32)for_,rate:=rangerates{mappedByCurrency[rate.Currency]=rate.Rate}returnmappedByCurrency}funcnewCurrencyExchangeServer()http.Handler{r:=http.NewServeMux()r.HandleFunc("/",func(whttp.ResponseWriter,req*http.Request){if!routingRegexp.MatchString(req.URL.Path){w.WriteHeader(http.StatusBadRequest)return}parts:=routingRegexp.FindAllStringSubmatch(req.URL.Path,-1)[0]requestedDate:=parts[1]requestedCurrency:=parts[2]enc:=json.NewEncoder(w)if_,ok:=exchangeRates[requestedDate];!ok{w.WriteHeader(http.StatusNotFound)return}varexs=exchangeRates[requestedDate]ifrequestedCurrency==""{enc.Encode(exchangeRatesByCurrency(exs))}else{for_,rate:=rangeexs{ifrate.Currency==parts[2]{enc.Encode(rate)return}}w.WriteHeader(http.StatusNotFound)}})returnhttp.Handler(r)}funcmain(){log.Printf("listening on :8080")log.Fatal(http.ListenAndServe(":8080",newCurrencyExchangeServer()))}

Note the new call to runtime.GC() which forces a garbage collection. This is important to get a correct memory usage report, otherwise we’d get varying and thus wrong memory usage reports.

Turns out the memory footprint is acceptable, without any optimizations:

all data since 1999, all currencies: 5.137 MB

all data since 1999, only USD and GBP: 0.836 MB

last 90 days, all currencies: 0.211 MB

last 90 days, only USD and GBP: 0.137 MB

Let’s wrap it up:

In less than 200 lines of code we managed to create a fully functional currency exchange rates api. Compared to the original version we do not cache exchange rates to disk, in favor of keeping everything in memory. This reduces the total lines of code considerably and also removes the need for a separate importer binary.

The API is not perfect, however:

the data source does not contain data for weekends as well as holidays. For anything production ready we’d want to write a fallback which serves old exchange rates instead of just returning a 404.

However, I’ll leave it for now. You can find the entire source in this gist.

1: If anyone knows a higher precision, open data source for history currency exchange rates I’d love to know. Leave a comment.

In the last post regarding open source side projects I presented traq, a CLI time tracking application I use for my everyday work.

Today I decided to present and walk you through the setup of umsatz, my open source accounting application. But let’s first introduce umsatz:

umsatz was written to ensure that my book keeping informations are kept safe - that is only locally accessible, not from the internet.

It’s not that my information are particularly sensitive. It’s just that I like control over my data and I do not trust third parties like google or apple to keep my information safe.

I use umsatz to track all my freelance related incomes and expenses, organize them by account, and get a basic overview about what’s due. Some more details about umsatz are available at umsatz.deployed.eu.

Now, let’s set umsatz up.

Assuming you want to run umsatz on a Raspberry PI and you’ve got all rpi accessoires at hand, all you need is an empty usb stick as secondary backup storage.