However, I wanted to use Chroma.js to generate the legend colors dynamically. So I needed a new getColor function.

Chroma.js has a variety of methods to return colors. The one I choose was using scale and classes. These can then be sent as variables to a getColor function to return colors to use in legend and map.

scale can be single value or an array of two colors (either as hex values or color words). In my case, the first is a light blue and the second is a darker blue. Chroma.js will then return gradients between these two colors. See colorHex variable below.

classes is an array of legend ‘breaks’ for the color gradients. For example they could be the numerical values from the Leaflet tutorial getColor function above (eg 10, 20, 50, etc). See classBreaks variable below.

This getColor function can then be used as described in the Leaflet tutorial to set choropleth polygon fill colors. It also be used similarly to create the legend by looping through the classes to get a color for each legend entry.

However there is important consideration when creating the legend. Using scale and classes, Chroma.js only returns classes – 1 colors. For example the variable classBreaks array with 10 elements will only return 9 colors. To hack this I push a dummy element (‘999’) to the array so Chroma.js would return 10 colors and then ignore the dummy element when creating the legend.

The legend code is below includes hard-coded zero (0) value set to color white (#ffffff). Looping through the classBreaks each time using getColor function to return legend color based on break value.

During the 2020 COVID-19 pandemic in Canada I wanted to get better understanding of the geographical distribution of COVID-19 related activity changes across Canada.

Google has helpfully provided freely available global “Community Mobility Reporting” which shows Google location history change compared to baseline by country, and country sub-regions. These provide changes in activity by location categories: Workplace, Retail & Recreation, Transit Stations, Grocery & Pharmacy and Parks locations, and Residential locations. For Canada it is available by province. As of April 19, data contained daily values from Feb 15 to Apr 11.

The Community Mobility Reporting data is available as a single csv file for all countries at Google Community Mobility Report site. In addition, Google provides feature to filter for specific country or country sub regions eg state or provinces, etc and download resulting PDF format.

As the COVID-19 lockdowns occurred across Canada you would expect that people were less likely to be in public spaces and more likely to be at home. The Community Mobility Reporting location history allows us to get some insight into whether or not this happened, and if it did, to what degree and how this changed over time.

I also created an Excel version of this heat map visualization using Pivot Table & Chart plus conditional formatting. This Excel file, described in more detail below, is available in the Github repository.

More detail and screenshots of visualizations is provided below:

Heatmaps
Heatmaps are grids where columns represent date and rows province/territory. Each heatmap is a grid representing a single mobility report category. The grid cell colors represent value of percent change which could be positive or negative. Changes can be observed as lockdowns occurred where locations in public areas decreased relative to baseline. Inversely, residential location increased relative to baseline as people sheltered in place at their homes.

1) Heatmap created using Excel / Power Query: For this heatmap visualization the global csv data was transformed using Excel Power Query. The Excel file has two Pivot Table and Chart combos. The Excel files and Power Query M Code are in the repository. Excel files are available in Github repository.

2) Heatmap created using D3.js: For this heatmap visualization the global csv data was transformed using Excel Power Query. The heatmap visualization was created using slightly modified code from ONSvisual.

Bar charts
These were created using Excel to visualize percent change by Province/Territory and location category using Excel / Power Query. These allow comparison between provinces by date and category. This Excel / Power Query file can be used for analytical purposes to slice and dice global data by date, country, sub region 1 & 2 and category. Excel files are available in Github repository.

Recently I developed a Django web application for a friend. I wanted to be able to get it up and out onto the internet so she could view it and get others to do user testing.

Normally I host my web applications on my Webfaction account. But I wanted to keep this project separate so I went over to Digital Ocean and spun up a virtual server to host the Django web application.

We didn’t have a domain name for this site yet so just used the Digital Ocean provided IP address to access the site and created a Self-Signed SSL Certificate for Apache and forced all traffic to HTTPS to enable secure browsing.

I have used Digital Ocean for development purposes and to share projects with others from time to time. I have also used Vultr.

Vultr – use this link to open account and get $20 credit (summer 2016 promotion).

Both have monthly price plans that can also be pro-rated depending on how many days you use them so you can spin them up for a few minutes or days and pay accordingly. I am using the basic plans on both which are good enough for my demonstration development web application purposes.

Beware though that unlike the huge number of shared hosting such as Webfaction, Digital Ocean and Vultr virtual servers are yours to own and manage. You get root access to the server and are responsible for managing the server, OS, how it is configured and secured, etc.

That is definitely something you should consider. There are lots of new things to learn. However there are tutorials and lots of help available on the internet so l recommend at least attempting to setup either or both to learn first hand. The cost is small and risk very low. Knowledge gained and benefits are well worth it.

Many comparison reviews seem to prefer Vultr over Digital Ocean for performance and features. However Digital Ocean was first and has done excellent job of documentation for getting many web applications setup. But Vultr does have good documentation too. You can’t really go wrong with either for development projects. If you don’t like one, just switch to the other!

One of my Django applications includes a form where user can enter and submit a property address.

The user submitting the form might not know the postal code so I left it optional. However the postal code is a key piece of information for this particular application so I wanted to ensure that I was getting it.

I also wanted to geocode the address immediately to get the address latitude and longitude so it could be shown to user on a Leaflet.js map.

There are lots of free geocoding services for low intensity usage but I ended up using Google Geocoding which is free under certain usage level. You just need to create a Geocoding API project and use the credentials to set up the geocoding.

To interact with the geocoding API I tried Python gecoding modules geopy and geocoder but in the end just used Python Requests module instead as it was less complicated.

When the user clicked the Submit button, behind the scenes, Requests submitted the address to Google’s Geocoding API, gets the JSON response containing the latitude, longitude and postal code which are then written to the application database.

I will update the code in future to check if the user’s postal code is correct and replace it if it is incorrect. Will wait to see how the postal code accuracy looks. Making geocoding API requests too often could bump me over the free usage limit.

The Django View that contains the code described above is shown below.

I have a Django web application that needed an interactive map with shapes corresponding to Canadian postal code FSA areas that were different colors based on how many properties were in each FSA. It ended up looking something like the screenshot below.

This exercise turned out to be relatively easy using the awesome open-source Javascript map library Leaflet.js.

One of the biggest challenges was finding a suitable data source for the FSAs. Chad Skelton (now former) data journalist at the Vancouver Sun wrote a helpful blog post about his experience getting a suitable FSA data source. I ended up using his BC FSA data source for my map.

Statistics Canada hosts a Canada Post FSA boundary files for all of Canada. As Chad Skelton notes these have boundaries that extend out into the ocean among other challenges.

Here is a summary of the steps that I followed to get my choropleth map:

1. Find and download FSA boundary file. See above.

2. Convert FSA boundary file to geoJSON from SHP file using qGIS.

3. Create Django queryset to create data source for counts of properties by FSA to be added to the Leaflet map layer.

4. Create Leaflet.js map in HTML page basically the HTML DIV that holds the map and separate Javascript script that loads Leaflet.js, the FSA geoJSON boundary data and processes it to create the desired map.

1. How the legend colors and intervals are assigned is changed but otherwise legend functions the same.

2. Significantly changed how the color for each FSA is assigned. The tutorial had the color in its geoJSON file so only had to reference it directly. My colors were coming from the View so I had to change code to include new function to match FSA’s in both my Django view data and the geoJSON FSA boundary file and return the appropriate color based on the Django View data set count.

Happened to see Cuba Gooding Jr’s first tweet about 30 minutes or so after he created it.

Update: @cubagoodingjr is no longer active so not getting tweets from it any longer

At the time his profile said he had 559 followers. A few minutes later I refreshed his profile and saw the follower count had increased to 590 and every few minutes it kept increasing. By the end of the day he had about 4,000 followers.

I thought it would be interesting to track how his follower count changed going forward. So I used the Twitter API to get his follower count once per day, save the data, and then created a web page to visualize his follower count per day.

After 2 days Cuba had about 7,000 followers which averaged out to about 175 new followers per hour. However, the number of new followers slowed down quickly to 30 or so new followers per day and after about 3 months he only gets about 10 new followers per day. In fact, some days, he has net loss of followers, eg more people unfollow than him, than new follows on that day.

For the technically inclined, I setup an automatic retrieval of Cuba’s Tweets using Twitter’s API and the Tweepy Python module scheduled to run daily using a cron job.

The follower counts get put into a database. I created a PHP web page application to retrieve the data from the database, and create a web page that includes the Google Charts API to create a simple line chart to show Cuba’s regularly updated follower count by day.

Wirebase was a concept created in 2001 before broadbrand, wired or wireless was widely available.

The concept was shopped around to a bunch of Vancouver area VC’s and potential investors ultimately got lost in the dot.com bubble crash.

The idea was to create ‘wirebases’ that were a modular component integrated into an automated banking machine or deployed as a stand-alone unit in a public area.

At the time I was contracting with a western Canadian credit union banking services bureau who was in the process of building its first national automated banking machine switching system so we were privy to the details of ATMs and banking machine switching. Automated banking machines have computers inside. At the time they were regular PC’s and could have peripherals connected just like any ordinary PC eg internal slots, etc.

The Wirebase set-up is illustrated in diagram below.

The Wirebase module is connected to the internet via a proprietary network solution that is similar to that used by the global ABM network.

Here is a demo web application / SaaS called SpeedVisit that individuals and companies can use to easily and quickly record visits from people to record basic customer demographic information.

Features include:

Groups – create a group so you add other users, locations and referrers to it.

Users – add additional users to your group as you want.

Locations – add locations where visits will take place.

Referrers – add referrers to record how visitor was referred.

The definition of ‘visit’ is pretty broad. It could be customers, co-workers, classmates, attendees in many situations including:

Retail Stores

Staff Meetings

Convenience Stores

Restaurants

Street Vendors

Art Studios

Non-Profits

Community Clubs

Sports Events

Social Events

Trade Shows

The demographic information that you can record currently includes:

Gender

Age

Zipcode or postal code

Income

Prev Visit

Referrer

Location

Purchase

Email

Comments

The site is mobile friendly responsive design so it works on phones, tablets, laptops and desktops.

The data is totally portable with easy export of all visits to csv file. It is setup with SSL so your data will be securely transferred between your browser and the server.

This is a screenshot of the top of the visit capture page which is the central feature of the application and workflow. Simply use radio buttons and choosers to select categorize the visitor and add record. The page will then be ready for the next visitor.

The data is available for anyone else to view it, to view aggregated statistics, etc. This is screenshot of the recorded visit data.

You can sign into the demo account to see what sample account and visits look like too.

Car2Go provides developer access to their real-time vehicle location and parking availability, and service area boundaries for all of their city locations around the world.

I am using the Car2Go API to power a website called www.cardivvy.com that I created for my own use after struggling with the official Car2Go map on their site when using my mobile phone. I wanted a way to find cars by street locations.

The site includes link to get vehicle location, parking lot location and service area boundaries on a Google Map. You can click into each city, view available cars alphabetically by street location or parking lot location, and then click through to car current location on a Google Map. Each city’s Car2Go service area can be viewed on a Google Map.

I am waiting for ZipCar to get their API up and running which should be available Fall 2015, then I will integrate that into the cardivvy site too so I can see where both cars are available.