]]>
http://piotr.gabryjeluk.pl/blog:monitoring-freezer-temperatureMonitoring Freezer Temperaturehttp://piotr.gabryjeluk.pl/blog:monitoring-freezer-temperature
<p>For some time we've had suspicions the freezer in our apartment doesn't work correctly. Recently my wife reached to it to get some ice cubes but found water in the form instead. That clearly meant the temperature was over the freezing point and stayed there for some time. She put a thermometer on the fridge, put the sensor into the freezer and started watching the readings. The temperature reached around -20°C and stayed there, fluctuating between -15°C and -20°C.</p>
<p>by <span class="printuser avatarhover"><a href="http://www.wikidot.com/user:info/gabrys" ><img class="small" src="http://www.wikidot.com/avatar.php?userid=2462&amp;amp;size=small&amp;amp;timestamp=1506461316" alt="Gabrys" style="background-image:url(http://www.wikidot.com/userkarma.php?u=2462)" /></a><a href="http://www.wikidot.com/user:info/gabrys" >Gabrys</a></span></p>
Sun, 25 Sep 2016 22:11:15 +0000
For some time we've had suspicions the freezer in our apartment doesn't work correctly. Recently my wife reached to it to get some ice cubes but found water in the form instead. That clearly meant the temperature was over the freezing point and stayed there for some time. She put a thermometer on the fridge, put the sensor into the freezer and started watching the readings. The temperature reached around -20°C and stayed there, fluctuating between -15°C and -20°C.

The webcam reader

I decided, it's time to record the temperature, so I brought the laptop, adjusted the screen angle, set the brightness to max and took a shot from the webcam to see if the thermometer is in the view of the camera:

mplayer tv:// -vo png -frames 3

This takes 3 frames from the webcam and saves them as PNG files: 00000001.png, 00000002.png, and 00000003.png. If you wonder why I need 3 frames, the first one is always under- or overexposed and out of focus, the second is usually OK and the third is almost always good (as for a webcam), so long story short, I'm giving some time to the camera to auto-adjust its settings.

export DISPLAY=:0 selects the default X display, so I can run the command from an SSH shell

Then we have some hard-coded paths to tmp and out directories

ristretto ./white.png & starts a image-viewer and shows a white.png file which is a big all-white file. This makes the screen display enough white, so the LCD thermometer is properly lit

I use xte to move the mouse around: this blocks the automatic screen dimming, so the screen continues to be 100% bright

Then we have the mplayer command from before, just changed to 5 frames (just to be sure) and -noconsolecontrols. mplayer kind of hangs when you start it without a proper terminal on stdin, unless you pass this option

Then I copy the 5th frame to the output file, which has the current date and the time in its path

Finally I close the ristretto process I opened before

I run this command in a while loop like that:

while sleep 28 ; do ./dup.sh ; done

./dup.sh takes around 2 seconds, so I have about 2 photos a minute.

I put that to a screen session.

Now in the out directory I serve the files using Python HTTP server:

python -m SimpleHTTPServer

This starts an HTTP server on port 8000 and serves all the files and creates an index of them when requesting /.

The processing

How easier to copy the HTTP-exposed files than using a plain-old wget command?

wget -r http://dell:8000/

This creates directory dell:8000 and downloads all the PNG files to it.

Once you have that directory you may later want to update it with only the newer files:

rm dell\:8000/index.html wget -nc -r http://dell:8000/

The -nc switch makes wget ignore files that are already there. We explicitly remove the index.html so it downloads a new list (that includes newer files).

I wanted to make a program that gets a photo of the 7-segment display and reads it producing a number that along with the date could be used to graph the temperature over time.

First, let's use ImageMagick to:

crop the picture to only the interesting part: the OUT reading

shear the picture to make the LCD segments vertical and horizontal, not skewed

bump contrast/gamma to remove the noise

auto-adjust the visual properties of the picture so it's more consistent across different times (like night vs day)

make it black and white — turned out it works best if I only use the Green channel, R and B turned out more noisy than G.

You can see the bottom segment is not really visible, but that's OK, all the digits can be properly read without that segment being visible, so we'll only consider the 6 segments that we can easily read.

Now comes the meat, the Python program reading the file and returning the temperature:

#!/usr/bin/env python#encoding: utf-8fromPILimportImage# Points that hold each segment:# d1 is the first digitd1_sA = [(12,1), (13,1), (14,1), (15,1)]d1_sB = [(18,2), (18,3), (18,4), (18,5)]d1_sC = [(18,8), (18,9)]d1_sE = [(11,8), (12,9), (11,9), (12,8)]# Don't need this, as we only ever have 1 and 2 in the first placed1_sF = []d1_sG = [(13,7), (14,7), (15,7), (16,7)]# d2 is the second digitd2_sA = [(24,1), (25,1), (26,1), (27,1)]d2_sB = [(29,2), (29,3), (29,4), (29,5)]d2_sC = [(29,8), (29,9)]d2_sE = [(22,8), (22,9)]d2_sF = [(22,2), (22,3), (22,4), (22,5)]d2_sG = [(24,6), (25,6), (26,6), (24,7), (25,7), (26,7)]# d3 is the small digit (first after the decimal point)d3_sA = [(35,4), (36,4), (37,4)]d3_sB = [(38,5), (38,6), (38,7)]d3_sC = [(38,9)]d3_sE = [(34,9)]d3_sF = [(34,5), (34,6), (34,7)]d3_sG = [(35,8), (36,8)]# Now the tricky part, for each segment I define a threshold below which I consider it "lit"# 0 means completely black, 255 is white.# Because of uneven lighting, for each segment (and digit, but we ignore that) the value is different.tA = 200tB = 170tC = 120tE = 115tF = 140tG = 170# A threshold for the "-" signtSIGN = 200# All of those were obviously updated on the go to match the files# A nice debugging function that prints which segments the code considers lit# Also if you wondered what A, B, C, E, F, G meant, here's the schematic:defprint_digit(segs): # Only print this if there's a second argument to the script passediflen(sys.argv) == 2: returnprint''' {A}{A} {A}{A} {F} {B} {F} {B} {F} {B} {G}{G} {G}{G} {E} {C} {E} {C} {E} {C} '''.format(A='###'if'A'insegselse'', B='###'if'B'insegselse'', C='###'if'C'insegselse'', E='###'if'E'insegselse'', F='###'if'F'insegselse'', G='###'if'G'insegselse'', )# This doesn't do anything spectacular, just passes the coordinates for each of the segments and the average value of the first 9 pixels of the image: (0,0) - (3,3)defread_digit(im, pointsA, pointsB, pointsC, pointsE, pointsF, pointsG, avg9px): segs = ''segs += 'A'ifread_segment(im, pointsA, tA-avg9px)else''segs += 'B'ifread_segment(im, pointsB, tB-avg9px)else''segs += 'C'ifread_segment(im, pointsC, tC-avg9px)else''segs += 'E'ifread_segment(im, pointsE, tE-avg9px)else''segs += 'F'ifread_segment(im, pointsF, tF-avg9px)else''segs += 'G'ifread_segment(im, pointsG, tG-avg9px)else''print_digit(segs)# A list of all digits and their representation on the 7-segment display:ifsegs == 'ABCEF': return0ifsegs == 'BC': return1ifsegs == 'ABEG': return2ifsegs == 'ABCG': return3ifsegs == 'BCFG': return4ifsegs == 'ACFG': return5ifsegs == 'ACEFG': return6ifsegs == 'ABC': return7ifsegs == 'ABCEFG': return8ifsegs == 'ABCFG': return9# A special case for the first digit, it doesn't display "0", just doesn't display any segmentsifsegs == '': return0# The function that takes the PIL image object, gets a list of (x,y) coordinates# and checks if the average value of them is smaller than the threshold passed: segment "on"# For 0 points passed it returns False: segment "off"defread_segment(im, points, threshold=128): val = 0forpointinpoints: val += im.getpixel(point)returnval < threshold * len(points)# Nothing interesting in here, just for printing the date from the file namedefget_date(file_name): time_str = file_name.split('-')[-1].replace('.png', '')d1, d2, m1, m2, y1, y2, y3, y4 = file_name.split('/')[-1].split('-')[0]return'{}{}/{}{}/{}{}{}{} '.format(m1, m2, d1, d2, y1, y2, y3, y4) + '{}{}:{}{}:{}{}'.format(*list(time_str))# A bunch of imports in the middle of the file# Don't do that at home ;-)importsysimportsubprocess# This will be for example: pngs/dell\:8000/24092016-101055.pngfile_in = sys.argv[1]# And this: processed-pngs/dell\:8000/24092016-101055.pngfile_out = 'processed-' + file_in# Calling the ImageMagick as discussed in the articlesubprocess.check_call(['convert', file_in, '-gamma', '1.5', '-auto-level', '-brightness-contrast', '35x80', '-shear', '-14x0', '-crop', '260x70+320+227', '-channel', 'Green', '-separate', '-resize', '40x20', file_out])# Reading what it createdim=Image.open(file_out)# Now a thing I added at some point later.# Because of different lighting throughout the day and because the ImageMagick command# above was not good compensating for it (in spite of auto-level and high contrast)# some of the images were darker than the others. In most images (the ideal scenario for the code)# the first 9 pixels of the image (0,0) to (3,3) were just white (or very close), but in those darker# images the whole image was darker and I used the first 9 pixels to detect how much darkerfirst9px = im.getpixel((0,0)) + im.getpixel((0,1)) + im.getpixel((0,2)) \ + im.getpixel((1,0)) + im.getpixel((1,1)) + im.getpixel((1,2)) \ + im.getpixel((2,0)) + im.getpixel((2,1)) + im.getpixel((2,2))# This is the compensation, for most of the images it's 0 or very few, but for darker images, it's moreavg9px = 255-first9px/9num = '{}{}{}.{}'.format('-'ifread_segment(im, [(2,6), (3,6), (4,6)], tSIGN-avg9px)else'+', read_digit(im, d1_sA, d1_sB, d1_sC, d1_sE, d1_sF, d1_sG, avg9px), read_digit(im, d2_sA, d2_sB, d2_sC, d2_sE, d2_sF, d2_sG, avg9px), read_digit(im, d3_sA, d3_sB, d3_sC, d3_sE, d3_sF, d3_sG, avg9px), )if'None'notinnum: printget_date(file_in) + '\t' + num# If there's a second argument to the script passed, show the original image for comparisoniflen(sys.argv) > 2: Image.open(file_in).show()

Even though this script is so simple, after tweaking the thresholds, most of the files were recognized correctly, those that weren't had one or more non-recognized digits, so they were easy to filter out. For over 1-day worth of images, only 2 or 3 minutes had a missing reading.

I loaded the data to LibreOffice and generated this pretty graph:

Timelapse video

Another approach visualizing the data was to create a video.

The plan:

Annotate the images with the recorded time

Compose the video from single frames, putting 60 frames per a second of the resulting video

60 frames a second with roughly 2 frames captured a minutes means a day of recording is compressed to:

]]>
http://piotr.gabryjeluk.pl/blog:work-around-buggy-soundcardWorking around a buggy sound cardhttp://piotr.gabryjeluk.pl/blog:work-around-buggy-soundcard
<p>I found the laptop I bought recently has a weird thing with its soundcard. It plays music OK, it also allows you to change the mixer levels, nothing special, but when you try changing mixer setting while the music is playing the music starts getting choppy and the process handling the mixer tend to freeze. Also you cannot pause the music, sometimes it repeats the same sample over and over and it gets very annoying.</p>
<p>by <span class="printuser avatarhover"><a href="http://www.wikidot.com/user:info/gabrys" ><img class="small" src="http://www.wikidot.com/avatar.php?userid=2462&amp;amp;size=small&amp;amp;timestamp=1506461316" alt="Gabrys" style="background-image:url(http://www.wikidot.com/userkarma.php?u=2462)" /></a><a href="http://www.wikidot.com/user:info/gabrys" >Gabrys</a></span></p>
Tue, 02 Dec 2014 23:34:29 +0000
I found the laptop I bought recently has a weird thing with its soundcard. It plays music OK, it also allows you to change the mixer levels, nothing special, but when you try changing mixer setting while the music is playing the music starts getting choppy and the process handling the mixer tend to freeze. Also you cannot pause the music, sometimes it repeats the same sample over and over and it gets very annoying.

My approach to this issue was to use PulseAudio software mixer capabilities, so the hardware mixer is never adjusted (other than for the initial setup).

Step 1. Getting PulseAudio to use the software mixer

I found no good documentation on this other than PulseAudio uses paths to convert the desired volume to the hardware mixer settings and if it lacks the ability to do so it would turn the volume up/down using software. Reading between the lines I realized if I remove the paths I will get what I want. Just do this and restart pulseaudio and you'll have PulseAudio doing all the mixing in software:

Verifying it works

Open two terminal windows. In each of them open alsamixer. Both should display very simple controls and say Card: PulseAudio. Use F6 to switch to the real card in one of the windows. Usually the cards have plethora of controls, so should yours.

Now change the output volume in the alsamixer still saying "Card: PulseAudio". If the volume reported by the other alsamixer does not change, it works as expected!

If you now rename the paths directory back and restart the pulseaudio process, updating the PulseAudio volume should update the real device volume as well.

Step 2. Integrating this with XFCE (or other DE)

The way xfce4 handles the volume is two-fold:

There's a xfce4-volumed daemon. It only controls the key bindings. It binds volume up, volume down and mute keys to control the real device controls

There's a panel applet called "Audio Mixer" that shows the current volume level and lets you change the volume as well.

Both component are optional and both seem to not work with PulseAudio (even through PulseAudio ALSA emulation which tricks alsamixer for instance).

We're going to overcome this problem by replacing these components with two other programs.

pa-applet will replace the "Audio Mixer" applet. The difference is the pa-applet docks to the systray, so is not XFCE-specific. The downside is we don't have control over the specific position of the icon, all we know is it will appear in the systray (so don't forget to have the systray applet added to the panel).

In order to install this software you need to clone the source, and run ./autogen.sh, ./configure, make, sudo make install. You will get information about deps missing, for instance:

configure: error: Package requirements (glib-2.0) were not met: No package 'glib-2.0' found

You need to find the proper *-dev package for each unmet dependency (for this one it will be libglib2.0-dev).

I came across one additional problem. The software seems not updated after some GTK+ 3 updates and it uses a deprecated function. Because the code is configured in a way that every warning stops the compilation, it wouldn't let you compile the code unless you open the src/Makefile file and then find and remove the -Werror flag from it.

Add the executable bit, restart the laptop and all should be working correctly at this moment.

Setting the hw mixer controls

Since PulseAudio won't update the hw controls after you follow up this little howto, it's quite important to set all the important channels to 100% (or other number up to your taste). The way to update the hw controls was described before.

Open the terminal, type alsamixer and hit F6. This will give the list of ALSA devices besides the PulseAudio emulation, your real sound cards should display as well. Choose the right one and you should be given access to tweak the real card controls.

Note if you removed the PulseAudio paths, the PulseAudio controls wouldn't update when you tweak the hw controls.

]]>
http://piotr.gabryjeluk.pl/blog:share-screen-over-http-and-mjpegShare screen over HTTP and M-JPEGhttp://piotr.gabryjeluk.pl/blog:share-screen-over-http-and-mjpeg
<p>Here's a script to share your screen in a way that you can use a simple web browser to see it:</p>
<p>by <span class="printuser avatarhover"><a href="http://www.wikidot.com/user:info/gabrys" ><img class="small" src="http://www.wikidot.com/avatar.php?userid=2462&amp;amp;size=small&amp;amp;timestamp=1506461316" alt="Gabrys" style="background-image:url(http://www.wikidot.com/userkarma.php?u=2462)" /></a><a href="http://www.wikidot.com/user:info/gabrys" >Gabrys</a></span></p>
Mon, 13 Oct 2014 20:14:05 +0000
Here's a script to share your screen in a way that you can use a simple web browser to see it:

After spending some time figuring out why this doesn't work out of the box, I learned most of the tutorials are not right about one thing. They claim to share the network in the same (sub)network you don't need any special configuration on the client. That's wrong. You need to install cups-browsed package on the client:

]]>
http://piotr.gabryjeluk.pl/blog:ad-filtering-on-adroid-phonesAd Filtering On Adroid Phoneshttp://piotr.gabryjeluk.pl/blog:ad-filtering-on-adroid-phones
<p>Recently I installed a <a href="http://forum.xda-developers.com/showthread.php?t=1403113">CM9-based Ice Cream Sandwitch ROM</a> on my HTC Desire and it's cool, but I found one thing extremely frustrating: the ads. They are everywhere and slow things down. I haven't noticed them before, because I always used AdAway (and AdFree before that), but with this ROM those ad blockers refused to work.</p>
<p>by <span class="printuser avatarhover"><a href="http://www.wikidot.com/user:info/gabrys" ><img class="small" src="http://www.wikidot.com/avatar.php?userid=2462&amp;amp;size=small&amp;amp;timestamp=1506461316" alt="Gabrys" style="background-image:url(http://www.wikidot.com/userkarma.php?u=2462)" /></a><a href="http://www.wikidot.com/user:info/gabrys" >Gabrys</a></span></p>
Fri, 06 Apr 2012 21:44:43 +0000
Recently I installed a CM9-based Ice Cream Sandwitch ROM on my HTC Desire and it's cool, but I found one thing extremely frustrating: the ads. They are everywhere and slow things down. I haven't noticed them before, because I always used AdAway (and AdFree before that), but with this ROM those ad blockers refused to work.

The problem seems to be the /system/etc/hosts file is real file on flash filesystem and if modified it can be only as big as 2 kilobytes, so regular host-based ad-blocking can't be used here. For some reason creating symbolic link to /data also fails.

I though the ad blocking can be done differently — by DNS server. If we can't put much to /etc/hosts file, let's set up a DNS server (say dnsmasq) that reads the hosts-to-be-blocked file and returns 127.0.0.1 for them and act as a proxy-DNS server for every other domain. But there was problem in settings DNS server in Android settings. It seems the ROM has hardcoded OpenDNS servers and you can only override this per each WiFi network. No way to do this for 3G.

But then I found, the ROM comes with iptables executable (possibly from busybox, haven't really checked) and more interestingly iptables-capable kernel. After a bit of trial and error (and Google searching) I found commands to redirect all DNS traffic comming from the phone to one predefined server:

The shell throws out one-line TODO (implement getprotobyname) errors but this is harmless to us. This is effective untill you flush iptables (with iptables -t nat -F) or reboot the phone.

At first I wanted to set up the ad-blocking DNS server myself, but I said, hey, someone just had to do it. I wasn't wrong, there were at least two relevant Google results for this: AdBarricade and FoolDNS. So you need to choose one and you're set.

But how do you compare those two? I mentioned I used AdAway and AdFree before and their block lists are very good (personal opinion), so I decided to download the hosts file generated by AdAway (it has the option to generate the hosts file anywhere, not just in plain /etc/hosts), shuffle it a bit and for top 1000 hosts check whether AdBarricade or FoolDNS block them. The results were:

For our purpose (measure by how many of AdAway-listed hosts are blocked) FoolDNS is a winner. That's why 87.118.110.215 was used in the iptables scripts above.

Once we have our DNS-based iptables-powered ad filtering, we may want to make it persistent. BCM ROM has a script that's called during boot process, it's /etc/boot.d/99bash and we can just append two abovementioned lines to it, to make the boot process run them each time the phone boots.

It's worth noting, that libqt4-webkit is no longer a valid package in experimental, but is a transitional package to pull libqtwebkit4, so now QtWebKit and Qt versions are quite independent. You can use Qt 4.6 or 4.7 with the QtWebKit 2.0.

I needed QtWebKit 2.0 on ARM, so I built it from deb-src. It went OK with no modifications of the source package.

When it comes to Qt's 4.7 beta release I needed to patch it slightly, to disable precompiled headers (for some reason, after generating them, they were not found by installation), —fast configure flag (not sure if this is needed) and I needed to tweak src/core/io/io.pri file, because for some reason linux-* {} section of qmake file was not properly applied. Also to properly compile stuff, I added Qt3 support.