I am a contributor to GIMP, CrossFire,
XSession, RPlay and
some other programs. Besides some contributions to the code
of GIMP, I also try to take care of the GIMP
bugs in Bugzilla and keep the GIMP web site on life
support

A few years ago (1993-1994), I was the main
author of the first Doom level editor that was able to
create new levels from scratch: DEU
(Doom Editing Utilities) and I co-authored the Unofficial Doom
Specs and later the Unofficial
Quake Specs (reverse-engineering these file formats was
fun, especially the BSP part). I wish I had some spare
time left to maintain DEU, which is getting quite old now.
Not to mention finishing and distributing my GIMP plug-ins
for WAD/PAK files.

Recent blog entries by Raphael

Thanks to Alex Graveley for linking to a very interesting new research result from Ed Felten and others, explaining that encryption keys can be easily retrieved from the memory of a running system by power-cycling it. Contrary to what most people think, it is possible to retrieve almost all data from a DRAM chip several seconds or minutes after a power cut. Many companies (including the one I work for) require hard disk encryption for all laptop computers in order to ensure that any sensitive information stored on the machine cannot be retrieved even if the machine is stolen.

However, the report published by the Princeton researchers shows that if the machine is running or is in suspended mode, then an attacker can steal it and get both the encrypted hard disk and the decryption key. This key must be stored in the RAM of the running system so that it can access the files on disk. The attack consists in briefly removing the power from the machine and rebooting it using a small program that will save the contents of the memory to some external storage. Once this is done, the hard disk encryption key can be retrieved from the saved data. Some machines have a mechanism that clears their memory after a reboot (this is often the case with ECC memory). But even in this case, it is also possible to retrieve the decryption key by cooling down the memory chips, removing them from the machine and inserting them into another machine that will extract the valuable information.

This is a serious problem for anybody who relies on hard disk encryption for protecting confidential data: an attacker who has physical access to the machine (even for just a brief moment) may be able to retrieve the decryption key and get full access to the contents of the disk. Leaving the machine unattended in suspended mode or with the screen locked may be the same as leaving it fully open.

There are not many ways to avoid this problem, besides preventing physical access to the machine or using some software or hardware self-destruction mechanisms in case the machine is tempered with. If the machine is suspended, the research paper (PDF) explains that it may be possible to clear or obscure the key before suspending the system so that it cannot be retrieved easily. The user would then have to re-enter the disk encryption key before resuming the system, or enter a password to decrypt that key. This is not trivial to implement because the system cannot read any information from the encrypted disk until the user has entered the right password, so all software needed for entering passwords and setting input and output devices to a known state must be available before the system is resumed.

It is not possible to implement the same protection when the screen is simply locked, because there will usually be some software that wants to access the hard disk while the screen is locked. The paper describes a way to make it slightly more difficult to retrieve the key from RAM: if the system does not need to access the disk for a while, it could scramble the key (in a reversible way) and spread it over a larger area in memory in such a way that a single bit error over the whole area would make the key unusable. As soon as the key is needed again, it is reassembled and used until it is not needed anymore. This can provide some limited protection because the cold boot attack does not always get a perfect copy of the RAM. But even with this additional level of protection, it looks like a locked screen is a very weak protection against data theft.

Besides a full text version of the book, that mini-site also contains some simple illustrations made by the author. Looking at the bottom of the “Outer” image, I saw the following copyright notice: “Vernor Vinge, 2006 (Using the GIMP)”

This is not entirely surprising, considering that he is a friend of Free Software and even a member of the award committee for the FSF Award for the Advancement of Free Software. But still, it is nice to see a well-known SF author who is also a GIMP user. As a GIMP developer and SF fan, this made me happy.

I doubt that Vernor Vinge will ever read this blog, but for the next illustrations I would recommend using Inkscape for the line art and text. This would lead to better results than using GIMP alone.

After reading Xan’s article The Cyclomatic Horror From Outer Space analyzing the complexity of some GTK functions, I was curious and I wanted to run the same test in the GIMP source tree in order to see what parts of the code would be the hardest to test. This test is very simple and can be summarized as counting the number of decision points in every function in a program (so you get an idea of the number of possible code paths).

I did as suggested and I started with “apt-get install pmccabe“, followed by “pmccabe app/*/*.c | sort -nr | head -10” to get the 10 functions with the highest (worst) results. This gave me the following table:

Cyclomatic complexity

Lines of code

Function name

113

892

gimp_display_shell_canvas_tool_events

100

123

layers_actions_update

69

416

update_box_rgb

61

359

border_region

60

445

siox_foreground_extract

56

452

render_image_tile_fault

54

327

combine_inten_a_and_inten_a_pixels

53

194

gimp_plug_in_procedure_add_menu_path

47

275

gimp_drawable_offset

46

240

gimp_vector_tool_oper_update

According to the CMU page on cyclomatic complexity, numbers between 21 and 50 reveal a “complex, high risk program” and numbers above 50 only occur in an “untestable program (very high risk)“.

What does this mean for GIMP? Not much. But if you touch one of these functions, please be careful… you might break things and it will be very hard to find where the bugs are hiding in that code.

Some GIMP users who follow tutorials written for Adobe Photoshop are sometimes confused when they see statements like “Save your image using quality 8 or 9 in order to get good results” because this obviously does not match the scale from 0 to 100 used by GIMP (and other software based on the IJG JPEG library).

While working on some improvements for GIMP’s JPEG plug-in, I investigated the compression levels used by various programs and cameras. The analysis of several sample images allowed me to build the following mapping table (slightly updated since I posted a similar message to the gimp-web mailing list in August):

Photoshop quality 12 <= GIMP quality 98, subsampling 1×1,1×1,1×1

Photoshop quality 11 <= GIMP quality 96, subsampling 1×1,1×1,1×1

Photoshop quality 10 <= GIMP quality 93, subsampling 1×1,1×1,1×1

Photoshop quality 9 <= GIMP quality 92, subsampling 1×1,1×1,1×1

Photoshop quality 8 <= GIMP quality 91, subsampling 1×1,1×1,1×1

Photoshop quality 7 <= GIMP quality 90, subsampling 1×1,1×1,1×1

Photoshop quality 6 <= GIMP quality 91, subsampling 2×2,1×1,1×1

Photoshop quality 5 <= GIMP quality 90, subsampling 2×2,1×1,1×1

Photoshop quality 4 <= GIMP quality 89, subsampling 2×2,1×1,1×1

Photoshop quality 3 <= GIMP quality 89, subsampling 2×2,1×1,1×1

Photoshop quality 2 <= GIMP quality 87, subsampling 2×2,1×1,1×1

Photoshop quality 1 <= GIMP quality 86, subsampling 2×2,1×1,1×1

Photoshop quality 0 <= GIMP quality 85, subsampling 2×2,1×1,1×1

The quality settings in Adobe Photoshop include not only the compression factor that influences the quantization tables, but also the type of chroma subsampling performed on the image. The higher quality levels use no subsampling, while the lower ones use 2×2 subsampling. The strange transition between Photoshop quality 6 and 7 (quality 6 having a higher equivalent IJG quality than 7) can be explained by the difference in subsampling: since quality 6 has less color information to encode, the size of the file will be smaller anyway, even if more coefficients are preserved in the quantization step.

You may also be surprised by the fact that the default GIMP JPEG quality level (85) matches the lowest quality offered by Photoshop: quality 0. This makes sense if you consider that the default “Save” offered by Photoshop is designed for high-quality images, so the losses should be minimized. But if you want to save images for web publishing, then Photoshop has a separate “Save for Web” feature that can save images using lower quality levels:

This mapping between Photoshop and GIMP quality levels for JPEG is not exact and is intentionally pessimistic for GIMP. There is some safety margin, so it is possible to decrease the GIMP quality level a bit and still get a file that is as good as the one saved by Photoshop.

Reminder: if you think that you will need to re-edit an image later, then you should never save it only in JPEG format. Always keep a copy in XCF format (GIMP’s native file format) so that you can edit it without losing any additional information.

Another reminder: using a JPEG quality level below 50 or above 98 is not a good idea. Saving a JPEG image using quality 99 or 100 is just a waste of disk space. You should use a different file format instead (such as PNG or TIFF). And below quality 50, you lose so much that it would be better to rescale your image and use a lower resolution before trying to compress it further.

Last Wednesday, I went to the gas station because my car was a bit thirsty. When I wanted to insert my card and pay for the fuel, I was greeted with this ridiculous error message: “The exception unknown software exception (0×0eedfade) occurred in the application at location 0×77e73887.”

This is so wrong..

The error message goes to the wrong target: the customer cannot do anything about it anyway, so why does it appear on the screen? The touch screen was frozen so I could not even press the OK button. In cases like this, the software should just log the error and blank the screen or display some customer-oriented error message such as “Out of order”. There should be a way to trap these errors (any kind of software error) and redirect them to the company that maintains these terminals instead throwing them at the customer.

The exception “unknown software exception” shows that things are definitely not under control. How can one trust a system that displays such a stupid error message?

Minor detail: the error message is in English only, while the user interface of this terminal defaults to French and supports multiple languages (Dutch and German, but not English). Trapping the error and displaying “Out of order” in multiple languages would have been more appropriate and more customer-friendly.

If you ask Google about this error message by searching for the error code and address, you will find several matches revealing that various applications are affected by this random crash: Internet Explorer, Photoshop, some Delphi applications and other specialized software. This looks like a mysterious Windows crash that confuses everybody.

Using Windows instead of a more robust embedded operating system is just asking for trouble. The main advantage may be that some customers are already familiar with the Windows error dialogs and can recognize them from a distance, so they know that they should go away and not even bother reading the error message.