2014-10-30

The Dell U2212HM's OSD menu shows a bar labeled "Energy Use" that goes from 1 (least watts) to 20 (most). I wanted to see what the actual correlation was, so I broke out my kill-a-watt. The simple testing method was to change the brightness from lowest to highest, which tracked with the energy use bar. (Aside: Setting the contrast almost to black results in a 1w decrease, even with brightness at 100%).

Monitor resolution was 1080x1920 (Portrait Mode), with Gmail open while changing brightness. Values listed are average: there was an up to .8w variance from start of a bar to end. (For example, going from 3 bars to 4 showed 13.8, and right before going to 5 was 14.4)

Energy Use to Watt Correlation

Bars

1

2

3

4

5

6

7

8

9

10

11

12

13

14

15

16

17

18

19

20

Watts (avg)

11.8

12.4

13.4

14.0

14.6

15.3

15.8

16.4

17.2

17.7

18.2

18.8

19.4

20.0

21.0

22.5

23.8

25.0

26.4

27.9

With a colorful video playing, at 20 bars energy use, the power consumption was around 26w. Full screen open pdf of the manual (islands of black text on oceans of white) resulted in the highest consumption, at 28.1w*. Dell's conservative value in the specs ('normal') is 30w; the kill-a-watt measurement supports this.

Based on an energy cost calculator, the yearly cost to leave this monitor on 24/7 at the highest observed energy use (28w) (.672 kWh/day) at 12 cent per kWh is $29.43/yr ($2.42/mo, $.08/day).

Using 'good enough' brightness for me (15 bars)(21w) and assuming a more realistic 16 hour a day use (ignoring marathons of sexual transgression with the red bulls) gives .336 kWh/day, costing $.04/day, $1.21/mo, or $14.72 a year. (x2 for its twin)

I remain opposed to running the numbers on the 27", at least, until I set up nircmd such that I'm turning all three off with a hotkey.