The fontsize is normally a negative number. Choosefont would give you -13 for a 10point arial font.

On the screen the nFactor is 15. On the printer its 4. If I do exactly the same thing the font size is way too small. If I substitute a 15 for the 4, its ok. Obvisouly the calculation of the factor is incorrect.

The way you set up the mapping and extents should allow you to specify dimensions such that the same values produce the same physical sizes on both the screen and the printer. I.e a font of hieght 15 on the printer should be nearly the exact same size as a font of height 15 on the screen

Is that your intent? (Its often a good scheme, but what confuses me is that with it you wouldn't want that nFactor type of formula.)

With the mapping mode set this way there is 1440 logical units per physcial inch (of the screen and printer). Since a point is approximately 1/72 of an inch there is approxumately 20 logical units per point. so the font size in logical units (which is what you specify when creating the font) should be 20 times the desired font size in points.

Maybe the other thing that confuses me is that when you use choosefont and select a 10 point font, it returns -13 in the lfHeight field of the logfont field. This is the field I have been using as the font size. Should I be using iPointSize;

Maybe the other thing that confuses me is that when you use choosefont and select a 10 point font, it returns -13 in the lfHeight field of the logfont field. This is the field I have been using as the font size. Should I be using iPointSize;

>> When I multiply the fontsize by 20 the font turns out too large.
Too large in what sense. You need to multiple the point size you want by 20. So if you want a 12 point font, you should use 120. this should yield a font that is about 1/6th of an inch high. (Actually you probably want to use -120, not 120)

What are you using and what are you getting?

>> I originally tested this at home with a HP deskjet
>> printer. It worked fine. When I
>> got to work and tried it on our Laser printers the
>> fonts came out too large?
The same code? the whole point is that this is hardware independant. i.e. you get the same size results regardless of the resolution (pixels per inch) of the device.

>> you use choosefont and select a 10 point font, it
>> returns -13 in the lfHeight field of the logfont field
I don't use choosefont, so I'm not sure what is going on there, but one issue is that a negative height indicates that you are specifying the height of the entire "line", that includes the accent, decent, and leading. A positive height indicates you are specifying only the acent and decent height, so the font returned will be a little larger than this. because of this you often want to use negative heights in LOGFONT. How why is it returnin +-13 which is too small for a 10 point font? I would have to guess that is because it is returning the height in device units, not logical units (because it doesn't know what sort of mapping mode you use.) So it is saying the font should be 13 pixels high (which is just about right for a 10 point font if you have a 96 pixel/per inch monitor--standard.) So you must conver these device units to logical units before using them. In your case you will multiply by 1440 and divide by the number of pixels per inch.

>> If I do use iPointSize should I
>>
>> logfont.lfHeight = cf.iPointSize*20;
>>
>> or
>>
>> logfont.lfHeight = -cf.iPointSize*20;
Yes.

It depends on what you want. Positive allows you to specify the height of the characters themselves, not the ehight of the space "they use". i.e. if you have multiple lines they will each use up more space than what you specify, because of the space between the lines. Negative means you are specifying the total space you want for a line, including the space seperating the lines.

When you specify a font in most programs, like word processors, you usualy are specifying the total spacing (negative), not the space used just by the character.

>> if I set logfont.lfHeight = 0, it defaulted to no
>> height. How can I determine what logfont
>> should be to default to the correct font?
I'm guessing. But you probably need to set lfheight to the height in device units (pixels). So remember that a point is 1/72 of an inch. So

PixelHeight = PixelsPerInch*PointSize/72

You get PixelsPerInch from GetDeviceCaps() using LOGPIXELSY.

Note if PixelsPerInch is 96 (typical) then the formula 96*10/72 yeilds 13. Which is what you foudn above. i.e. that a 10 point font is 13 pixels high on you monitor.

Note, sometimes I find that 96 is returned for a monitor even when the monitor has a very different resolution. There is nothing you can do about that really. You have to work with what the OS "knows" Now that can be fixed--probably-by the user. They need to insure that the display setup is specifying the right monitor and driver, if not the OS and therefore your program can't be blamed for working with the wrong figures.

This article describes how to add a user-defined command button to the Windows 7 Explorer toolbar. In the previous article (http://www.experts-exchange.com/A_2172.html), we saw how to put the Delete button back there where it belongs. "Delete" is …

If you have ever found yourself doing a repetitive action with the mouse and keyboard, and if you have even a little programming experience, there is a good chance that you can use a text editor to whip together a sort of macro to automate the proce…

This is Part 3 in a 3-part series on Experts Exchange to discuss error handling in VBA code written for Excel.
Part 1 of this series discussed basic error handling code using VBA.
http://www.experts-exchange.com/videos/1478/Excel-Error-Handlin…

Excel styles will make formatting consistent and let you apply and change formatting faster. In this tutorial, you'll learn how to use Excel's built-in styles, how to modify styles, and how to create your own. You'll also learn how to use your custo…