Newsgroups: comp.windows.ms.programmer
Path: utzoo!utgpu!watserv1!daemon
From: tom@mims-iris.waterloo.edu (Tom Haapanen)
Subject: Re: Font madness...
Message-ID: <1991Feb25.234350.28096@watserv1.waterloo.edu>
Sender: daemon@watserv1.waterloo.edu
Organization: University of Waterloo, WATMIMS Research Group
References: <1991Feb25.204847.18731@watserv1.waterloo.edu>
Date: Mon, 25 Feb 1991 23:43:50 GMT
Lines: 24

I previously wrote:
> If I do EnumFonts() I can figure out what point sizes are available
> on a given hDC (by doing a GetTextMetrics() and other magic to convert
> a LOGFONT height into points).  But if I know what pointsize I want
> (say, 8-point Helv), how do I select that size?  CreateFontIndirect()
> and CreateFont() both want the height in device pixels, and including
> the internal leading of the font.  How do I get 8-point?

Well, it appears (after some CodeView-style investigation) that the "user
units" are just twips after all, and it's quite easy to get the right size
font.  I only wish that they referred to the units as twips instead of
this ambiguous "user unit".

This also now explains something else that I had wondered about...  Way
back, when I was using Excel 2.0, when I changed from an EGA display to
a VGA display, all of a sudden my worksheet font specs started getting
saved as 8.25, 9.75 and 13.75 points; I couldn't figure out why.  But it
appears that those are the actual available sizes for Helv using the VGA
driver; Excel wasn't doing the appropriate rounding...

Sorry to waste everybody's time!

[ \tom haapanen --- university of waterloo --- tom@mims-iris.waterloo.edu ]
[ "i don't even know what street canada is on"               -- al capone ]
