What is causing Calibri to lose ClearType between 9 and 14 pt?
Asked Answered
F

1

17

What exactly is it that makes GDI+ switch to binary aliasing when using default Microsoft Office font Calibri between 9pt and 14pt with ClearTypeGridFit specified?

It's somewhat disconcerting. How many other fonts are also affected by whatever is behind this, and at what sizes? Is there a workaround? (Excluding GDI, which doesn't have the same text layout features?)

Here's the code I used to generate the image:

private void Form1_Paint(object sender, PaintEventArgs e)
{
    e.Graphics.TextRenderingHint = TextRenderingHint.ClearTypeGridFit;

    var height = 0;
    for (var i = 1; i <= 17; i++)
    {
        using (var font = new Font("Calibri", i))
        {
            var text = "ClearTypeGridFit " + i + "pt";
            e.Graphics.DrawString(text, font, SystemBrushes.ControlText, 0, height);
            height += (int)e.Graphics.MeasureString(text, font).Height;
        }
    }
}
Faso answered 1/5, 2015 at 12:4 Comment(5)
Graphics.DrawString() only produces decent output on high DPI devices. Printers, not monitors. Use TextRenderer.DrawText(e.Graphics, text, font, new Point(0, height), SystemColors.ControlText); instead.Befoul
@HansPassant Understood. The question is still interesting and important to me. This is what I'm working with.Faso
Clearly you are going to wait until DevEx gets off their butt and does something about it. Meanwhile, use a font that behaves better, the XP fonts don't have this problem. Segoe UI is fine too.Befoul
Probably will. In the meantime, I'm really hoping someone is knowledgeable enough to shed light on this mystery. Since I'm not in control of the font, who knows where else this will come up with other fonts?Faso
@HansPassant people are going to run into this issue and will get a lot more out of your comment if you write it as an answer, instead, even if it's not the exact answer to the question.Elrod
C
15

Calibri comes with an EBLC table and EBDT table, which tells text engines that for certain point sizes, they should not try "their own scaling algorithms" but just use bitmaps that are stored directly in the font, instead.

Each font size can come with its own list of "the following glyphs must be bitmapped at this size", called a "strike", so one glyph can have multiple bitmaps for multiple sizes (but there can be gaps, and when that happens bitmaps need to be scaled and things can go catastrophically wrong).

For instance, Calibri has strikes for point sizes 12, 13, 15, 16, 17 and 19, with an example bitmap for A being:

<ebdt_bitmap_format_1 name="A">
  <SmallGlyphMetrics>
    <height value="8"/>
    <width value="7"/>
    <BearingX value="0"/>
    <BearingY value="8"/>
    <Advance value="7"/>
  </SmallGlyphMetrics>
  <rawimagedata>
    10102828 447c8282  
  </rawimagedata>
</ebdt_bitmap_format_1>

This bitmap is referenced by the font size 12 strike, and is encoded as a 7x8 pixel bitmap. Since 12 is the lowest value, we run into problems when we use a font size lower than 12: suddenly we have to scale a bitmap. This can only go horribly wrong.

If you look at something like WordPad, you can see that Microsoft's Uniscribe engine (used with GDI+; the modern equivalent is Direct2D with DirectWrite as text engine, instead)can scale these bitmaps down quite well (shown are sizes 5 through 20), but even Microsoft's own technology has clear limitations. We see that at font sizes 5, 6, and 7px the bitmaps are pretty horrible, and even 8, 10, and 11 look kind of wonky:

A at sizes 5 through 20

Scaled up:

A at sizes 5 through 20, scaled up 3x

Things get more interesting because not every glyph is represented in every strike, so while "A" has a bitmap at point size 12, there are glyphs for which the lowest point size with an explicit bitmap may be 13, or 15, or 16, or 17, or even 19.

This means you have three problems:

  1. A font might "demand" the text engine uses its bitmaps, instead of trying to rasterise the vector outlines per the text engine's algorithms, and
  2. There is no magic font size above which all characters are rendered "nicely" and below which all characters are rendered "poorly". A font can have any number of "strikes", containing any subset of the font's encoded glyphs, effectively meaning that each character can have its own rules about when the text engine should switch from rasterised vector to embedded bitmap, and
  3. Text engines are entirely free to completely ignore the font's "demands" and do their own thing anyway, and finding out which engine does what is, despite having the internet at our disposal, virtually impossible. It's one of those things that no one seems to document.

The easiest way to find out which fonts will do this is to simply check the font for an EBDT table at all - if there is one, this font will force engines to use bitmaps for very small (and sometimes very large) font sizes. If you want the specifics, you can run the font through TTX and then find the <EBDT> table start, to see what's really going on.

Prepare to be overwhelmed, though. Calibri alone has bitmaps specified for well over a thousand glyphs, for example.

Codification answered 1/5, 2015 at 19:40 Comment(2)
Those strikes don't match up with 9pt - 14pt, but your explanation makes sense otherwise. I thought all modern text rendering engines used vector rendering (cached, obviously)- would you say it is generally true that most fonts are rendered using the embedded rasters?Faso
For professional TTF-OpenType fonts like Calibri, embedded bitmaps are going to be far better than what a vector rasterizer can produce (rastertragedy.com explains why this in the most amazing detail). As such, good text engines will make use of them - but not all of them do. And, just because you're on Windows doesn't mean you're always using the same engine: GDI+ I think uses Uniscribe, but modern versions of windows will use Direct2D with DirectWrite. That said, I've not programmed for windows since windows 95, so that deserves verification.Elrod

© 2022 - 2024 — McMap. All rights reserved.