Re: Two encoding problems
- View SourceOn 22 Mar 2004 at 6:27, Tony Mechelynck wrote:
> Thanks for drawing my attention to the 'printfont' option (whichIt is still a vim todo to be able to print multiple character sets in
> works on my system and is set to "Courier_New:h10"). But I don't
> think that, without a functional 'printencoding', it will help me
> print a file in UTF-8 with mixed Latin, Cyrillic and Arabic
> characters on a printer which, AFAIK, understands only graphics or
> cp1252 text (e.g. the file
> http://users.skynet.be/antoine.mechelynck/index.htm ).
one file. While Unicode has come up with a single encoding, the
issue of printing very large character reportoires is a much harder
issue usually achieved by switching fonts with different encodings.
> Actually, I got the wrong end of the stick too. To avoid hollowUsing send TT as graphics gets around the problem of missing or
> boxes I must "send TT as graphics" instead of the default "send TT
> as bitmaps". In any case, IIUC, the idea is to bypass the printer
> fonts completely (which is slower but safer I suppose) when
> printing text that might include "exotic" characters like the
> accented consonants of Esperanto (c, g, h, j, s with circumflex and
> u with breve).
mismatched fonts on a printer. In a way cheaper printers with
Windows are more predictable since Windows will alwys pre-raster text
and you always get what you see on screen. Using GhostScript to pre-
raster PostScript files before sending them to the printer is the
same thing really.
The only other reliable method is to include a copy (possibly a
subset) of the font in the data sent to the printer, but that
requires a more expensive printer with an embedded font interpreter
and that then just opens up another Pandora's box of issues.
Laugh, and the world ignores you. Crying doesn't help either.