Hi,
On Thu, Jan 28, 2016 at 9:52 AM, Chris Sherlock
<chris.sherlock79@gmail.com <javascript:;>> wrote:
I did a calculation from http://www.sven.de/dpi and apparently I have a
pixel density of 267.02 PPI.
Wondering if this is skewing the calculations.
OS usually doesn't tell the correct monitor DPI to the application but
usually just reports 96 DPI (if no scaling is applied). In windows you
can change the scale levels - 100%, 125%, 150%, 200%,... which just
adjusts what DPI is reported to the application (100% - 96DPI, 125% -
120DPI, 150% - 144 DPI, 200% - 192DPI...) so the application can scale
accordingly (at least fonts). I'm not sure how OSX does this.
You should check what DPI is actually reported to LO - OutputDevice
has mnDPIX and mnDPIY - check where they are set and to what value.
DPI influences how pixels are converted to and from actual units (mm,
inch, twips,..) It could be that we transform between units just too
much and the error accumulates if the DPI is set to a weird value.
Tracking this down could be tricky.
Chris
Regards, Tomaž
Privacy Policy |
Impressum (Legal Info) |
Copyright information: Unless otherwise specified, all text and images
on this website are licensed under the
Creative Commons Attribution-Share Alike 3.0 License.
This does not include the source code of LibreOffice, which is
licensed under the Mozilla Public License (
MPLv2).
"LibreOffice" and "The Document Foundation" are
registered trademarks of their corresponding registered owners or are
in actual use as trademarks in one or more countries. Their respective
logos and icons are also subject to international copyright laws. Use
thereof is explained in our
trademark policy.