Skip to main content

I recenty purchased an HP Spectre X360 with an 13 inch display and 4K resolution.

Problem:  Attempting to run system in HD 1080p makes everything blurry and difficult to read in Windows 10.

Why use HD1080p resolution?  Windows 10 scaling functionality is not complete and many applications don't scale well (tigervnc and Unity3D as examples.)  I Imagine in a couple of years, most applications will become DPI aware and this problem in general will only effect older applications no longer activily being maintained.  

Why is running a 4K display at HD resolution a problem on some systems?  I know what you're thinking - if I set my display to 1920x1080, everything should scale perfectly since it is an exact divisor of 3840x2160.  Each pixel in HD should be represented by 4 pixels and everything should look pretty good, just not as sharp.  The issue appears to be the graphic drivers attempting to be clever when they shouldn't.  Rather than outputting an HD signal to the monitor, it's outputting a 4K signal and applying a scaling algorithm to adjust the picture.  It's this scaling algorithm that makes everything look ill-defined and blurry.

Solution:  You have to convince Windows that your display supports HD as a native resolution.  This can easily be done by using CRU (Custom Resolution Utility) authored by ToastyX and can be found here:

https://www.monitortests.com/forum/Thread-Custom-Resolution-Utility-CRU  

It was as simple as adding 1920x1080, rebooting, and selecting that resolution.  I added it to detailed and standard resolutions and all is well in that department.

Additional Issues: Auto brightness control was also driving me a little crazy.  To turn it off on an HP Spectre X360, I had to disable power management features in the Intel Drivers and select a non HP power setting in power options.

x

User login