Software developers - DPI scaling

Summer-Time-Fun

Well-Known Member
Apr 1, 2007
529
271
I'm sure many of us know how small things can look on some of the new LCD/TFT wide screens running at 2048x1152 native resolution. It's true that there is a DPI scaling option in WinXP and Vista, but some software developers still refuse to make their applications "resolution aware". Their windows, and text look too small on newer screens.
It's something to push for and complain about. If you have software on your system that ignores/will not compensate for large DPI scaling, and the applications UI is way too small, please email the developers of your software. Simply lowering your display resolution from the native default looks like crap as we all know.
I wont name any software products, but I'm amazed to see so many of the big players ignoring the larger scale LCD sizes.

Code:
http://blogs.techrepublic.com.com/window-on-windows/?p=740
 

techie

SuupaOtaku
Jul 24, 2008
568
4
Could be worth considering, but I must say you cannot put the whole blame on us software guys.

After all the screen is hardware... I almost always set my screen to the max resolution to have as many links and icons on the screen at the same time.

Icons the size of a house (256x256) is just a waste of space on my system.

To avoid having changes in the resolution look like junk, I would say there is a lot left to wish for from both software AND hardware people. It's not that the hardware MUST only support the latest, biggest and so on.

But you are definitely right, compatibility and cross-user friendly systems are important.
I took that to a new level in web design when some friends of mine started screaming for "cross-browser" compatible layouts.

I told them, there is a simple strategy.

a) I will not write two style sheets when one works fine.
b) What doesn't work well in all browsers should perhaps not be in a style-sheet to begin with.

Then came MSIE 6... and 7... and now 8...

And they go further and further away from the rest of the planet.
Needless to say, I for one don't care for writing source-code for supporting stuff that doesn't work right to begin with.

PS / If I had a screen like the one you talk about, I would be running 16x16 icons all over theplace, joyfully getting ten things done at the same time with an Excel sheet open for viewing columns A - ZA in one page. My gosh that would be heaven, not having to scroll around the table layouts.
 

Summer-Time-Fun

Well-Known Member
Apr 1, 2007
529
271
Hello techie

You know, to be honest, my comment in the body of the first post, where I used the word lazy, was the wrong word choice. I was in the moment I guess. By no means are software developers lazy. Not for the hours they spend coding. So I must apologize for that remark.
I really just wanted the comment to be an idea in the back of minds, because when things get talked about, they’re usually kept in sight and circulation. I was also a little agitated when I made the post.
I seen a beautiful Samsung 21” LCD, that ran at 1600x900, but I got the 22” figuring it would be larger, and still use the same native rez, but the 22’ was actually 1680x1050 native. So I returned it for the 23” because the label at the store said that the 23” ran at the same native rez as the 22”. It was a miss print, so it turns out that the 23” actually runs at 2048 x 1152. At that point I’m thinking; how fricking hi are they going to crank the darn res up. That looks like the max of my graphics card.

To make a long story short, I later found that the DPI setting in windows worked well without changing the quality of the icons, and font because the DPI setting doesn't stretch the icons like the alternate setting in appearance tab. (well the icons in the tool bar don’t look as good for some reason, but the rest of the system looks the same, just a little larger).
But, I feel the same as you in regards to big buttons and huge tool bars, though 2048 x 1152 is ridicules to the point where it just hurts my eyes.
I figured I could get a larger screen with everything the same size, but with more space. My screen ended up being a football field. But at least I can put more girls on it, they just look smaller, and I don’t have to turn them sideways. :XD:

I understand that developers where not exposed to the higher resolutions with some of the older software , but Today-If MS can display an in dependent DPI setting to enlarge their tool bars, fonts, and dialog boxes, without loosing any quality, or sharpness, why are companies like Adobe, and ESET, (to name a couple) not following the new standards with their latest software. I"m aware that some windows and dialog boxes are not dynamically expandable, But this DPI setting I'm referring seems to be independent of the actual hardware screen rez. I think it's emulating a lower screen resolution, but in a layer kind of manner, without stretching the icons which happens when the appearance tab is used.
 

techie

SuupaOtaku
Jul 24, 2008
568
4
Hello techie

You know, to be honest, my comment in the body of the first post, where I used the word lazy, was the wrong word choice. ...
---
...miss print, so it turns out that the 23” actually runs at 2048 x 1152. At that point I’m thinking; how fricking hi are they going to crank the darn res up. That looks like the max of my graphics card....

---
I understand that developers where not exposed to the higher resolutions with some of the older software , but Today-If MS can display an in dependent DPI setting to enlarge their tool bars, fonts, and dialog boxes, without loosing any quality, or sharpness, why are companies like Adobe, and ESET, (to name a couple) not following the new standards with their latest software. I"m aware that some windows and dialog boxes are not dynamically expandable, But this DPI setting I'm referring seems to be independent of the actual hardware screen rez. I think it's emulating a lower screen resolution, but in a layer kind of manner, without stretching the icons which happens when the appearance tab is used.

On quote one, not to worry I wasnt taking it personally.
I agree with you in a lot here, because I for one think the biggest threat to software development is "Hubris"

In my soon to be 15 years of development related work and PC work in both hardware and software, I have often been met with other developers lack of interest in sharing knowledge about certain routines and practices.

It is as if they feel they are privy to something secret and special solely because they had a different manual or access to other lines of the same fields education.

I for instance have focused on simplfied GUI's towards DBMS interaction, combined with web development, and have hardly had a need to focus much at all on GDI, DirectX and other graphical routines, until now that I get into game and gui development using partly 3D tools.

And here the problems begin...

In your quote 2, you remark on the excellent resolution, but at the same time state "That looks like the max of my graphics card....", which is where the root of a lot of the problem lays.

In the event you are not a game developer, you dont need to write code using the biggest VGA adapters, and I doubt we'll ever see a need to write pure HTML or C++ using a 2 GB VGA Adapater.

Much like when I started to develop applications using Visual Studio, I was on a windows 98 box using FAT32 as filesystem.
This little change of upgrading to NTFS opened the ability for me to compile software for both FAT32 and NTFS. So by changing my os setting I can now compile for other environments.

In other words, if you dont have the hardware at hand its very difficult to write code for it, and assuming all developers have access to all the latest native systems is very difficult at best. Even if you can write the code, does not imply one can even test it properly no matter how closely one follows a defined API.

In the third quote above, you mention the comparison between MS and Adobe, or many other large software houses not following the same principles.

This is simply because Microsoft tends to follow MS software routines, and expand on them for their own use before making it available to the greater public of developers.

The next and even larger issue is the fact that where one works with DirectX 9, another uses OpenGL or other VGA routines in fancy footwork on the pixels.

Both are quite different to work with, just as well as using different language bases, C, C++, VB, C#, Python, Perl, Java, the list goes on...

Even if two applications are written in C++ does not mean their even remotely similar in coding. That also depends on code style choice, compiler and IDE tools and much more. I studied C++ on Borland, and work with VS.
Quite a lot is different in the source code and development environment even if the underlying target is the same, and supposedly the software is to do the exact same thing.
 

guy

(;Θ_Θ)ゝ”
Feb 11, 2007
2,079
43
The biggest problem is that there is no single standard on how graphics are rendered on screen. There are big differences between the metrics of print and screen scaling/dimensions, and no standard way in how raster and vector rendering follows those metrics -- and in reality, there should not be any standards.

Take an online newspaper website with content that is 1000px wide, utilizing CSS to control the metrics for font size and page layout, viewed between a 12" 1024x768 (120dpi) laptop screen and a 24" 1920x1200 (96dpi) desktop screen. There are two basic options:
1) Render according to pixel metric. Eg: a 12px font renders at 12px height, therefore the online newspaper will look physically bigger on the 24" screen due to the lower dpi, but 1000px width is rendered as 1000px and the 24" screen will have an unused space 920px in width.
2) Render according to print metric. Eg: a 12pt font is enlarged on the 12" screen to match the physical size of the 24" screen (eg: 12px→15px, since 15px÷120dpi = 1/8in, and 12px÷96dpi = 1/8in). However that means when displayed on the 12" screen, the 1000px width increases to 1250px, and the 1024 horizontal resolution will cause the page to be cut off.

So which way is right? The answer is neither, because really it is up to the designer to decide how s/he wants the page to be rendered, although as mentioned inconsistency between how different browsers interpret CSS metrics is a nightmare for web developers.



There are further two issues when it comes to rendering GUI. The first is that there is no standard library for GUI components. Applications coded in Visual Basic, C++, C#, .NET, and (especially) Java all access different libraries for drawing application components (textboxes, labels, menus, etc). Part of the reason it can't be standardized is because it would break a lot of legacy support, since forcing an app to adjust to different DPIs could cause components to be rendered off-screen and thus become useless/un-clickable, etc. Another part is because these coding languages are shared across multiple platforms (especially Java, between Windows, Unix, OSX, etc), and those hosts all have their own way of handling display metrics. But perhaps the biggest reason is that it just doesn't make sense to standardize them: different coding languages are born out of different necessities, and while C++ is a lovely language for developing games, .NET is a much better language for developing business web applications. And there isn't a lot of common ground regarding display resolution between games and web apps.

The second reason is that it's just not practical right now. In order for every application to be dpi-aware, that requires that virtually every component and content be treated as vector. This is easily accomplished with text and 3D rendered graphics, but think about games that use bitmap sprites, or applications that use custom bitmaps to render their interfaces. If they are forced to increase size due to high dpi, then those graphics will become pixelated due to upsampling. The reason we don't see it happening with icons very much is because a lot of designers are now embedding 192x192 and 256x256 raster icons in their iconsets. But icons are small in file size; if a game developer must suddenly double the resolution of raster content, then the size of the images expands exponentially! Which means greater load on the HDD (for file access), CPU (for file processing), and GPU (for image rendering).



For what it's worth, the pixel/raster measurement is not going to die anytime soon, because of the way high-bandwidth data transmission works. While games can be rendered as vector (polygons which the GPU renders in real-time), vector graphics simply cannot be transmitted over-the-air, namely for broadcast transmission (HDTVs). There's just no way with our technology that you can record a movie/tv show as a scale-free vector and just have the receiving unit scale it according to whatever dpi it has -- that would take up incredible amounts of bandwidth. It just has to be rasterized in order to fit into the (current) ATSC spec. It will probably take a leap into quantum computing before true raster-less imaging is possible.
 

techie

SuupaOtaku
Jul 24, 2008
568
4
Thanks Guy,,, very concise
but think about games that use bitmap sprites, or applications that use custom bitmaps to render their interfaces. If they are forced to increase size due to high dpi, then those graphics will become pixelated due to upsampling.

This issue is something I personally don't like much and I ended up in some cases making graphics for the largest possible scenario and scaling them down in the display instead. Same problem, different approach, better result.

The reasoning is simple, "Disk space is cheap, but re-sampling is not from a quality point of view"
 

Summer-Time-Fun

Well-Known Member
Apr 1, 2007
529
271
Good points from both of you; techie, Guy, thanks!

Again, the title of this post was totally wrong. But I think we all agree that there is a problem somewhere along the line at the consumer end, you guys know more about the technical side of things then I do.
@techie, I’ve worked in computer repair departments and I know exactly what you mean about the lack off communication. Everyone is worried that the next guy is going to take their job. Distrust is a problem in evolution.
Now when I said my graphics card was at the max, I’m only pointing out the fact that it’s over kill even with today’s latest graphics cards. I’ve developed many game levels, and it’s great for games, or forensic work, but that’s all it’s good for. These flat screens are being sold to the general public.

@Guy, in regards to what you said about vectors, that “there should be no standard”. I understand vectors are mathematical lines, and are not dependent on pixels. I'm guessing this is sort of what makes standardizing LCDs to a native resolution complex.
In any case I have no idea how developers matrix things, but it’s something that needs to be delt with if flat screens are going to only have one native resolution, even if it means making multiple standard icons, or just use Mipmaps, which makes more sense to me, and with today's drive sizes, room is not an issue. we're probably only talking 1mb to maybe 50Mbs larger depending on the program size. (text I think is vector, or no?) If this code is not available in the libraries can this be added at some point? Obviously I don't see the entire picture, because I'm not a software/hardware developer, just a consumer.

It's not the developers fault, this is where the title of my original post was wrong. from my point of view, it just seems like everything today is about production, we end up in a world of half working/ half satisfied customers. And quality is a whole another topic. I’ve got music equipment rack units and amps that are 30 years old still running strong, yet I’ve been though 2 professional Netgear router switches, a Motorola modem, a $500 t.c electronics multi effects processor.. (all modern stuff). I've been told that I'm lucky to have gotten three years out of my Netgear. I just wish there was less production, and more work with the technology we already have. It can be done.

Again, no disrespect to developers, programmers.
 

techie

SuupaOtaku
Jul 24, 2008
568
4
I believe a lot of the problems come from people with to much toys and to much time on their hands.

I would love to see better graphics and simpler implementation of 3D routines in development, but the problem it brings is two fold.

a) the consumers get their hands on many nice "toys" and expect everyone to be to the same standard level. I know how hard this is, and often remind myself that the people I write code for are far behind in hardware upgrading.

b) The code involved in newer design studios is hardly going to apply to code styles used three or more years ago. I am currently (or should I say, first now) learning .NET since I am finding that 99.99% of what my clients and, thereby I, need are fully supported by VB 5 (which is off the shelves since over 5-6 years now).

So in order to use the new graphics and hardware and so on, the developers have to follow and write code for the latest hw, whereas most clients, don't have it.

The reason they don't have it, is not only a lack of interest to getting it, but common users 1, don't think they will need it and 2, companies do not spend that kind of money either. Especially not in times of credit crunches.

Someone once asked me if they should upgrade from windows XP on a P4 Hyper threaded 2x3GHz system and I said...

No... there are two reasons not to.
1) I use the same to write the code and...
2) Only because you're computer can calculate 1+1=3 faster, you still wont type faster than 60 words per minute.

There has to be a balance to what the market expects the clients to jump on, versus what the clients actually need to do what they are doing.

It is not going to "halt evolution" in any major way anyway, however I feel people have forgotten that only 20 years ago, no normal person, except students at well supplied universities, had ever heard of PC's on any larger scale.

So to the extent is our world overwhelmed by them that kids today don't use their imagination anymore.

I just had a discussion with a friend of mine a day ago, if that couldn't be a reason to that we see a massive increase in ADHD too. The individual outlet for imagination and creativity has been taken over by the virtual lifestyles and pre-made games and toys.

Ask any 5 year old today if they know how to make a pig from a pinecomb and four toothpicks, and they will ask you if that available on Wii or Nintendo, with a nice "pimp-my-pinecomb"-kit.

Regardless of which, I think universal standards are good, as long as they do not restrict the choices for the developers and their individual style of work.
Nor should it be restricted to a single platform or environment, and as such, it puts an even greater pressure, not on software developers per say, but rather on communication between the hardware manufacturers and those writing the supporting LIB (DLL or COM files) which are used by developers, making the implementation of supporting functions simpler for all.