Display resolution versus Image Size

From Nexus Mods Wiki
Jump to: navigation, search

Display resolution versus Image Size (especially in older games)

There is a tendency to think the largest available texture image size is always preferable. This is not always true. It does produce more detailed images, but can those details make a material difference on your display screen or are you wasting video processing resources and possible creating "stutter" for yourself? Much depends upon the game in question and the display you are using as well as texture image resolution (often called "image size").

Image size

The "vanilla game" default texture image resolution size for Bethesda is 512x512 pixels (an array of individually addressable dots on the display screen). The largest "high resolution" textures used at the time of older games such as "Fallout New Vegas" (released in 2010) were 2048x2048 pixels. Each pixel ("picture element") includes a "color" component. The intensity of each pixel is variable. In color imaging systems, a color is typically represented by three or four component intensities such as red, green, and blue (RGB), or cyan, magenta, yellow, and black (CMYB). The number of distinct colors that can be represented by a pixel depends on the number of bits per pixel (bpp). A 1 bpp image uses 1-bit for each pixel, so each pixel can be either on or off. Each additional bit doubles the number of colors available, so a 2 bpp (2^2 bits) image can have 4 colors, and a 3 bpp (2^3) image can have 8 colors:

  • 8 bpp, 2^8 = 256 colors
  • 16 bpp, 2^16 = 65,536 colors ("Highcolor" )
  • 24 bpp, 2^24 = 16,777,216 colors ("Truecolor")

- (Source: Wikipedia: Pixel).

This becomes important when considering the impact of your choice of image resolution: the resulting image size is not only width x height but also x bpp.

Display resolution

In 2010 70% of LCD Monitors were 19 inches or less. (An additional 6% more were 20-21 inch, and only 1% were 25-27 inches; the largest size released that year.) - Statista.

The highest resolution a 19 inch monitor is capable of is 1680x1050 pixels in dimension (width x height) at 104.3ppi ("pixels per inch") in a 16:10 aspect ratio, requiring 1.68MP ("mega pixels") in total for the display. Much more common was 1280x1024 at 86.3ppi in a 5:4 ratio, requiring 1.25MP. At the upper end were 21-22 inch units in 1920x1080 at 100.1-102.5ppi in 16:9 ratio, requiring 1.98MP. - Desktop LCD Display Comparison.

The larger "Cinematic HD" TV screens (30+ inches) available today were not even considered. And the "pixel size" on each monitor is not the same (that's the "ppi" measure).

Color Depth

"True Color" uses 24-bits (3 bytes) to convey 8 bits of each of R, G, B. As of 2018 24-bit color depth is used by virtually every computer and phone display and the vast majority of image storage formats. Almost all cases where there are 32 bits per pixel mean that 24 are used for the color, and the remaining 8 are the alpha channel (brightness intensity) or unused. So each 32-bit "pixel" consists of 4 bytes of data, (times the "mega pixels" of the display) meaning it takes roughly 8MB to display 2MP on the screen of a 22 inch monitor.

Video Memory

Your video card has to essentially render a display equal to the monitor's "mega pixels" value every time your "field of view" changes even one pixel's worth in any direction. Extended/wider "fields of view" increase the amount of memory required to render the scene. Generally it's desirable for your video pipeline (basically your RAM allocated to video) to be able to work on rendering adjacent cells for the "next" screens in anticipation while it is displaying the "current" screen on the monitor (in case you suddenly switch viewing directions). If there isn't enough video memory available to compute the positions of the next screens or it has to wait on retrieving the images from disk, then it has to take another cycle (or more) to finish rendering the scene, probably resulting in "stutter". (This is why high-speed "Graphical Processor Units" (GPUs) with dedicated VRAM were developed.)

Bear in mind that modern video cards dedicated memory is measured in "Giga-Bytes" of VRAM: 1024 MBs/GB"; or approximately 128 "2MP displays" per GB. But what it renders in memory is based upon the collected images size and color depth. This is larger in aggregate than what is required for display on the monitor. When you increase the game settings related to the number of adjacent cells it "buffers" ("grids to load" and "interior/exterior cell buffer"), that increases the amount of memory required to "pre-render" adjacent cells; most of which will not be used. Therefor, choose your game settings against the video card's VRAM size with your monitor needs in mind, and vice versa.

Impact of choice of Image size on Display

Consequently, a texture image of a given size gets rendered on screen in the screen resolution and aspect ratio demanded by the monitor. (Multiple individual images make up the total rendered screen image.) The video card takes care of this conversion and passes along the final rendered display of various images in that resolution as that many total "mega-pixels". When the image size of a model is smaller than the number of pixels needed to fill it's respective area of the screen, the graphics card fills in the gaps by "interpolation" of the surrounding pixels. The result can be a "grainy" appearance to that model image. Higher resolution images require less interpolation, and less grainy images but more video processing memory per image. (The screen display requirements remain the same.)

Large displays can make the "grainy" image more apparent than on a smaller display. Smaller screens in effect "waste" processing power and time on higher resolution images they can't display anyway. (This latter is the basic problem with VWD//LOD images in general that are not optimized to reduce the polygon count and texture details.)

There are basically two types of graphic improvements: foreground and background. Everything you see up close (such as weapons, armor, and NPC bodies) are "foreground". Their image scope (the relative size of the object in question; not the texture size of the image itself) tends to be small as they are individual models. Everything in the middle to far distance is "background" (technically View While Distant/Level of Detail (VWD/LOD)). They encompass the entire horizon and objects seen at a great distance, so their image scope tends to be larger. They both count against the video memory; choose between them carefully.

Image size matters. A 4096 x 4096 image with 16 bit color resolution is 32 MB of data. If your screen is displaying 100 different models with textures that size, that's 3.2 GB of data just for processing the textures. Since a 32 bit program can only address up to 4 GB of data in total, you can see how trying to display a lot of high resolution models is going to run the game out of memory very quickly. (Thank you for the perspective madmongo.) Now, VRAM is not directly addressed by the program; it's controlled by the video card and can hide the actual addressing of more than 4GB from the program, but you get the idea. If you are running on a laptop (which usually doesn't have dedicated VRAM but uses system RAM for video instead) that is coming out of the memory available to Windows (the OS) and the game, and if you are running a 32-bit version of Windows, the 4GB overall limitation still applies.

Another thing to consider is that larger images take longer to pass through the video pipeline. When such larger than 2048x2048 image sizes were not considered in the game design (and they probably weren't at the time of older games), bottlenecks are to be expected. The use of "ENBoost" can help if you have a 64-bit version of Windows:

The idea is to reduce [32-bit] RAM and as result - fewer CTDs. Memory transfered to enbhost.exe processes and you can use up to 128 GB of [video] memory for x64 OS instead of a bit less than 4 with default game. What this means? You can install more mods, especially if you have cool video card with 3+ GB of VRAM and want 8K textures everywhere. - (Source: ENBoost mod description)

ENBoost is independent of, and installed separately, from the "ENB Series" graphical post-processor package. It is bundled with the various game specific "patch" files of the "ENB Series" as well, available on the "ENB Downloads" page, but does not require them.

Frame Rate (FPS) versus Monitor Refresh Rate (Hz)

This issue is similarly related to the "display vs image" issue in that it confuses many people. The article Frame Rate (FPS) vs Refresh Rate (Hz) gives an excellent explanation. If you've educated yourself this far: might as well finish the job properly.


The size of the monitor you are displaying the game on should dictate the upper limit of your choice of image size. 4K resolution images are wasted on a 15 inch display. You can't really see the difference. They (4K images) are intended for large screen TVs which otherwise make the "grainy" aspects of lower resolution images more apparent.

The trade-off is that older games are not designed to deal with images of that size that have to be rendered on those larger "mega-pixel" displays. The video pipeline will choke and stutter. In addition, the more pixels the video card needs to render, the more "art assets" it needs to pull into it's buffer to process efficiently. When it has to retrieve those assets from the hard disk, this is orders of magnitude slower and introduces stutter. This is where a faster video card with more dedicated VRAM becomes essential. If the video card can't keep up with the display demands, you get a stutter you can't overcome except by using "ENBoost" to increase the buffer size (which will still be slower than VRAM) and reducing the size of the images, and therefore the memory demands.

If you push the game too hard then things like the music can start to stutter. A forum moderator recounted that he was running a game (Fallout New Vegas) at 4K maxed with an army of companions in high res gear, on top of texture packs and everything else. Without "ENBoost" the whole lot would have fallen apart. When he swapped out his "way too big 4K display" for his old 2K one: the problem went away. No other changes were needed.

It's all inter-related.