Your monitor can only display a small range of intensities, even with gamma correction's logrithmic scale. A blackbody spectrum's power varies greatly depending on its temperature. So if we try to truthfully display not just chromaticity (hue and saturation), but brightness as well, we can only display a partial range of temperatures, bounded by too-dark-to-see (black) on one side, and too-intense-to-display-truthfully on the other.
With a 1000 K to 10000 K scale...
Why does the range get smaller and smaller?
It goes like this. The blackbody curve of wavelength vs power is a bump with a blunt front and a long tail. As its temperature increases, the bump advances from invisible infrared, through the narrow strip of visible wavelengths, into invisible ultraviolet, leaving just its long tail behind and visible. But the bump is also growning with temperature, so by the time you are seeing the tail, tail is thicker than the bump center was when it went through. So, the 2000 K normalized view is narrow because the blunt end of the bump is plowing into the visible strip, with intensity increasing so quickly that we can only show a small range of temperature. Around 6000 K the bump center is going through visible, and the bump's growth, fighting with the down-slope on the backside of the bump, yields a slower increase in intensity. And by 10000 K, all is left is the long flat tail, so intensity only increases slowly with temperature as the tail grows gradually thicker. A 30000 K normalized scale looks similar to the 10000 K one - a wide stretch of slowly fading blue. (Which is why, traveling near light-speed, the blue-shifted stars in front of you would still look blue, long after the stars behind you had been red-shifted into invisibility.)