Now, holy crap--that's almost as big as the Giganto-Tron monitor. It runs at standard HD resolution (1920x1080) and has an LED backlight. AOC brand, with which I have no quarrel.
So, 24" for $100 or 27" for $130. And get this: their regular prices are $200 and $200, respectively, making the 27" the obvious choice when they're not on sale. (Which $5 says rarely happens.)
The thing is, I can remember when 27" was it for televisions, when there wasn't a bigger screen available at any price. When projection TVs came along they were about 50" but expensive, meaning you could buy a good used car for the price of a 50" projection television. (A really good used car.) For all practical purposes, until the 1990s TVs were maxed out at 27", and that was it.
And they were expensive. I used my tax refund in 1996 to buy a 25" TV, and it cost me $250. At the time, that was a car payment ($267/mo for my green Escort) and I'd only been able to swing it because the thing went on sale at a really good price. It was a rock-bottom TV with only an RF input, no RCA jacks. That didn't really matter because even laserdisks were still analog video, NTSC standard. HD didn't exist yet; the HD standard had only barely been established a couple years earlier and the mandatory switchover wasn't set to take effect for quite a while. DVDs hadn't hit the market yet, either.
For a while, there were 35" CRTs. About a decade before plasma and LCD screens took over.
Now, it used to be that manufacturing LCD screens was something of a black art. You had to accept that one or two pixels would be bad; the warranties were quite specific on that point because it was impossible to make perfect LCDs at a reasonable price. And the bigger the screen, the bigger the problem, because doubling the size in each dimension quadrupled the number of pixels. A square screen 1,000 pixels on a side has 1,000,000 pixels; with 2,000 pixels on a side it's 4,000,000.
I don't know how they solved the problem, but they did, such that they now make 55" displays (and bigger!) with zero defects, and they do it reliably enough that you can buy a 55" TV now for the same number of dollars a 50" projection TV was in the early 1980s--about $1,500 depending on brand--and there we are talking about dollars that are worth considerably less than they were 35 years ago. For a TV that shows breathtakingly clear and vivid images and lets you surf the Internet if you so desire.
CRTs are inherently analog devices. They were expensive because a lot of human labor was required to build them. Actually making the tube could be automated, but when it came to final assembly and testing of the thing, a robot couldn't do it. Especially aligning the yoke.
CRTs operate using magnetic fields to steer electron beams. There's a set of electromagnets on the "neck" of the tube, where it widens out from the narrow electron gun to the screen itself. These electromagnets (collectively called the "yoke") are constantly changing their fields, and the electron beam will deflect based on the strength and direction of the magnetic field from the yoke. By steering the beam across the screen, and varying voltage to the electron guns, the circuitry in the TV paints a picture, 30 times a second. (Actually, it does so in frames, painting half the picture 60 times a second, odd lines in one frame, even lines in the next, alternating.)
...and in the factory, someone would have to hook the TV up to a test fixture and twiddle potentiometers in the thing until everything was lined up. Analog circuits are fussy, and if everything was not adjusted just right, the image wouldn't look right. All three color guns (red, blue, green) had to point at exactly the same spot on the screen, which meant three electron beams originating in slightly different places had to converge on the same spot every time.
Later--near the end of CRTs--the potentiometers were replaced with computer controls. Instead of looking at the display and tweaking pots, the test tech would hook the thing up to a computer and use that to set the convergence. A human still had to look at the screen and change settings; it was just done digitally instead of using small screwdrivers.
But LCDs are a different story. They're inherently digital. There's no convergence to worry about; if R G and B don't line up, the panel is defective. The hard thing with LCD screens is connecting all those thousands of rows and columns, but once done there's nothing to adjust or tinker with. (If you do tinker with it, you're ruining the display, which is why there are "DO NOT REMOVE" warnings all over the ribbon connectors.)
For HDTV, LCDs are natural because the signal requires little processing. Demodulate and extract the video and audio streams, route them to the appropriate places. HDTV on a CRT is a pain, because you need to take a digital signal and convert it to analog, and you must do it well enough that the circuitry in the CRT can understand the signal you're sending. The reverse is also true; analog video on an LCD is also troublesome, because (again) you must convert the signal, but from analog to digital.
Because there's really very little to tweak with an LCD, the manufacture of them can be very highly automated. Even getting the color settings correct can be automatic; a robot can place a color sensor against the screen and check it. Since there's no convergence to worry about, with the right programming, the base color and white balance can be set by machine without a human hand ever touching it.
Which is why LCDs cost a bare fraction of what CRTs cost.
A 27" LCD monitor for $130 (even for $200!) is insanely cheap compared to what they used to cost. Big CRT monitors used to cost tons of money, thousands of dollars if you wanted something bigger than 15". The 20" CRT I had cost something like $300 in 2000; the 22" LCD that replaced it cost $329 seven years later...and the 24" that replaced that one cost $150.
Now I routinely see 24" monitors under $100, and I'm confident that should the time come I'll be able to replace the Giganto-Tron with a similar sized monitor for not a lot more than that.
Oh--Apple Cinema Display 30, debuted in 2004 with an MSRP of $3,200. I'm not sure when this one was made; at the end of their run in 2010 they fetched "only" $1,800 new. Some of that is the Apple premium price, but not all of it. $200 is a bit more than 10% of that price. It's not as fancy--no aluminum frame or anything--but I'd bet the display image is every bit as good, if not better.
It's kind of hard to believe, how cheap electronics have become just within my lifetime thus far. I mean, I was around and cognizant of the deflation for most of that time, yet I'm still astounded at how far it's gone. In 2017 I carry around a telephone which could emulate, in real time, the first computer I owned; which could, in fact, do that with any computer I owned up to the turn of the century. (The Celeron 333 emultion might chunk a little bit. A little bit.) This phone is a more capable computer than the best PC money could buy in 1990, and the phone company gave it to me for switching to their service. And it's a CHEAP phone.
I mean, dang. In 1985 a 20 MB hard drive would have been enough to store all my writing and games and everything (obviating the 100+ 170k floppy disks I had) but now I strain at keeping everything on a bevy of disks five hundred times bigger. (I do have rather a lot of anime, though. Video's pretty big.)
A lot of the low-hanging fruit has been picked, and now prices hold steady as the dollar inflates. The $500 computer of 2017 is vastly more capable than the $500 computer of 2007. The $250 computer of 2017 can easily replace the $500 computer of 2007 without exceeding any of its capabilities (other than storage and RAM, as no one makes 160 GB mechanical hard drives any more). And you can buy, for $50, a computer which will do any basic task--word processing, e-mail--you'd use a computer to do. (Raspberry Pi.)
It's the 21st century. Dang.