VGA is analog and delivers a poorer picture quality than any digital alternative. It is subpar for everybody, but has more disadvantages for those who don't game. I don't see why you wouldn't use a digital medium (DVI, HDMI, displayport, etc.) as they are far superior in nearly every respect and cost essentially the same amount.
The monitor is for my dad's computer, he doesn't play computer games. My computer looks like this: (imported from here) Old pic by the way
Flying home into Kansas City. Once we get near the airport, it's deemed unsafe to land, so we circle around for about 30 minutes with 45 minutes of fuel left. Then we decide to land in Omaha Nebraska. Now we're sitting here refueling while tornadoes are headed straight for where we live. Fun.
Well many people already have loads of vga leads, so use them rather than buying a new lead. The only reason i switched in the end was because my new monitor had a vga, dvi and display port cable included, which meant i could make both screens digital. But my main monitor that had the cables included is an Asus PB238Q which is a fairly decent ips screen and i could not tell the difference between the two cables since i wanted to see if there was any difference. So there are certainly lots of advantages to using a digital connector, but if you already own a vga cable it will be fine in 95% of scenarios. - - - Updated - - - My sister has that laptop in white (probably with different internals).
Never had any issues with VGA on the uni 1900*1200 IPS panels. No discernible difference between my laptop running through HDMI-VGA adapter to those monitors or the PC running through DVI to those monitors. Monitor even reports itself as running at 60Hz on VGA, as does windows. No loss of focus, no random lines down the screen. A 100% sharp image. The DAC's in a modern VGA driver are pretty damn good these days, although I should note that I was using a 50cm shielded cable. The consipiracy theorist within me is relatively convinced that the popularity of these newer digital formats is simply so thatmanufacturers can control who has access to them, outside of the western world some of these formats are harder to come across. Plus HDMI natively supports DRM which is in use on a few current gen consoles and prevents non licensed HDMI compliant displays being used and also limits which displays some blu ray content can be played upon. From a technical point of view. There is nothing wrong with VGA. Infact VGA does not have any technical restriction on the frequency of pixel data pushed across it allowing it to theoretically support any resolution whereas the digital types are clocked in and out at a set frequency.
Should I switch out a side intake fan and stock Intel cooler on an i7 for a Corsair H80i with liquid cooling directly to the CPU? I'm afraid if I do this my GPU and PSU won't get any cooling, unless air travels through the radiator.
https://en.m.wikipedia.org/wiki/High-bandwidth_Digital_Content_Protection and there some other bits going on too. no other intakes available?
I got a GoPro Hero 3+ Silver yesterday, now I've just got to figure out what to do with it exactly. I took a short drive with the camera on its suction cup mount, but the audio quality wasn't that great. There's plenty of great test footage of the camera's quality already, and I didn't really feel like editing it. I might try to mount it on my Traxxas Slash 2WD RC car that's been collecting dust in my closet and see what kind of footage I get from that. I'm very pleased with the quality of video you get from it so far, and I almost want to start using it as a dash cam.
Same here, absolutely no difference in quality between VGA and HDMI for me, so I was perfectly fine using it. Dunno where the claim that VGA is always worse comes from because it clearly isn't the case. I'd probably still be using VGA (or at least through a DVI converter because most recent cards don't have VGA) right now but I've since made the switch to DisplayPort since I got a Gsync monitor. But like I said, VGA was perfectly fine. I definitely don't miss the screws though.
Perhaps some are more sensitive to it than others. I used VGA up until about a year ago when one of my cables broke (melted, actually); the difference between DVI and VGA was almost like night and day. I guess I could've been using outdated drivers, though I doubt it. If it truly is the case that VGA and DVI are identical in picture quality then I guess I could've been using a faulty cable. Who knows.
I think it might be a cable thing. I'm very sensitive to little flaws in audio or video, and I've never noticed a difference between VGA and HDMI on my Iiyama screen, or between the VGA on the Iiyama and the DP on my new BenQ screen. If you showed me a picture of something being displayed on each panel I definitely wouldn't be able to tell them apart.
One of the main advantages is that digital cables just work or don't work. None of that low quality with cheap cables or whatever.
Perhaps quality degradation is a thing? Those VGA cables had been in relatively constant use since at least the mid 90s.
If digital cables degraded enough the picture would go all weird or not work at all. No quality change.
Sorry, I think I was a bit unclear... I'm thinking of various reasons as to why there was such a quality difference between the analog (VGA) and digital (in this case, DVI). I am referring to quality degradation in the VGA cables that had been in use for over 20 years and whether or not that could've been a factor.
Two front intakes - one for the HDD, one empty One side intake One rear exhaust One top exhaust I doubt it would fit in the front so it has to be the side.