|
Resolved: it was a setting on my monitor to specify HDMI input as to PC monitor.
I have an LG 24" monitor and just upgraded my computer. I have a gtx 560 1gb, and I plug micro hdmi with gender adapter to hdmi cable going to my monitor.
It looks.... not as good as DVI. Isn't it supposed to be better? or about the same?
I can also just hook dvi up, should I just do that?
I feel like HDMI should be better but it doesn't look as crisp and I can't explain why.
It looks great with DVI
http://www.lg.com/uk/it-products/monitors/LG-tv-monitor-M2362D.jsp
this is my monitor that I'm having trouble with.\
I'm using windows 7 64-bit also if that makes a difference.
|
Well, it could be an issue with the cable or drivers. Could be confused on the resolution to use.
|
either the cable works or it doesn't. Are you sure your settings are the same with hdmi and dvi?
What does better/worse mean here? Colors? Resolution?
|
hmm wel I just have some generic hdmi cable, and then it's converted.
It looks pretty much the same as it does on my big TV from my laptop, but I expect the quality isn't as great on a TV (and i run it at 1100 resolution there)
I also use this hdmi cable for halo on xbox360 and it looks amazing on there (from tv to xbox 360)
How can I tell if the drivers are up to date? I just got this computer an hour ago. I don't think it's the cable but it could be.
I have no idea if the settings are the same between hdmi and dvi, just that I plug DVI in and it looks great, and I plug hdmi in and it doesn't look great.
If I look at text from the same box (like this browser) from the hdmi monitor to my dvi monitor (i have dual), it looks slightly blurred (very slightly)... like... it's crisp but not crisp.
It's not super blurred like it would be if I put a goofy resolution up, but it looks. off.
I'm sorry that's an awful explanation.
|
On October 13 2011 12:18 Clues wrote: either the cable works or it doesn't. Are you sure your settings are the same with hdmi and dvi?
What does better/worse mean here? Colors? Resolution?
Since when are cables a go/no-go type item? It's entirely possible.
I know some graphics drivers will use the wrong resolution and scale over HDMI until they're manually set.
|
Get drivers via the nvidia website.
Once you've updated those, right click your desktop, open the nvidia control panel, and check settings.
|
On October 13 2011 12:28 JingleHell wrote: Get drivers via the nvidia website.
Once you've updated those, right click your desktop, open the nvidia control panel, and check settings.
what settings specifically am I looking for? I do have nvidia control panel alreayd and have been messing around with it but can't get it to look right. I'm downloading the drivers now but I assume they are the same since this computer is just under a day old.
The text is always a little off... like... jagged and blurry at the same time.
Would an optimized HDMI setting look equivalent to DVI?
If so, should I just leave it on DVI then?
|
Uhm, you could try turning your display off and back on, and make sure your cable is firmly plugged. I occasionally get some display wonkies where I have to turn it off and back on.
What resolution and color settings is the control panel using?
|
I initially had the same issue on my samsung monitor. I had to go into the monitor's menu to set the hdmi input as a pc and it suddenly good.
|
On October 13 2011 12:36 xdthreat wrote: I initially had the same issue on my samsung monitor. I had to go into the monitor's menu to set the hdmi input as a pc and it suddenly good.
Yeah, that can be an issue too. But it could easily be using the wrong resolution and scaling.
|
On October 13 2011 12:35 JingleHell wrote: Uhm, you could try turning your display off and back on, and make sure your cable is firmly plugged. I occasionally get some display wonkies where I have to turn it off and back on.
What resolution and color settings is the control panel using?
i'm on 1920x1080
no scaling
(gpu does scaling)
60 hz (should it be 59? i don't know what the difference is)
dynamic range is full (0-255) was set to limited (16-235) (couldn't tell the difference)
dynamic contrast ehnancement chceked color enhancement ch ecked
It had edge enhancement at 0% so I put it up to 100%, seemed like it is a little better but still off
Same with Noise reduction, was 0% I put it up to 100%
deinterlacing checked with Use inverse telecine
|
On October 13 2011 12:36 xdthreat wrote: I initially had the same issue on my samsung monitor. I had to go into the monitor's menu to set the hdmi input as a pc and it suddenly good.
hmmm this could be exactly it. My monitor is suposed to be a Monitor/TV... and I just have it set to "HDMI"
I don't know how to specify that it's for PC though.
Maybe the HDMI inputs on my monitor are only for TV?
Anyway maybe the quality is the same on DVI and HDMI?
I might be about to head out and buy a new 27" samsung for my main display because I'm putting this monitor on my other computer, which will hook up via DVI anyway I guess.
|
I always had that same question on my head, so since this thread came up... Is HDMI better than DVI if your monitor and graphic card supports it?
From my knowledge on old monitors it is better to use DVI than VGA because the quality is better and you get more Hz due to the the DVI cable having more pins so it delivers quicker.
|
Pretty sure that DVI is better. I can't even get 120hz with HDMI and I think that there's also more delay when using HDMI.
edit: They're actually the same thing. The difference is that HDMI also carries audio. source: google
|
There is no issue that you are specifically having, OP. I have a GTX 470, and have the EXACT same issue. I am using the highest quality DVI and HDMI cables you can buy (Gold plated and all), and the issue always occurs, regardless of which monitor I use.
There is no "technical" reason I have received from Nvidia support, but even Tech Support stated to me that I should be using DVI over HDMI at all times, unless I need to utilize Audio also.
I am not an engineer, so take this with a grain of salt, however, I feel that the Micro HDMI that our GPU's have don't fully carry over the signal, as well as DVI or Dual-DVI does. Science and engineering says I am wrong, but everybody who has the same type of Fermi GPU that I have has the exact same problem.
Use DVI; it allows for active-aspect ratio scaling too, which HDMI does not :-)
|
On October 13 2011 12:24 JingleHell wrote:Show nested quote +On October 13 2011 12:18 Clues wrote: either the cable works or it doesn't. Are you sure your settings are the same with hdmi and dvi?
What does better/worse mean here? Colors? Resolution? Since when are cables a go/no-go type item? It's entirely possible. I know some graphics drivers will use the wrong resolution and scale over HDMI until they're manually set.
Maybe since HDMI transmit a digital signal, not analog. There is no difference between the quality displayed in regards to different brands of HDMI cable, nor are there differences in quality between HDMI and DVI.
|
On October 13 2011 13:50 Angry_Fetus wrote:Show nested quote +On October 13 2011 12:24 JingleHell wrote:On October 13 2011 12:18 Clues wrote: either the cable works or it doesn't. Are you sure your settings are the same with hdmi and dvi?
What does better/worse mean here? Colors? Resolution? Since when are cables a go/no-go type item? It's entirely possible. I know some graphics drivers will use the wrong resolution and scale over HDMI until they're manually set. Maybe since HDMI transmit a digital signal, not analog. There is no difference between the quality displayed in regards to different brands of HDMI cable, nor are there differences in quality between HDMI and DVI.
I think the video data for HDMI and DVI is NOT transmitted with an error-correction code. (The audio data and some other stuff in HDMI is run through a (64,56) BCH code, so up to a few errors per 64 bits sent can be automatically corrected in real time without incident.) There's some encoding done, but they're to improve the physical operation, not correct errors.
So although it's digital, if a 0 is interpreted as a 1 or vice versa, there will be an error in the output. Any digital communications system is still sent as an analog waveform obviously, so there can be errors. It's just that the data is represented digitally.
Spec says that if the signal conditions are a certain way, errors should be no more than 10^-9, aka negligible. With any kind of short cable run with a cable of reasonable quality, there should be negligible amounts of errors, and an error in color information for one pixel in one frame is probably not going to be noticeable anyway.
With long cable runs, with poorer cables, the probability of error may be non-negligible and you may get garbage output or a visibly distorted image.
For like a 6 foot cable from a computer to a monitor, the cheapest thing will work pretty much perfectly though.
|
So then, if I want the best sound, should I transmit with hdmi sound and dvi for video?
Or since I don't have speakers (only the monitor tv sound) should I just use the sound cable and don't worry about it?
|
Press auto adjust on the monitor.
|
As someone said before, you might need to change the settings of your monitor to let it know that it is connected to a pc. I had to do this for my LG monitor, it solved the problem of color/image quality.
|
|
|
|