|
After witnessing someone else using their main TV screen as their monitor for their computer I feel in love with the concept. Unforutnately I seem to be having an issue with poor quality/colouring.
My laptop that is drastically underpowered compared to my PC gets better quality on the TV. I am wondering if I am meant to have installed some form of program to help Combine the two.
I am using an HDMI cable to connect both ports.
TV Specs: Panasonic TX-L32S10 32-inch Widescreen Full HD 1080p LCD TV
http://www.amazon.co.uk/Panasonic-TX-L32S10-32-inch-Widescreen-Freeview/dp/tech-data/B0020HRID8/ref=de_a_smtd
Computer Specs:
Power Supply: Modular 850W Power Supply (80 Plus Certified) Processor: Core i7 2600K 8MB Cache Socket 1155 CPU Cooler: Xigmatek Scorpion HDT-S1283 CPU Cooler Graphics Card 1: Radeon HD6970 2GB GDDR5 PCI-Express Graphics Card Memory: 8GB Corsair DDR3 1600MHz C9 Dual Channel Memory Kit (4 x 2GB) Motherboard: Asus P8P67 Intel P67 (Socket 1155) Motherboard Hard Disk Drive One: Samsung 1TB Spinpoint F3 32MB Cache SATA 300 Hard Disk Drive Two: Corsair Force 120GB SATA II 2.5
Thank you!
Edit: What is the terminology for what I am trying to do here? When I try to google my question, I can't really seem to put it into words.
|
Are you using an HDMI input or are you using the PC input on your TV? Did you check the resolution of the desktop when your PC is connected to your TV, make sure it's 1080P and not something else. Are you getting any black bars around your screen? Maybe you need to fix some settings in Catalyst Control Center.
See if this helps: Go to catalyst control center Go to "My Digital Flat-Panels" (you may need to change to advance view under preferences) Go to "HDTV Support (Digital Flat-Panel)" Under "HDTV modes supported by this display" look for 1080p60 and add it
|
Are you using the same hdmi cable to connect your laptop to the TV (just so it's not something wrong with your cable)? You shouldn't need any software other than drivers for your hardware. Check your windows settings for your screens when you have it connected to your tv so you have the correct resolution and such. Other than that I'm not sure, maybe the TV default to some weird setting.
|
I wouldn't recommend using a large TV as a monitor. Even a computer monitor over 27" is going to look bad at 1080p. You need a higher than 1080p resolution monitor when going over 24" size.
|
On February 13 2013 04:56 materia99 wrote: I wouldn't recommend using a large TV as a monitor. Even a computer monitor over 27" is going to look bad at 1080p. You need a higher than 1080p resolution monitor when going over 24" size. I disagree, I enjoy using my 40 inch TV as a monitor. I'm sure you're going to say something about contrast ratio, input lag, or PPI. But I don't see any difference between my 1080p 24 inch monitor and my 40 inch TV.
|
On February 13 2013 04:59 KentHenry wrote:Show nested quote +On February 13 2013 04:56 materia99 wrote: I wouldn't recommend using a large TV as a monitor. Even a computer monitor over 27" is going to look bad at 1080p. You need a higher than 1080p resolution monitor when going over 24" size. I disagree, I enjoy using my 40 inch TV as a monitor. I'm sure you're going to say something about contrast ratio, input lag, or PPI. But I don't see any difference between my 1080p 24 inch monitor and my 40 inch TV. Some people don't really see a difference between HD Ready and Full HD. Some people don't mind bigger pixel sizes on large monitors/screens while some prefer 22" full hd monitors because of smallest pixel size. So that's not the point. --- When I was trying how would it look on my 46" TV, I had to play with settings on both Nvidia drivers and TV a lot and still didn't got what I wanted. Btw, I absolutely love the picture I get when playing on my consoles, so that's not a TV's problem at all. IIRC I had to set much higher contrast and digital vibrance in drivers so colors wouldn't look so "washed out".
|
I have been using a 37" TV as a monitor for like 3 years now. It works great, however when I first set it up my image looked like ass because of some sharpening setting on my TV. I have a Visio. Anyway I would highly recommend looking at all the image settings your TV has, especially anything about sharpening. I turned that all the way down.
|
32" at 1080p is not going to give you a sharp picture no matter what you do, that doesn't really matter for playing games or watching films, working with text won't be so nice though
|
5930 Posts
For text and images, most TVs are definitely worse than computer monitors. The only ones that are similar are those with 4:4:4 support but those are fairly rare when I last checked; most TVs are still 4:2:2 so you have problems with computer stuff. Not a problem for gaming or videos obviously but a problem for static objects that are viewed closely.
Since your laptop is fine, we can probably rule that out. I don't believe AMD has the same colour space problems Nvidia has with HDMI so I think we can also rule that out. Speaking of poor quality/colouring, do you mean undersaturated colours or oversaturated colours?
Anyway, three things you can try messing around with (remember the default setting so it works with your laptop): - Colour Pixel Format (found in My Digital Flatpanels -> Pixel Format) - Your TV's colour space. It'll be somewhere in the TV's on-screen menu. - Messing with RGB settings in your driver/TV menu.
I believe what you want is RGB Full Range of some sort, if it isn't already. I'm not on my computer right now so I can't check for you. Only possible "solutions" I can currently think of.
|
I have a 46" 1080p Samsung LCD (not LED), 3 years old. I have a EVGA Nvidia GTX 570 HD outputting over HDMI. Believe me when I say the quality is gorgeous. I'm a pretty discerning viewer. The thing about PPI, is that it all depends on how far you are away from the device. A phone needs about 300ppi for true pixel imperceptibility. A computer ~200ppi like the retina display Macbook Pros that are 1800p. For a TV that you're sitting 4 feet away from, you can't see pixels. At least I can't.
My dad just bought a 55" Panasonic plasma. The TV is gorgeous but the display from the computer just doesn't seem as flawless as mine. It might've been because it was coming from an iMac using a display port to HDMI converter though. It's good but not great. Blu rays look good. I didn't mess around with it enough to make a definite judgement if it was the TV or not. I think it would be possible to get a good picture if I had tried.
A couple of tips I recommend. Turn sharpness down to 0. Turn it to cinema mode if you have it and begin calibrating from there. I'm not sure about your TV, so you'll have to just mess with the settings until you find what looks good. You basically want the least amount of TV post processing possible. A lot of TVs have all these dejudder and high contrast color post processing etc. Those will make it look like utter crap. Windows 7 has a pretty good monitor calibrator that's a good start. Make sure to turn off all eco functions. You'll just have to manually adjust brightness and contrast and color vividness etc.
As far as the advantages just to get you excited about doing this, there are many.
Once you get it set up though, you'll absolutely love it. I have 2x1TB HDs, 2x2TB HDs, and 1x3TB HD (and a 256GB Crucial M4 SSD). They are all full of ...uh... HD content. Plus all of the other content the internet has to offer (Starcraft streams, Justin.tv, Hulu, Amazon VODs, Crackle etc. I have been paying for cable TV for 3 years to get the internet discount and never hooked it up because it's not worth the hassle. I watched the Super Bowl on CBS's website. I think in about 15 years, cable and satellite TV packages will basically be non existant. Comcast knows it too. Pretty much the only reason to watch TV the traditional way is for Sports. But now with the Xfinity thing where you can watch sports (among others) channels online, that's pretty much over too.
http://xfinity.comcast.net/watch-live-tv/
You get 4 ESPNs online plus other sports channels. Pretty soon people will watch more content on their ipads, smart TVs, smartphones, and computers than over traditional cable TV where you pay for 500 channels and watch 3 shows.
|
You should definately be able to get "console quality" and better on you TV, I don't understand the people saying you shouldn't use your tv as a monitor (for gaming and media that is). I have a comparable setup and get great picture on my 46'' TV. Make sure you try to disable all image processing options in your TV, they often have a tendency to ruin the feed from you pc. My view lagged like hell (similar to low fps) with bad colors until I disabled some of those options and then it looked perfect.
|
I had the same problem before, and the easiest solution?
Use a VGA or DVI connection instead of HDMI! Try it, and I'm 99% certain you'll get a much better image quality! Something to do with the HDMI output and the cable, I forget the exact problem but using a different connection will be the easist solution, although you won't have audio over either VGA or DVI.
|
5930 Posts
On February 13 2013 22:11 EngrishTeacher wrote: I had the same problem before, and the easiest solution?
Use a VGA or DVI connection instead of HDMI! Try it, and I'm 99% certain you'll get a much better image quality! Something to do with the HDMI output and the cable, I forget the exact problem but using a different connection will be the easist solution, although you won't have audio over either VGA or DVI.
That's basically it. Its also a possible physical solution to the problem.
The problem is how GPUs handle HDMI output to TVs. For instance, nVidia's HDMI implementation will output RGB limited-range (16-235) by default. This theoretically makes sense since HDMI is typically used for TVs and video content (Blu-Ray, DVD, whatever), in general, has a limited range of 16-235. But this doesn't make sense for desktop purposes since desktops and monitors are meant to deal with RGB full range (0-255). You're essentially restricting output for no reason.
Why do the drivers do this then? When you connect your GPU to the TV, the GPU's driver looks up the TV's EDID. When the driver isn't certain that the TV can display full range (that is 0-255) and sees that you're using HDMI, it assumes the worst and defaults to limited range. AMD and nVidia both do this, I believe.
Now I don't know if this will solve the OP's problem, since a lot of TVs are utter shit with this sort of stuff (i.e. a TV without 4:4:4 support will never look as good as a monitor for desktop things), but its worth a try. This doesn't really explain why his laptop is better though.
Anyway, since we're talking about limited and full range, people using HDMI with their monitors should check if the GPU's output is full or limited range. To make it easier for nVidia users, this toggle will fix the problem. AMD's toggle solution is located somewhere in the drivers.
|
The key thing is to make sure your laptop is displaying a 1920x1080 resolution on the monitor. If you just did something simple like telling your laptop to duplicate its desktop on the other screen it might be sending an image the same resolution as your laptop's monitor to the TV. That would be really bad, as you would further add on the problems of using a non-native & worse resolution.
|
|
|
|