Which Stream Would You Rather Watch?


Twitch has recently upped their max bitrate to allow for 1080p 60fps streaming… Which got me thinking… Stuck at 6000Kbps, would it be better to watch a stream with higher image quality and lower framerates, or lower image quality but higher framerates?

Which would you rather watch (not including CS:GO streams)? Vote below:

  • 1080p 30fps (Higher Image Quality, Less Smooth)
  • 1080p 60fps (Lower Image Quality, More Smooth)

0 voters


When I watch streams, I’m watching to be entertained and enjoy the game. I would prefer to see a great image quality.


Smooth gameplay. Specially for action and fast paced games.


My unrestricted answer is 1080p down-scaled in OBS to 720p/60fps as my preferred compromise of quality/smoothness, but I voted my preference of the above as well.


I agree with @Auth, what I’d prefer isn’t on there. If I have to choose between the two I guess I’d go with the 30fps option.

However, I think the better compromise is 720p@60fps, especially considering windowed (non-full screen with chat) viewing, mobile, etc.


That’s a great point I hadn’t even considered. I regularly have a #teamstrats stream running on my second monitor (not maximized) or I’m watching it in the Twitch theater mode (which I honestly like); neither instance is actually full screen, so I’m not getting the benefit of full 1080p.


I’m a little confused as to how a lower resolution would be better…?

If you’re streaming at 60fps, then you have 100Kb available to you per frame. At any given amount of data, you are never going to get a BETTER image with a lower resolution… Only the same or worse…


whichever results in less buffering especially for tablet devices. For a while I could never watch a stream that I was using chromecast to view IF it was @ 60fps on the streamers end. Always resulted in stutter.

The streams I do watch, I am typically there for the streamer and less for the game itself. As long as we aren’t talking 480p, then the quality is a “meh” to me.


Neither would cause more buffering than the other. They’re both 6000Kbps.
The only factor when it comes to buffering is how much data needs to be loaded. If the amount of data is the same, it will cause the same amount of buffering.


IMO, the down-scaling that OBS offers actually looks pretty damn good, and being able to shove that out at 60fps will look better than a less-crisp 1080p/60fps. The fact that I (and, presumably) a lot of other people don’t watch streams in full screen makes the 1080p not as big a deal, whereas the smoothness of 60fps would translate well. 1080p/60fps might still have some of the obvious encoding/compression visual noise that would still be obvious to a viewer even if running it windowed, whereas losing a few source pixels by streaming in 720p would be less obvious.

Assuming someone is using a 1080p screen and watching a video with in a window that takes up 1/4 of their screen, they’re watching in a 270p window. The difference between 1080p and 720p at that size is negligible, as there’s either 4 or ~3 source pixels per window pixel (respectively). When I’m using Theater Mode it looks like my screen becomes ~930p (screenshot and measured in MS Paint like a boss), so the stretching a 720p source would undergo to fit would be pretty insignificant, IMO, and wouldn’t bother me in the least (I’m totally fine with watching 720p videos and such, generally). To me, the slight compromise in image quality in favor of added smoothness by going 720p/60fps is better than either 1080p/30fps (less smooth than I’d like) or 1080p/60fps (higher chance of pixelation when there’s lots of movement on screen).


You’re right. I was injecting the streamer perspective as well: maximum experience for a user with minimum requirements/cost.

All things being equal, yeah you might as well get the source as high as it can go (assuming the server is transcoding for you).


@Bradum Can we ask how awesome YT is?


I think you are missing a major thing though, a facecam, with cleavage.


I like watching 1080p60 game content when it’s available on YouTube, but I think this is usually achieved by uploading a high bitrate capture afterwards, not live-captured and transcoded. That said, for me 1080p60 even when livestreaming under dubious conditions still seems to work.

My current rig is an i7-2600K with an R9-290, and the upstream bitrate is limited to about 3500kbps (ugh), so I tend to stream at 720p60 (downsampled from 1080p). 3500kbps isn’t technically enough for it, but on YouTube it seems to work well enough. It seems to work even for 1080p60, though individual frames can get pretty blotchy. If I want to showcase something pretty in a game, I’ll just stop moving the camera and let the frame build progressively to a higher-quality image over the 5-10 frame of data it needs.

Here’s a Wildlands 1080p60 stream test on the setup I described above. I was surprised it worked at all. Streamed via the “AMD Relive” tool when it was still working for me. OOPS This was an uploaded high bitrate recording. This Division 720p60 was streamed (though my video card was overheating, causing hitching)

UPDATE: The file I was looking at before for Wildlands wasn’t livestreamed. Duh. I TAKE IT ALL BACK.


Thee reason YouTube is different is because it actually allows a higher bitrate for 60fps videos… YT allows 8Mbps for 1080p30 and 12Mbps for 1080p60… While Twitch only allows 6Mbps no matter what, and that’s just gone up this month from the 3.5Mbps they used to allow.


Still confused… Given it contains the same amount of information, why would 1080p60 have more pixelation than 720p60?

When it comes to video compression, the resolution is the number of unique pixels it has AVAILABLE for use, not the number of unique pixels it MUST use.
At any given bitrate, 720p60 will never produce a better image than 1080p60.

EDIT: I guess the only argument FOR the lower resolution is that, if you’re watching it in quarter screen or something, any area the compression uses to put more pixels than are available on the screen would essentially be wasted data… Compressing video to make it match pixels available will always be the most optimal… But if you’re watching a twitch stream quarter screen, do you really care that much about minute differences in image quality?


Honestly, 1080p60 is great, but I gotta also say that not all viewers can handle something of that quality — y’never know where they might be viewing from or what type of internet they’ve got. For all we know, they could have a potato for a router and a microwave for a modem. :blush:


Yeah, but resolution isn’t what determines how much bandwidth they need, bitrate is…

Bitrate is the amount of data that the encoder is allowed to use for the video, resolution is just the number of pixels available for the encoder to use… So a 1080p60 stream encoded at 6Mbps will require the same internet connection as a 720p60 stream encoded at 6Mbps.

Same goes for 1080p30 vs 1080p60 at 6Mbps. Since 1080p60 has double the frames but is using the same amount of data the video will be smoother because it has twice as many frames per second, but each frame will have half the amount of data to use because there are twice as many frames to encode…

Which is what prompted my question… The resolution and bitrate don’t really matter. What I’m interested in knowing is, at any given resolution, whether people would rather watch a smoother stream with a lower image quality (60fps), or a less smooth stream with higher image quality (30fps).


Kinda lame actually. While YouTube allows for 21:9 videos, YouTube Live doesn’t allow you to stream in 21:9 :frowning:


@Bradum Can a non-partner even stream in 1080p? Referencing Youtube’s recommended bitrates. Those are a little higher than the 6Mbps of Twitch for non-partners.