Jan 23, 2019 Hi everyone, I have wrote an OpenCV application in Qt (MSVC 2017) to decode raw frames from a MJPEG steam for further process, now I am trying to use GPU for decoding the stream. What is the best and easiest way to do so? For best results, I ignored both, and set the encoder and decoder to Intel QuickSync or DXVA encoding with the processor. If you want to use a GPU encoding method, I strongly recommend an NVidia GPU for this purpose. You can decode on the client machine with an NVidia GPU as well, but Intel's iGPU with QuickSync is exceptional at decoding H.264.
We have had discussions and fights here everyday about whether the RX 480 or GTX 1060 is faster. You could go and on about DX12, Vulkan, future proofing, etc. but at the end of the day, theyre just 10% from each other in almost any game.
Id argue that there are far more important things than that when choosing a graphics card. We tend to overlook the features and simply focus on performance, performance, and performance. Performance is important, for sure, but at the end of the day, its the overall experience that matters. A lot of factors come into play when it comes to overall experience.
So, lets take a look at the other less commonly discussed topics. Take note that my experience consist of using an AMD Hawaii (later rebadged as Grenada) from 2015 to 2016, and NVIDIA Pascal from 2016 to 2017.
AMD has FreeSync and NVIDIA has G-SYNC. Buy the monitor that your GPU is compatible with. No surprise here. That simple?
Not really, you need to take note that not all FreeSync monitors are equal, as FreeSync lets each manufacturer implements the technology themselves. You need to check the review for each individual monitor to see how well it manages FreeSync. For the first generation FreeSync, which is practically all monitors on the market today, the term is interchangeable with the VESA standard 'Adaptive Sync'.
For the future, however, AMD will start certifying FreeSync 2 monitors since it has higher standards than Adaptive Sync. Monitors that do not pass the certification will be likely to be advertised with 'Adaptive Sync', and will likely to perform worse than FS2 monitors. More information about FS2: http://www.pcworld.com/article/3153254/displays/amds-freesync-2-makes-cutting-edge-hdr-monitors-even-more-glorious.html
G-SYNC system is much simpler, since the module comes directly from NVIDIA, which means the performance is roughly equal from one monitor to another. However, you do pay premium for the convenience as G-SYNC monitors are roughly 100-200 dollars more expensive than an equivalent FreeSync monitor.
Learn more about GS and FS here https://www.reddit.com/r/hardware/comments/666i4e/gsync_and_freesync_a_primer_on_similarities_and/
When it comes to 4K TV, take note that only AMD Polaris (and Vega?) platform has HDMI 2.0 support, which means apart from the RX 460, 470, and 480 (plus their rebadged 5xx variants), all AMD products are stuck with HDMI 1.4, or 30 Hz for running 4K. HDMI 2.0 support seems to be a bit buggy as discussed in another sub (simply search for RX 480 HDMI 2.0).
The vast majority of NVIDIA cards on the market, Maxwell and Pascal, already have proper support for HDMI 2.0. However, it is no guarantee that you can game smoothly at 4K, as some cards are simply too weak. Outside of gaming, such as using a 4K TV as a monitor, which is getting more and more common these days due to the aggressive pricing from TV manufacturers, is still completely possible.
AMD has removed analog output from their graphics card since Hawaii (R9 290) in 2013, and NVIDIA soon followed suit in 2016 with the release of Pascal (10-series). The last, most powerful AMD card with analog output is the HD 7970 (later rebadged as R9 280X), and the last, most powerful NVIDIA card with analog output is the GTX 980Ti or Titan X Maxwell.
Though in my opinion, if you still have a monitor with only analog input, you need to upgrade your monitor first before your graphics card.
While video playback seems to be the lightest task for a PC, the truth is its more complicated than that. In order for a video to be played efficiently, you need a dedicated hardware for a specific video codec. This is called 'hardware decoding' (playback) or 'hardware encoding' (producing the video). When the dedicated hardware is not present, the decoding/encoding process will be left to the CPU, which is WAY less efficient at doing so, as a CPU is designed for general purpose, not dedicated to do that specific task. This is called 'software encoding' or 'software decoding'.
Almost any video card on the market today supports H.264 or AVC codec. However, not all cards supports H.265 or HEVC codec. In addition to that, there is a competing codec called VP9. Thankfully, cards that support H.265/HEVC also supports VP9, so don't let that be a confusion for you. Anyway, the first version of H.265 was approved in 2013, and has been widely adopted as the successor to H.264/AVC. It's main advantage is it takes less space and internet bandwidth for the video of the same quality as H.264, and therefore it is highly favored in 4K videos. Almost all 4K streaming at the moment is performed using H.265 codec. This is why I strongly suggest to get a video card that supports H.265, as when it comes to 4K, everything becomes 4x heavier. Decoding a 4K video on the CPU is a heavy task, even for a Core i7. Expect the vast majority of videos, 4K or not, to be distributed in H.265 format within the next couple of years. YouTube has started transitioning to VP9 since 2014, 4K goes first with other resolutions followed shortly after.
Similar with the HDMI 2.0 situation, only AMD Polaris platform, that consists of RX 460, 470, and 480 (plus their rebadged 5xx variants) has H.265 codec. On the other hand, NVIDIA has started supporting H.265 since Maxwell. That means, all NVIDIA 9 and 10-series cards do have support for H.265. Take note that, the GTX 970 and 980 only have partial support for H.265 decoding, thankfully still keeping H.265 encoding intact, which is important for the next point.
Both cards support live streaming as of today, with ReLive and ShadowPlay for AMD and NVIDIA respectively. That means, just like video playback, it utilizes the dedicated video encoder hardware on your graphics card. Or to put it in a more concrete example, you don't need to buy a Core i7 or Ryzen 7 for streaming, your GPU has the capability to do it far more efficiently.
In addition to live streaming, NVIDIA has GameStream, which lets you play your PC games from NVIDIA devices such as the SHIELD TV and SHIELD tablet. There is a 3rd party app that lets you stream to another PC, Android, iOS, Raspberry Pi, and other devices. http://moonlight-stream.com/. Just like ShadowPlay, the encoding is done on the GPU and therefore, it puts virtually no load on the CPU. There is no AMD equivalent at the moment. The closest alternative is Steam in-home streaming, though its only compatible with PC and Steam Link, and only works, as the name implies, in-home.
This is also where you can immediately feel the benefit of H.265 codec, as you can use GameStream at higher quality within the same available bandwidth. Keep in mind that the client device (phone, PC, etc.) has to support H.265 decoding for it to work.
A little bit addition
You can use just about any streaming software to achieve the same function as AMD ReLive or NVIDIA ShadowPlay. For example, OBS has an option to switch the encoding type from CPU to GPU (albeit, named differently).
Streaming via hardware encoding has lower quality than CPU encoding. There have been great gains in GPU encoding over the years that make it closer in quality to CPU, but it still loses in the quality war. It is, however, completely sufficient for casual day-to-day use. NVENC used by Nvidia is notably better, closer to pure CPU encoding, compared to AMF or VCE by AMD
FreeSync is cheaper, but less consistent. If you do a proper research, you save money over G-SYNC. Consequently, you need an AMD GPU
GTX 9-series or 10-series is generally better for 4K TV than other cards
If you want to watch videos within the next couple of years, get the GTX 9-series, 10-series, or RX series
You don't need to buy a Core i7 or Ryzen 7 for streaming, use your GPU
You can play your PC games on phones, tablets, etc. only if you have NVIDIA GPU