28

I am doing video rendering these days and one thing I am totally confused about is if someone uses a cheap laptop for rendering videos.

  1. Does video rendered on a high-end i7 laptop look better than video rendered on a dual-core laptop? (Does Intel HD graphics matter, which is used in both?)

  2. Does video rendering degrade the processor's performance after a period of time (going 100% for minutes)?

Run5k
  • 16,463
  • 24
  • 53
  • 67

5 Answers5

42

Does i7 render better picture quality than dual core (Does Intel HD graphics matter which is used in both)?

No, it doesn't. They both render in the quality you tell them to, however, rendering is a really computationally-heavy task, so rendering with an i7 will be a lot faster than rendering with a low-end dual-core processor.
And no, the Internal Graphics Processor (Intel HD Graphics in this case) will not matter, since rendering uses only the CPU. However, some applications for rendering might use your IGP (Intel HD) or GPU (your discrete graphics card, if any is present) to render an image, which will lead to a completely different result. Most consumer-grade CPUs do better at rendering than regular GPUs, and a lot, a lot better than IGPs (both in quality [because of better computational algorithms] and in speed, however, this does not apply to this case). So you should keep this in mind, as it is varies from application to application. (credit to @CliffArmstrong for the suggestion)


Does processor degrade after a short amount of time because I use them to render videos? (videos use 100% CPU for minutes)

No, processors do not degrade. They are manufactured so you don't have to change them regularly. Check this answer for more detailed information.


If the application which is currently rendering makes use of multithreading, then newer processors which also have a higher core count would be able to perform the same task a lot faster.
For example, let us say we have a newer 8-core i7 processor and one older regular dual-core processor and let us say that each core has 2 threads. That makes them a processor with 16 threads and a processor with 4 threads. Theoretically, if the application made use of all the cores and we specify the image to be of quality 1080p (Full HD), the i7 processor would theoretically render the image 4 times faster than the dual-core processor (if all cores work at the same frequency in both processors). However, the image quality would still be 1080p, so they will render the same quality image, but in different time.

And while processors are assigned such heavy tasks, they start producing a lot more heat, which is what can be dangerous. Proper cooling is a must-have when performing such tasks, as @Tetsujin mentioned in his answer, or else your CPU could begin to throttle itself down in order to reduce heat.

Fanatique
  • 5,153
21

So long as the machine can keep itself cool enough, the only difference will be the time taken.

When rendering video even on a 12-core Xeon, I intentionally ramp the fans up to maximum. Even though the machine is perfectly capable of keeping itself cool, it considers "cool enough" to be 1°C under 'procHot' which is Intel's specified maximum temperature for the processor [98°C for this particular processor, you'd have to check Intel's figures for your own].

I just like to give it a bit more headroom, but maybe that's just me being a little paranoid.

On the other hand, if it can't keep itself under procHot, it will eventually cause short-term crashes/BSODs or even long-term damage.

Cooling is prime when doing intensive tasks.

Tetsujin
  • 50,917
13

When running the exact same software encoder (program) with exactly the same options and configuration on two different processors you will get the exact same result. The only difference will be time taken to do the encoding.

Using an exact same program with exact same configuration with exact same input should give the same output quality when run on a Xeon, an i7, an i3 or even a Celeron processor.

If you use the built-in hardware video encoders or decoders then you may get different results as they might be set up or optimised differently between processor generations and newer hardware may support newer features. In the same way that using a 5 year old copy of ffmpeg might be slower or yield slightly different results for a given configuration than a newer version, the different hardware video encoders can be thought of as equivalent to different versions of the "software", albeit versions that cannot be upgraded without replacing the hardware.

The processor itself will not likely degrade but as the processor runs hotter the fans will run harder, the power supply will work harder and overall the system will work harder and hotter than it otherwise would if you weren't doing the encoding. In theory this extra work could be thought to be putting an extra strain on your system but in practice your system should be designed well enough that the difference between you using it in this fashion and not using it at all should mean that the working lifetime of the system will be as near the same as makes no difference.

If you have a power supply or cooling system that is not designed or specified well enough to match the load of your system then you might cause a failure sooner than they otherwise should.

Running demanding tasks on an underpowered PSU may cause it to overheat and burn out components within the PSU, or it might "brown out" causing system instability. Unless you bought a bargain basement pre-built machine or built it yourself with the smallest supply you could find this should not be the case.

Mokubai
  • 95,412
3

On a laptop, generally no. However many laptops are not built to last. The CPU may not degrade but something will. It is abusive to use a laptop in this way, even a "gaming" laptop.

Running over voltage, and over nominated clock rates will shorten the life of many workstation and desktop components. This applies to graphics cards too. This is not necessarily a degradation for a CPU, but a failure.

A GPU can leave the factory with latent issues and working them hard can reveal the faults. this is why we have ECC Ram in graphics cards now. I'm not going to mention any brands but there is a reason there is a warranty.

mckenzm
  • 946
-2

This will depend on how much of the computation happens on the CPU and how much happens on the GPU.

In general, CPUs will do more of the serial work where a lot of branching happens and GPUs will do more of the work that's performing the same operation on a large amount of data (i.e. on every pixel).

Also, the amount of cores only help if the rendering makes use of multiple cores. A lot of applications don't fully utilize all cores. So an 8-core processor (or quadcore with hyperthreading) will almost never give an 8-fold increase in speed.

An application that's not optimized at all for multithreading will not even get a speedup at all.

To answer your questions:

  1. No, it will look exactly the same, as the exact same operations are performed.

  2. Considering what I wrote above, it depends on if the temperature of your CPU increases above a certain threshold, which will cause the CPU to tune itself down to not increase the temperature any further. So if the CPU is doing a lot of work it will slow down after some time of full load, especially in laptops (small case, bad cooling). If by degrading you mean long-term degradation, then refer to the answers above (tl dr; they don't degrade by much).