I ran some test with MC 7.0 and a BMD Ultrastudio SDI on a notebook an realized that i got different CPU loads while capturing from different HD-SDI sources.source 1 was the output of a adrenalin hardware with installed HD boards.source 2 was a sony video mixer (DFS-800)
what's the difference? 8 vs 10 bit??
all other settings were the same 1080i25. I ran the test using different codecs too (185 / 185X / 120) - same result.
regards - M.
visit me @ https://www.facebook.com/Kreationist
Moses.M:what's the difference? 8 vs 10 bit??
8-bit means coding of Y/Cb/Cr is with 8 bit per code-word, which gives 256 different luma-/color values.
10 bit means codeing of Y/Cb/Cr is with 10 bit per code word, which gives 1024 different luma-/ color values.
Probably, the CPU load with 10 bit is higher than the one resulting from 8 bit.
Joachim
Joachim Claus
I know what 8 and 10 bit mean - I want to KNOW whether it 'causes the higher CPU load...
© Copyright 2011 Avid Technology, Inc. Terms of Use | Privacy Policy | Site Map | Find a Reseller