In the late 80s our diesel maintenance supervisor at work gave me a printout of cold pour tests. There was quite a range, from a conventional straight 30 or 40 at the thick end down to a synthetic 5W-30 at the thin end.
It's been a long time now, and I can't recall the paper perfectly, but the tests were done at several cold temperatures, and with the oil cold-soaked for different lengths of time.
The differences were, predictably, emphasized as the test temperatures got colder, but what surprised me was that the cold soak time was a significant factor in flow rate - I think they did 16 hrs and 48 hrs, and there was a very measurable difference. I was surprised, as I wouldn't have thought that oil would get any colder after sitting in the sump for 16 hours after the engine had last been run.
So my question (and maybe it was in the video but I missed it) is for how long were the oils sitting outside @ -20 C before the start of the test? If the tester kept them at room temperature and then took them outside immediately before the test, the test was not really evaluating the impact of cold weather on flow rate. That might explain why, in the 2nd test, the conventional oil flowed as well as the synthetic.
I know oils are better now, but I also remember buying oil (QS 10W-30) by the caseload of 24, and carrying the whole lot around in the trunk of my thirsty oil-guzzling '68 Impala. I had to add a couple of quarts @ -30 C, and the oil (which had been sitting in the trunk for weeks) was almost impossible to pour out of the can. It was thick like molasses. I know 5W-30 vs. my 10W-30, and -30 vs. -20, but still that conventional looked to be pouring extremely well for winter temperatures.