It pretty seems that 4K is now everywhere. On our phones and TVs. And we’re already talking about 8K. And how about 16K? All this is great, but there seems to be a rising problem with higher bitmap resolutions.

Although I’m not an expert in video, cinema or film, there is an eventual problem with computer processing and data networking with the higher demands of 8K and 16K. I have a fairly new computer, it ain’t a beast but it boasts a recent Intel i7 with 24GB of RAM and I’m having difficulties handling 4K. I know that computer processing is progressing but I’m asking myself what kind of CPU monsters we’re going to have in a few years to manipulate the eventual replacement of 4K, 8K that is, or what some experts are talking about on the Web.

Not long ago I stumbled across an article that opened up some perspective on the replacement of 4K. No it ain’t 8K. It is Vectorial Video. There is already a codec in production by the University of Bath. Here’s the two-ish year old preview of what their developped codec looked like then.

[youtube https://www.youtube.com/watch?v=4_f1ukhy2ZE]

Some work must have been done to evolve on the subject but last I looked I couldn’t find much. Maybe some corporation took on the research.

To be followed…

(via redsharknews.com)