What is necessary to play 8K60 footage?
How does playback happen anyway?
That depends on the codec used to encode that 8K footage.
The some of the commonly used codecs right now for 8K would be:
All of the YouTube videos you have linked report the following codec string:
vp09.00.51.08.01.01.01.01 (272) / opus (251)
Google is a major contributor to developing the VP9 codec, and thus YouTube prefers to stream this encoding for 8K videos. (It saves them a lot of bandwidth over H.265)
My best guess for what that vp09.00 first part means is that it's VP9 Profile 0, which is 8-bit and 4:2:0 chroma subsampling.
To play 8K videos smoothly, you're either going to need a very fast CPU for software decoding, or a GPU with support for hardware accelerated decoding of one of those codecs.
For Software Decoding:
Libraries like FFmpeg or libvpx provide the means to decode videos regardless of your hardware features, so a sufficiently fast CPU is one way to play 8K videos. 8K is a very high bitrate though (up to 240 mbps), so maybe the i9-9900K is still not fast enough to decode that much data per second with no frame drops.
For Hardware Decoding:
You will need a GPU that supports 8K resolution and has a fast decoder for the codec used by the video.
According to WikiChip, the Intel UHD 630 Graphics in your i9-9900K can only decode any of those codecs at a maximum of 4K resolution. It won't be useful for 8K videos, but it's definitely an ideal hardware accelerator for 4K content using the latest codecs.
Your GTX 1070 can decode a max resolution of 8K, but its support for codecs that can be 8K is limited.
According to NVIDIA's Video Encode and Decode GPU Support Matrix your GTX1070 can only decode VP9 in 8-bit, or H.265 with 4:2:0 chroma subsampling.
This explains why your chrome://gpu/ page displays support for "Decode vp9 profile0 up to 8192x8192 pixels".
Since all of the YouTube links you tested are (probably) VP9 Profile 0 videos, the GPU is being utilized by Chrome to play those videos.
What is the bottleneck in the system?
That likely comes down to the sheer magnitude of 8K video. It is likely that the video decoder in the GPU is hitting its maximum potential performance, since the CPU usage and GPU usage are not at 100% yet. I'm also not sure that NVIDIA has ever said that 8K@60hz would be optimal on the GTX 1070, only that it is supported.
See update below...
How come the 60FPS footage looks smooth but drops a ton of frames, but the 24 FPS footage is choppy?
I can't explain this one, but maybe it's possible that 3D videos are streamed with the full stereo data and the player is displaying it as 2D with local processing. That would lead to more CPU overhead.
Update:
I tested that 3D video on my GTX 1070 and only 2 frames were dropped over 1136 frames (52s), the playback was also very smooth. My CPU is a Ryzen 5 3600X, so by no means should your i9-9900K be the bottleneck either. Make sure you're using the latest version for your video drivers and Chrome. It's possible that the version of Windows 10 could affect this too (I am using 1809 Pro).