Propably not, film don't have that resolution, maybe some 70mm. It's aprox. 2K max depends on film grain and lens optics. 70mm teoretically can reach 4K (Dunkirk 65mm as example).
Yes this can make some benefits. Imagine - When you are using standard 1080p 4:2:0 picture is only 8bit and chroma resolution is 960×540.
While you get 1080p or 2K in 4:4:4 or 4:2:2 chroma sampling it's possible to get 4K 4:2:0 10bit.
In full chroma mode you get 1920×1080 or 2048×1080 resolution in color - it's same in 4K 4:2:0.
Anyway it's possible to scale using neural network scaler, but this cannot add some new details to picture. But can reconstruct lines in luma etc.
Often propably no, anyway there are using RED, ARRI 4K, 6K and 8K for filming, but in postproduction their target format is 2K DCI.
Almost 99% cinemas are 2K so this shit quality is top for them. There are very few films produced in native 4K.
Very much of them are heavily affected by luma nad chroma noise. This makes things worse.
Producing 2K DCI from 8K source is crazy if I take the whole cost for making movie. It can be hybrid 4K / 2K (can be upscaled by neural network) for effects for better cost ratio.
It's possible to reconstruct HDR picture from RAW or highbit video material. But it's complicated process. HDR10 is static system, where is one metadata for whole video.
HDR10+ and Dolby Vision are dynamic per frame.
Yes there are a few films with correct HDR, but many of them are not. It's just simple conversion from SDR - sometimes with white level shifted higher.
High Frame Rate, old films have 23fps, but new production can do 48fps, 50fps, 60fps. Is it good? Absolutely!
Some modern TV can insert intermediate frames to boost framerate (this process can work with high resoluton pictures without blurring and noise). It's good, but not perfect. With blurred pictures or much noise, interframe cannot be reconstructed without picture errors.