What you are asking are very good questions, and all of us, streaming video developers, encounter same issues and share same frustration when it comes to plugin-free near real time streaming video in browsers.
Let me address your questions to the best of my knowledge (I have implemented both WebRTC and Media Source Extensions in recent years, for a streaming server software)
- " if it is possible to create a MediaStream and append buffers to it, like the MediaSource"
This one is easy - it is NOT possible. MediaStream API:
https://developer.mozilla.org/en-US/docs/Web/API/MediaStream
does not expose access to MediaStream object's frame buffer, it handles everything internally using WebRTC, either getting frames using getUserMedia (from local webcam), or from RTCPeerConeection (from network). With MediaStream object you don't manipulate frames or segments directly.
And, of course, video.srcObject = mediaSource will not work: video.srcObject must be a MediaStream object created by WebRTC API, nothing else.
- "I could not find in the documentation if browsers handle src and srcObject differently"
Hell yes, browsers do treat video.src and video.srcObject very differently; and there is no documentation about it, and it doesn't make much sense. Politics play large role in it.
Notorious examples from Chrome browser:
a. Media Source Extensions (video.src) support AAC audio, but WebRTC (video.srcObject) does not, and never will. The reason is - Google bought too many audio compression companies and one of them - Opus - made it to WebRTC specs, and Google is pushing Opus to be a new "royalty-free" audio king, so no AAC support in video.srcObject, and all the hardware world must implement Opus now.
So Google can and is legally allowed to add AAC support to Chrome, because it does it for Media Source Extesnsions (video.src). But it will not add AAC support to WebRTC, never.
b. Chrome uses different strategies for H264 video decoders in video.src and video.srcObject.
This makes no sense but it's a fact. For example, on Android, only devices with hardware H264 decoding support will support H264 in WebRTC (video.srcObject). Older devices without hardware H264 support, will not play H264 video via WebRTC. But same devices will play same H264 video via Media Source Extensions (video.src). So video.src must be using a software decoder if hardware is not available. Why the same cannot be done in WebRTC?
Lastly, your VP8 stream will not play on iOS, neither in Media Source Extensions (iOS doesn't support it at all, ha ha ha), nor in WebRTC (iOS only support H264 video for WebRTC, ha ha ha ha). You are asking why Apple does that? ha ha ha ha ha