That stream can include, for example, a video track (produced by either a hardware or virtual video source such as a camera, video recording device, screen sharing service, and so forth), an audio track (similarly, produced by a physical or virtual audio source like a microphone, A/D converter, or the like), and possibly other track types.
Actually the page contain only references/examples about how to do use the camera. Actually that api support also the window and screen sharing but is not mentioned.
I found that page https://mozilla.github.io/webrtc-landing/gum_test.html that show that in Firefox there is that support (it is very simple to try it) but example about change the media source on MDN are missing but this demo is complete of everything.
I was trying to understand why demo have that support but there are no examples on MDN maybe someone forgotten to update it?