{"id":1266,"date":"2024-08-12T22:20:46","date_gmt":"2024-08-12T14:20:46","guid":{"rendered":"https:\/\/www.fanyamin.com\/wordpress\/?p=1266"},"modified":"2024-08-12T22:50:43","modified_gmt":"2024-08-12T14:50:43","slug":"gstreamer-new-plugin-webrtcsink-and-webrtcsrc","status":"publish","type":"post","link":"https:\/\/www.fanyamin.com\/wordpress\/?p=1266","title":{"rendered":"gstreamer new plugin: webrtcsink and webrtcsrc"},"content":{"rendered":"<p><img decoding=\"async\" src=\"https:\/\/www.fanyamin.com\/wordpress\/wp-content\/uploads\/2024\/08\/image-1723474237549.png\" alt=\"file\" \/><br \/>\n<a href=\"https:\/\/gitlab.freedesktop.org\/gstreamer\/gst-plugins-rs\/-\/tree\/main\/net\/webrtc\">https:\/\/gitlab.freedesktop.org\/gstreamer\/gst-plugins-rs\/-\/tree\/main\/net\/webrtc<\/a><\/p>\n<h1>webrtcsink and webrtcsrc<\/h1>\n<p>All-batteries included GStreamer WebRTC producer and consumer, that try their<br \/>\nbest to do The Right Thing\u2122.<\/p>\n<p>It also provides a flexible and all-purposes WebRTC signalling server<br \/>\n(<a href=\"signalling\/src\/bin\/server.rs\">gst-webrtc-signalling-server<\/a>) and a Javascript<br \/>\nAPI (<a href=\"gstwebrtc-api\">gstwebrtc-api<\/a>) to produce and consume compatible WebRTC<br \/>\nstreams from a web browser.<\/p>\n<h2>Use case<\/h2>\n<p>The <a href=\"https:\/\/gstreamer.freedesktop.org\/documentation\/webrtc\/index.html\">webrtcbin<\/a> element in GStreamer is extremely flexible and powerful, but<br \/>\nusing it can be a difficult exercise. When all you want to do is serve a fixed<br \/>\nset of streams to any number of consumers, <code>webrtcsink<\/code> (which wraps<br \/>\n<code>webrtcbin<\/code> internally) can be a useful alternative.<\/p>\n<h2>Features<\/h2>\n<p><code>webrtcsink<\/code> implements the following features:<\/p>\n<ul>\n<li>\n<p>Built-in signaller: when using the default signalling server, this element<br \/>\nwill perform signalling without requiring application interaction.<br \/>\nThis makes it usable directly from <code>gst-launch<\/code>.<\/p>\n<\/li>\n<li>\n<p>Application-provided signalling: <code>webrtcsink<\/code> can be instantiated by an<br \/>\napplication with a custom signaller. That signaller must be a GObject, and<br \/>\nmust implement the <code>Signallable<\/code> interface as defined<br \/>\n<a href=\"src\/signaller\/mod.rs\">here<\/a>. The <a href=\"src\/signaller\/imp.rs\">default signaller<\/a><br \/>\ncan be used as an example.<\/p>\n<p>An <a href=\"examples\/webrtcsink-custom-signaller\/README.md\">example<\/a> is also available to use as a boilerplate for<br \/>\nimplementing and using a custom signaller.<\/p>\n<\/li>\n<li>\n<p>Sandboxed consumers: when a consumer is added, its encoder \/ payloader \/<br \/>\nwebrtcbin elements run in a separately managed pipeline. This provides a<br \/>\ncertain level of sandboxing, as opposed to having those elements running<br \/>\ninside the element itself.<\/p>\n<p>It is important to note that at this moment, encoding is not shared between<br \/>\nconsumers. While this is not on the roadmap at the moment, nothing in the<br \/>\ndesign prevents implementing this optimization.<\/p>\n<\/li>\n<li>\n<p>Congestion control: the element leverages transport-wide congestion control<br \/>\nfeedback messages in order to adapt the bitrate of individual consumers' video<br \/>\nencoders to the available bandwidth.<\/p>\n<\/li>\n<li>\n<p>Configuration: the level of user control over the element is slowly expanding,<br \/>\nconsult <code>gst-inspect-1.0<\/code> for more information on the available properties and<br \/>\nsignals.<\/p>\n<\/li>\n<li>\n<p>Packet loss mitigation: webrtcsink now supports sending protection packets for<br \/>\nForward Error Correction, modulating the amount as a function of the available<br \/>\nbandwidth, and can honor retransmission requests. Both features can be<br \/>\ndisabled via properties.<\/p>\n<\/li>\n<\/ul>\n<p>It is important to note that full control over the individual elements used by<br \/>\n<code>webrtcsink<\/code> is <em>not<\/em> on the roadmap, as it will act as a black box in that<br \/>\nrespect, for example <code>webrtcsink<\/code> wants to reserve control over the bitrate for<br \/>\ncongestion control.<\/p>\n<p>A signal is now available however for the application to provide the initial<br \/>\nconfiguration for the encoders <code>webrtcsink<\/code> instantiates.<\/p>\n<p>If more granular control is required, applications should use <code>webrtcbin<\/code><br \/>\ndirectly, <code>webrtcsink<\/code> will focus on trying to just do the right thing, although<br \/>\nit might expose more interfaces to guide and tune the heuristics it employs.<\/p>\n<h2>Building<\/h2>\n<blockquote>\n<p>Make sure to install the development packages for some codec libraries<br \/>\nbeforehand, such as libx264, libvpx and libopusenc, exact names depend<br \/>\non your distribution.<\/p>\n<\/blockquote>\n<pre><code class=\"language-shell\">cargo build<\/code><\/pre>\n<h2>Usage (embedded services)<\/h2>\n<p><code>webrtcsink<\/code> can optionally instantiate a signalling server and a web server.<\/p>\n<p>This is the simplest set up for testing, but may not always be desirable.<br \/>\nFor instance one may prefer hosting the services on different machines, or would<br \/>\nprefer that a crash from the host webrtcsink doesn't take down signalling \/ websites.<\/p>\n<p>Head over to the following section if you want to learn how to run services individually.<\/p>\n<p>In the terminal, from the root of the <code>net\/webrtc<\/code> crate:<\/p>\n<pre><code>gst-launch-1.0 videotestsrc ! webrtcsink run-signalling-server=true run-web-server=true<\/code><\/pre>\n<p>In your browser of choice, navigate to <a href=\"http:\/\/127.0.0.1:8080\/\">http:\/\/127.0.0.1:8080\/<\/a>, and click on the stream<br \/>\nidentifier under &quot;Remote streams&quot;. You should see a test video stream and hear a test tone.<\/p>\n<h2>Usage (standalone services)<\/h2>\n<p>Open three terminals. In the first one, run the signalling server:<\/p>\n<pre><code class=\"language-shell\">cd signalling\nWEBRTCSINK_SIGNALLING_SERVER_LOG=debug cargo run --bin gst-webrtc-signalling-server<\/code><\/pre>\n<p>In the second one, run a web browser client (can produce and consume streams):<\/p>\n<pre><code class=\"language-shell\">cd gstwebrtc-api\nnpm install\nnpm start<\/code><\/pre>\n<p>In the third one, run a webrtcsink producer from a GStreamer pipeline:<\/p>\n<pre><code class=\"language-shell\">export GST_PLUGIN_PATH=&lt;path-to-gst-plugins-rs&gt;\/target\/debug:$GST_PLUGIN_PATH\ngst-launch-1.0 webrtcsink name=ws meta=&quot;meta,name=gst-stream&quot; videotestsrc ! ws. audiotestsrc ! ws.<\/code><\/pre>\n<p>The webrtcsink produced stream will appear in the former web page<br \/>\n(automatically opened at <a href=\"https:\/\/localhost:9090\">https:\/\/localhost:9090<\/a>) under the name &quot;gst-stream&quot;,<br \/>\nif you click on it you should see a test video stream and hear a test tone.<\/p>\n<p>You can also produce WebRTC streams from the web browser and consume them with<br \/>\na GStreamer pipeline. Click on the &quot;Start Capture&quot; button and copy the<br \/>\n&quot;Client ID&quot; value.<\/p>\n<p>Then open a new terminal and run:<\/p>\n<pre><code class=\"language-shell\">export GST_PLUGIN_PATH=&lt;path-to-gst-plugins-rs&gt;\/target\/debug:$GST_PLUGIN_PATH\ngst-launch-1.0 playbin uri=gstwebrtc:\/\/127.0.0.1:8443?peer-id=[Client ID]<\/code><\/pre>\n<p>Replacing the &quot;peer-id&quot; value with the previously copied &quot;Client ID&quot; value. You<br \/>\nshould see the playbin element opening a window and showing you the content<br \/>\nproduced by the web page.<\/p>\n<h2>Configuration<\/h2>\n<p>The webrtcsink element itself can be configured through its properties, see<br \/>\n<code>gst-inspect-1.0 webrtcsink<\/code> for more information about that, in addition the<br \/>\ndefault signaller also exposes properties for configuring it, in<br \/>\nparticular setting the signalling server address, those properties<br \/>\ncan be accessed through the <code>gst::ChildProxy<\/code> interface, for example<br \/>\nwith gst-launch:<\/p>\n<pre><code class=\"language-shell\">gst-launch-1.0 webrtcsink signaller::uri=&quot;ws:\/\/127.0.0.1:8443&quot; ..<\/code><\/pre>\n<h3>Enable 'navigation' a.k.a user interactivity with the content<\/h3>\n<p><code>webrtcsink<\/code> implements the <a href=\"https:\/\/gstreamer.freedesktop.org\/documentation\/video\/gstnavigation.html\"><code>GstNavigation<\/code><\/a> interface which allows interacting<br \/>\nwith the content, for example move with your mouse, entering keys with the<br \/>\nkeyboard, etc... On top of that a <code>WebRTCDataChannel<\/code> based protocol has been<br \/>\nimplemented and can be activated with the <code>enable-data-channel-navigation=true<\/code><br \/>\nproperty allowing a client to send GstNavigation events using the WebRTC data channel.<\/p>\n<p>The <a href=\"gstwebrtc-api\">gstwebrtc-api<\/a> and <code>webrtcsrc<\/code> implement the protocol as well<br \/>\nand they can be used as a client to control a remote sever.<\/p>\n<p>You can easily test this feature using the <a href=\"https:\/\/gstreamer.freedesktop.org\/documentation\/wpe\/wpesrc.html\"><code>wpesrc<\/code><\/a> element with the following pipeline<br \/>\nthat will start a server that allows you to navigate the GStreamer documentation:<\/p>\n<pre><code class=\"language-shell\">gst-launch-1.0 wpesrc location=https:\/\/gstreamer.freedesktop.org\/documentation\/ ! queue ! webrtcsink enable-data-channel-navigation=true meta=&quot;meta,name=web-stream&quot;<\/code><\/pre>\n<p>You can control it inside the video running within your web browser (at<br \/>\n<a href=\"https:\/\/127.0.0.1:9090\">https:\/\/127.0.0.1:9090<\/a> if you followed previous steps in that readme) or<br \/>\nwith the following GSteamer pipeline as a client:<\/p>\n<pre><code class=\"language-shell\">gst-launch-1.0 webrtcsrc signaller::producer-peer-id=&lt;webrtcsink-peer-id&gt; enable-data-channel-navigation=true ! videoconvert ! autovideosink<\/code><\/pre>\n<h3>Sending HTTP headers<\/h3>\n<p>During the initial signalling server handshake, you have the option to transmit<br \/>\nHTTP headers, which can be utilized, for instance, for authentication purposes or sticky sessions:<\/p>\n<pre><code class=\"language-shell\">gst-launch-1.0 webrtcsink signaller::uri=&quot;ws:\/\/127.0.0.1:8443&quot; signaller::headers=&quot;headers,foo=bar,cookie=\\&quot;session=1234567890; foo=bar\\&quot;&quot;<\/code><\/pre>\n<h2>Testing congestion control<\/h2>\n<p>For the purpose of testing congestion in a reproducible manner, a<br \/>\n<a href=\"https:\/\/github.com\/tylertreat\/comcast\">simple tool<\/a> has been used, it has been used on Linux exclusively but it is<br \/>\nalso documented as usable on MacOS too. Client web browser has to be launched<br \/>\non a separate machine on the LAN to test for congestion, although specific<br \/>\nconfigurations may allow to run it on the same machine.<\/p>\n<p>Testing procedure was:<\/p>\n<ul>\n<li>\n<p>identify the server machine network interface (e.g. with <code>ifconfig<\/code> on Linux)<\/p>\n<\/li>\n<li>\n<p>identify the client machine IP address (e.g. with <code>ifconfig<\/code> on Linux)<\/p>\n<\/li>\n<li>\n<p>start the various services as explained in the Usage section (use<br \/>\n<code>GST_DEBUG=webrtcsink:7<\/code> to get detailed logs about congestion control)<\/p>\n<\/li>\n<li>\n<p>start playback in the client browser<\/p>\n<\/li>\n<li>\n<p>Run a <code>comcast<\/code> command on the server machine, for instance:<\/p>\n<pre><code class=\"language-shell\">$HOME\/go\/bin\/comcast --device=$SERVER_INTERFACE --target-bw 3000 --target-addr=$CLIENT_IP --target-port=1:65535 --target-proto=udp<\/code><\/pre>\n<\/li>\n<li>\n<p>Observe the bitrate sharply decreasing, playback should slow down briefly<br \/>\nthen catch back up<\/p>\n<\/li>\n<li>\n<p>Remove the bandwidth limitation, and observe the bitrate eventually increasing<br \/>\nback to a maximum:<\/p>\n<pre><code class=\"language-shell\">$HOME\/go\/bin\/comcast --device=$SERVER_INTERFACE --stop<\/code><\/pre>\n<\/li>\n<\/ul>\n<p>For comparison, the congestion control property can be set to &quot;disabled&quot; on<br \/>\nwebrtcsink, then the above procedure applied again, the expected result is<br \/>\nfor playback to simply crawl down to a halt until the bandwidth limitation<br \/>\nis lifted:<\/p>\n<pre><code class=\"language-shell\">gst-launch-1.0 webrtcsink congestion-control=disabled<\/code><\/pre>\n<h2>Monitoring tool<\/h2>\n<p>An example of client\/server application for monitoring per-consumer stats<br \/>\ncan be found <a href=\"https:\/\/gitlab.freedesktop.org\/gstreamer\/gst-plugins-rs\/-\/tree\/main\/net\/webrtc\/examples\">here<\/a>.<\/p>\n<h2>License<\/h2>\n<p>All the rust code in this repository is licensed under the<br \/>\n<a href=\"http:\/\/opensource.org\/licenses\/MPL-2.0\">Mozilla Public License Version 2.0<\/a>.<\/p>\n<p>Code in <a href=\"gstwebrtc-api\">gstwebrtc-api<\/a> is also licensed under the<br \/>\n<a href=\"http:\/\/opensource.org\/licenses\/MPL-2.0\">Mozilla Public License Version 2.0<\/a>.<\/p>\n<h2>Using the AWS KVS signaller<\/h2>\n<ul>\n<li>\n<p>Setup AWS Kinesis Video Streams<\/p>\n<\/li>\n<li>\n<p>Create a channel from the AWS console (<a href=\"https:\/\/us-east-1.console.aws.amazon.com\/kinesisvideo\/home?region=us-east-1#\/signalingChannels\/create\">https:\/\/us-east-1.console.aws.amazon.com\/kinesisvideo\/home?region=us-east-1#\/signalingChannels\/create<\/a>)<\/p>\n<\/li>\n<li>\n<p>Start a producer:<\/p>\n<\/li>\n<\/ul>\n<pre><code>AWS_ACCESS_KEY_ID=&quot;XXX&quot; AWS_SECRET_ACCESS_KEY=&quot;XXX&quot; gst-launch-1.0 videotestsrc pattern=ball ! video\/x-raw, width=1280, height=720 ! videoconvert ! textoverlay text=&quot;Hello from GStreamer!&quot; ! videoconvert ! awskvswebrtcsink name=ws signaller::channel-name=&quot;XXX&quot;<\/code><\/pre>\n<ul>\n<li>Connect a viewer @ <a href=\"https:\/\/awslabs.github.io\/amazon-kinesis-video-streams-webrtc-sdk-js\/examples\/index.html\">https:\/\/awslabs.github.io\/amazon-kinesis-video-streams-webrtc-sdk-js\/examples\/index.html<\/a><\/li>\n<\/ul>\n<h2>Using the WHIP Signaller<\/h2>\n<h3>WHIP Client<\/h3>\n<p>WHIP Client Signaller uses BaseWebRTCSink<\/p>\n<p>Testing the whip client as the signaller can be done by setting up janus and<br \/>\n<a href=\"https:\/\/github.com\/meetecho\/simple-whip-server\/\">https:\/\/github.com\/meetecho\/simple-whip-server\/<\/a>.<\/p>\n<ul>\n<li>\n<p>Set up a <a href=\"https:\/\/github.com\/meetecho\/janus-gateway\">janus<\/a> instance with the videoroom plugin configured<br \/>\nto expose a room with ID 1234 (configuration in <code>janus.plugin.videoroom.jcfg<\/code>)<\/p>\n<\/li>\n<li>\n<p>Open the &lt;janus\/share\/janus\/demos\/videoroomtest.html&gt; web page, click start<br \/>\nand join the room<\/p>\n<\/li>\n<li>\n<p>Set up the <a href=\"https:\/\/github.com\/meetecho\/simple-whip-server\/\">simple whip server<\/a> as explained in its README<\/p>\n<\/li>\n<li>\n<p>Navigate to <a href=\"http:\/\/localhost:7080\/\">http:\/\/localhost:7080\/<\/a>, create an endpoint named room1234<br \/>\npointing to the Janus room with ID 1234<\/p>\n<\/li>\n<li>\n<p>Finally, send a stream to the endpoint with:<\/p>\n<\/li>\n<\/ul>\n<pre><code class=\"language-shell\">gst-launch-1.0 -e uridecodebin uri=file:\/\/\/home\/meh\/path\/to\/video\/file ! \\\n  videoconvert ! video\/x-raw ! queue ! \\\n  whipwebrtcsink name=ws signaller::whip-endpoint=&quot;http:\/\/127.0.0.1:7080\/whip\/endpoint\/room1234&quot;<\/code><\/pre>\n<p>You should see a second video displayed in the videoroomtest web page.<\/p>\n<h3>WHIP Server<\/h3>\n<p>WHIP Server Signaller uses BaseWebRTCSrc<\/p>\n<p>The WHIP Server as the signaller can be tested in two ways.<\/p>\n<p>Note: The initial version of <code>whipserversrc<\/code> does not check any auth or encryption.<br \/>\nHost application using <code>whipserversrc<\/code> behind an HTTP(s) proxy to enforce the auth and encryption between the WHIP client and server<\/p>\n<h4>1. Using  the Gstreamer element <code>whipwebrtcsink<\/code><\/h4>\n<p>a. In one tab of the terminal start the WHIP server using the below command<\/p>\n<pre><code class=\"language-shell\">RUST_BACKTRACE=full GST_DEBUG=webrtc*:6 GST_PLUGIN_PATH=target\/x86_64-unknown-linux-gnu\/debug:$GST_PLUGIN_PATH gst-launch-1.0 whipserversrc signaller::host-addr=http:\/\/127.0.0.1:8190 stun-server=&quot;stun:\/\/stun.l.google.com:19302&quot; turn-servers=&quot;\\&lt;\\&quot;turns:\/\/user1:pass1@turn.serverone.com:7806\\&quot;, \\&quot;turn:\/\/user2:pass2@turn.servertwo.com:7809\\&quot;\\&gt;&quot; ! videoconvert ! autovideosink<\/code><\/pre>\n<p>b. In the second tab start the WHIP Client by sending a test video as shown in the below command<\/p>\n<pre><code class=\"language-shell\">RUST_BACKTRACE=full GST_DEBUG=webrtc*:6 GST_PLUGIN_PATH=target\/x86_64-unknown-linux-gnu\/debug:$GST_PLUGIN_PATH gst-launch-1.0 videotestsrc ! videoconvert ! video\/x-raw ! queue ! \\\n  whipwebrtcsink name=ws signaller::whip-endpoint=&quot;http:\/\/127.0.0.1:8190\/whip\/endpoint&quot;<\/code><\/pre>\n<h4>2. Using Meetecho's <code>simple-whip-client<\/code><\/h4>\n<p>Set up the simple whip client using using the instructions present in <a href=\"https:\/\/github.com\/meetecho\/simple-whip-client#readme\">https:\/\/github.com\/meetecho\/simple-whip-client#readme<\/a><\/p>\n<p>a. In one tab of the terminal start the WHIP server using the below command<\/p>\n<pre><code class=\"language-shell\">RUST_BACKTRACE=full GST_DEBUG=webrtc*:6 GST_PLUGIN_PATH=target\/x86_64-unknown-linux-gnu\/debug:$GST_PLUGIN_PATH gst-launch-1.0 whipserversrc signaller::host-addr=http:\/\/127.0.0.1:8190 stun-server=&quot;stun:\/\/stun.l.google.com:19302&quot; turn-servers=&quot;\\&lt;\\&quot;turns:\/\/user1:pass1@turn.serverone.com:7806\\&quot;, \\&quot;turn:\/\/user2:pass2@turn.servertwo.com:7809\\&quot;\\&gt;&quot; name=ws ! videoconvert ! autovideosink ws. ! audioconvert ! autoaudiosink<\/code><\/pre>\n<p>b. In the second tab start the <code>simple-whip-client<\/code> as shown in the below command<\/p>\n<pre><code class=\"language-shell\">.\/whip-client --url http:\/\/127.0.0.1:8190\/whip\/endpoint \\\n        -A &quot;audiotestsrc is-live=true wave=red-noise ! audioconvert ! audioresample ! queue ! opusenc perfect-timestamp=true ! rtpopuspay pt=100 ssrc=1 ! queue ! application\/x-rtp,media=audio,encoding-name=OPUS,payload=100&quot; \\\n        -V &quot;videotestsrc is-live=true pattern=ball ! videoconvert ! queue ! vp8enc deadline=1 ! rtpvp8pay pt=96 ssrc=2 ! queue ! application\/x-rtp,media=video,encoding-name=VP8,payload=96&quot; \\\n        -S stun:\/\/stun.l.google.com:19302 \\\n        -l 7 \\\n        -n true<\/code><\/pre>\n<p>Terminating the client will close the session and the client should receive 200 (OK) as the response to the DELETE request<\/p>\n<h2>Using the LiveKit Signaller<\/h2>\n<p>Testing the LiveKit signaller can be done by setting up <a href=\"https:\/\/livekit.io\/\">LiveKit<\/a> and creating a room.<\/p>\n<p>You can connect either by given the API key and secret:<\/p>\n<pre><code class=\"language-shell\">gst-launch-1.0 -e uridecodebin uri=file:\/\/\/home\/meh\/path\/to\/video\/file ! \\\n  videoconvert ! video\/x-raw ! queue ! \\\n  livekitwebrtcsink signaller::ws-url=ws:\/\/127.0.0.1:7880 signaller::api-key=devkey signaller::secret-key=secret signaller::room-name=testroom<\/code><\/pre>\n<p>Or by using a separately created authentication token<\/p>\n<pre><code class=\"language-shell\">gst-launch-1.0 -e uridecodebin uri=file:\/\/\/home\/meh\/path\/to\/video\/file ! \\\n  videoconvert ! video\/x-raw ! queue ! \\\n  livekitwebrtcsink signaller::ws-url=ws:\/\/127.0.0.1:7880 signaller::auth-token=mygeneratedtoken signaller::room-name=testroom<\/code><\/pre>\n<p>You should see a second video displayed in the videoroomtest web page.<\/p>\n<h2>Streaming from LiveKit using the livekitwebrtcsrc element<\/h2>\n<p>First, publish a stream to the room using the following command:<\/p>\n<pre><code class=\"language-shell\">gst-launch-1.0 livekitwebrtcsink name=sink \\\n    signaller::ws-url=ws:\/\/127.0.0.1:7880 \\\n    signaller::api-key=devkey \\\n    signaller::secret-key=secret \\\n    signaller::room-name=testroom \\\n    signaller::identity=gst-producer \\\n    signaller::participant-name=gst-producer \\\n    video-caps=&#039;video\/x-h264&#039; \\\n  videotestsrc is-live=1 \\\n  ! video\/x-raw,width=640,height=360,framerate=15\/1 \\\n  ! timeoverlay ! videoconvert ! queue ! sink.<\/code><\/pre>\n<p>Then play back the published stream:<\/p>\n<pre><code class=\"language-shell\">gst-launch-1.0 livekitwebrtcsrc \\\n    name=src \\\n    signaller::ws-url=ws:\/\/127.0.0.1:7880 \\\n    signaller::api-key=devkey \\\n    signaller::secret-key=secret \\\n    signaller::room-name=testroom \\\n    signaller::identity=gst-consumer \\\n    signaller::participant-name=gst-consumer \\\n    signaller::producer-peer-id=gst-producer \\\n    video-codecs=&#039;&lt;H264&gt;&#039; \\\n  src. ! queue ! videoconvert ! autovideosink<\/code><\/pre>\n<h3>Auto-subscribe with livekitwebrtcsrc element<\/h3>\n<p>With the LiveKit source element, you can also subscribe to all the peers in<br \/>\nyour room, simply by not specifying any value for<br \/>\n<code>signaller::producer-peer-id<\/code>. Unwanted peers can also be ignored by supplying<br \/>\nan array of peer IDs to <code>signaller::excluded-producer-peer-ids<\/code>. Importantly,<br \/>\nit is also necessary to add sinks for all the streams in the room that the<br \/>\nsource element has subscribed to.<\/p>\n<p>First, publish a few streams using different connections:<\/p>\n<pre><code class=\"language-shell\">gst-launch-1.0 \\\n  livekitwebrtcsink name=sinka \\\n    signaller::ws-url=ws:\/\/127.0.0.1:7880 \\\n    signaller::api-key=devkey \\\n    signaller::secret-key=secret \\\n    signaller::room-name=testroom \\\n    signaller::identity=gst-producer-a \\\n    signaller::participant-name=gst-producer-a \\\n    video-caps=&#039;video\/x-vp8&#039; \\\n  livekitwebrtcsink name=sinkb \\\n    signaller::ws-url=ws:\/\/127.0.0.1:7880 \\\n    signaller::api-key=devkey \\\n    signaller::secret-key=secret \\\n    signaller::room-name=testroom \\\n    signaller::identity=gst-producer-b \\\n    signaller::participant-name=gst-producer-b \\\n    video-caps=&#039;video\/x-vp8&#039; \\\n  livekitwebrtcsink name=sinkc \\\n    signaller::ws-url=ws:\/\/127.0.0.1:7880 \\\n    signaller::api-key=devkey \\\n    signaller::secret-key=secret \\\n    signaller::room-name=testroom \\\n    signaller::identity=gst-producer-c \\\n    signaller::participant-name=gst-producer-c \\\n    video-caps=&#039;video\/x-vp8&#039; \\\n  videotestsrc is-live=1 \\\n  ! video\/x-raw,width=640,height=360,framerate=15\/1 \\\n  ! timeoverlay ! videoconvert ! queue ! sinka. \\\n  videotestsrc pattern=ball is-live=1 \\\n  ! video\/x-raw,width=320,height=180,framerate=15\/1 \\\n  ! timeoverlay ! videoconvert ! queue ! sinkb.\n  videotestsrc is-live=1 \\\n  ! video\/x-raw,width=320,height=180,framerate=15\/1 \\\n  ! timeoverlay ! videoconvert ! queue ! sinkc.<\/code><\/pre>\n<p>Then watch only streams A and B by excluding peer C:<\/p>\n<pre><code class=\"language-shell\">gst-launch-1.0 livekitwebrtcsrc \\\n  name=src \\\n  signaller::ws-url=ws:\/\/127.0.0.1:7880 \\\n  signaller::api-key=devkey \\\n  signaller::secret-key=secret \\\n  signaller::room-name=testroom \\\n  signaller::identity=gst-consumer \\\n  signaller::participant-name=gst-consumer \\\n  signaller::excluded-producer-peer-ids=&#039;&lt;gst-producer-c&gt;&#039; \\\n  src. ! queue ! videoconvert ! autovideosink\n  src. ! queue ! videoconvert ! autovideosink<\/code><\/pre>\n","protected":false},"excerpt":{"rendered":"<p>https:\/\/gitlab.freedesktop.org\/gstreamer\/gst-plugins-rs\/-\/tree\/main\/net\/webrtc webrtcsink and webrtcsrc All-batteries included GStreamer WebRTC producer and consumer, that try their best to do The Right Thing\u2122. It also provides a flexible and all-purposes WebRTC signalling server (gst-webrtc-signalling-server) and a Javascript API (gstwebrtc-api) to produce and consume compatible WebRTC streams from a web browser. Use case The webrtcbin element in GStreamer is [&hellip;] <a class=\"read-more\" href=\"https:\/\/www.fanyamin.com\/wordpress\/?p=1266\" title=\"Permanent Link to: gstreamer new plugin: webrtcsink and webrtcsrc\">&rarr;Read&nbsp;more<\/a><\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[5],"tags":[],"class_list":["post-1266","post","type-post","status-publish","format-standard","hentry","category-5"],"_links":{"self":[{"href":"https:\/\/www.fanyamin.com\/wordpress\/index.php?rest_route=\/wp\/v2\/posts\/1266"}],"collection":[{"href":"https:\/\/www.fanyamin.com\/wordpress\/index.php?rest_route=\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.fanyamin.com\/wordpress\/index.php?rest_route=\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.fanyamin.com\/wordpress\/index.php?rest_route=\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/www.fanyamin.com\/wordpress\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=1266"}],"version-history":[{"count":2,"href":"https:\/\/www.fanyamin.com\/wordpress\/index.php?rest_route=\/wp\/v2\/posts\/1266\/revisions"}],"predecessor-version":[{"id":1272,"href":"https:\/\/www.fanyamin.com\/wordpress\/index.php?rest_route=\/wp\/v2\/posts\/1266\/revisions\/1272"}],"wp:attachment":[{"href":"https:\/\/www.fanyamin.com\/wordpress\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=1266"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.fanyamin.com\/wordpress\/index.php?rest_route=%2Fwp%2Fv2%2Fcategories&post=1266"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.fanyamin.com\/wordpress\/index.php?rest_route=%2Fwp%2Fv2%2Ftags&post=1266"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}