Membrane Plugin for sending and receiving streams via WebRTC. It's based on ex_webrtc.
It's a part of the Membrane Framework.
The package can be installed by adding membrane_webrtc_plugin
to your list of dependencies in mix.exs
:
def deps do
[
{:membrane_webrtc_plugin, "~> 0.25.3"}
]
end
The examples
directory shows how to send and receive streams from a web browser.
There are the following three demos:
live_view
- a simple Phoenix LiveView project usingMembrane.WebRTC.Live.Player
andMembrane.WebRTC.Live.Capture
to echo video stream captured from the user's browser.phoenix_signaling
- showcasing simple Phoenix application that usesMembrane.WebRTC.PhoenixSignaling
to echo stream captured from the user's browser and sent via WebRTC. Seeassets/phoenix_signaling/README.md
for details on how to run the demo.webrtc_signaling
- it consists of two scripts:file_to_browser.exs
andbrowser_to_file.exs
. The first one displays the stream from the fixture file in the user's browser. The latter captures the user's camera input from the browser and saves it in the file. To run one of these demos, type:elixir <script_name>
and visithttp://localhost:4000
.
To establish a WebRTC connection you have to exchange WebRTC signaling messages between peers.
In membrane_webrtc_plugin
it can be done by the user, with Membrane.WebRTC.Signaling
or by passing WebSocket address to
Membrane.WebRTC.Source
or Membrane.WebRTC.Sink
, but there are two additional ways of doing it, dedicated to be used within
Phoenix
projects:
- The first one is to use
Membrane.WebRTC.PhoenixSignaling
along withMembrane.WebRTC.PhoenixSignaling.Socket
- The second one is to use
Phoenix.LiveView
Membrane.WebRTC.Live.Player
orMembrane.WebRTC.Live.Capture
. These modules expectt:Membrane.WebRTC.Signaling.t/0
as an argument and take advantage of WebSocket used byPhoenix.LiveView
to exchange WebRTC signaling messages, so there is no need to add any code to handle signaling messages.
The see the full example, visit example/phoenix_signaling
.
- Create a new socket in your application endpoint, using the
Membrane.WebRTC.PhoenixSignaling.Socket
, for instance at/signaling
path:
socket "/signaling", Membrane.WebRTC.PhoenixSignaling.Socket,
websocket: true,
longpoll: false
- Create a Phoenix signaling channel with the desired signaling ID and use it as
Membrane.WebRTC.Signaling.t()
forMembrane.WebRTC.Source
,Membrane.WebRTC.Sink
orBoombox
:
signaling = Membrane.WebRTC.PhoenixSignaling.new("<signaling_id>")
# use it with Membrane.WebRTC.Source:
child(:webrtc_source, %Membrane.WebRTC.Source{signaling: signaling})
|> ...
# or with Membrane.WebRTC.Sink:
...
|> child(:webrtc_sink, %Membrane.WebRTC.Sink{signaling: signaling})
# or with Boombox:
Boombox.run(
input: {:webrtc, signaling},
output: ...
)
Please note that
signaling_id
is expected to be globally unique for each WebRTC connection about to be estabilished. You can, for instance:
- Generate a unique id with
:uuid
package and assign it to the connection in the page controller:unique_id = UUID.uuid4() render(conn, :home, layout: false, signaling_id: unique_id)
- Generate HTML based on HEEx template, using the previously set assign:
<video id="videoPlayer" controls muted autoplay signaling_id={@signaling_id}></video>
- Access it in your client code:
const videoPlayer = document.getElementById('videoPlayer'); const signalingId = videoPlayer.getAttribute('signaling_id');
- Use the Phoenix Socket to exchange WebRTC signaling data.
let socket = new Socket("/signaling", {params: {token: window.userToken}})
socket.connect()
let channel = socket.channel('<signaling_id>')
channel.join()
.receive("ok", resp => { console.log("Signaling socket joined successfully", resp)
// here you can exchange WebRTC data
})
.receive("error", resp => { console.log("Unable to join signaling socket", resp) })
Visit examples/phoenix_signaling/assets/js/signaling.js
to see how WebRTC signaling messages exchange might look like.
membrane_webrtc_plugin
comes with two Phoenix.LiveView
s:
Membrane.WebRTC.Live.Capture
- exchanges WebRTC signaling messages betweenMembrane.WebRTC.Source
and the browser. It expects the sameMembrane.WebRTC.Signaling
that has been passed to the relatedMembrane.WebRTC.Source
. As a result,Membrane.Webrtc.Source
will return the media stream captured from the browser, whereMembrane.WebRTC.Live.Capture
has been rendered.Membrane.WebRTC.Live.Player
- exchanges WebRTC signaling messages betweenMembrane.WebRTC.Sink
and the browser. It expects the sameMembrane.WebRTC.Signaling
that has been passed to the relatedMembrane.WebRTC.Sink
. As a result,Membrane.WebRTC.Live.Player
will play media streams passed to the relatedMembrane.WebRTC.Sink
. Currently supports up to one video stream and up to one audio stream.
To use Phoenix.LiveView
s from this repository, you have to use related JS hooks. To do so, add the following code snippet to assets/js/app.js
import { createCaptureHook, createPlayerHook } from "membrane_webrtc_plugin";
let Hooks = {};
const iceServers = [{ urls: "stun:stun.l.google.com:19302" }];
Hooks.Capture = createCaptureHook(iceServers);
Hooks.Player = createPlayerHook(iceServers);
and add Hooks
to the WebSocket constructor. It can be done in the following way:
new LiveSocket("/live", Socket, {
params: SomeParams,
hooks: Hooks,
});
To see the full usage example, you can go to examples/live_view/
directory in this repository (take a look especially at examples/live_view/assets/js/app.js
and examples/live_view/lib/example_project_web/live_views/echo.ex
).
Copyright 2020, Software Mansion
Licensed under the Apache License, Version 2.0