Quick start
This guide will explain basic LiveCompositor setup.
Configure inputs and output
Start the compositor
- HTTP
- Membrane Framework
Start the compositor server. Check out configuration page for available configuration options.
Following code snippets are implementing handle_init/2
or handle_setup/2
callbacks. Those
are just examples, you can use any Membrane.Pipeline
callbacks
instead.
alias Membrane.LiveCompositor
def handle_init(ctx, opts) do
spec = [
...
child(:live_compositor, %LiveCompositor{
framerate: {30, 1}
}),
...
]
{[spec: spec], %{}}
end
Register input stream input_1
.
- HTTP
- Membrane Framework
POST: /api/input/input_1/register
Content-Type: application/json
{
"type": "rtp_stream",
"transport_protocol": "tcp_server",
"port": 9001,
"video": {
"decoder": "ffmpeg_h264"
}
}
After receiving the response you can establish the connection and start sending the stream. Check out how to deliver input streams to learn more.
In this example we are using RTP over TCP, but it could be easily replaced by UDP.
def handle_init(ctx, opts) do
spec = [
...
video_input_1_spec
|> via_in(Pad.ref(:video_input, "video_input_1"))
|> get_child(:live_compositor),
audio_input_1_spec
|> via_in(Pad.ref(:audio_input, "audio_input_1"))
|> get_child(:live_compositor),
...
]
{[spec: spec], %{}}
end
where video_input_1_spec
and audio_input_1_spec
are elements producing H264 video and Opus audio respectively.
Register input stream input_2
.
- HTTP
- Membrane Framework
POST: /api/input/input_2/register
Content-Type: application/json
{
"type": "rtp_stream",
"transport_protocol": "tcp_server",
"port": 9002,
"video": {
"decoder": "ffmpeg_h264"
}
}
After receiving the response you can establish the connection and start sending the stream. Check out how to deliver input streams to learn more.
In this example we are using RTP over TCP, but it could be easily replaced by UDP.
def handle_init(ctx, opts) do
spec = [
...
video_input_2_spec
|> via_in(Pad.ref(:video_input, "video_input_2"))
|> get_child(:live_compositor),
audio_input_2_spec
|> via_in(Pad.ref(:audio_input, "audio_input_2"))
|> get_child(:live_compositor),
...
]
{[spec: spec], %{}}
end
where video_input_2_spec
and audio_input_2_spec
are elements that produce H264 video and Opus audio respectively.
Register output stream output_1
.
Configure it to:
- render an empty
View
component with a background color set to#4d4d4d
(gray) - produce silent audio
- HTTP
- Membrane Framework
POST: /api/output/output_1/register
Content-Type: application/json
{
"type": "rtp_stream",
"transport_protocol": "tcp_server",
"port": 9003,
"video": {
"resolution": { "width": 1280, "height": 720 },
"encoder": {
"type": "ffmpeg_h264",
"preset": "ultrafast"
},
"initial": {
"root": {
"type": "view",
"background_color_rgba": "#4d4d4dff"
}
}
},
"audio": {
"encoder": {
"type": "opus",
"channels": "stereo"
},
"initial": {
"inputs": []
}
}
}
You can configure the output framerate and the sample rate using LIVE_COMPOSITOR_OUTPUT_FRAMERATE
and LIVE_COMPOSITOR_OUTPUT_SAMPLE_RATE
environment variables.
After receiving the response you can establish the connection and start listening for the stream. Check out how to receive output streams to learn more.
In this example we are using RTP over TCP, if you prefer to use UDP you need start listening on the specified port before sending register request to make sure you are not losing first frames.
def handle_init(ctx, opts) do
spec = [
...
get_child(:live_compositor),
|> via_out(Pad.ref(:video_output, "video_output_1"), options: [
width: 1280,
height: 720,
encoder: %LiveCompositor.Encoder.FFmpegH264{
preset: :ultrafast
},
initial: %{
root: %{
type: :view,
background_color_rgba: "#4d4d4dff",
},
}
])
|> video_output_1_spec,
get_child(:live_compositor)
|> via_out(Pad.ref(:audio_output, "audio_output_1"), options: [
encoder: LiveCompositor.Encoder.Opus.t(),
initial: %{
inputs: []
}
])
|> audio_output_1_spec
...
]
{[spec: spec], %{}}
end
where video_output_1_spec
and audio_output_1_spec
are elements that can consume H264 video and Opus audio respectively.
You can configure output framerate and sample rate using framerate
and output_sample_rate
bin options.
View
component does not have any children, so on the output you should see just a blank screen of a specified color as shown below.
The initial.inputs
list in audio config is empty, so the output audio will be silent.
Output stream
Update output
Configure it to:
- Show input streams
input_1
andinput_2
usingTiles
component. - Mix audio from input streams
input_1
andinput_2
, whereinput_1
volume is slightly lowered.
- HTTP
- Membrane Framework
POST: /api/output/output_1/update
Content-Type: application/json
{
"video": {
"root": {
"type": "tiles",
"background_color_rgba": "#4d4d4dff",
"children": [
{ "type": "input_stream", "input_id": "input_1" },
{ "type": "input_stream", "input_id": "input_2" }
]
}
},
"audio": {
"inputs": [
{ "input_id": "input_1", volume: 0.9 },
{ "input_id": "input_2" }
]
}
}
def handle_setup(ctx, state) do
video_request = %LiveCompositor.Request.UpdateVideoOutput{
output_id: "video_output_1",
root: %{
type: :tiles,
children: [
%{ type: :input_stream, input_id: :input_1 },
%{ type: :input_stream, input_id: :input_2 }
]
}
}
audio_request = %LiveCompositor.Request.UpdateAudioOutput{
output_id: "audio_output_1",
inputs: [
%{ input_id: "input_1", volume: 0.9 },
%{ input_id: "input_2" }
]
}
events = [
notify_child: {:live_compositor, video_request},
notify_child: {:live_compositor, audio_request}
]
{events, state}
end
Output stream