blob: 77c2b27a13c33e1489460be45f8c185e75d10cfa (
plain)
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
|
/** \page page_midi MIDI Support
This document explains how MIDI is implemented.
# Use Cases
## MIDI Devices Are Made Available As Processing Nodes/Ports
Applications need to be able to see a port for each stream of a
MIDI device.
## MIDI Devices Can Be Plugged and Unplugged
When devices are plugged and unplugged the associated nodes/ports
need to be created and removed.
## Applications Can Connect To MIDI Devices
Applications can create ports that can connect to the MIDI ports
so that data can be provided to or consumed from them.
## Some MIDI Devices Are Sinks Or Sources For MIDI Data
It should be possible to create a MIDI sink or source that routes the
MIDI events to specific MIDI ports.
One example of such a sink would be in front of a software MIDI
renderer.
An example of a MIDI source would be after a virtual keyboard or
as a mix from many MIDI input devices.
## Applications Should Auto-connect To MIDI Sinks Or Sources
An application should be able to be connected to a MIDI sink when
it wants to play MIDI data.
An application should be able to connect to a MIDI source when it
wants to capture MIDI data.
# Design
## SPA
MIDI devices/streams are implemented with an \ref spa_node with generic
control input and output Ports. These ports have a media type of
`"application/control"` and the data transported over these ports
are of type \ref spa_pod_sequence with the \ref spa_pod_control type set to
\ref SPA_CONTROL_Midi.
This means that every MIDI event is timestamped with the sample
offset against the current graph clock cycle to get sample accurate
midi events that can be aligned with the corresponding sample data.
Since the MIDI events are embedded in the generic control stream,
they can be interleaved with other control message types, such as
property updates or OSC messages.
## The PipeWire Daemon
Nothing special is implemented for MIDI. Negotiation of formats
happens between `"application/control"` media types and buffers are
negotiated in the same way as any generic format.
## The Session Manager
The session manager needs to create the MIDI nodes/ports for the available
devices.
This can either be done as a single node with ports per device/stream
or as separate nodes created by a MIDI device monitor.
The session manager needs to be aware of the various MIDI sinks and sources
in order to route MIDI streams to them from applications that want this.
# Implementation
## PipeWire Media Session
PipeWire media session uses the \ref SPA_NAME_API_ALSA_SEQ_BRIDGE plugin for
the MIDI features. This creates a single SPA Node with ports per
MIDI client/stream.
The media session will check the permissions on `/dev/snd/seq` before
attempting to create this node. It will also use inotify to wait
until the sequencer device node is accessible.
## JACK
JACK assumes all `"application/control"` ports are MIDI ports.
The control messages are converted to the JACK event format by
filtering out the \ref SPA_CONTROL_Midi types. On output ports, the JACK
event stream is converted to control messages in a similar way.
There is a 1 to 1 mapping between the JACK events and control
messages so there is no information loss or need for complicated
conversions.
*/
|