blob: fdaab04011c9d45be9be91a74f98d4456b27a389 (
plain)
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
|
Deploy behind HAProxy
=====================
This guide demonstrates a way to load balance connections across multiple
websockets server processes running on the same machine with HAProxy_.
We'll run server processes with Supervisor as described in :doc:`this guide
<supervisor>`.
.. _HAProxy: https://www.haproxy.org/
Run server processes
--------------------
Save this app to ``app.py``:
.. literalinclude:: ../../example/deployment/haproxy/app.py
:emphasize-lines: 24
Each server process listens on a different port by extracting an incremental
index from an environment variable set by Supervisor.
Save this configuration to ``supervisord.conf``:
.. literalinclude:: ../../example/deployment/haproxy/supervisord.conf
This configuration runs four instances of the app.
Install Supervisor and run it:
.. code-block:: console
$ supervisord -c supervisord.conf -n
Configure and run HAProxy
-------------------------
Here's a simple HAProxy configuration to load balance connections across four
processes:
.. literalinclude:: ../../example/deployment/haproxy/haproxy.cfg
In the backend configuration, we set the load balancing method to
``leastconn`` in order to balance the number of active connections across
servers. This is best for long running connections.
Save the configuration to ``haproxy.cfg``, install HAProxy, and run it:
.. code-block:: console
$ haproxy -f haproxy.cfg
You can confirm that HAProxy proxies connections properly:
.. code-block:: console
$ PYTHONPATH=src python -m websockets ws://localhost:8080/
Connected to ws://localhost:8080/.
> Hello!
< Hello!
Connection closed: 1000 (OK).
|