blob: 4ebbda6d3fc5a4d7654cecf211bef0e39284812d (
plain)
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
|
The files in this subdirectory are used to help measure the performance
of the SQLite JSON functions, especially in relation to handling large
JSON inputs.
# 1.0 Prerequisites
* Standard SQLite build environment (SQLite source tree, compiler, make, etc.)
* Valgrind
* Fossil (only the "fossil xdiff" command is used by this procedure)
* tclsh
# 2.0 Setup
* Run: "`tclsh json-generator.tcl | sqlite3 json100mb.db`" to create
the 100 megabyte test database. Do this so that the "json100mb.db"
file lands in the directory from which you will run tests, not in
the test/json subdirectory of the source tree.
* Make a copy of "json100mb.db" into "jsonb100mb.db" - change the prefix
from "json" to "jsonb".
* Bring up jsonb100mb.db in the sqlite3 command-line shell.
Convert all of the content into JSONB using a commands like this:
> UPDATE data1 SET x=jsonb(x);
> VACUUM;
* Build the baseline sqlite3.c file with sqlite3.h and shell.c.
> make clean sqlite3.c
* Run "`sh json-speed-check.sh trunk`". This creates the baseline
profile in "jout-trunk.txt" for the preformance test using text JSON.
* Run "`sh json-speed-check.sh trunk --jsonb`". This creates the
baseline profile in "joutb-trunk.txt" for the performance test
for processing JSONB
* (Optional) Verify that the json100mb.db database really does contain
approximately 100MB of JSON content by running:
> SELECT sum(length(x)) FROM data1;
> SELECT * FROM data1 WHERE NOT json_valid(x);
# 3.0 Testing
* Build the sqlite3.c (with sqlite3.h and shell.c) to be tested.
* Run "`sh json-speed-check.sh x1`". The profile output will appear
in jout-x1.txt. Substitute any label you want in place of "x1".
* Run "`sh json-speed-check.sh x1 --jsonb`". The profile output will appear
in joutb-x1.txt. Substitute any label you want in place of "x1".
* Run the script shown below in the CLI.
Divide 2500 by the real elapse time from this test
to get an estimate for number of MB/s that the JSON parser is
able to process.
> .open json100mb.db
> .timer on
> WITH RECURSIVE c(n) AS (VALUES(1) UNION ALL SELECT n+1 FROM c WHERE n<25)
> SELECT sum(json_valid(x)) FROM c, data1;
|