Hi,
I am running foundationdb synthetic random read write benchmark, I am having some trouble in understanding the results of the benchmark. The result is as follows for one of the client processes.
startingConfiguration: start
setting up test (RandomReadWriteTest)…
running test (RandomReadWriteTest)…
RandomReadWriteTest complete
checking test (RandomReadWriteTest)…
fetching metrics (RandomReadWriteTest)…
Metric (0, 0): Measured Duration, 300.000000, 300
Metric (0, 1): Transactions/sec, 341.896667, 342
Metric (0, 2): Operations/sec, 3452.866667, 3.45e+03
Metric (0, 3): A Transactions, 92399.000000, 92399
Metric (0, 4): B Transactions, 10170.000000, 10170
Metric (0, 5): Retries, 523.000000, 523
Metric (0, 6): Mean load time (seconds), 2.653203, 2.65
Metric (0, 7): Read rows, 101700.000000, 1.02e+05
Metric (0, 8): Write rows, 934160.000000, 9.34e+05
Metric (0, 9): Mean Latency (ms), 579.081821, 579
Metric (0, 10): Median Latency (ms, averaged), 475.733280, 476
Metric (0, 11): 90% Latency (ms, averaged), 1270.120382, 1.27e+03
Metric (0, 12): 98% Latency (ms, averaged), 1633.173466, 1.63e+03
Metric (0, 13): Max Latency (ms, averaged), 4480.281830, 4.48e+03
Metric (0, 14): Mean Row Read Latency (ms), 1.397406, 1.4
Metric (0, 15): Median Row Read Latency (ms, averaged), 1.105785, 1.11
Metric (0, 16): Max Row Read Latency (ms, averaged), 123.408794, 123
Metric (0, 17): Mean Total Read Latency (ms), 0.355296, 0.355
Metric (0, 18): Median Total Read Latency (ms, averaged), 0.000000, 0
Metric (0, 19): Max Total Latency (ms, averaged), 123.408794, 123
Metric (0, 20): Mean GRV Latency (ms), 561.424525, 561
Metric (0, 21): Median GRV Latency (ms, averaged), 458.384752, 458
Metric (0, 22): Max GRV Latency (ms, averaged), 1939.560652, 1.94e+03
Metric (0, 23): Mean Commit Latency (ms), 6.001462, 6
Metric (0, 24): Median Commit Latency (ms, averaged), 3.398657, 3.4
Metric (0, 25): Max Commit Latency (ms, averaged), 149.589062, 150
Metric (0, 26): Read rows/sec, 339.000000, 339
Metric (0, 27): Write rows/sec, 3113.866667, 3.11e+03
Metric (0, 28): Bytes read/sec, 178992.000000, 1.79e+05
Metric (0, 29): Bytes written/sec, 1644121.600000, 1.64e+06
I want to know that what does Metric (0, 9): Mean Latency (ms), 808.910736, 809 represent here?, The mean total read latency and mean total commit latency seems to be close to be fine, but the mean latency is more than 0.5 seconds which is quite a lot. could you kindly tell how is the mean latency calculated in the benchmark?