./run.sh [sr|pp|all] [--local-only] [osu|taiko|catch|mania] [osu|taiko|catch|mania]
^^^^^^^^^ ^^^^^^^^^^^^ ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
arg:mode opt:local-only arg:ruleset (one-or-more)
For each ruleset, a set of .csv file outputs generated from the database via queries described at the bottom of this document, containing:
- If the
mode
arg issr
orall
:[ruleset]_sr_all_gains.csv
- Star rating diffs for all mods sorted by gains.[ruleset]_sr_all_losses.csv
- Star rating diffs for all mods sorted by losses.[ruleset]_sr_nomod_gains.csv
- Star rating diffs for no-mod sorted by gains.[ruleset]_sr_nomod_losses.csv
- Star rating diffs for no-mod sorted by losses.
- If the
mode
arg ispp
orall
:[ruleset]_pp_all_gains.csv
- PP diffs for all mods sorted by gains.[ruleset]_pp_all_losses.csv
- PP diffs for all mods sorted by losses.[ruleset]_pp_nomod_gains.csv
- PP diffs for no-mod sorted by gains.[ruleset]_pp_nomod_losses.csv
- PP diffs for no-mod sorted by losses.
The user should additionally be able to access the database by themselves without
Inside a cloned repo should primarily exist three repository submodules and a run script:
top_level
| /osu (submodule -> https://github.com/ppy/osu)
| /osu-performance (submodule -> https://github.com/ppy/osu-performance)
| /osu-difficulty-calculator (submodule -> https://github.com/ppy/osu-difficulty-calculator)
| run.sh
In order to use this tool, users should navigate into each of the submodules and checkout the appropriate commits. For example, a change to performance calculators in the osu
submodule will require a respective change in the osu-performance
submodule.
This is because the tool should be able to be used to test changes before they're merged.
- Set up a MYSQL database.
- Download, extract, and import data from https://data.ppy.sh containing top 10000 scores into two databases.
curl https://data.ppy.sh/ | grep performance_${{ matrix.ruleset.name }}_top_1000 | tail -1 | awk -F "\"" '{print $2}' | sed 's/\.tar\.bz2//g'
- Call the databases
osu_master
andosu_local
respectively. - This should be included in the database setup image.
- Download and extract beatmap data from https://data.ppy.sh to
./beatmaps
.curl https://data.ppy.sh/ | grep osu_files | tail -1 | awk -F "\"" '{print $2}' | sed 's/\.tar\.bz2//g'
- This should perist to the checkout rather than be included in a docker image (it acts as a cache and is mutable between runs).
Now we'll run osu-difficulty-calculator
and osu-performance
in sequence. This should be done in docker containers of their own so as to isolate the build toolchains from the host system.
It's important to note which database the processes are running against:
- We always need to run the local checkouts from the structure above against the
osu_local
table. - If the
--local-only
option is not set, then need to checkout the HEADs ofppy/osu
,ppy/osu-difficulty-calculator
, andppy/osu-performance
, and repeat the process from those checkouts against theosu_master
table.- This should be done by cloning the repositories into a temporary directory.
CONCURRENCY=4
- Pass
./beatmaps
as a volume. - Run
./UseLocalOsu.sh
- Set environment variables:
DB_HOST={docker database}
DB_NAME=osu_local
BEATMAPS_PATH={beatmaps volume path}
SAVE_DOWNLOADED=1
ALLOW_DOWNLOAD=1
- Map each ruleset to their respective ID (
osu:0
,taiko:1
,catch:2
,mania:3
). - Run
dotnet run -c:Release -- all -ac -c {CONCURRENCY}
.- Add options to the run string for all relevant ruleset IDs as
-m [ruleset_id]
. - If possible, add a global environment variable like
INCLUDE_CONVERTS
to enable/disable the-ac
option.
- Add options to the run string for all relevant ruleset IDs as
- Apply the
osu-performance
patch listed at the bottom of this document. cd build && cmake .. && make -j && cd ../bin
- Update the contents of
config.json
as follows:{ "mysql.master.host" : "{docker database}", "mysql.master.port" : 3306, "mysql.master.username" : "root", "mysql.master.password" : "", "mysql.master.database" : "osu_local" }
- Once per ruleset name, run
./osu-performance all -t {CONCURRENCY} -m [ruleset]
.
Generate outputs by running the SQL queries listed at the bottom of this document.
# I want to test the changes out and generate a full overview of SR + PP changes made in the osu and catch rulesets.
# First I'll clone the tester repository,
git clone --recurse-submodules ppy/diffcalc-tester
# Checkout the relevant commits containing the changes
cd osu
git remote add smoogipoo https://github.com/smoogipoo/osu
git fetch smoogipoo && git checkout XXX
cd ../osu-difficulty-calculator
git remote add smoogipoo https://github.com/smoogipoo/osu-difficulty-calculator
git fetch smoogipoo && git checkout YYY
cd ../osu-performance
git remote add smoogipoo https://github.com/smoogipoo/osu-performance
git fetch smoogipoo && git checkout ZZZ
# A bit of setup. Since I have a beefy PC, I can expend 16 cores.
export CONCURRENCY=16
# Run the process
cd ..
./run.sh all osu catch
# Wait for process to complete (runs diffcalc, ppcalc, generates .csv data)...
# Upload .csv data to Google Sheets, or add as artifacts in GH actions workflow, or output to command line.
# Can be a separate user step, depending on usage scenario. Most common for me will be the Google Sheets path.
# New changes have been made. I need to re-run and generate more data!
# Since the DB is already in the correct state, I'll run only the local checkouts
./run.sh all --local-only osu catch
# And once again do stuff with the .csv data.
SELECT
m.beatmap_id,
m.mods,
b.filename,
m.diff_unified as 'sr_master',
p.diff_unified as 'sr_pr',
(p.diff_unified - m.diff_unified) as 'diff',
(p.diff_unified / m.diff_unified - 1) as 'diff%'
FROM osu_master.osu_beatmap_difficulty m
JOIN osu_local.osu_beatmap_difficulty p
ON m.beatmap_id = p.beatmap_id
AND m.mode = p.mode
AND m.mods = p.mods
JOIN osu_local.osu_beatmaps b
ON b.beatmap_id = p.beatmap_id
WHERE
# Comment/uncomment the following line to generate combined or nomod-only diffs.
m.mods = 0 AND
abs(m.diff_unified - p.diff_unified) > 0.1
ORDER BY p.diff_unified - m.diff_unified
# If sort-by-gains, use DESC. If sort-by-losses, use ASC.
DESC
LIMIT 10000;
SELECT
m.score_id,
m.beatmap_id,
m.enabled_mods,
b.filename,
m.pp as 'pp_master',
p.pp as 'pp_pr',
(p.pp - m.pp) as 'diff',
(p.pp / m.pp - 1) as 'diff%'
FROM osu_master.osu_scores_fruits_high m
JOIN osu_local.osu_scores_fruits_high p
ON m.score_id = p.score_id
JOIN osu_local.osu_beatmaps b
ON b.beatmap_id = p.beatmap_id
WHERE
# Comment/uncomment the following line to generate combined or nomod-only diffs.
m.enabled_mods = 0 AND
abs(m.pp - p.pp) > 0.1
ORDER BY p.pp - m.pp
# If sort-by-gains, use DESC. If sort-by-losses, use ASC.
DESC
LIMIT 10000;
diff --git a/src/performance/Score.cpp b/src/performance/Score.cpp
index b2d4916..3bd16f6 100644
--- a/src/performance/Score.cpp
+++ b/src/performance/Score.cpp
@@ -48,7 +48,7 @@ void Score::AppendToUpdateBatch(UpdateBatch& batch) const
_scoreId
));
- batch.AppendAndCommitNonThreadsafe(StrFormat("UPDATE `score_process_queue` SET `status` = 1 WHERE `mode` = {0} AND `score_id` = {1};", static_cast<int>(_mode), _scoreId));
+ //batch.AppendAndCommitNonThreadsafe(StrFormat("UPDATE `score_process_queue` SET `status` = 1 WHERE `mode` = {0} AND `score_id` = {1};", static_cast<int>(_mode), _scoreId));
}
PP_NAMESPACE_END