The Pyston project published a set of macro benchmarks that they used. We can work on incorporating those into the pyperformance suite.
TODO:
Steps to incorporate Pyston benchmarks on a benchmarking host:
- clone the Pyston benchmarks repo
- create a manifest file with just an "includes" section, containing references to the manifests you're combining
- add a "manifest=" entry to the config file you're passing to
pyperformance compile ...
- proceed as usual
More concretely:
#git clone --branch benchmark-management https://github.com/ericsnowcurrently/python-performance pyperformance
#git clone --branch pyperformance https://github.com/ericsnowcurrently/pyston-macrobenchmarks
git clone https://github.com/pyston/python-macrobenchmarks pyston-macrobenchmarks
cat > MANIFEST.combined << EOF
[includes]
<default>
pyston-macrobenchmarks/benchmarks/MANIFEST
EOF
cat > ./cpython-perf.ini << EOF
[config]
...
[scm]
...
[compile]
...
[run_benchmark]
manifest = MANIFEST.combined
...
EOF
PYTHONPATH=./pyperformance python3 -m pyperformance compile ./cpython-perf.ini main
The Pyston project published a set of macro benchmarks that they used. We can work on incorporating those into the pyperformance suite.
TODO:
req-1636475547-esnowvs.req-1636488560-esnow[includes]section to manifestSteps to incorporate Pyston benchmarks on a benchmarking host:
pyperformance compile ...More concretely: