Alpha ZealPHP is early-stage and under active development. APIs may change between minor versions until v1.0. Feedback and bug reports welcome on GitHub.

Benchmarks

Real machine, full methodology, every CSV linked. Reproduce yourself before quoting.

Setup  |  AMD Ryzen 9 7900X (12 cores) · 24 GB RAM · Ubuntu 22.04 (Docker) · PHP 8.3.31 · OpenSwoole 26.2.0 · Node.js 24.11.1 · ab -n 50000 -c 200 -k -l · 4 workers, each runtime tested alone
117k
req/s text
avg 1.7 ms
106k
req/s JSON
avg 1.9 ms
50k
req/s template
avg 4.0 ms
0
failures
/ 150k reqs

Three findings worth highlighting

1. OpenSwoole's raw HTTP outperforms Node's

Before any framework or middleware loads — bare HTTP server, single handler returning text/JSON:

Runtime/raw/bench (text)/json
OpenSwoole raw142,170 req/s137,535 req/s
Node.js raw http129,091 req/s131,513 req/s
Delta+10.1%+4.6%

Counter-intuitive for the "PHP is slow" prior. Both Node and OpenSwoole are C extensions to their language runtimes; their HTTP servers are head-to-head and OpenSwoole is fractionally faster on this workload.

2. Framework efficiency: ZealPHP retains 82%, Express retains 15%

The same workload through a full framework with CORS + ETag + sessions + routing + middleware:

StackRaw runtimeFull frameworkRetention
ZealPHP / OpenSwoole142,170116,85182%
Express / Node.js129,09119,99415%

This is the actual answer to "why does ZealPHP beat Express by 5×". It's not raw speed; it's that each layer added by the framework costs ZealPHP much less throughput than the equivalent layer costs Express.

3. PHP with full middleware reaches 91% of bare Node http

Compose findings #1 and #2 — ZealPHP runs on a faster runtime AND keeps more of that runtime under middleware load. Net result, "PHP with everything turned on" vs "Node with nothing":

ComparisonTextJSON
ZealPHP full PSR-15116,851105,681
Node.js raw http (no framework)129,091131,513
ZealPHP retains91%80%

Honest framing: ZealPHP doesn't beat hand-rolled Node http. But it gets within 10–20% of it while serving a full PSR-15 middleware stack with sessions, ETag, and reflection-based routing — features bare Node http doesn't have.

Sequential head-to-head — same workload, every stack

Each runtime gets the full 12-core machine in isolation; we don't run them concurrently because that measures the scheduler instead of the framework. ab -n 50000 -c 200 -k -l, warmed up first.

Framework Raw text (/raw/bench) JSON (/json) Template (/bench/template)
Runtime (no framework, no middleware)
OpenSwoole raw 141,670 137,535
Node.js raw http 129,091 131,513
Full framework (CORS + ETag + sessions + routing + templates)
ZealPHP built-in PSR-15 stack 116,851 105,681 49,863
Express.js + cors + etag + express-session + session-file-store + ejs + body-parser 19,994 21,741 12,470 (EJS)
ZealPHP vs Express +484% (5.8×) +386% (4.9×) +299% (4.0×)
Other PHP frameworks (community benchmarks, similar workload class)
Slim 4~4,000 req/s
Symfony 7~2,000 req/s
Laravel 11~500 req/s

vs Laravel 11: ~210× · vs Symfony 7: ~55× · vs Slim 4: ~28×

Concurrency sweep — ZealPHP solo across c = 1…1000

Same 4 workers, varying simultaneous connections. Shows where each endpoint saturates, how tail latency degrades, and whether throughput holds at heavy load.

/raw/bench — lean runtime, no demo middleware

creq/savg msp90 msp99 msfailures
13,8830.26000
1030,5010.33010
5094,8880.53130
100110,9640.90160
200102,1561.96390
500100,3634.988200
100085,00111.7719330

/json — full PSR-15 stack (CORS · ETag · Range · sessions · reflection-injected handler)

creq/savg msp90 msp99 msfailures
14,1730.24000
1030,8400.32010
50105,8680.47140
100108,0860.93160
20093,7332.13390
50095,5265.238190
100077,76112.8619810

Peak at c = 100, sustained well past it. Throughput holds within ~20% of peak at c = 1000 with zero failures — the framework degrades gracefully rather than falling over.
Low-concurrency throughput (c = 1, c = 10) is bounded by Docker localhost-network round-trip latency, not framework cost. Run on bare metal to see higher c = 1 numbers; the c ≥ 50 figures are unaffected.

Raw CSVs: /raw/bench · /json

Reproduce on your own machine

Numbers are hardware- and OS-bound. Published figures are a starting point, not a contract. Three harnesses ship with the repo; pick the one that matches the claim you want to verify.

One-line install (Ubuntu/Debian)

Goes from a fresh box to a benched-ready clone — installs PHP 8.3, OpenSwoole, uopz, composer, wrk, ab, then clones sibidharan/zealphp to ~/zealphp and runs composer install:

bench-install.sh — root-required, single command
curl -fsSL https://php.zeal.ninja/bench-install.sh | sudo bash
# Prints the next-step bench command when it finishes.

Inspect before piping to sudo: curl -fsSL https://php.zeal.ninja/bench-install.sh | less

Manual install (macOS / inspect-friendly)

macOS (Homebrew)
brew install wrk php composer node
pecl install openswoole uopz
git clone https://github.com/sibidharan/zealphp && cd zealphp && composer install
Linux apt (one-liner equivalent, manually)
curl -fsSL https://php.zeal.ninja/install.sh | sudo bash   # PHP + openswoole + uopz + composer
sudo apt install -y wrk apache2-utils git
git clone https://github.com/sibidharan/zealphp && cd zealphp && composer install

Verify extensions loaded: php -m | grep -E 'openswoole|uopz'

Recipe 1 — single-stack concurrency sweep (matches the tables above)

scripts/bench.sh
scripts/bench.sh --tool ab --requests 50000 \
                 --workers 4 --threads 4 --task-workers 0 \
                 --paths /raw/bench,/json --p1000
# Output: bench/results/zealphp-<timestamp>.csv + per-c raw logs

Recipe 2 — ZealPHP vs raw Node (matches the head-to-head table)

scripts/bench_compare.sh
scripts/bench_compare.sh --workers 4 --threads 4 --p1000 --duration 30s
# Or via Docker so versions don't matter:
mkdir -p bench/results && docker compose run --rm --build compare

Recipe 3 — 3-way with sample-to-sample variance (autocannon)

A single 30s run can hide 10–15% per-sample swings on noisy hardware. This harness runs 10 short samples per stack spread over time and reports mean ± stddev so you can see how stable each stack is.

bench/compare-3way/run.sh
cd /tmp && npm install autocannon express   # one-off
./bench/compare-3way/run.sh                 # ~10 min

Methodology

FieldValue
MachineAMD Ryzen 9 7900X · 12 cores · 24 GB RAM
OSUbuntu 22.04.4 LTS
RuntimeDocker container · native Linux · near-zero virtualization overhead
PHP8.3.31 (cli, NTS)
OpenSwoole26.2.0
Node.js24.11.1
Benchmark toolApacheBench 2.3 (ab -n 50000 -c <c> -k -l)
HTTP workers4 (deliberate — keeps the result comparable to typical mid-tier app server sizing)
Task workers0
Warmup5s per path/runtime before measurement
Sample size50,000 requests per concurrency level
Sweepc = 1, 10, 50, 100, 200, 500, 1000
MethodEach runtime tested alone with full machine resources — never simultaneously

Endpoints under test

PathReturnsWhat it exercises
/raw/benchplain text (~20 bytes)Bare framework dispatch path with no demo middleware. Routing only.
/jsonJSON of G::instance()->sessionFull PSR-15 stack — CORS · ETag · Range · Compression · coroutine-safe sessions · reflection-injected handler · auto-JSON.
/bench/template~6 KB HTMLSame as /json + template rendering with App::render().

Caveats — read before quoting

  • Single-machine numbers. Your hardware, OS limits, payload size, and middleware set will move these. Quote your own measurements.
  • Docker localhost RTT. c = 1 and c = 10 throughput is bounded by Docker's localhost networking overhead, not framework cost. Bare-metal runs typically post c = 1 closer to 15k-20k req/s.
  • 4 workers ≈ 4 cores. Deliberate baseline. The framework is multi-process; doubling workers on a wider machine scales further until you saturate I/O or coroutine context-switching.
  • Express comparison is fair. Express runs with cors + etag + express-session + session-file-store + ejs + body-parser — middleware roughly equivalent to ZealPHP's built-in PSR-15 stack. We're not comparing bare Express to full-stack ZealPHP.
  • "Other PHP frameworks" numbers are community benchmarks, not measured on this box. They're rough orders of magnitude; we don't claim 1.0% precision.

Source: PERF.md · Raw CSVs: bench/results/ryzen-sweep/ · Scripts: scripts/ · bench/compare-3way/