Measure JSON: bytes, minified vs pretty, gzip estimate, depth, and the heaviest keys.
or drop a .json file onto the editor
Raw
—
Minified
—
Gzip
—
Nodes
—
Keys
—
Max depth
—
Paste JSON, upload a file, or load the sample to see a full breakdown.
About JSON Size Analyzer
JSON Size Analyzer breaks down the anatomy of a JSON payload: raw byte size, minified size, estimated gzip size, total node count, maximum depth, and a ranked breakdown of the heaviest top-level keys by share of the payload. Use it to spot payload bloat before shipping an API response, or to decide what to paginate, compress, or move out-of-band.
How it Works
1Paste or upload a JSON document.
2The analyzer parses the structure in your browser — nothing is uploaded.
3Review headline metrics (bytes, minified, gzip estimate, depth, node count) and the heaviest-keys table.
4Use the findings to decide what to paginate, compress, or strip before serialising.
Frequently Asked Questions
How accurate is the gzip estimate?
The gzip size is produced by the browser's native CompressionStream API where available, so it's a real gzip(level 6-ish) number — not a heuristic. On browsers without CompressionStream the field shows 'Unavailable' instead of a fake value.
What counts as a 'node'?
Every object, array, and primitive value is one node. A three-field object whose values are primitives counts as four nodes (the object plus the three values).
Why is my minified size bigger than the raw?
Very rarely, your 'raw' input already lacks whitespace between tokens. In that case minify ≈ raw. If the minified size is larger, the raw input contained escape sequences or non-ASCII characters that the JSON re-serializer chose to encode differently — e.g. \u escapes produced by another tool are decoded into UTF-8 on parse.