Mathematics · Physics · History

The Pizza Problem That Unlocks the Secret of Falling Objects

How counting pizza slices, a 350-year-old trick with differences, and Galileo's ramps all tell the same beautiful story

You're at a party. Someone hands you a pizza and a knife. One straight cut and you've got 2 pieces. Another cut — if it crosses the first — gives you 4. A third, crossing both previous cuts, yields 7. Your friend bets you can't get more than 10 pieces from four cuts. Should you take the bet?

This humble question — what is the maximum number of regions created by n straight cuts across a disk? — is a gateway into one of the most elegant intersections of combinatorics, numerical analysis, and physics. Its answer connects a technique Newton refined in the 17th century to the very law Galileo discovered rolling balls down ramps in Padua. The thread that ties them together is surprisingly simple: the pattern hiding inside differences.

I. The Pizza Problem

Combinatorics

Let's set up the rules precisely. Each cut is a full straight line across the pizza (not a chord ending early). To maximize pieces, every new line must cross all the previous ones, and no three lines may meet at a single point. Under these conditions, what is the maximum number of regions R(n) for n cuts?

Try it yourself below. Add cuts one at a time and watch the regions multiply. Every region is colored distinctly so you can count them.

Interactive: Pizza Slice Simulator

0 cuts
Regions: 1

Each new line is arranged so that every pair of lines intersects inside the circle, with no three lines meeting at a single point — guaranteeing the maximum number of regions. Every region is flood-filled with a distinct color so you can verify the count.

Count carefully and you'll build up this table:

Cuts (n)0123456
Regions R(n)1247111622

The sequence 1, 2, 4, 7, 11, 16, 22 … doesn't jump out as an obvious formula. The differences between terms are 1, 2, 3, 4, 5, 6 — the natural numbers themselves. And the differences of those differences are all 1. One more step and we hit all zeros. This is the telltale signature of a quadratic — a polynomial of degree 2. But how do we extract the exact formula?

Enter a technique that Newton himself championed.

• • •

II. Newton's Divided Differences

Numerical Analysis

A Brief History of Divided Differences

Isaac Newton
Isaac Newton (1643–1727)

The story of divided differences weaves through some of the greatest names in mathematics. Henry Briggs, who computed the first base-10 logarithm tables in the 1620s, was arguably the first to use finite differences systematically — he needed them to interpolate between computed values. Isaac Newton, in his Principia (1687) and unpublished manuscripts from the 1670s, formalized the forward difference interpolation formula that now bears his name. He used it not for abstract theory but for practical astronomy — predicting comet trajectories from sparse observational data.

What many don't know is that James Gregory, the Scottish mathematician, independently discovered the same interpolation formula around 1670, and the formula is sometimes called the Newton–Gregory formula.

James Gregory
James Gregory (1638–1675)
Later, Brook Taylor (of Taylor series fame) extended these ideas, recognizing finite differences as a discrete analogue of differential calculus — a connection that Leonhard Euler and George Boole would later develop into a full-blown "calculus of finite differences."

George Boole
George Boole (1815–1864)

Boole, in his 1860 treatise A Treatise on the Calculus of Finite Differences, gave the subject its definitive systematic treatment — unifying the scattered techniques of his predecessors into a coherent theory. His work showed that finite differences obey algebraic laws strikingly parallel to those of ordinary calculus, an insight that continues to influence numerical analysis and combinatorics to this day.

Charles Babbage
Charles Babbage (1791–1871)

The technique proved indispensable to human computers — people, not machines — who spent careers building navigation tables, actuarial tables, and ballistic charts. Charles Babbage designed his famous Difference Engine (1822) specifically to automate this method, making it arguably the philosophical ancestor of the modern computer.

The method of divided differences is one of the oldest and most powerful tools in numerical mathematics. The idea is deceptively simple: given a table of data, compute successive "differences" between consecutive values. If those differences eventually become constant (or zero), you've discovered the polynomial hiding in your data — and you can reconstruct it exactly.

The Forward Difference Table

For equally spaced data we use forward differences, denoted by the Greek letter delta (Δ). The first forward difference is Δf(n) = f(n+1) − f(n). The second difference is Δ²f(n) = Δf(n+1) − Δf(n), and so on. Let's apply it to our pizza data:

Forward Difference Table for Pizza Regions

nR(n)Δ¹Δ²Δ³
01110
12210
24310
3741
4115
516
622

The top entries of each column (highlighted in gold) — 1, 1, 1 — are the key values we need. Third differences vanish (green zeros), proving the data is exactly quadratic.

The third differences are all zero. This is the crucial signal. A fundamental theorem in finite calculus states: if the k-th forward differences of a sequence are all zero, then the sequence is generated by a polynomial of degree at most k − 1. Since Δ³ = 0, our pizza function R(n) is a polynomial of degree at most 2 — a quadratic.

Newton's Forward Difference Formula

Newton's interpolation formula lets us reconstruct the polynomial using the top row of the difference table — the highlighted values 1, 1, 1 from our forward difference table above. For equally spaced points starting at n = 0:

R(n) = Δ⁰ · C(n,0) + Δ¹ · C(n,1) + Δ² · C(n,2)
R(n) = 1 · 1 + 1 · n + 1 · n(n−1)/2

R(n) = 1 + n + n(n−1)/2 = (n² + n + 2) / 2

Here C(n,k) denotes the binomial coefficient "n choose k." Plugging in n = 4 gives R(4) = (16 + 4 + 2)/2 = 11. Your friend was wrong — and you should have taken the bet.

But this formula does more than answer a party trick. It demonstrates a profound principle: when differences stabilize, pattern becomes law. And this exact same principle was waiting, centuries earlier, in a physics laboratory in Italy.

• • •

III. Galileo's Tower and the Secret of Falling

Physics
Galileo Galilei
Galileo Galilei (1564–1642)

In the 1600s, Galileo Galilei set out to understand the most basic motion in nature: the fall of a heavy body. Legend has it he dropped objects from the Leaning Tower of Pisa; what's certain is that he meticulously timed falling objects and discovered a stunning regularity.

What Galileo recorded was distance versus time. He found that the distances traveled in successive equal time intervals followed the pattern of odd numbers: 1, 3, 5, 7, 9 … And the cumulative distances — 1, 4, 9, 16, 25 — were perfect squares. Distance, he concluded, grows as the square of time.

But let's pretend we don't know this. Let's play Galileo ourselves and use the tool Newton would later formalize — divided differences — to extract the law of motion from raw data. Watch the ball drop and notice how the gaps between time markers grow wider — the unmistakable signature of acceleration.

Interactive: Galileo's Free-Fall Experiment

Free fall under gravity: g = 32 ft/s². Distance markers appear at each half-second. The widening gaps between marks reveal that the ball is accelerating — covering more distance in each successive interval.

Here is the data for a ball dropped from rest under gravity (g = 32 ft/s²), with distance measured in feet at half-second intervals. Since d = ½gt², we get d = 16t²:

Forward Difference Table for Free-Fall Data

Time t (s)Distance d (ft)Δ¹Δ²Δ³
0.00440
0.54840
1.0161240
1.536164
2.06420
2.5100

Look at the structure. The first differences — 4, 8, 12, 16, 20 — grow steadily. The second differences are constant at 4. The third differences are zero. The same signal we found in the pizza problem: a quadratic hiding in the data.

But there's a beautiful pattern in the raw distances too. The values 0, 4, 16, 36, 64, 100, divided by 4, give 0, 1, 4, 9, 16, 25 — perfect squares. And the ratios between successive intervals follow 1 : 3 : 5 : 7 : 9 — the odd numbers that Galileo himself marveled at. Distance grows as the square of time, and the constant second difference is the numerical fingerprint of uniform acceleration.

Cumulative distances:   0,   4,   16,   36,   64,   100
Divide by 4:   0,   1,   4,   9,   16,   25 = perfect squares

Ratios of successive intervals:   1 : 3 : 5 : 7 : 9 = odd numbers

Constant Δ²  ⟹  constant acceleration  ⟹  d ∝ t²
"The distances traversed during equal intervals of time by a body falling from rest stand to one another in the same ratio as the odd numbers beginning with unity." — Galileo Galilei, Discourses and Mathematical Demonstrations Relating to Two New Sciences (1638)

Galileo stated this as an empirical law. Newton's divided differences give us the machinery to derive it from raw data — no physics intuition required. Feed a table of numbers, compute the differences, and the polynomial writes itself.

• • •

IV. What the Differences Reveal

Synthesis

Step back and consider what just happened. We took two problems from entirely different worlds — one a puzzle about slicing a pizza, the other a fundamental question about how objects fall — and cracked both open with the same simple act: subtraction.

Galileo stared at distances piling up — 1, 4, 9, 16, 25 — and subtracted his way to a constant. That constant was gravity. Newton aimed the same machinery at the heavens. Babbage was so captivated he spent his fortune building a machine to do nothing but subtract — the Difference Engine, philosophical ancestor of the computer you're reading this on.

From a dinner table puzzle to the laws of the cosmos — all it took was subtraction.