In information theory, there is a specific type of unresolved issue that subtly refuses to go away. Among them is the binary deletion channel.
Even though it’s one of the most basic objects you can think of—a stream of bits, each of which is dropped with a fixed probability d—we still don’t have a clear formula for its capacity over fifty years after Levenshtein first described it. That in and of itself says something. In less time, entire fields of communication theory have been developed and closed.
| Field | Details |
|---|---|
| Topic | Capacity Upper Bounds for Deletion-Type Channels |
| Field of Study | Information Theory, Coding Theory, Probability |
| Originating Concept | Introduced by Levenshtein (1965); coding theorem by Dobrushin (1967) |
| Key Channels Examined | Binary Deletion Channel (BDC), Poisson-Repeat Channel |
| Notable Constant | Golden ratio φ = (1+√5)/2 ≈ 1.618 |
| Methodology | Convex programming, real analysis, genie-aided decoder framework |
| Earlier Lower Bound | Mitzenmacher–Drinea: c₁(1−d), c₁ ≈ 0.1185 |
| New Upper Bound (d ≥ 1/2) | (1−d)·φ |
| Limiting Case Bound (d → 1) | 0.7918·(1−d) |
| Practical Relevance | DNA storage, packet networks, synchronization-error correction |
| Open Problem Status | No single-letter capacity characterization yet |
But the recent events feel different. For years, the upper bounds hardly changed while the lower bounds continued to get better. Borrowed from the erasure channel, the pointless 1−d ceiling sat there like an outdated piece of furniture that no one wanted to move. Perhaps the field had quietly come to terms with it. The first nontrivial, fully explicit upper bounds for values of d well outside the simple limiting cases have since begun to be produced thanks to a wave of work utilizing convex programming and real analysis—a somewhat antiquated toolkit, to be honest.
The simplicity of the headline result is striking. The capacity is at most (1−d)·φ, where φ is the golden ratio, for deletion probability d ≥ 1/2. The appearance of φ, a constant more frequently connected to sunflower spirals and Renaissance architecture than to cacophonous channels, has an almost playful quality.

If the capacity function is convex, the bound becomes 1 − d·log(4/φ) below 1/2. These bounds are not hidden within numerical optimization with the help of a computer. They are expressed in elementary functions on paper.
This work pays equal attention to the Poisson-repeat channel, which was first introduced by Mitzenmacher and Drinea in 2006. The framing there is truly unexpected. The Poisson version, which substitutes a Poisson-distributed number of copies for each bit, should be messier than the simple deletion channel. It appears that the opposite is more accurate. The Poisson-repeat channel turns out to be more analytically tractable, and the authors hypothesize—against common sense—that a thorough understanding of it could be the key to ultimately cracking the deletion channel itself. Researchers believe that this connection has been going unnoticed.
Beneath all of this is a sophisticated technical move. Bounding capacity is reduced to maximizing a real-valued, single-variable, frequently concave function over a bounded interval. Mathematically speaking, concave functions on bounded intervals are good citizens. They can be solved analytically in certain situations (d = 1/2, for example) or given to a numerical solver. Though hindsight is generous in this regard, it’s the kind of reformulation that makes you wonder why no one asked the question in this manner before.
As this develops, it’s difficult to ignore how much of contemporary information theory still relies on careful mathematical craftsmanship as opposed to brute-force computation. A decades-old trick is the genie-aided decoder, in which you imagine an idealized receiver who knows precisely when full runs have been erased. It’s still working hard. It’s still unclear if these new bounds will eventually close the gap that has plagued researchers since the 1960s. But at last, the wall has moved. That qualifies as news in this area of the field.
