The Optimal Block Size -

The Optimal Block Size


In this article I will explain how the size of Bitcoin blocks is determined in the absence of an arbitrary size limit. Oleg has already written an excellent post on this topic, but I have a few things to add and I wish to relate the discussion to the current debate over increasing the block size.

If the maximum limit on block size is removed, it is not the case that blocks will simply grow without limit. There is a natural size that is determined by the market. The block size will change until the marginal revenue of adding more transactions approaches the marginal cost; this could happen because the block grows in size until a limit is reached, or because the costs of mining and transmitting blocks change as a result of investment in the capital goods which make up the network, or a combination of the two.

The marginal revenue of a transaction is easy to understand—it is simply the value of the transaction fee relative to the size of the transaction. The marginal cost comes from two sources—the risk of generating an orphaned block and the bandwidth required to receive the transactions. Right now, blocks propagate at different speeds depending on how big they are. This puts a sharp economic limit on their size because it means that each additional transaction in a block increases its chances of being orphaned. If two blocks are generated at roughly the same time, the smaller one will propagate faster and therefore has a greater probability of being accepted throughout the network. This risk limits the block size.

However, there is a proposal from Gavin Andresen which would reduce the block propagation speed from O(n) to O(1) using invertable Bloom lookup tables. Should this be implemented, the risk of an orphaned block will no longer limit the block size because large blocks will propagate at the same speed as smaller ones. Instead, the bandwidth of the network limits the size of the blocks. This, by the way, is a reason that O(1) block propagation needs to be implemented as quickly as possible—when that happens, the miners will demand to start generating larger blocks.

What determines the bandwidth of the network? Right now the Bitcoin network has a problem because nobody pays for relaying transactions other than the people running full nodes. Ultimately this will have to change because if it does not, there is the possibility that required functions in the network would not be served properly. What we should want is that the network should set transaction fees and should adjust to relay only those transactions which would be profitable for miners to include in blocks. In turn, miners should be willing to pay to receive transactions which they can mine, and ordinary bitcoin users should be willing to pay to receive transactions which are likely to be included in the latest blocks. Maybe this will one day be accomplished with Bitcoin’s micropayment channels.

This is all related to the debate about increasing the maximum block size because there is a myth, which stems from a poor understanding of the economics of block size, that increasing the maximum block size would reduce Bitcoin’s security. For example, on Bitcoin-Assets, Mircea Popescu has argued

so average bandwidth in US (where Gavin is based) is 10Mbit. An incearse of .5 per year is like what 15 next then 22.5 after that etc etc < << in 20 years the per-block subsidy will be just about 40 bitcents. at that same time, gavin's block size will be 110 mb. So one block will fit 100x as many tx, and each solved block will yield 50x less in subsidies. that's a 5k drop over 20 years. it's high enough to kill the price, and with it mining, and with all that bring bitcoin back within the financial ability of the us, which is exactly the point of all this fucking derpage.

In light of the foregoing, it should be clear that Mircea, not Gavin, is derping here. First, increasing the maximum block size does not imply that there will be an increased network bandwidth. Increasing the block size only provides the option, not the necessity. Miners will tend to allow for more transactions if it is profitable to do so, and they will never make blocks so big that they would be unlikely to propagate throughout the network.

Second, and more importantly, Mircea is thinking only in terms of an increased cost of a higher bandwidth rather than its effect on the revenue of mining or the opportunity cost of not allowing the block size to increase. The block reward is still declining regardless of what happens to the block size. Therefore, Bitcoin’s security must be backed by a new source of revenue eventually. Bitcoin cannot pay miners forever with new bitcoins or else it would have no value today. That means transaction fees and the mass adoption of Bitcoin. If Bitcoin cannot handle a high transaction volume, then it is fundamentally flawed.

It is true that an artificial limit on the block size would limit the cost of running a full node, but it would also limit the revenue available from mining. It is not necessarily the case, therefore, that allowing for an increased network bandwidth would lead to a reduced hash rate, long-term, from what it would have been otherwise. If the block size were limited, then not only would the number of transactions be limited, but each one would have to be very expensive in order to pay for a high hash rate. Bitcoin could not function easily as a medium of exchange under those circumstances, and it clearly has much better prospects if it can process many transactions with low fees. In the long-term, it is only through mass adoption and lots of transactions that Bitcoin could possibly have a high difficulty in the long run.

Image source:

Daniel Krawisz

Daniel Krawisz

Daniel Krawisz received his master's in physics from The University of Texas at Austin in 2010 and is now pursuing a master's in software engineering there. He has been involved in Bitcoin since 2011 and writes articles at Bitmessage: BM-NBPVwY5A26MtyfbHyh4UfA4Hn76DamAP

  • Good article. If anything, Bitcoin needs to adapt as no system is perfect. Learning from history and current systems (like VISA and MASTERCARD), we have good information as to what to expect and how to adapt.

  • onb

    Hey Dan. Nice article. FTR, that’s a slight misquote; the text after the <<< is a response to thestringpuller's text from earlier, found at . The colons throw people off :)

  • xcsler

    I’m not an economist and admit that I didn’t understand the entirety of your post. I’m still confused as to why block sizes need to be uncapped.

    From the article:
    “It is true that an artificial limit on the block size would limit the cost of running a full node, but it would also limit the revenue available from mining.”

    My understanding is that the more nodes there are the more secure the network is. Therefore increasing block sizes leads to a less secure network. Without maximized security Bitcoin’s value as an unforgeable ledger comes into question. This must be avoided at all costs.
    I don’t agree with the statement that an artificial limit on the block size would limit the revenue available from mining. The transaction fees could rise to whatever level is necessary to insure that mining is profitable. Clearly this would limit on-chain transactions but so what? Off-chain transactions could substitute for low value transactions with users paying fees to Bitcoin wallet providers/”banks”. These “banks” can then settle with one another by paying fees to miners for on-chain transactions. Proof of reserves and audits would keep the banks honest.
    Is my thinking flawed? Also, thanks for all the great articles and analysis. I enjoy your work.

  • Allen Piscitello

    More nodes don’t necessarily add security. Running your own node lets you verify everything more than other systems (SPV or pure trust of other nodes). If there are fewer nodes, it might be easier to pull of other attacks, like Sybil attacks, where you are isolated from the rest of the network.

    Daniel seems to miss the possibility that tx fees would just rise to make the scarce space of the blockchain a resource that could be “auctioned” every 10 minutes, forcing smaller value transactions off the chain.

    I have also thought of the “pay for relaying” or “pay for serving blocks” model for a node, and it would be interesting to see how it plays out. In this case, miners and originators of transactions would have an incentive to directly connect to users, or possibly make it cheaper if you dealt directly with the miners. That produces different incentives, as bigger miners might be able to operate with an economy of scale in making deals to accept transactions at a discounted rate.

    The “market” will certainly reach an equilibrium based on the inputs going into it, but it’s also important to consider that minor changes in those inputs could drive a vastly different equilibrium, all of which is subjective in what is better. The 1MB limit is simply a “check” on keeping the costs for operating a full node lower. Unfortunately with the consensus nature of Bitcoin, individual nodes cannot decide to arbitrarily reject blocks they deem too big, like the miners can with fees. Cutting the limit will certainly not be that big of a deal, as miners don’t fill up blocks now for reasons mentioned, but also is not without consequence.

    Perhaps the answer is not in fees for relaying transactions but in being able to retrieve parts of the blockchain and serve it. This encourages those with the best ways to store the data to profit while allowing those who don’t have such needs to just pay as needed.

  • xcsler

    This article seems to say that the more nodes there are the more secure the network is.

    I’m not much of a computer networking expert so I can’t argue one way or the other on the importance of nodes but it seems like most people in the space do feel that the number of nodes does correlate with security.

    Do you disagree with this assertion?

  • Allen Piscitello

    Since he doesn’t clarify what he means, it’s hard to tell. “More secure” is fairly arbitrary and can mean different things to different people. The fewer copies of the blockchain there are, the harder it is to give a false history, and the easier it is to perform a Sybill attack, but it’s not a huge factor in security. The idea you would actually trust checking for double-spends as some kind of security measure in unconfirmed transactions is a bit alarming to suggest, so my guess is the author has no idea what he is talking about.