Home / Markets News & Opinions / The Bitcoin Block Size Debate Has Been Going On Since 2013

The Bitcoin Block Size Debate Has Been Going On Since 2013

Last Updated March 4, 2021 4:45 PM
Justin OConnell
Last Updated March 4, 2021 4:45 PM

As this debate on BitcoinTalk  demonstrates, the current rhubarb regarding increasing the block size limit in Bitcoin has been ongoing since at least 2013 in a public manner… as well as a heated one. The discussion between the Bitcoin core developers goes on for 26 pages. The most recent post is April 2013. It starts with Peter Todd, who still to this day opposes increasing the block size limit in the manner being proposed via Bitcoin-XT.

I fear very few people understand the perverse incentives miners have with regard to blocks large enough that not all of the network can process them, in particular the way these incentives inevitably lead towards centralization. I wrote the below in terms of block size, but the idea applies equally to ideas like Gavin’s maximum block validation time  concept. Either way miners, especially the largest miners, make the most profit when the blocks they produce are large enough that less than 100%, but more than 50%, of the network can process them – Peter Todd

Gavin Andresen is quick to respond:

I strongly feel that we shouldn’t aim for Bitcoin topping out as a “high power money” system that can process only 7 transactions per second.

I agree with Stephen Pair– THAT would be a highly centralized system.

Oh, sure, mining might be decentralized.  But who cares if you either have to be a gazillionaire to participate directly on the network as an ordinary transaction-creating customer, or have to have your transactions processed via some centralized, trusted, off-the-chain transaction processing service?

So, as I’ve said before:  we’re running up against the artificial 250K block size limit now, I would like to see what happens. There are lots of moving pieces here, so I don’t think ANYBODY really knows what will happen (maybe miners will collectively decide to keep the block size low, so they get more fees.  Maybe they will max it out to force out miners on slow networks.  Maybe they will keep it small so their blocks relay through slow connections faster (maybe there will be a significant fraction of mining power listening for new blocks behind tor, but blasting out new blocks not via tor)).

I think we should put users first. What do users want? They want low transaction fees and fast confirmations. Lets design for that case, because THE USERS are who ultimately give Bitcoin value.

Already Michael Hearn and Andresen were in agreement.

I agree with Gavin, and I don’t understand what outcome you’re arguing for.

You want to keep the block size limit so Dave can mine off a GPRS connection forever? Why should I care about Dave? The other miners will make larger blocks than he can handle and he’ll have to stop mining and switch to an SPV client. Sucks to be him.

Your belief we have to have some hard cap on the N in O(N) doesn’t ring true to me. Demand for transactions isn’t actually infinite. There is some point at which Bitcoin may only grow very slowly if at all (and is outpaced by hardware improvements).

Likewise, miners have all kinds of perverse incentives in theory that don’t seem to happen in practice. Like, why do miners include any transactions at all? They can minimize their costs by not doing so. Yet, transactions confirm. You really can’t prove anything about miners behaviour, just guess at what some of them might do.

I don’t personally have any interest in working on a system that boils down to a complicated and expensive replacement for wire transfers. And I suspect many other developers, including Gavin, don’t either. If Gavin decides to lift the cap, I guess you and Gregory could create a separate alt-coin that has hard block size caps  and see how things play out over the long term.

Hearn also wants to highlight what everyone can agree upon:

We’re all keen to see efficient protocols built on top of Bitcoin for things like micropayment channels (which allow lots of fast repetitive satoshi-sized payments without impacting the block chain), or trusted computing (which allows offline transactions to be carried around in long chains until final resolution). Also the payment protocol should eliminate the most absurd abuses of micropayments like SDs messaging system. These things fall into the class of “no brainers” and were discussed for a long time already.

Other more exotic ideas like Ripple-style networks of payment routers using contracts don’t seem against the spirit of Bitcoin if they keep the low trust aspects of the system.

At the same time, as evidenced by the disagreement on this thread, there are too many unknown variables for us to figure out what will happen ahead of time. The only way to really find out is to try it and see what happens. If Bitcoin does fail to scale then the end result will be a smaller number of full nodes but lots of people using the system – this is still better than Bitcoin being deliberately crippled so it never gets popular because even if the number of full nodes collapses down to less than 1000, unknown future advances in technology might make it cheap enough for everyone to run a full node again. In the absence of a hard-coded limit the number of full nodes can flex up and down as supply and demand change. But with a hard-coded limit Bitcoin will fail to achieve popularity amongst ordinary people and will eventually be forgotten.

A constant theme among all three lead programmers is one of uncertainty. Nobody knows what will happen when the Bitcoin community crossed the bridge they are today in terms of increasing – or not – the block size limit. As some commenters mention on Reddit, the BitcoinTalk discussion dating back to 2013 is among the most informative content on the subject available.

Also read: Is There A Privacy Backdoor in XT? 

 

Hearn and Andresen seem convinced they know what is best for the users; namely, cheap transaction fees and quick confirmations. Here are some other highlights in the discussion. Peter Todd’s ideas are strange, as well, and not relatable to Bitcoin’s user-base. 

Peter Todd:

A system where anyone can get a transaction confirmed by paying the appropriate fee? A fee that would be about $20 (1) for a typical transaction even if $10 million a day, or $3.65 billion a year, goes to miners keeping the network secure for everyone? I’d be very happy to be able to wire money anywhere in the world, completely free from central control, for only $20.

Mike Hearn:

I don’t personally have any interest in working on a system that boils down to a complicated and expensive replacement for wire transfers. And I suspect many other developers, including Gavin, don’t either. If Gavin decides to lift the cap, I guess you and Gregory could create a separate alt-coin that has hard block size caps and see how things play out over the long term.

Peter Todd:

If there was consensus to, say, raise the limit to 100MiB that’s something I could be convinced of. But only if raising the limit is not something that happens automatically under miner control, nor if the limit is going to just be raised year after year.

Gavin Andresen:

Yes, it is a fact of life that if you have a system where people are competing, the people who are less efficient will be driven out of business. So there will be fewer people in that business. You seem to be saying that we should subsidize inefficient miners by limiting the block size, therefore driving up fees and making users pay for their inefficiency.

Many commenters found Todd’s idea for $20 transaction fees distasteful and not in Bitcoin’s best interest. One thing is for sure – the debate ranges on. The main difference today is Bitcoiners can now vote with their Internet connections: to download XT or not to download XT? 

Featured image from Shutterstock.