The first thing you need to understand that it's not just a matter of the majority of miners for a hard fork.... it's got to be pretty much everybody. Otherwise, you will have a blockchain split with two different user groups both wanting to call their blockchain "bitcoin". Unspent outputs at the time of the fork can be spent once on each new chain. Mass confusion.

The only way to do it is to get most major clients to accept larger blocks AFTER a future specified date.

That way once say "2017 Dec 31" rolls around 90% of BTC users will all accept larger blocks at the same time and the confusion will be minimal.

This is not that hard honestly just get MyWallet, Armory, the Satoshi client and Electrum programmers/distributors to agree on a date say "2020" and a new limit "100mb" and you're done.I don't see why that many people would reject this change and as such the new standard should be rolled out way before it takes effect.

Miners are irrelevant in this, but they should welcome the change; BTC will never grow if normal people can't use it. We would be right back to trusting banks, only backed by BTC instead of gold this time. We all know what a wild success THAT has been!

Currently (Feb 2013), we have about 50000 tx per day, or 0.579 tx per second (tps), or 347 tx per block (tpb). We are paying miners 25 BTC per block, or $500 per block at current rate. If bitcoin becomes the VISA scale, it has to handle 4000 tps, or 2400000 tpb, or 6916x of the current volume. To keep mining profitable, we may need to pay $50000 per block (to pay electricity, harddrive space, bandwidth, CPU time for ECDSA). As the block reward will become 0 in the future, this $50000 has to be covered by fee. Since we will have 2400000 tpb, each tx will pay $0.021, not too bad when you can send unlimited amount of money to anywhere in the world in no time.

That means mining is profitable even without any block size limit.

On the other hand, the 1MB constraint will certainly kill bitcoin. 1MB is only about 2500 tpb, or 0.1% of VISA scale. We are already at 13.9% of this limit. If we don't act before problem arises, people will start migrating to alt-coins.

In 10-years, a modern smartphone/computer will be able to run the full processing node if the Max_Block_Size remains 1MB. This is a clear economic benefit for Bitcoin: Decentralized Bitcoin verification. In fact in a few years, virtually every computer will be able to process the entire blockchain without issue thus making Bitcoin extremely unique in the realm of payments.

What is the use of having portable devices act as full nodes if you can't (because of fees) use bitcoin for purchasing anything smaller than a house? As I see it, your argument is not valid. With 1MB blocksize limit, even if Bitcoin remains a relatively small niche currency, the limit will act as a hard constraint on the potential utility of the currency. Of course, once we start hitting the limit, it will hurt Bitcoin's public image so much that it's conceivable so many people will move away from Bitcoin that we get few more years of time to fix the issue.

Lets not get ahead of ourselves here. I expect that we will have a multi-layered system the vast majority of the transactions being made off-chain.

Additionally; who is to say that one wouldn't want to verify their house transaction with a smart-phone.

People completely mis-judge the free-market. If you have to use alt-chains because the fees are so high, well isn't that a success of bitcoin already! By no stage has bitcoin failed then.

The argument for larger blocks is VALID for a proticol that isn't Bitcoin. However it is a catch 22, for Bitcoin. It only becomes a problem if Bitcoin is a success. If Bitcoin is a success, by definition it isn't a problem.

Currently (Feb 2013), we have about 50000 tx per day, or 0.579 tx per second (tps), or 347 tx per block (tpb). We are paying miners 25 BTC per block, or $500 per block at current rate. If bitcoin becomes the VISA scale, it has to handle 4000 tps, or 2400000 tpb, or 6916x of the current volume. To keep mining profitable, we may need to pay $50000 per block (to pay electricity, harddrive space, bandwidth, CPU time for ECDSA). As the block reward will become 0 in the future, this $50000 has to be covered by fee. Since we will have 2400000 tpb, each tx will pay $0.021, not too bad when you can send unlimited amount of money to anywhere in the world in no time.

That means mining is profitable even without any block size limit.

On the other hand, the 1MB constraint will certainly kill bitcoin. 1MB is only about 2500 tpb, or 0.1% of VISA scale. We are already at 13.9% of this limit. If we don't act before problem arises, people will start migrating to alt-coins.

Let's assume a miner with moderate hashing power can mine 1 in 10000 blocks (i.e. one block in 10 weeks). With $50000/block, he will get about $5/block.

2500000tpb (VISA scale) means about 1GB/block. Currently a 2000GB drive costs about $100, or $0.05/GB. Therefore, the harddrive cost is only 1% of his mining income. It's negligible. (and harddrive will be much cheaper in the future)

A quad-core Intel core i7 is able to handle 4000tps (https://en.bitcoin.it/wiki/Scalability#CPU) at 77W. Assuming $0.15/kWh, it costs about $0.012/h, or $0.002/block. Even energy is 10x more expensive in the future, it's still negligible. (and CPU will be much efficient in the future)

1GB/block needs a bandwidth of 4.3TB/month. Including all overhead it may take 10TB/month, and may cost $300/month currently for a dedicated server in datacentre. It is $300/(30*24*6) = 0.069/block. Again, it is negligible comparing with the $5/block reward.

He will still earn $5-0.05-0.002-0.069 = $4.879/block after deducting the harddrive, CPU, and bandwidth cost. It is $29/hr or $21077/month and is a ridiculous amount given he only owns 0.01% of total hashing power. He still needs to pay for the electricity bill for the mining equipment. It is hard to estimate but even if he uses 90% of the earning for the electricity bill, he will still earn $2107/month.

In 10-years, a modern smartphone/computer will be able to run the full processing node if the Max_Block_Size remains 1MB. This is a clear economic benefit for Bitcoin: Decentralized Bitcoin verification. In fact in a few years, virtually every computer will be able to process the entire blockchain without issue thus making Bitcoin extremely unique in the realm of payments.

What is the use of having portable devices act as full nodes if you can't (because of fees) use bitcoin for purchasing anything smaller than a house? As I see it, your argument is not valid. With 1MB blocksize limit, even if Bitcoin remains a relatively small niche currency, the limit will act as a hard constraint on the potential utility of the currency. Of course, once we start hitting the limit, it will hurt Bitcoin's public image so much that it's conceivable so many people will move away from Bitcoin that we get few more years of time to fix the issue.

Lets not get ahead of ourselves here. I expect that we will have a multi-layered system the vast majority of the transactions being made off-chain.

Additionally; who is to say that one wouldn't want to verify their house transaction with a smart-phone.

People completely mis-judge the free-market. If you have to use alt-chains because the fees are so high, well isn't that a success of bitcoin already! By no stage has bitcoin failed then.

The argument for larger blocks is VALID for a proticol that isn't Bitcoin. However it is a catch 22, for Bitcoin. It only becomes a problem if Bitcoin is a success. If Bitcoin is a success, by definition it isn't a problem.

Setup your own Electrum server with your computer at home and verify your house transaction with a smart-phone through it. Therefore you don't need to trust a third party

A smart-phone is never designed to run a bitcoin full-node

Moreover, the sentence "It only becomes a problem if Bitcoin is a success. If Bitcoin is a success, by definition it isn't a problem." is self-contradicting

In 10-years, a modern smartphone/computer will be able to run the full processing node if the Max_Block_Size remains 1MB. This is a clear economic benefit for Bitcoin: Decentralized Bitcoin verification. In fact in a few years, virtually every computer will be able to process the entire blockchain without issue thus making Bitcoin extremely unique in the realm of payments.

What is the use of having portable devices act as full nodes if you can't (because of fees) use bitcoin for purchasing anything smaller than a house? As I see it, your argument is not valid. With 1MB blocksize limit, even if Bitcoin remains a relatively small niche currency, the limit will act as a hard constraint on the potential utility of the currency. Of course, once we start hitting the limit, it will hurt Bitcoin's public image so much that it's conceivable so many people will move away from Bitcoin that we get few more years of time to fix the issue.

Lets not get ahead of ourselves here. I expect that we will have a multi-layered system the vast majority of the transactions being made off-chain.

So, full nodes act as banks and issue Bitcoin-denominated instruments to their clients. Maybe the clients do not even have to trust the banks, thanks to some kind of cryptographical magic. Because of the economical scale of the transactions, only big companies and financial institutions have any reason to actually make a Bitcoin transaction, and these kinds of actors can run any kind of full node anyway. The clients can run something similar to what is outlined in this thread: https://bitcointalk.org/index.php?topic=88208.0

Actually, that thread outlines the way that future PCs (if not smartphones) could conceivably run a full node (or "almost-full" node) even with no limit / floating limit.

I just can't see why this artificial limit that was intended as temporary from the start should be accepted as an immutable part of the protocol.

There is going to be a hard fork in any case, more likely sooner than later. It should be planned beforehand, if we care about Bitcoin at all.

Actually, that thread outlines the way that future PCs (if not smartphones) could conceivably run a full node (or "almost-full" node) even with no limit / floating limit.

There are many merits to etotheipi's writing but what he proposes massive _increases_ the IO and computational cost of running a full node (or a fully validating but historyless node) over a plain committed UTXO set for validation. The increased node burden is one of the biggest arguments against what he's proposing and I suspect will ultimately doom the proposal.

I have seen nothing proposed except moore's law that would permit full validation on "desktop" systems with gigabyte blocks.

Quote

I just can't see why this artificial limit that was intended as temporary from the start should be accepted as an immutable part of the protocol.

There are plenty of soft limits in bitcoin (like the 500k softlimit for maximum block size). The 1MB limit is not soft. I'm not aware of any evidence to suggest that it was temporary from the start— and absent it I would have not spent a dollar of my time on Bitcoin: without some answer to how the system remains decentralized with enormous blocks and how miners will be paid to provide security without blockspace scarcity or cartelization the whole idea is horribly flawed. I also don't think a network rule should be a suicide pact— my argument for the correctness of making the size limited has nothing to do with the way it always was, but that doesn't excuse being inaccurate about the history.

I think we should let miners decide on the maximum size of blocks that they'll build on.

How difficult would it be to implement it on bitcoind right now, without touching the 1Mb hard limit?I mean the multiple limits and tolerance levels idea.

Miners using bitcoind would be able to set in config file a list of value pairs. One value would be a size limit, the other the amount of blocks longer a chain breaking that limit would have to be in order to you to accept building on top of it. Do you see what I'm saying?That could be done right now on bitcoind, with the sole condition that anything above 1Mb will be rejected no matter what.

Couldn't the limit be adjusted with every difficulty change so that it is approximately in-line with the demand of the previous difficulty period? If the block size were capped near the transaction volume ceiling there would be still be incentive to include mining fees while never running the risk of running out of block space.

Actually, that thread outlines the way that future PCs (if not smartphones) could conceivably run a full node (or "almost-full" node) even with no limit / floating limit.

There are many merits to etotheipi's writing but what he proposes massive _increases_ the IO and computational cost of running a full node (or a fully validating but historyless node) over a plain committed UTXO set for validation. The increased node burden is one of the biggest arguments against what he's proposing and I suspect will ultimately doom the proposal.

I have seen nothing proposed except moore's law that would permit full validation on "desktop" systems with gigabyte blocks.

Quote

I just can't see why this artificial limit that was intended as temporary from the start should be accepted as an immutable part of the protocol.

There are plenty of soft limits in bitcoin (like the 500k softlimit for maximum block size). The 1MB limit is not soft. I'm not aware of any evidence to suggest that it was temporary from the start— and absent it I would have not spent a dollar of my time on Bitcoin: without some answer to how the system remains decentralized with enormous blocks and how miners will be paid to provide security without blockspace scarcity or cartelization the whole idea is horribly flawed. I also don't think a network rule should be a suicide pact— my argument for the correctness of making the size limited has nothing to do with the way it always was, but that doesn't excuse being inaccurate about the history.

Read my calculation above. With each transaction just paying $0.021 as fee, we have more than enough money to pay miners to handle 4000 transaction per second.

After there are 25BTC of fees per block to replace the mining reward instead of .25, by transactions that have to pay to get in a block in a reasonable amount of time, then it might be time to consider a larger block size. Until then it needs to stay scarce. I have 2.5GB of other people's gambling on my hard drive, because it's cheap.

If the block size for is attempted, it is critical to minimize disruption to the network. Setting it up well in advance based on block number is OK, but that lacks any kind of feedback mechanism. I think something like

Code:

if( block_number > 300000) AND ( previous 100 blocks are all version > 2)then { go on and up the MAX_BLOCK_SIZE }

Maybe 100 isn't enough, if all of the blocks in a fairly long sequence have been published by miners who have upgraded, that's a good indication that a very large super-majority of the network has switched over. I remember reading something like this in the qt-client documentation (version 1 -> 2?) but can't seem to find it.

Alternatively, instead of just relying on block header versions, also look at the transaction data format version (first 4 bytes of a tx message header). Looking at the protocol it seems that every tx published in the block will also have that version field, so we could even say "no more than 1% of all transactions in the last 1000 blocks of version 2 means it's OK to switch to version 3".

This has the disadvantage of possibly taking forever if there are even a few holdouts (da2ce7? ), but my thinking is that agreement and avoiding a split blockchain is of primary importance and a block size change should only happen if it's almost unanimous. Granted, "almost" is ambiguous: 95%? 99%? Something like that though. So that anyone who hasn't upgraded for a long time, and somehow ignored all the advisories would just see blocks stop coming in.

Idea 2: Measuring the "Unconfirmable Transaction Ratio" I agree with gmaxwell that an unlimited max block size, long term, could mean disaster. While we have the 25BTC reward coming in now, I think competition for block space will more securely incentivize mining once the block reward incentive has diminished. So basically, blocks should be full. In a bitcoin network 10 years down the road, the max_block_size should be a limitation that we're hitting basically every block so that fees actually mean something. Lets say there are 5MB of potential transactions that want to get published, and only 1MB can due to the size limit. You could then say there's a 20% block inclusion rate, in that 20% of the outstanding unconfirmed transactions made it into the current block.

I realize this is a big oversimplification and you would need to more clearly define what constitutes that 5MB "potential" pool. Basically you want a nice number of how much WOULD be confirmed, except can't be due to space constraints. Every miner would report a different ratio given their inclusion criteria. But this ratio seems like an important aspect of a healthy late-stage network. (By late-stage I mean most of the coins have been mined) Some feedback toward maintaining this ratio would seem to alleviate worries about mining incentives.

Which leads to:

Idea 3: Fee / reward ratio block sizing.

This may have been previously proposed as it is fairly simple. (Sorry if it has; I haven't seen it but there may be threads I haven't read.)

What if you said:

Code:

MAX_BLOCK_SIZE = 1MB + ((total_block_fees / block_reward)*1MB)

so that if the block size would scale up as the multiple of the reward. So right now, if you wanted a 2MB block, there would need 25BTC total fees in that block. If you wanted a 10MB block, that's 250BTC in fees.

In 4 years, when the reward is 12.5BTC, 250BTC in fees will allow for a 20MB block.It's nice and simple and seems to address many of the concerns raised here. It does not remove the freedom for miners to decide on fees -- blocks under 1MB have the same fee rules. Other nodes will recognize a multiple-megabyte block as valid if the block had tx fees in excess of the reward (indicative of a high unconfirmable transaction ratio.)

Problems with this is it doesn't work long term because the reward goes to zero. So maybe put a "REAL" max size at 1GB or something, as ugly as that is. Short / medium term though it seems like it would work. You may get an exponentially growing max_block size, but it's really slow (doubles every few years). Problems I can think of are an attacker including huge transaction fees just to bloat the block chain, but that would be a very expensive attack. Even if the attacker controlled his own miners, there's a high risk he wouldn't mine his own high-fee transaction.

Please let me know what you think of these ideas, not because I think we need to implement them now, but because I think thorough discussion of the issue can be quite useful for the time when / if the block size changes.

Wait, no, I spoke too soon. The fee/reward ratio is a bit too simplistic. An attacker could publish one of those botnet type of blocks with 0 transactions. But instead, fill the block with spam transactions that were never actually sent through the network and where all inputs and outputs controlled by the attacker. Since the attacker also mines the block, he then gets back the large fee. This would allow an attacker to publish oversized spam blocks where the size is only limited by the number of bitcoins the attacker controls, and it doesn't cost the attacker anything. In fact he gets 25BTC with each sucessful attack. So an attacker controlling 1000BTC could force a 40MB spam block into the blockchain whenever he mines a block.

Not the end of the world, but ugly. There are probably other holes in the idea too.Anyway, I'm just suggesting that something akin to a (total fee/block reward) calculation may be useful. Not sure how you'd filter out spammers with lots of bitcoins. And filtering out spammers was the whole point (at least according to Satoshi's comments) of the initial 1MB limit.

I'll keep pondering this, though I guess it's more about WHAT the fork might be, rather than HOW (or IF) to do it.

Without a sharp constraint on the maximum blocksize there is currently _no_ rational reason to believe that Bitcoin would be secure at all once the subsidy goes down.

Can you walk me through the reasoning that you used to conclude that bitcoin will remain more secure if it's limited to a fixed number of transactions per block?

Are you suggesting more miners will compete for the fees generated by 7 transactions per second than will compete for the fees generated by 4000 transactions per second?

If the way to maximize fee revenue is to limit the transaction rate, why does Visa, Mastercard, and every business that's trying to maximize their revenue process so many of them?

If limiting the allowed number of transactions doesn't maximize revenue for any other transaction processing network, why would it work for Bitcoin?

If artificially limiting the number of transactions reduces potential revenue, how does that not result in fewer miners, and therefore more centralization?

In what scenario does your proposed solution not result in the exact opposite of what you claim to be your desired outcome?

If space in a block is not a limited resource then miners won't be able to charge for it, mining revenue will drop as the subsidy drops and attacks will become more profitable relative to honest mining.

Play Bitcoin Poker at sealswithclubs.eu. We're active and open to everyone.

If space in a block is not a limited resource then miners won't be able to charge for it, mining revenue will drop as the subsidy drops and attacks will become more profitable relative to honest mining.

That's why transaction space should IMO be scarce. But not hard limited.

A hard limit cap will just make a transaction impossible at a certain point, no matter how high the fees paid. If we would have 1Mil. legit transactions a day, with the 1MB limit 400k would never be confirmed, no matter the fees.

A algorithm adjusting max. blocksize in a way that transaction space remains scarce, but ensuring all transactions can be put into the blockchain is IMO a reasonable solution.

Ensuring fees would always have to be paid for fast transactions, but also ensuring every transaction has a chance to get confirmed.

All previous versions of currency will no longer be supported as of this update

If space in a block is not a limited resource then miners won't be able to charge for it, mining revenue will drop as the subsidy drops and attacks will become more profitable relative to honest mining.

How many business can you name that maximize their profitability by restricting the number of customers they serve?

If it really worked like that, then why stop at 1 MB? Limit block sizes to a single transaction and all the miners would be rich beyond measure! That would certainly make things more decentralized because miners all over the world would invest in hardware to collect the massive fee that one lucky person per block will be willing to pay.

Why stop there? I'm going to start a car dealership and decide to only sell 10 cars per year. Because I've made the number of cars I sell a limited resource I'll be able to charge more for them, right?

Then I'll open a restaurant called "House of String-Pushing" that only serves regular food but only lets in 3 customers at a time.

I have 2.5GB of other people's gambling on my hard drive, because it's cheap.

Snap! Me too.

Zero-fee transactions are an overhead to bitcoin. One benefit of them might be to encourage take-up by new users maintaining momentum of growth. If they need to be discouraged, then agreed, it could be done by using the max block size limit.

An initial split ensuring "high or reasonable" fee transactions get processed into the blockchain within an average of 10 minutes, and "low or zero" fee transactions get processed within an average of 20 minutes might be the way to go.

Consider the pool of unprocessed transactions:

Each transaction has a fee in BTC and an origination time. If the transaction pool is sorted by non-zero fee size then: fm = median (middle) fee value.

The block size limit is then dynamically calculated to accommodate all transactions with a fee value > fm plus all the remaining transactions with an origination time > 10 minutes ago. If a large source of zero fee transactions tried to get around this by putting, say, a 17 satoshi fee on all its transactions then fm would likely be 17 satoshis, and these would still get delayed. A block limit of 10x the average block size during the previous difficulty period is also a desirable safeguard.

The public would learn that low or zero fee transactions take twice as long to obtain confirmation. It then opens the door for further granularity where the lower half (or more) of the pool is divided 3, 4, or 5 times such that very low-fee transactions take half an hour, zero-fee transactions take an average of an hour. The public will accept that as normal. Miners would reap the benefits of a block limit enforced fee incentive system.