Read this thread. I think many underestimate the ways Bitcoin blockspace supports more txns today relative to 2017.
- simply more space (2-3x more thanks to SW)
- batching + SW outputs more common
- fee estimation better (no feedback loop)
- Optech bridges industry & Core https://twitter.com/ziggamon/status/1264233417362878466
- simply more space (2-3x more thanks to SW)
- batching + SW outputs more common
- fee estimation better (no feedback loop)
- Optech bridges industry & Core https://twitter.com/ziggamon/status/1264233417362878466
it's unlikely we get a 2017-style 'fee crisis' anytime soon. Industry is using blockspace more efficiently, there's more of it, and new spillways exist to absorb capacity if it does get extreme.
To dive deeper into these issues:
- Bitcoin Optech, which communicates best practices to exchanges: https://bitcoinops.org/
- @0xB10C's txn fee. info, which has great data on usage modes: https://transactionfee.info/
- @0xB10C's amazing mempool observer: https://mempool.observer/monitor
- Bitcoin Optech, which communicates best practices to exchanges: https://bitcoinops.org/
- @0xB10C's txn fee. info, which has great data on usage modes: https://transactionfee.info/
- @0xB10C's amazing mempool observer: https://mempool.observer/monitor
This is how scaling works
- Institutional credibility (don't f**k with the base layer)
- Big blockspace consumers gradually optimize their usage
- The amount of $$ txn value packed into a single byte increases steadily – "economic density". No need to nuke node operators.
- Institutional credibility (don't f**k with the base layer)
- Big blockspace consumers gradually optimize their usage
- The amount of $$ txn value packed into a single byte increases steadily – "economic density". No need to nuke node operators.
Without fee pressure, the chain gets packed with junk data (see here: https://www.coindesk.com/how-blockchains-become-great-big-garbage-patches-for-data), and big blockspace consumers have no incentive to economize their usage.