A baker creating a block must currently optimize for two limits, the size limit of the block and the gas limit. While heuristics for the multiple knapsack problem are pretty good, this tends to complicate matters a bit.
For instance, if gas were the only metric, a baker could order transactions by fee/gas and include them in this order, a pretty good heuristic. This is much more complicated with two constraints.
The reason for the size limit of a block is that it takes time to download, and takes up storage space. The first constraint can actually be subsumed into gas, by determining the gas cost incurred to download the block on some baseline Internet connection. It doesn’t fully capture the cost of storing historical blocks, but storage is typically not the limiting factor.
Note that the size limit per operation, which does not affect bakers directly, can be kept in place.
Having conducted literally hundreds of 1-1 onboarding sessions in the last last few months, I not only agree but I can add that this has been one of the most frequently expressed points of friction/confusion for newcomers.
That’s exactly much it. Technically, I would actually keep the size limit per block but set its value so that gas is the limiting factor. This may be important so the shell knows how big a block should be at most.
This proposal looks reasonable. And, indeed, I also think that maintaining a clear upper-bound is necessary but bumping it shouldn’t have real issues. I’m unclear on the specifics of the gas costs estimation simulation but we might need to go through it to make sure that we don’t miss anything (e.g. (de)serialization, storage on disk, etc.).