Statistical Multiplexing

And other Fractional F&*)ery

Image Source: Edge2Edge Media on Unsplash

Statistical multiplexing is the art of using or selling more than you actually have.

Famous examples include the Internet.

Cloud computing has been another one over the last couple of decades, and the drive from applications, to VMs, containers, and serverless (function as a service), is all about achieving increasing statistical gains. The smaller the unit of scheduling, the better the chances of statistical gain (billing for more than the actual demand and/or maximizing the usage of a resource).

Another similar concept is fractional. The great philosophical shift in the banking system over the last hundred years was the shift from the gold standard - a mindset that currency must be backed by hard assets - to fractional banking - issuing loans for a great amount than the deposits actually being held. A statistical assumption that not everyone is going to want their deposits at the same time. Some people credit this shift to enormous wealth creation, others blame it for inflation.

Buying securities on margin is buying more than you can actually afford by financing the portion you do not have the money for. This amplifies both gains and losses.

Naked shorting is the practice of shorting stocks that have not been verified to exist. Theoretically illegal, it may still occur because of loopholes. Some believe it is a beneficial thing, aiding in price discovery and allowing additional participants in a share.

When Robinhood prevented customers from buying Gamestop, they essentially said their capacity was oversubscribed, they could no longer cover their customer’s margin positions - buying more shares than they actually had the financial capacity to buy. creating a potential obligation for Robinhood should their customers not be able to meet their obligations.

Should Robinhood have also stopped the selling of Gamestop shares? They did not need to do that to address their oversubscription problem. Many have asserted they did not have the legal requirement too, and other traders took similar actions. Would it have been a wise thing to also stop selling from a brand perspective? Well, I’ll let brand experts fight over that one, but the intuitive answer is yes. Though they are now finally getting their story out, the initial brand damage is likely significant.

Statistical multiplexing creates enormous leverage and is perhaps the single most significant reason why the Internet created so much economic activity compared to previous approaches to networking. There are downsides and corner cases though, and when everyone wants the capacity, they believe they are entitled to, all at once, then problems occur.

This is the reason why oversubscription ratios are of such interest in the industry; arguably also QoS, traffic engineering, load balancing, packet sizes, packet marking, and various other aspects of networking.

Often traffic management in a network is focused on the subscribers that are the top consumers of bandwidth, the outliers. We may have in a sense seen some of the same in financial networks this week.

There have to be consumption management mechanisms, and the greater the oversubscription, the more important these mechanisms become.

While control plane issues are often top of mind, and can on occasion go sideways, the fundamental economics and performance of a network are day in and day out, about traffic management - whether it is a conservative approach to over provisioning, or an aggressive approach to oversubscribing.

Networks of all types, whether they be packet networks or financial networks, must invest in engineering that realizes reasonable performance, in the presence of statistical multiplexing, fractional f&*(ery, or people selling things that don’t exist or they don’t have the money for. This engineering is essential to realizing the benefits of statistical gains, while mitigating the downsides, and reducing the outage time caused by everyone wanting their piece of the pie at the same time.