Just ran the same WebSocket load tests with #SiteJS on a 2 vCPU VPS with 2GB RAM and it started dropping messages (1 message/sec) at around 10K concurrent connections.
I’m more than happy with that as a starting point. (That’s not what we need to optimise. For context, I have 7.24K followers on the fediverse after 3 years. We won’t start hitting those limits on the #SmallWeb for a while.) ;)
Videos, etc., at: https://twitter.com/aral/status/1345734023658745866
Note: I posted those videos on Twitter as my Mastodon personal instance didn’t let me post them as they were too large (my instance of one is hosted on @mastohost and they have limits – which is understandable; they’re not a multi-billion-dollar company).
For longer videos, etc., I use our Vimeo account.
@aral Obviously, the cost of allowing concurrent streaming of large videos is high and would increase the cost for someone running a Mastodon instance.
I do think these limits make sense. Not sure if the values are ideal but there should be a small limit to make possible running a Mastodon instance on a low cost set up.
@aral It would make sense, at least for uploads done via the web interface of Mastodon. Other software could probably manage that compressing client side.
Having said that, I have spent so many hours dealing with ffmpeg issues...
@aral have you tried using the ImagePipe app on Android? That compresses images automatically
@aral @mastohost We did historically up the limits when a new generation of iPhones dropped because ideally mobile videos should just work but it’s complicated, the bigger they are, the more resources it takes to process it server-side.
@SuperDicq @aral @mastohost Probably not in the browser, though native apps should be able to
@mastohost Don’t disagree. I do wonder, however, if more processing could be done server side so that a < 30 second video captured on a modern mobile phone fits the limits (the issue is that modern phones have stellar resolution so they’d need to be resampled/recompressed).