Just ran the same WebSocket load tests with #SiteJS on a 2 vCPU VPS with 2GB RAM and it started dropping messages (1 message/sec) at around 10K concurrent connections.
I’m more than happy with that as a starting point. (That’s not what we need to optimise. For context, I have 7.24K followers on the fediverse after 3 years. We won’t start hitting those limits on the #SmallWeb for a while.) ;)
Videos, etc., at: https://twitter.com/aral/status/1345734023658745866
@aral The limits are set by Mastodon https://github.com/tootsuite/mastodon/blob/master/app/models/media_attachment.rb#L153-L157
I believe the reasoning for this is for one instance not allow 1GB video and another 5MB. Besides creating "special" instances it also would not fully federate between an instance where large videos are allowed and an instance with small videos.
This way every Mastodon instance follows the same limits (as long as the admin didn't hardcode a change to those values).
@aral Obviously, the cost of allowing concurrent streaming of large videos is high and would increase the cost for someone running a Mastodon instance.
I do think these limits make sense. Not sure if the values are ideal but there should be a small limit to make possible running a Mastodon instance on a low cost set up.
@mastohost Don’t disagree. I do wonder, however, if more processing could be done server side so that a < 30 second video captured on a modern mobile phone fits the limits (the issue is that modern phones have stellar resolution so they’d need to be resampled/recompressed).
@aral It would make sense, at least for uploads done via the web interface of Mastodon. Other software could probably manage that compressing client side.
Having said that, I have spent so many hours dealing with ffmpeg issues...
The original server operated by the Mastodon gGmbH non-profit