Okay here's what I'm going to do, or at least try!

I will setup another 1 or 2 servers for sidekiq. (sidekiq is the part of Mastodon that handles the jobs, think about all actions in the network.)

Currently we are about 100k jobs behind and I'm afraid one server just won't do..

If anyone has any experience with this could some give me a few tips? I know I'm going to fix it but it could be a lot faster!

Thank you!

@stux An extremely noob question - what exactly are these jobs? Is something being triggered on the server everytime traffic comes in?

@Psyborg There are no noob questions, it's good you ask!

Every action on the platform like a follow, a toot, or a boost for example is a 'job' for sidekiq the software that handles these things. 🙂

@stux Like via an API? I know there are no noob questions and thank you so much for not making fun! I'm a C dev myself and always on the lookout for new things to absorb. Is there a sidekiq manual I can look at and not bug you?

@Psyborg Checkout this awesome docu! I think there's all you want to know 🙂
docs.joinmastodon.org/

@stux Thanks much! Another question - do you own this instance?

@Psyborg I do not own the instance that you are on that would be @Gargron

@stux @Gargron Ahh yes. So then how is it that you seem to be maintaining the jobs? Oh, is that for a client you designed?

@Psyborg @stux Sidekiq jobs = background task processing. Can’t do slow things in the HTTP request/response cycle.

@Gargron @Psyborg

Is there any guide on how to setup sidekiq, streaming and puma on different servers? I get most things but I want to be sure

@stux @Psyborg Well, it’s the same setup as usual but you only enable one type of service, and instead of installing new databases you just make it connect to the existing ones.

@Gargron @Psyborg Thanks for clarification! So for example create a new sidekiq server, copy the env but point hosts to the ‘main’ server for Postgres and Redis? What about opening ports on both servers? Would this be secure?

Follow

@stux @Gargron @Psyborg Before scaling to multiple servers, make sure that you properly configured number of workers and threads to handle the load. Different server will obviously lead to some latency

Sign in to participate in the conversation
Mastodon

Server run by the main developers of the project 🐘 It is not focused on any particular niche interest - everyone is welcome as long as you follow our code of conduct!