IMO one of the first steps when building tools and networks that aren't for-profit is to consider how the goal of profit (or perceived value) affects how people are encouraged to behave. Expected behavior is by and large informed by the pursuit of profit and perception of value because popular platforms are commercial.
For example, maximum "engagement" is always desirable to corporations. When your goals and desires are different, that shifts to the *quality* of interaction, at least partially along the scale, rather than sheer volume.
So it may seem counterintuitive to omit or modify expected features, but that's because the dynamics you want are different than what people have been conditioned to expect. Whether or not what they've been conditioned to expect is good for them, or even remotely sensible from the perspective "is this a good way to do things".
Malice, disconnect, the desire for social control, whatever. There's a point where platforms like Twitter, FB, etc shift into a set of goals that don't actually have anything to do with the quality of user experience. We get conditioned to those environments because they're what we use. This is a long and rambling way of saying imitation may be a form of flattery, but it shouldn't be the opening move when building alternatives.