Principles of UI, A Thread:
1. natural mapping
2. visibility of system state
3. discoverability
4. constraints and affordances
5. habits and spatial memory
6. locus of attention
7. no modes
8. fast feedback
9. do not cause harm to a user's data or through inaction allow user data to come to harm
10. prefer undo to confirmation boxes. For actions that can't be undone, force a "cooling off" period of at least 30 seconds.
11. measure using Fitt's, Hick's, GOMS, etc. but always test with real users.

12. don't assume that your skills or knowledge of computers as a designer or programmer in any way resemble the skills or knowledge of your users.

13. Consider the natural order of tasks in a flow of thought. Verb-Noun vs. Noun verb. Dependency->Dependants vs. Dependants->Dependencies.

14. Instead of having noob mode and advanced mode, use visual and logical hierarchies to organise functions by importance.

15. Everything is an interface, the world, learning new things, even perception itself

16. Consider the psychology of panic. Panic kills scuba divers, panic kills pilots. panic kills soldiers. panic loses tennis matches. Panic leads to stupid mistakes on a computer.
more at: asktog.com/columns/066Panic!.h

17. Consider the 3 important limits of your user's patience:
0.1 second, 1 second, 10 seconds

nngroup.com/articles/response-

18. An interface whose human factors are well considered, but looks like butt, still trumps an interface that looks slick but is terrible to use. An interface that is well considered AND looks good trumps both, and is perceived by users to work better than the same exact interface with an ugly design.

19. Don't force the user to remember things if you can help it. Humans are really bad at remembering things. This includes passwords, sms codes, sums, function names, and so on. My own personal philosophy is to consider humans a part of your system, and design around our shortcomings instead of thinking of users as adversaries. Software should serve humans, humans shouldn't serve software.

20. Some Sources:
Donald Norman
Jef Raskin
Jacob Nielsen
Bruce "Tog" Tognazzini

I recommend all the talks by Alan Kay and Bret Victor, here's two:

Doing with Images Makes Symbols
youtube.com/watch?v=p2LZLYcu_J
The Future Of Programming
youtube.com/watch?v=8pTEmbeENF

Follow

The first 8 items of this thread are extremely terse, to the point of being meaningless on their own. Please use them as search terms, or ask me to expand on them when my dog isn't barking at me to go to bed.

21. Gall’s Law:
A complex system that works is invariably found to have evolved from a simple system that worked. A complex system designed from scratch never works and cannot be patched up to make it work. You have to start over with a working simple system.

22. show, don’t tell. lengthy tutorials and “protips” forced on the user at app start usually do nothing other than get in the way of the user’s task. if you want to teach the user about a feature, include easy to find examples.

23. don’t interrupt flow of thought. if a user is opening an application, they usually have some specific task to complete. nagging them at this point in time about software updates or handy tips is very user hostile.

24.many jokes are made about the “save” icon looking like a floppy disk. it’s very appropriate, since the command as a concept is built around the technological limits of floppy disks, limits that are comically irrelevant in the 21st century.drag your app out of the 1980s and implement autosave and version control already.

25. consistency consistently consistent. there’s few things more fun than designing your own custom ui widget toolkit, css framework, or interaction paradigm. however, please strongly consider *not* doing this. custom UI is like ugly baby photos. instead, stick as much to the HIG guidelines and conventions of the platform you are on, so users can use what they’ve already learned about where things usually are, and what the fuck the weird molecule icon does.

26. try to imagine ways to use your shiny new software to abuse, harass, stalk, or spy on people, especially vulnerable people. ask a diverse range of people to do the same.
then fix it so you can’t. if you cannot figure out how to do your special software thing without opening vulnerable people to abuse, consider not making it available to anyone.

27. UX is ergonomics of the mind (and also body). Where traditional ergonomics considers the physical abilities and limits of a human body, UX considers the limits of the human mind: attention, memory, response time, coordination, emotions, patience, stamina, knowledge, subconscious, and so on. If you ever find a UX practitioner sacrificing accessibility on the altar of so called “good experiences”, you are dealing with incompetence.

expanding on 1. Natural Mapping:
user interfaces typically “map” to the system they control, each button and dial corresponding to some element of the system. Natural mapping is when the interface forms an obvious spatial relationship to the system, such as 4 stovetop dials that are in the same arrangement as the stovetops. the anti-pattern is arranging controls in an arbitrary order with no spatial correspondence to the system.

2. Visibility of System State:
Software typically has state (to state the obvious), such as “where” you are in the software’s menu system, what “mode” you are currently in. whether your work is safely stored on disk or has “unsaved changes”, what stage of a process you are up to and how many steps are left. Failure to effectively communicate system state to the user is inviting them to get lost and make mistakes. counterexamples: setting the time on a digital wrist watch, programming a VCR

3. Discoverability
this is about making the possible actions in a system visible- or if not immediately visible, the mechanism of their discovery should be visible and consistent. For instance, the menu items in a GUI system are discoverable. the available commands in a unix system are not. the opposite of this principle is “hidden interface”, examples of hidden interface are rife in iOS: tapping the top of the screen for “scroll to top”, shake to undo, swipe from edge for browser back- etc.

4. Constraints and Affordances.
A constraint is something that is not possible in a system. an affordance is something that is possible to do. which is which should be communicated clearly- the nature of this communication breaks down into three subcategories:
a. physical:
visually obvious from the shape of objects in a system- two lego bricks can only snap together in a limited number of ways.
b. logical: what’s possible or not makes sense logically: e.g. color coding,
c. cultural

constraints and affordances is at the heart of the “flat design” vs. “skeumorphism” debate. the benefit of skeumorphic interfaces is that replicating the look of real world objects like buttons, provides a natural way to communicate interactions. where skeumorphism went wrong is communicating false affordances: a detail in the ios6 calendar app hinting that pages could be torn out- when no interaction supported it.
flat design throws the baby out with the bathwater. we still need real buttons.

5. Habits and Spatial Memory
this is mostly about not arbitrarily moving around.buttons in an interface. people are creatures of habit, and if you fundamentally change the method of performing a task for no good reason, it’s not a “UI revamp” it’s pointlessly frustrating your existing users.
for spatial memory, millions of years of evolution have left us with mental machinery for remembering exactly *where* something is physically. you can take advantage of this in UI with persistence of space.

an example of this persistence of space concept is the meticulous way some people curate their phone’s launch screens. even better would be if iOS allowed a different wallpaper for each page, and for icon grids to permit gaps anywhere instead of forcing them to sort left to right, top to bottom. the different look of each screen could then be very personal and memorable. Finding an app, then, a matter of finding the page with the right color and shape.

6. Locus of Attention
this is a recognition of the fact that human consciousness is single threaded. that while parallel processes permit us to do things like walk and chew gum at the same time, there is only one thread of processing that represents our conscious awareness. therefore, interfaces that expect our attention to be fully present in the status bar, the cursor, the flashing banner ad, the address bar, the lock icon, the autoplaying video and the notifications are misguided.

7. No Modes
A Gesture is an action (a keystroke, a mouse move) expected to result in some effect (a letter being added to a document, a cursor moving).
A mode changes the effects associated with some or all gestures. caps lock is a mode. “apps” are modes. Modes are bad if they result in modal error: the unawareness that a mode has been activated, resulting in unexpected effects, and possibly unawareness it *is* a mode, or how to get out of it. VIM is prime offender. so are modern TVs.

modes are typically employed as solutions to the situation of the number of functions in a system far exceeding the number of available external controls. this can happen either as a result of featuritis, or an apple-esque fetish for small numbers of buttons.
suggested remedies include quasimodes like the shift key, that activate a mode only while a button is being held down. another approach is developing composable UI conventions like GUI menus, or search, that can scale without modes.

another way of looking at this is examining how much context a user needs to understand what effect a gesture will have, and how effectively that context is being communicated. Can i write a step for step guide to doing a task on a computer, for a computer novice, that doesn’t include first determining where in the operating system you are, whether the correct application is open, figuring out which of many methods can get you into that apllication are applicable in that situation?
no.

this is what was nice about the “home” button on iphones: it doesn’t matter where you are in the system, there’s a physical hardware clicky button that will always bring you back to the start, and cannot be overriden by third party software.
apple ruined it with the iphone X swipey home gesture. not only is it hidden interface, but it’s modal now-which edge you swipe depends on the orientation sensor, and is —- sometimes but not always visually indicated by a line that is maybe correct.

Show newer

@nindokag thanks. it’s basically my cliff’s noted from doing a lot of reading. hmm, I try my best to give credit where credit is due. hmm, #17. is an expansion of #8. oops. I should expand on the other first 7.

@zensaiyuki Can you expand on what you don't like about modern TVs, so I can avoid replicating it in my 3rd browser?

For the record I don't personally have one, and when my family give me the remote to theirs' I typically go to the browser or a USB stick I've just plugged in. Hence why I want to make my own smart TV browser.

I think you can guess I don't like the distribution channels for mainstream entertainment...

@alcinnz it’s not a matter of personal “like” or “dislike”, it’s watching my grandmother’s personal triumph over finally mastering the ability to control one. In the road toward that, a common episode is accidentally pressing the wrong button which put the tv into some mode, leaving her with no frame of reference for how to get out of it. enough episodes like this can make people afraid to touch the remote control at all.

@alcinnz i mean, if you’ve never has a problem figuring these things out it can be very hard to empathise, but remote controls have a *lot* of cryptically labelled buttons. if you don’t already know what every one of them does, it’s impossible to know which to press in any of dozens of possible situations. it’s a bit of a bandaid, but one of the best things apple phones used to have is a “home” button, which would always do the same exact thing no matter when you pressed it. pissed it’s gone.

Show newer
Show newer
@zensaiyuki I think it's really important to consider your audience. Not all of these principles apply to all software/people; for instance, tools made for text editing (due to the fact that editing and composing typically have different mental models) and graphic desig (e.g. Blender, due to the sheer amount of different operations necessary) often benefit from modality. The fact that you said "Vim is a prime offender" while Vim-like modes are baked into almost every editor/IDE by popular demand highlights this.

There should be a zeroth rule: "Frequently scope out your target user-base and use-cases. Understand what unlisted rules apply to them, and what listed rules don't". These things might change over time; this can be a good or bad thing to be embraced or curtailed, hence the word "Frequently".

I do agree that modes need to be obvious to the user. In Neovim (or Vim with the right default settings), for instance, the cursor appears as a line rather than a block when in Insert mode to minimise this error; the mode can also be shown in the statusline.

I think that too much software is made for "everyone"; it'd be better for developers to be able to separate users' strong/weak preferences and absolute needs (A11y is an absolute need; the human body doesn't support many upgrades). Devs can try to cater to all the absolute needs possible, and target only a subset of users with particular preferences.

@Seirdy i accept no defense of vim. some people might prefer driving a car blind and using only a radar screen because it makes them feel like a powerful ship captain. however, we live in a society. intentionally crippling yourself (and others) with inferior senses for a preference that makes you *feel* powerful is not a very socially responsible behavior.

@zensaiyuki A critical part of design is empathy. Users aren't all the same, and condescendingly asserting that what works best for them is just a childish ego-booster without understanding where they're coming from doesn't sound like a productive course of action. I say this as someone who shares this problem.

I tend to lose my train of thought easily. Time is not a fungible commodity; the seconds I have as I need to get an idea down are more valuable to me than the minutes I spend learning how to use my tools each week.

I'm not interested in feeling powerful, or using whatever new thing someone else thinks is best for me. If that were the case, I'd use something like a sci-fi interface. I'm interested in something that lets me edit at the speed of thought on a very low-end device. I've used Emacs, Atom, VSCode, IntelliJ, Gedit, Leafpad, Kate and Kdevelop, Anjuta, Geany, Spyder, and ScITE for extended periods of time over the past seven years. The modal keyboard-driven editing in Vi (and for certain tasks, /bin/ed) made me *less* crippled.

I can personally guarantee that I am not the only one who has come to this conclusion. I also know that vim bindings aren't for everyone; the investment it takes to learn them isn't always worth it.
Show newer

@zensaiyuki I feel like there could be a lot of benefit in more seamlessly integrating the terminal and GUI interfaces to the OS. What's lacking is a one-to-one correspondence with terminal and GUI behavior. Another thing is that the terminal itself could be designed with menus to perform actions, which it then automatically populates as a text command. Eg. You go to terminal and choose copy, it pops up a box asking for source and destination, autopopulates the terminal with eg (cont)

@zensaiyuki 'cp someFile someNewFile', then shows the user the copied file in the destination. Or something like that. It could be integrated into the file manager or something.

@unspeakablehorror there’s a very nice model (yet to be fully implemented by anyone) for integrating gui and cli interfaces described in Jef Raskin’s “the Humane Interface”, one of my sources for this thread. tiny parts of the idea have been integrated into OSX spotlight,

@unspeakablehorror his son Aza Raskin implemented other parts in Enso and Ubiquity, two now abandoned projects.

@unspeakablehorror a slightly different part of the idea can be seen in interactive “workbook” systems like mathematica snd jupyter

@zensaiyuki Just seeing this again, and it is annexcellent and illuminating summary.

Yes, this.

UI/UX

@zensaiyuki I often encounter such things when dealing with IRC advocates. IRC — and by extension their defense of it — can be so blatantly harmful to newcomers…

@KekunPlazas that’s a good point that could explode out into 1000 points about community management and design- which i wouldn’t want to make because while i know some things, i wouldn’t want to present myself as an expert.

@zensaiyuki You know who I'd complain about here?

Thankfully the W3C are trying to improve... But I'm significantly more willing to break backwards compatibility for the sake of privacy & security.

@zensaiyuki Also I did have to defend this UX principle. Sure there's tradeoffs to consider and we might miss things, but fundamentally too many (mainstream?) developers don't even consider it. We tend to be too optimistic!

@alcinnz there’s an episode of Star Trek: The Next Generation where they unfreeze a bunch of cryogenically frozen people from the 20th century. One of them goes straight to treating the ship’s intercom as something like room service or flight attendant request. Captain gets irritiated, and 20th century guy asks “if you didn’t want me to abuse the system, why didn’t you put access controls in place?”

captain says “because in the 23rd century, everyone knows not to abuse the system.”

@alcinnz i think about this a lot with relation to things like CAPTCHA. it’s gonna be an eternal cat and mouse game with bots until the day that there’s no economic incentive to defeat CAPTCHA, and the only way to get there is a radical overhaul of the global culture.

@alcinnz on the other hand, this approach is comically naive for a ship navigating unexplored and potentially hostile territories. the ship’s computer being hijacked because the captain’s admin password was “passw0rd” seems to be a regular plot device.

@alcinnz like the stories i’ve heard about former soviet states: after the fall of communism, lots of people just floundered, literally having no idea how to do cuthroat dog eat dog capitalism.

@zensaiyuki there is a cool dutch word for this which we repeatedly used in my design ethics class: "hufterproof" which means something like asshole/jerk/vandal-proof.

Every session we would think of ways to abuse our design and then think of changes to prevent them.

@zensaiyuki I do try to adhear to this for Odysseus (in bridging between an elementary OS & Web experiences), but for my other browser(s) I'm guilty of all three.

But really that's in the name of giving webdevs something even cooler to play with: Building webpages that works great on absolutely any device and/or OS!

@alcinnz i wouldn’t begrudge anyone their fun or their experiments trying to push the state of UI forward. brace for failure though. hopefully the “ugly baby photos” metaphors makes sense. years ago I took a photography class that forbade pet photos. Your pets always look beautiful to you, but that’s not enough to make it a good photo interesting for everyone else to look at.

@zensaiyuki Well, success to me looks like webdevs building more accessible pages. For it to be feasable for others to build their own, simpler, browser engines to display those pages. And for The Web to better protect users' privacy!

What I really care about are the deeper architectural issues, which I may still fail at.

@alcinnz that’s fine, my comment wasn’t personal. you’re talking to someone who designs widget toolkits for fun and has a list of UI guidelines pinned to their social media. I’d be a hypocrite to dump on you for designing widgets for fun.

@alcinnz and well, more broadly, you’d have to be fairy resiliant and inattentive to not have been occasionally annoyed by some electron app or cross platform ui toolkit completely breaking the conventions of your os, perhaps even having non standard keyboard shortcuts. or the morass of the linux/unix world where you’d be hard pressed to find any standards whatsoever. the trouble with quitting vim, for instance, isn’t that it’s hard to learn, it’s that it’s not just control+c or control+d

@alcinnz ah! i just rememebered the ui principle i was gonna post the other day and forgot. #32.

@zensaiyuki Yes, in discussing apps you will quickly find me saying that I highly prefer installing them from the elementary AppCenter. They do a great job enforcing standards (though there's a "uncurated" section)!

And how much I love & hate The Web is a huge motivation for me to write software!

Talking of which, someone please design elementary OS a nice instant messenger to tempt them off of Slack!

Oh, and you've tempted me to ask about your widget toolkits?

@alcinnz well, if they were in a stage I could talk about, I would. at this point it’s nothing more than a couple vague ideas and piles and piles of brain dumps in assorted notes applications.

@zensaiyuki And, custom widgets and interfaces are usually unusable for blind people, whose screen readers have to figure out what's going on. Do stick to the platform's own toolkits!

@zensaiyuki I wonder if the argument changes if you imagine the word "save" being replaced with the word "commit" (as in version control, or committing a transaction).

Seems to me that "saving" isn't as much a technological limitation as it is separation between transient and persistent states. It makes sense in many a context, for the same reason you don't want your project to be rebuilt on every keypress, or your friend's IM displaying your message letter by letter as you type it.

@temporal the “argument” is just a reiteration of “visibility of system state” and “don’t cause a user’s data to come to harm”. the problem with “save” is the mistakes the basic design cause which lead to data loss. while what you’re saying has some sense to it, it’s not the word “save” that is the problem. it’s believing you’ve pressed it and that the save operation completed when it hasn’t.

@temporal and well, i myself have made that mistake plenty of times with git. or times i’ve committed but didn”t “push”.

Sign in to participate in the conversation
Mastodon

Server run by the main developers of the project 🐘 It is not focused on any particular niche interest - everyone is welcome as long as you follow our code of conduct!