If you ever had to teach anyone anything, properly teach, you would know it’s a myth. It’s self-explanatory to you because you’re already familiar with the logic, language, conventions. I’m guessing, you grew up with all that from childhood, and you just forgot how you had to learn all that, and now you assume this knowledge didn’t need to be taught. You think cog is a universally understood language for settings because you always had it in front of you. Just like a lot of people think/thought that 3.5 floppy is a universally understood icon for “save”, and people who grow up in more recent time have no idea what I am talking about.
And then you assume that you are the average person, and start measuring everyone by this mark.
But if several years of teaching people of different skills, motivations, and ages, how to work with computers taught me anything, it’s that there is no universal language, there is no, and cannot be anything self-explanatory, and intuitive interface is a myth perpetuated by people who newer used anything other that one OS they grew up with. There is no amount of skeuomorphism you can employ that doesn’t require at least some amount of learning.
And when it comes to learning, let me tell you, there is nothing more straightforward to teach than “you type words and then read what the computer typed you back.”
And if several years of tech support taught me anything, it’s that if a regular person who doesn’t care about a computer encounters a problem, they don’t have inherently better time fixing it with GUI, never, not at all, not in a million years. I however always have way better time helping them, if it’s Linux and I can tell them what to type and they can read me the response. This actually true even if people are good with computers and know their OS.
It’s self-explanatory to you because you’re already familiar with the logic, language, conventions. I’m guessing, you grew up with all that from childhood…
This argument can be used as a reason to implement GUIs.
If we wish to market to an audience that has had some basic experience with using Windows and Mac, we can skip some of the reteaching by implementing familiar GUIs
Most people didn’t grew up with Windows or Mac, that was a blip in tine, most people grew up with a phone. When it comes to PC they’re a blank slate, they will have as much familiarity with the idea of a Windows start menus as they are with Linux console. That is to say, they saw it in a movie.
Most people do know how to use a computer though. Windows and macOS have been around for a very long time by now, and both have not required you to use the CLI for anything but very extreme cases in more than 25 years. You’re not starting with a blank slate. They know how a GUI is supposed to work. It is self explanatory to them. Shoving them towards a CLI is making them relearn stuff they already knew how to do. There’s a reason a lot of Windows migrants end up with KDE or Cinnamon. It’s familiar, it’s easy. Most people do in fact associate a cog with settings. CLI aren’t familiar to most people and thus a much larger hurdle.
Also, I’m not talking about fixing problems. The CLI is a perfectly valid tool to fix problems. Not everything has to be graphical. Just enough that you don’t need it unless something breaks.
That was kind of true for a brief period of time. And even then it wasn’t true entirely. Now most people encounter a computer when they enter the workforce. They know shit about shit, they never had to tinker with computers, most of them never had one outside of some chromebook that allowed them to render two web pages. In most cases they start from basically blank slate.
Most people do in fact associate a cog with settings.
Most people don’t know that it’s cog. Most people don’t know it’s a button. Most people don’t have concept of a button in mind. Most people entering workforce right this moment never used a mouse to press a cog button in their life. Unless they’re in IT or engineering.
Also, I’m not talking about fixing problems
This is usually when you kind of required to use console on Linux, that’s why I was talking about it.
But my broader point was against so called intuitive self-explanatory nature of the menu you have to click with your mouse.
Of course they know how to use a computer. They don’t know a thing about how a computer works but that doesn’t mean they can’t use it. Heck, my 8 y/o cousin can figure out how to open and play Minecraft on his tablet. No need for him to know about commands, programming languages and bits n bytes.
Most people these days know how to use their phones, at the very least, and even there cog = settings. Most people don’t know how to use a CLI or how a spreadsheet program works, but they certainly can use a browser on a computer. Which is also a form of using a computer.
And maybe they don’t explicitly know it’s a button. But they know if they tap or click on a cog it takes them to settings.
And even figuring out how a mouse works is a thing of a few seconds, if all you’ve used before was a touchscreen (or even nothing at all). There‘s a reason they took off in the first place.
Although, if someone truly has never used a computer in any shape or form before. No smartphone, no tablet, not even a smart TV, you‘d probably have a point that it’s not much more difficult for them to learn the common iconography than it would be to learn the CLI. But people rarely start with such a blank slate today.
Don’t get me wrong, I don’t think it’s a good thing, people are less and less tech literate these days. But my point is, tech illiteracy doesn’t mean they have never used any computer ever and do not know what an app- or settings-icon is. I’d wager it’s more the other way around: People are so used to their devices working and their UIs looking pretty (and very samey) that iconography like cogs for settings are especially self explanatory to them. It’s the same on their phone, tablet and even TV after all.
If you ever had to teach anyone anything, properly teach, you would know it’s a myth. It’s self-explanatory to you because you’re already familiar with the logic, language, conventions. I’m guessing, you grew up with all that from childhood, and you just forgot how you had to learn all that, and now you assume this knowledge didn’t need to be taught. You think cog is a universally understood language for settings because you always had it in front of you. Just like a lot of people think/thought that 3.5 floppy is a universally understood icon for “save”, and people who grow up in more recent time have no idea what I am talking about.
And then you assume that you are the average person, and start measuring everyone by this mark.
But if several years of teaching people of different skills, motivations, and ages, how to work with computers taught me anything, it’s that there is no universal language, there is no, and cannot be anything self-explanatory, and intuitive interface is a myth perpetuated by people who newer used anything other that one OS they grew up with. There is no amount of skeuomorphism you can employ that doesn’t require at least some amount of learning.
And when it comes to learning, let me tell you, there is nothing more straightforward to teach than “you type words and then read what the computer typed you back.”
And if several years of tech support taught me anything, it’s that if a regular person who doesn’t care about a computer encounters a problem, they don’t have inherently better time fixing it with GUI, never, not at all, not in a million years. I however always have way better time helping them, if it’s Linux and I can tell them what to type and they can read me the response. This actually true even if people are good with computers and know their OS.
This argument can be used as a reason to implement GUIs.
If we wish to market to an audience that has had some basic experience with using Windows and Mac, we can skip some of the reteaching by implementing familiar GUIs
Most people didn’t grew up with Windows or Mac, that was a blip in tine, most people grew up with a phone. When it comes to PC they’re a blank slate, they will have as much familiarity with the idea of a Windows start menus as they are with Linux console. That is to say, they saw it in a movie.
Most people do know how to use a computer though. Windows and macOS have been around for a very long time by now, and both have not required you to use the CLI for anything but very extreme cases in more than 25 years. You’re not starting with a blank slate. They know how a GUI is supposed to work. It is self explanatory to them. Shoving them towards a CLI is making them relearn stuff they already knew how to do. There’s a reason a lot of Windows migrants end up with KDE or Cinnamon. It’s familiar, it’s easy. Most people do in fact associate a cog with settings. CLI aren’t familiar to most people and thus a much larger hurdle.
Also, I’m not talking about fixing problems. The CLI is a perfectly valid tool to fix problems. Not everything has to be graphical. Just enough that you don’t need it unless something breaks.
That was kind of true for a brief period of time. And even then it wasn’t true entirely. Now most people encounter a computer when they enter the workforce. They know shit about shit, they never had to tinker with computers, most of them never had one outside of some chromebook that allowed them to render two web pages. In most cases they start from basically blank slate.
Most people don’t know that it’s cog. Most people don’t know it’s a button. Most people don’t have concept of a button in mind. Most people entering workforce right this moment never used a mouse to press a cog button in their life. Unless they’re in IT or engineering.
This is usually when you kind of required to use console on Linux, that’s why I was talking about it.
But my broader point was against so called intuitive self-explanatory nature of the menu you have to click with your mouse.
Of course they know how to use a computer. They don’t know a thing about how a computer works but that doesn’t mean they can’t use it. Heck, my 8 y/o cousin can figure out how to open and play Minecraft on his tablet. No need for him to know about commands, programming languages and bits n bytes.
Most people these days know how to use their phones, at the very least, and even there cog = settings. Most people don’t know how to use a CLI or how a spreadsheet program works, but they certainly can use a browser on a computer. Which is also a form of using a computer.
And maybe they don’t explicitly know it’s a button. But they know if they tap or click on a cog it takes them to settings.
And even figuring out how a mouse works is a thing of a few seconds, if all you’ve used before was a touchscreen (or even nothing at all). There‘s a reason they took off in the first place.
Although, if someone truly has never used a computer in any shape or form before. No smartphone, no tablet, not even a smart TV, you‘d probably have a point that it’s not much more difficult for them to learn the common iconography than it would be to learn the CLI. But people rarely start with such a blank slate today.
Don’t get me wrong, I don’t think it’s a good thing, people are less and less tech literate these days. But my point is, tech illiteracy doesn’t mean they have never used any computer ever and do not know what an app- or settings-icon is. I’d wager it’s more the other way around: People are so used to their devices working and their UIs looking pretty (and very samey) that iconography like cogs for settings are especially self explanatory to them. It’s the same on their phone, tablet and even TV after all.