We learn and teach inferior personal computing practice, and most people don’t realize how much they are missing.
The vast majority of people outside of enthusiast circles have absolutely no idea what a personal computer is, how it works, what is an operating system, what it does, and how it is supposed to be used. Instead of teaching about shells, sessions, environments, file systems, protocols, standards and Unix philosophy (things that actually make our digital world spin) we teach narrow systems of proprietary walled gardens.
This makes powerful personal computing seem mysterious and intimidating to regular people, so they keep opting out of open infrastructures, preferring everything to come pre-made and pre-configured for them by an exploitative corporation. This lack of education is precisely what makes us so vulnerable to tech hype cycles, software and hardware obsolescence, or just plain shitty products that would have no right to exist in a better world.
This blindness and apathy makes our computing more inaccessible and less sustainable, and it makes us crave things that don’t actually deserve our collective attention.
And the most frustrating thing is: proper personal computing is actually not that hard, and it has never been more easy to get into, but no one cares, because getting milked for data is just too convenient for most adults.
Completely agree. Now my hot take for this thread:
If governments some time in the 90s had decided from the start to ban computer hardware from being sold with pre-installed software then we wouldn’t have this problem. If everyone had to install their own operating system from scratch, which like you say isn’t hard if it’s taught, it would have killed the mystery around computing and people would feel ownership over their computers and computing.
I think the main issue is the fact that learning about how every single component in a computer works, would take an enormous amount of time and dedication, you cannot just inspire the interest in people to learn about something they are completely uninterested about.
You may see others as blind, careless individuals that want to get their data milked, but we all have to make sacrifices for convenience. We just cannot be interested in every single thing.
At a societal level, we all cannot and shouldn’t be knowing what the Unix philosophy is and what it represents for software design.
That being said, I do agree with the main point of being taught inferior PC practice, education in the schools I attended was mostly done via rote learning rather than explaining the tools that we have created to solve which problems or situations.
Given the importance of computers in our time, isn’t it only proportionally justified to spend an enormous amount of time and dedication in teaching it properly?
Only computer nerds think this way. People have a finite time and capacity for learning, and if computers can serve their needs without spending a large fraction of that precious resource it would be terrible to mandate such an expenditure anyway.
I wish we could all be completely educated and independent in every way that matters, but it’s not possible.
This is why people on lemmy are confused about a lack of adoption. Federation is significantly confusing and subtle; we’re just mostly dorks with the pre-inclination to get it.
I too have to watch myself to keep from falling into the hole of blaming the dumbing-down of computing systems on a moral failure of users. It is not.
I might have phrased my thought too bluntly: I never intended to frame the problem as any sort of moral failure on the end users’ part. I view this as a failure of our educational institutions.
In my mind, preferring to spend time on (e.g.) MS Office in class, instead of teaching proper computer literacy, is like trying to teach meal-prep with Philips air fryers instead of teaching how to cook.
I hear you, and I too feel like it might be just my aspi-nerdiness speaking, but the same argument could be said about any subject that is considered fundamental to highschool ed. We don’t skip on philosophy, sciences, languages and arts just because they seem less applicable than math or econ, or because “it’s impossible to learn everything”.
Our civ made progress, having invented a fundamentally new tech that is accessible to everyone and now underpins everything. Allowing people to acquire the basic literacy needed to interface with this tech sustainably is the bare minimum we should be doing. I am not talking about turning kids into cyber wizards - just getting their computing up to a level that allows them to make relevant informed choices.
I’m totally with you. I just think the level of informed choices that we nerds seek will not be attainable through a reasonable gen ed curriculum. It would be an improvement, though!
I was playing a degree of devil’s advocacy there because I was interested in how the person I replied to would respond.
I don’t think it needs to be as intensive as that, I think a small amount of education would go a long way. Like teaching school classes how to install an operating system on a blank machine as a basic entry point - that would do wonders for gaining a basic appreciation for ownership over computing.
I think the other user replied what I would have said as well, we have a finite amount of time and we are seeing things from a computer-centric perspective.
I do agree that computer literacy is incredibly important and people should have the means to know how to properly operate the things they use on a daily basis but we could make the exact same argument over a myriad of things, take for example interpersonal skills or even emotions, we barely go over them in most educational systems and something as simple as communication is one of the biggest bottlenecks you can find while working, I’ve personally seen big projects go down in a big ball of fire all because of people miscommunicating or because someone can’t control their emotions.
As a TL;DR, we have more pressing issues as a society.
Hopefully we can continue moving forward as a society though, and we can have better education in more aspects, I’ve been a teacher in the past and I can tell you some that students are really hungry for knowledge. So not all hope is lost in that sense.
I am not a professional educator, but in general I think it is worth to start with basic computer literacy: identifying parts of a PC, being able to explain their overall functions, difference between hardware and software, and what kinds of software a computer can run (firmwares, operating systems, user utilities etc.). This would also be a perfect time to develop practical skills, e.g. (assuming you are a normatively-abled person) learning to touch-type and perform basic electronics maintenance, like opening your machine up to clean it and replace old thermal compounds.
After that taking something like “Operating systems fundamentals” on Coursera would be a great way to go on.
It really depends on your goals, resources and personal traits, as well as how much time and energy you can spare, and how do you like to learn. You can sacrifice and old machine, boot Ubuntu and break it a bunch of times. You can learn how to use virtualization and try a new thing every evening. You can get into ricing and redesign your entire OS GUI to your liking. You can get a single-board computer like RaspberryPi and try out home automation.
We learn and teach inferior personal computing practice, and most people don’t realize how much they are missing.
The vast majority of people outside of enthusiast circles have absolutely no idea what a personal computer is, how it works, what is an operating system, what it does, and how it is supposed to be used. Instead of teaching about shells, sessions, environments, file systems, protocols, standards and Unix philosophy (things that actually make our digital world spin) we teach narrow systems of proprietary walled gardens.
This makes powerful personal computing seem mysterious and intimidating to regular people, so they keep opting out of open infrastructures, preferring everything to come pre-made and pre-configured for them by an exploitative corporation. This lack of education is precisely what makes us so vulnerable to tech hype cycles, software and hardware obsolescence, or just plain shitty products that would have no right to exist in a better world.
This blindness and apathy makes our computing more inaccessible and less sustainable, and it makes us crave things that don’t actually deserve our collective attention.
And the most frustrating thing is: proper personal computing is actually not that hard, and it has never been more easy to get into, but no one cares, because getting milked for data is just too convenient for most adults.
Completely agree. Now my hot take for this thread:
If governments some time in the 90s had decided from the start to ban computer hardware from being sold with pre-installed software then we wouldn’t have this problem. If everyone had to install their own operating system from scratch, which like you say isn’t hard if it’s taught, it would have killed the mystery around computing and people would feel ownership over their computers and computing.
I think the main issue is the fact that learning about how every single component in a computer works, would take an enormous amount of time and dedication, you cannot just inspire the interest in people to learn about something they are completely uninterested about.
You may see others as blind, careless individuals that want to get their data milked, but we all have to make sacrifices for convenience. We just cannot be interested in every single thing.
At a societal level, we all cannot and shouldn’t be knowing what the Unix philosophy is and what it represents for software design.
That being said, I do agree with the main point of being taught inferior PC practice, education in the schools I attended was mostly done via rote learning rather than explaining the tools that we have created to solve which problems or situations.
Given the importance of computers in our time, isn’t it only proportionally justified to spend an enormous amount of time and dedication in teaching it properly?
Only computer nerds think this way. People have a finite time and capacity for learning, and if computers can serve their needs without spending a large fraction of that precious resource it would be terrible to mandate such an expenditure anyway.
I wish we could all be completely educated and independent in every way that matters, but it’s not possible.
This is why people on lemmy are confused about a lack of adoption. Federation is significantly confusing and subtle; we’re just mostly dorks with the pre-inclination to get it.
I too have to watch myself to keep from falling into the hole of blaming the dumbing-down of computing systems on a moral failure of users. It is not.
I might have phrased my thought too bluntly: I never intended to frame the problem as any sort of moral failure on the end users’ part. I view this as a failure of our educational institutions.
In my mind, preferring to spend time on (e.g.) MS Office in class, instead of teaching proper computer literacy, is like trying to teach meal-prep with Philips air fryers instead of teaching how to cook.
I hear you, and I too feel like it might be just my aspi-nerdiness speaking, but the same argument could be said about any subject that is considered fundamental to highschool ed. We don’t skip on philosophy, sciences, languages and arts just because they seem less applicable than math or econ, or because “it’s impossible to learn everything”.
Our civ made progress, having invented a fundamentally new tech that is accessible to everyone and now underpins everything. Allowing people to acquire the basic literacy needed to interface with this tech sustainably is the bare minimum we should be doing. I am not talking about turning kids into cyber wizards - just getting their computing up to a level that allows them to make relevant informed choices.
I’m totally with you. I just think the level of informed choices that we nerds seek will not be attainable through a reasonable gen ed curriculum. It would be an improvement, though!
I was playing a degree of devil’s advocacy there because I was interested in how the person I replied to would respond.
I don’t think it needs to be as intensive as that, I think a small amount of education would go a long way. Like teaching school classes how to install an operating system on a blank machine as a basic entry point - that would do wonders for gaining a basic appreciation for ownership over computing.
I think the other user replied what I would have said as well, we have a finite amount of time and we are seeing things from a computer-centric perspective.
I do agree that computer literacy is incredibly important and people should have the means to know how to properly operate the things they use on a daily basis but we could make the exact same argument over a myriad of things, take for example interpersonal skills or even emotions, we barely go over them in most educational systems and something as simple as communication is one of the biggest bottlenecks you can find while working, I’ve personally seen big projects go down in a big ball of fire all because of people miscommunicating or because someone can’t control their emotions.
As a TL;DR, we have more pressing issues as a society.
Hopefully we can continue moving forward as a society though, and we can have better education in more aspects, I’ve been a teacher in the past and I can tell you some that students are really hungry for knowledge. So not all hope is lost in that sense.
There is a middle ground for sure. Installing an OS sounds like a solid unit in such a curriculum.
How to learn this? The way it’s taught is so people don’t know they don’t know. What are good starting resources?
I am not a professional educator, but in general I think it is worth to start with basic computer literacy: identifying parts of a PC, being able to explain their overall functions, difference between hardware and software, and what kinds of software a computer can run (firmwares, operating systems, user utilities etc.). This would also be a perfect time to develop practical skills, e.g. (assuming you are a normatively-abled person) learning to touch-type and perform basic electronics maintenance, like opening your machine up to clean it and replace old thermal compounds.
After that taking something like “Operating systems fundamentals” on Coursera would be a great way to go on.
It really depends on your goals, resources and personal traits, as well as how much time and energy you can spare, and how do you like to learn. You can sacrifice and old machine, boot Ubuntu and break it a bunch of times. You can learn how to use virtualization and try a new thing every evening. You can get into ricing and redesign your entire OS GUI to your liking. You can get a single-board computer like RaspberryPi and try out home automation.