I am trying to argue Unix/Linux course back to computer science curriculum in my university even as an optional course. ”Jimi really thinks or wants so” isn’t a good argument, what pedagogic reasons are there, or anything else which I could use to convince my university? For me Linux is so obvious, that I’m having a hard time finding objective reasons.
I would really appreciate any help, if you don’t know, I would really appreciate if you decided to boost this
So far I have heard that ”it isn’t necessary to know linux if you are just an programmer”, that is the response I got last time when I asked, but now they are designing a new curriculum so there is real possibility of getting it back.
My computer science student organization allowed me to use their official channels for this so it seems even more possible to achieve something
Thank you everyone for your insanely awesome support, without it I wouldn't been able to write the comment! 🤩 The message was written in Finnish so I used machine translation and light proofreading to add the results in English here:
With your comments and help we managed convince Finnish student union representatives so that SYL (National Union of University Students in Finland) as an organization will study the concepts of free software!
I am very hopeful that they find it beneficial to students and begin to promote the practises to other organizations like the Ministry of Education and Culture which they regularly meet with
@jimbo nowadays people use containers everywhere to deploy software. Containers are linux. If you want to do more than writing software in an IDE you'll need some basic linux skills no matter what.
Same for CI/CD pipelines, these all run on/in linux. If you want to do anything on IoT devices you'll most likely hit linux. Basically anything that isn't your Windows desktop will force you to do some linux.
@jimbo then there is the point of using Linux for learning by example. If people want to learn how to code any kind of application, use a linux system to base it on, download the corresponding source code for all libraries and show them how to deep dive code down to the syscall level and maybe beyond that.
Take Raspberry Pis and do projects based on these. Take an OpenStack environment and let them build HA clusters. All stuff we did at my university.
@jimbo and of course learn the community aspects of things. Show projects show how to work on and with projects. There is nothing that brings you easier into a job than showing that you can maintain (large) free software projects.
And finally, finally, from a pedagogical point of view it makes sense to teach concepts instead of product and since Microsoft, Google, Apple and friends love to teach products, using linux as the less known platform helps to focus on the concepts.
@jimbo you still have to learn git and version control aren't you?
How about cpu, memory or i/o (hardware optimisation) ? And network?
All of this are easier learn on linux (mainly becose it's open)
An OS is a tool like any other. Sure, you don't *need* to know anything about them to make a good solution to a problem, but it helps a ton to know the tools you are working with.
You can apply the same logic to practically any course (why even bother with PL's if assembly is Turing complete?), but we consider those to be of great value, even essentials in the engineer's toolbox.
By learning linux, you learn terminal and commands. Even if later your company use windows or even mac, it will still be very useful. Learn vim or emacs too.
How to find which process is listening on that port ? How to change all occurences of a word in all .csv ?
Beyond learning commands, it teaches a way of thinking.
Also... it's free as in speech, and free as in no ads and free as in no office365 crap.
@jimbo 99% of servers run linux, and even if you're 'just a programmer', it's good to know linux so you can make your software linux-compatible
All the code is open to be dug down into if you so choose. no _magic black boxes_ that have to be taken on trust. studying the code is an education in itself and is free to download.
It beat out very well funded competition in the form of Microsoft Windows to utterly dominate _the cloud_ and the server domain.
It(as Android admittedly) (and ios) prevented MS from gaining a foothold on smartphones proving again it's superiority to Windows.
In recent years the percentage of programmers who develop on Linux has been growing massively; by some estimates over 50 % of programmers use it as their main dev platform.
This could be the base that finally undermines the Windows monopoly on the desktop.... So time devoted to learning it is unlikely to be wasted.
@jimbo yeah I'd have to say its ubiquity on servers and iot / embedded devices is the strongest argument
@jimbo The obvious one is that Linux is everywhere when you do computer science as an objective and not just as a tool (developing, system administration, networking…).
Even when using Windows you’ll acces to remote servers, or use Linux virtual machine, or Deploying to Docker image. I think there are only two way to almost not touch Linux : developing for industries, or video games. And even in these areas this is less and less true.
So even without using it every day, you have to know the basics.
In the computer science curriculum I followed, the first months we learned to use Linux, and we had to use Linux in other courses. We had the right to go back on Windows only after the first semester. And only a few of us have done so, because it was already obvious to most of us that this was more practical.
@jimbo its an industry standard for everything but desktop OS thus everything will be targeted for it. Its growing in the desktop space. After learning about it then digging into the source could be valuable. GCC, git and other dev tools are native to it. Its easier to hack on. It's actually viable to use only a command line since bash etc is so much better than cmd or powershell. Its essentially a requirement for devops
@jimbo It's simple. What student would spend four years chasing a bachelors degree, only to not get hired because "Ugh, they graduated at THAT university, none of the hires from there don't know how to do anything beyond clicking GUI buttons on Windows" which then proceeds to ruin their reputation, so no one bothers wasting their money on the CS courses at that school, because they know they wont find a decent job.
@jimbo All HPC is Linux, so anyone that might end up in a technical position for scientific computing is going to need solid Linux skills.
@jimbo accessibility to more open source software. License cost reduction if more people use Linux rather than windows or Mac.
Since Linux is open source you can see how the OS works
Mastodon instance monetized using web monetization protocol (optional) which is privacy respecting way of funding sustainable internet services. From Finland, for English and/or Finnish speaking users.