• 0 Posts
  • 74 Comments
Joined 2 years ago
cake
Cake day: July 2nd, 2023

help-circle
  • Nobara: Has all the gaming features I want on my gaming pc (like gamescope) and is htpc capable. Also, it’s based on Fedora, which I’m familiar with.

    Fedora: I like gnome and it’s always fairly up to date and rock solid. Great on my laptop.

    Have considered switching to openSUSE though. It’s German (as am I), it’s the first Linux distro I ever used (on my granddad’s PC, more than a decade ago) and I’ve heard a lot of good about tumbleweed.


  • That’s because our eyes adapt themselves to different colour temperatures all the time during the day (a tungsten light bulb has very warm (orange) light, while daylight is much cooler (blue), for example, yet white is always white to us). This happens automatically and subconsciously.

    If you close one eye for a little while though, it „resets“ back to its default colour temperature. After opening it again, it’ll take a little while for it to start compensating to the correct white point again and thus you‘ll have different hues on both eyes for a little while.

    The effect is exaggerated a lot, if you close one eye and then look at a bright monochromatic image with the other one (like a bright red image on your phone, close to your face).

    Or, of course, if you wear anaglyphic 3D glasses (that’s the red/green or red/cyan kind) for a while, as one eye will try to compensate for the red as much as it can, while the other one will try to compensate for the green/cyan as much as possible. Result: the eye with the red glass will look much cooler after taking off the glasses, and the eye with the green/cyan glass much warmer.

    Generally that effect will balance itself out after a little while. Except for very slight variances of course. Our eyes and brains are far from perfect.


  • Of course they know how to use a computer. They don’t know a thing about how a computer works but that doesn’t mean they can’t use it. Heck, my 8 y/o cousin can figure out how to open and play Minecraft on his tablet. No need for him to know about commands, programming languages and bits n bytes.

    Most people these days know how to use their phones, at the very least, and even there cog = settings. Most people don’t know how to use a CLI or how a spreadsheet program works, but they certainly can use a browser on a computer. Which is also a form of using a computer.

    And maybe they don’t explicitly know it’s a button. But they know if they tap or click on a cog it takes them to settings.

    And even figuring out how a mouse works is a thing of a few seconds, if all you’ve used before was a touchscreen (or even nothing at all). There‘s a reason they took off in the first place.

    Although, if someone truly has never used a computer in any shape or form before. No smartphone, no tablet, not even a smart TV, you‘d probably have a point that it’s not much more difficult for them to learn the common iconography than it would be to learn the CLI. But people rarely start with such a blank slate today.

    Don’t get me wrong, I don’t think it’s a good thing, people are less and less tech literate these days. But my point is, tech illiteracy doesn’t mean they have never used any computer ever and do not know what an app- or settings-icon is. I’d wager it’s more the other way around: People are so used to their devices working and their UIs looking pretty (and very samey) that iconography like cogs for settings are especially self explanatory to them. It’s the same on their phone, tablet and even TV after all.



  • Game dev salaries have increased roughly in line with inflation though, so development time still costs the studio the same as 15 years ago, while AAA game prices are only now starting to surpass the $70 mark with games not generally surpassing the $60 mark until 2020.

    It’s a wonder, they haven’t increased to prices any sooner, as much as I‘d like them staying where they were.

    And again: if you don’t like the prices, vote with your wallet, buy used or on sale or don’t pay at all.


  • Was raised roman-catholic but got disillusioned pretty quickly. I was fairly religious in elementary school but by the time I was 14, I was agnostic/atheist.

    Partially because my parents aren’t religious (my mum is from the GDR, so she didn’t grow up with religion and my dad seceded from church before I was even born) and even my grandma, who was the religious one (albeit never very strongly, compared to American catholics. More a „goes to church on religious holidays“ type of person), drifted away from church quite a bit after all the child-rapist priest shit that was uncovered at the time.

    By now (mid 20s) I’d probably consider myself agnostic. Can’t prove there is no higher power but also, if there is, we wouldn’t know what religion – if any – is right anyways. It’s probably not christianity though.



  • Yea, I don’t generally disagree. Especially if you‘re someone who plays games for hundreds of hours, instead of dozens.

    But $100 is still a lot of money for a lot of people. I‘d have to save up for months for that (I’m a trainee and have less than 1000€ per month for rent, food, internet, gas, etc.), so I rather wait until I can get games cheaper.


  • Eh, there‘s some truth to either one. Game development is expensive and pricing hasn’t kept up with inflation ($60 in 2010 are almost $90 today). But also, games are ridiculously expensive at full price, especially in todays economy and especially if they’re as badly received as Skull and Bones, while Nintendo games are at the very least usually pretty decent.

    I’d recommend voting with your wallet and only buying games on sale or used. Just wait a little. (Or pirate them, if you can live with not supporting the developers at all).






  • I’m aware stuff like that exists. I was being sarcastic. Just wanted to highlight, that searching through recent commands would be much easier in a GUI as well. Should’ve used a “/s”, my bad.

    Also, I too wouldn’t highlight Windows as a staple of good UI design. Their jumble of 4 different design languages nested into each other in the most unintuitive ways with some actions having multiple possible ways and some having been hidden away deeply is not how I’d want a GUI to be. It’s also not user friendly and very much one reason I’ve banished windows from my household.

    But, people are used to it. At least enough to find basic settings. And I think that’s the best argument against pushing the terminal. People are familiar with graphical interfaces. They understand commonly used symbols (like cog = settings and similar stuff) because all mainstream operating systems (be it desktop or mobile) have used something similar for close to 3 decades. They are familiar with menus and submenus. They don’t know where everything is, when they use an unfamiliar program/OS, of course but they are familiar with the concepts. They are not with CLIs. You are, because you have been using them for a while. So am I and so are quite a few other people who regularly use it. The average Joe computer user doesn’t.

    Even stuff like tab to autocomplete and arrow-up for history are foreign concepts for someone who has never used a terminal before. Sure, it’s not hard to learn but they’d need to learn it. Not to mention, that a lot of commands are abstract enough that they are hard to memorise and thus to understand. It’s like a language you do have to learn. Not a difficult language if you don’t need to do complicated things but it’s a hurdle nonetheless.

    Which is also why don’t like the “literally just telling the computer what to do” argument, I’ve heard a few times now. I mean, it’s not entirely wrong but it’s telling the computer what to do in its language, not in yours. You don’t type “Hello computer please update my system and programs” or even just “update”, you type “sudo pacman -Syu”. Any non-tech person will be utterly confused at what even a “sudo” is or what pacman has to do with Linux. And yes, pacman is an especially obtrusive example and Arch definitely not the distro for newbies, regardless of their stance on terminals but my point still stands, even with apt, dnf and co. To tell a computer what to do via CLI, you’ll either have to either learn its language or copy it from someone who does.

    A GUI however tries to translate that language for you already and give you context clues based on common culture (floppy = save, cog = settings, folder = directories, etc.). It’s a language even small children and illiterate people and can understand, to some extent at least.

    But yes, I do agree, the most popular distros are fairly streamlined and mostly useable without CLI. And that’s good. Makes it possible for Linux to slowly gain market share even among non technical people and I can, in good faith, recommend/install it for friends and family, knowing they’ll manage unless there’s a problem. And I do think, Linux is getting better in this regard every day, and while not on par yet with the current mainstream OSes in terms of ease of use, it’s not far behind anymore. But it is still behind.

    I’m just tired of the elitist-enthusiast who doesn’t want linux to become easier to use for the everyman because it’d be less special. That attitude does not further FOSS and does not help anyone. Because that’s not how you reduce Microsoft’s, Google’s or Apple’s influence on the tech scene.


  • What if we took the most used commands and instead of having to arrow-up through them, we just laid them out in a list or a grid, so you could click on them? And then we give them a little icon each that makes it a little prettier, more quickly recognizable and easier to click on. And because there are a lot of commands, maybe sort them by category. But who’d ever want that?

    Also, I don’t know, when you last used a settings app or something similar but once you‘re more than two sub pages in, you’re usually in the realm of stuff even people who use a cli a lot would have to look up the commands. Because a good UI Design makes stuff you need regularly easy accessible.



  • Most people do know how to use a computer though. Windows and macOS have been around for a very long time by now, and both have not required you to use the CLI for anything but very extreme cases in more than 25 years. You’re not starting with a blank slate. They know how a GUI is supposed to work. It is self explanatory to them. Shoving them towards a CLI is making them relearn stuff they already knew how to do. There’s a reason a lot of Windows migrants end up with KDE or Cinnamon. It’s familiar, it’s easy. Most people do in fact associate a cog with settings. CLI aren’t familiar to most people and thus a much larger hurdle.

    Also, I’m not talking about fixing problems. The CLI is a perfectly valid tool to fix problems. Not everything has to be graphical. Just enough that you don’t need it unless something breaks.


  • The terminal will never reach mainstream adoption because it already had in the 80s and 90s and people progressed away from CLI and towards GUI. It’s archaic. It’s a fallback. It’s useful, sure. I use it regularly. But not because I‘d not just prefer having a graphical front end. It’s only more useful because the respective front end is lacking.

    Also, the „shut up and go use Windows/macOS“ attitude seems very elitist to me. You‘d rather have the non techies suffer high prices, privacy violations, etc., have them suffer microsoft/Apple instead of making the system more inviting for them? And you‘d rather have another company (like valve is doing right now btw) swoop in and offer what you refuse to entertain because you want everyone to do things the way you like to do things.


  • The CLI is very much an enthusiast/professional tool. It isn‘t and it shouldn’t be the default in this day and age. Saying everyone should know how to use the CLI is like saying everyone should know how to use a DSLR camera instead of just relying on their phone’s or everyone should know how to drive a manual transmission car. Those are all great skills to have but most people just want a snapshot or a car that gets them from A to B safely. They don’t want to think about it. And most people just want a computer that gets out of their way. And why shouldn’t they have it?

    And I’m not saying the terminal shouldn’t exist and that people shouldn’t be encouraged to learn about how it functions. But there should always be the option to completely avoid it. Because of you want mainstream adoption, you need to face the sad reality, that the Mainstream doesn’t want to look under the hood. And if you don’t want mainstream adoption, why?


  • A good modern gui also presents itself in front of you. It directs your attention to important buttons/options. You don’t need any prior knowledge to know that a cog shaped button labeled settings will take you to settings. Good UIs are self explanatory. CLI are not.

    To be able to use the terminal, you either need another person to tell you the necessary commands or search for a tutorial yourself, either online or somewhere else.

    That’s not intuitive. It’s not too hard to learn, but you need to actively pursue learning how to do it. An average person doesn’t want to do that. An average person doesn’t even want to memorize more than one password. They should. But they won’t. Thus, password managers were created. And non technical minded people still don’t even use those.

    You got to look at it from the point of view of someone who has no interest in knowing any more about their computer than how to turn it on, where to put their photos and how to open their browser and maybe an office suite. The kind of people that wouldn’t even update the system, if there wasn’t a notification asking for it. They’re not stupid. They just don’t care about computers and don’t want to spend any more mental power on them than necessary, the same way you wouldn’t want to think about manually keeping the timing of your car’s engine on point for the current conditions. You just want it to get you safely from A to B. Or maybe you do, but I assure you, most people wouldn’t.