• 0 Posts
  • 26 Comments
Joined 1 year ago
cake
Cake day: June 15th, 2023

help-circle
  • Trantarius@programming.devtoScience Memes@mander.xyzCFCs
    link
    fedilink
    English
    arrow-up
    5
    arrow-down
    40
    ·
    8 months ago

    Y2K specifically makes no sense though. Any reasonable way of storing a year would use a binary integer of some length (especially when you want to use as little memory as possible). The same goes for manipulations; they are faster, more memory efficient, and easier to implement in binary. With an 8-bit signed integer counting from 1900, the concerning overflows would occur in 2028, not 2000. A base 10 representation would require at least 8 bits to store a two digit number anyway. There is no advantage to a base 10 representation, and there never has been. For Y2K to have been anything more significant than a text formatting issue, a whole lot of programmers would have had to go out of their way to be really, really bad at their jobs. Also, usage of dates beyond 2000 would have increased gradually for decades leading up to it, so the idea it would be any sort of sudden catastrophe is absurd.












  • When you hit the windows key (aka meta-key or super-key) it brings up the app launcher. You get a dock at the bottom with pinned or running apps (like a taskbar), and all of your open windows are presented in a sort of mini-version that lets you switch between them or move them between workspaces. There is a search bar that you can immediately type into to open any app with a .desktop file. There is also a button to bring up the app grid which shows your apps kind of like a mobile device’s home screen.







  • Well letters don’t really have a single canonical shape. There are many acceptable ways of rendering each. While two letters might usually look the same, it is very possible that some shape could be acceptable for one but not the other. So, it makes sense to distinguish between them in binary representation. That allows the interpreting software to determine if it cares about the difference or not.

    Also, the Unicode code tables do mention which characters look (nearly) identical, so it’s definitely possible to make a program interpret something like a Greek question mark the same as a semicolon. I guess it’s just that no one has bothered, since it’s such a rare edge case.