• 1 Post
  • 268 Comments
Joined 1 year ago
cake
Cake day: June 11th, 2023

help-circle
  • The thing about the MPW Shell is it was sort of the only game in town if you actually wanted a command line with the classic Mac OS. (There’s an awesome little emulator called SheepShaver if you ever want to explore it btw.) Well, I suppose there was A/UX. I thought it was a miracle when that came out. You have to realize in those early days a good chunk of the operating system itself was actually baked in to ROM. (You had to do desperate things to squeeze a GUI out of such limited resources as existed back then!) So to this day I have no idea how they managed to spin off a 'nix despite that.

    Anyways. I wonder, if you made some sort of template format today, to what extent you could write some sort of conversion tool that would scrape a man page or whatever to rough it in and then you could tweak it to get what you want? man pages aren’t super standardized in their format I guess, so it’s probably more trouble than it’s worth. I like to use Python’s argparse when rolling out scripts myself, and its --help format is pretty rigid given that it’s algorithmically generated. Might be more plausible with something like that? I had a quick look just now to see if you can drill down into the argparse.ArgumentParser class itself to pull out the info more directly, but it seems a rather opaque thing that doesn’t expose public APIs for that. Oh well…


  • This reminds me of something from my ancient past. Back in the early-ish days of Apple, there was a development system called MPW (Macintosh Programmer’s Workshop) which included its own little kludgy shell.

    The weird thing about it though was while you could enter commands on the command line like in any shell, you could prefix them with the word commando (presumably a portmanteau of “command” and “window”) and this window would pop up showing various buttons, checkboxes, etc. correponding to command line options. When you ok’d the window, it would generate the command line for you.

    I’m rather hazy about how all this worked, but I think there was some sort of template language to define the window layout if you wanted to add commando support for your own tool? And presumeably, as you say, you could restrict what’s possible with the window interface as you deemed fit?


  • You mean like the comment fields we’re using right here on lemmy?

    As others have pointed out, it’s usually some markdown that’s embedded within the text. Lemmy is using a format that’s actually called “markdown” if I’m not mistaken, or a slight variation/subset thereof.

    I’ve gotten used to the double-star for bold and what not to the point that it annoys me when some message client or whatever doesn’t support it. I share code snippets with people fairly often, and the code markdown is particularly useful to maintain its legibility.



  • You can always combine integer operations in smaller chunks to simulate something that’s too big to fit in a register. Python even does this transparently for you, so your integers can be as big as you want.

    The fundamental problem that led to requiring 64-bit was when we needed to start addressing more than 4 GB of RAM. It’s kind of similar to the problem of the Internet, where 4 billion unique IP addresses falls rather short of what we need. IPv6 has a host of improvements, but the massively improved address space is what gets talked about the most since that’s what is desperately needed.

    Going back to RAM though, it’s sort of interesting that at the lowest levels of accessing memory, it is done in chunks that are larger than 8 bits, and that’s been the case for a long time now. CPUs have to provide the illusion that an 8-bit byte is the smallest addressible unit of memory since software would break badly were this not the case, but it’s somewhat amusing to me that we still shouldn’t really need more than 32 bits to address RAM at the lowest levels even with the 16 GB I have in my laptop right now. I’ve worked with 32-bit microcontrollers where the byte size is > 8 bits, and yeah, you can have plenty of addressible memory in there if you wanted.







  • I started in C and switch to C++. It’s easy to think that the latter sort of picked up where the former left off, and that since the advent of C++11, it’s unfathomably further ahead. But C continues to develop and occasionally gets some new feature of its own. One example I can think of is the restrict key word that allows for certain optimizations. Afaik it’s not included in the C++ standard to date, though most compilers support it some non-standard way because of its usefulness. (With Rust, the language design itself obviates the need for such a key word, which is pretty cool.)

    Another feature added to C was the ability to initialize a struct with something like FooBar fb = {.foo=1, .bar=2};. I’ve seen modern C code that gives you something close to key word args like in Python using structs. As of C++20, they sort of added this but with the restriction that the named fields have to come in the same order as they were originally defined in the struct, which is a bit annoying.

    Over all though, C++ is way ahead of C in almost every respect.

    If you want to see something really trippy, though, have a look at all the crazy stuff that’s happened to FORTRAN. Yes, it’s still around and had a major revision in 2018.



  • I’m in a band that performs on occasion at CFBs (Canadian Forces Bases). We typically eat there and spend the night either in barracks or guest housing.

    I have noticed that when we play for officers, dinner is like steak and lobster. When we play for enlisted, it’s more like high school cafeteria. The one and only time I had to excuse myself towards the end of a concert and miss the closing number was after eating at the enlisted mess and getting explosive diarrhea.




  • Falsehoods About Time

    Having a background in astronomy, I knew going into programming that time would be an absolute bitch.

    Most recently, I thought I could code a script that could project when Easter would land every year to mark it on office timesheets. After spending an embarrassing amount of…er…time on it, I gave up and downloaded a table of pre-calculated dates. I suppose at some point, assuming the code survives that long, it will have a Y2K-style moment, but I didn’t trust my own algorithm over the table. I do think it is healthy, if not essential, to not trust your own code.

    Falsehoods About Text

    I’d like to add “Splitting at code-point boundary is safe” to your list. Man, was I ever naive!