• 4 Posts
  • 1.13K Comments
Joined 1 year ago
cake
Cake day: January 3rd, 2024

help-circle









  • I even used Claude AI to write an entire C# application, I did ZERO coding, yes, literally nothing! I have NEVER coded in C# before, I gave it all requirements, worked with it like a project manager… it created a full blown working application that was beyond my expectations.

    I achieved the same in 2000 with a home grown framework, and again in 2006 with Ruby on Rails.

    Astonishingly fast prototyping is a quarter of a centrury old.

    • How are you enjoying maintaining this app in production? (Or is it not there yet, because it’s just very nice for a prototype?)
    • How did Claude AI do at deploying it?
    • Are you satisfied with Claude AIs answers to your boss’ traffic analytics and load balancing questions?
    • When will Claude AI let you know how the A/B tests proved out for optimizing sales?
    • Or doesn’t it do those things yet?

    Computers are replacing us. They’ve been at it since their inception.

    Keep learning the trade and you’ll find there’s a metric ton more that computers cannot help with, than that they can help with. That will get better. I’m working at making it get better.

    I figure that my learning how to train the computers is job security. I didn’t count on it being a harsh lesson in how long it’s going to be before computers get not stupid.

    I do have a plan for when I automate myself out of a job. It’s just not a plan I’m really counting on, because I’ve been trying for decades and I only have so many decades left of doing this.

    I’ve been constantly advised to have an exit plan, for when the computers replaced me, for the entirety of those same decades.

    Most often by the same people who want me to charge less.

    Funny thing, that. Take care who you listen to on this topic, and what their motives are.

    My motive is to (continue to) charge the rest of you a shit ton of money before the AI finally replace us.

    It does help me if you all don’t buy into the bullshit that CEOs have been spouting about replacing us all.

    We’ve all been undercharging for about 3 years due to it.

    AI hasn’t accomplished jack shit, but a lot of you have accepted lower pay than you probably should.

    I make very good money, but I can’t help but notice that it would be a bit more, if the rest of you would wise to the scam and raise your own prices.




  • it seems weird to attack Canonical so much over it.

    I mean, on the technical side, sure. Canonical’s technical choice is just weird. Plenty of fully open app store environments have almost no competition, because self hosting is still hard work.

    But all of the business reasons - for having a closed proprietary sole app server - go against everything that Canonical used to claim they stood for.

    Canonical’s business choice not to open source the snap servers is an open declaration of war against the FOSS community who have previously rallied around them.

    It’s like inviting someone into my basement and locking the door with a key as they get to the bottom step. The action isn’t illegal, but the probable motive is creepy as fuck. (Maybe I just watch too many horror movies. Lol.)


  • Oof. I’m anxious that folks are going to get the wrong idea here.

    While OCI does provide security benefits, it is not a part of a healthly security architecture.

    If you see containers advertised on a security architecture diagram, be alarmed.

    If a malicious user gets terminal access inside a container, it is nice that there’s a decent chance that they won’t get further.

    But OCI was not designed to prevent malicious actors from escaping containers.

    It is not safe to assume that a malicious actor inside a container will be unable to break out.

    Don’t get me wrong, your point stands: Security loves it when we use containers.

    I just wish folks would stop treating containers as “load bearing” in their security plans.




  • I have to object to the supposed necessity of C. In particular, the bolded claim that an OS not written in C is still going to have C involved.

    Such an OS could instead have written its non-native parts using assembly.

    Agreed! That’s a great point!

    I appreciate your clarification. Not everything has to run C. It’s just a trend in today’s products.

    I was attempting to humorously reference Monty Python’s Spam sketch, where it seems like everything on the menu has at least a little Spam in it. Every device I could think of, that I’ve toyed with enough to guess what it has running, is running at least a bit of C.

    For an attempt at a counterpoint, I thought of a few devices, like my PineWatch, that run an OS codes entirely written in one language. But… That one language is, of course, C.

    legacy convenience.

    Yeah. I think legacy convenience is, indeed, why there’s C in so many places, even places it doesn’t have to be.

    There’s so many folks with so much hardware driver expertise in C, and they teach our next generation, so I figure that will continue until something really compelling changes their preference.

    I appreciate your point. There are lots of non-C ways to create bytecode. My (amused) point is that we don’t seem very fond of any of those methods, today.



  • The essence of your answers is “yes, but…”. And the “but” is mostly about how slow Python is in contexts that need to be astonishingly fast.

    It depends how complex the hardware is and how much time we’re willing to waste.

    Technically, when I deploy a Python program to a BBC Microbit, that’s (more or less) what is happening. Pure Python code is making every decision, and is interacting directly with all available hardware.

    We could still argue semantics - virtually no (modern) computer exists that isn’t running at least one tiny binary compatibility driver written in C.

    I believe the compiled C binary on a BBC Microbit to bootstrap a pure Python OS is incredibly small, but my best guess is that it’s still present. The C library for Microbit needed to exist for other languages to use, and Python likes calling C binaries. So I don’t imagine anyone has recreated it in pure Python for fun (and slower results).

    (Edit: As others have pointed out, I’m talking about MicroPython, which is, itself written in C. The Microbit is so simple it might not use MicroPython, but I can’t imagine the BBC Microbit team bothered to reinvent the wheel for this.)

    Of course, if you don’t mind that the lowest level code has got to be binary, and very few people are crazy enough to create that code with Python, then…

    It begs another interesting question: Just how much of an OS can we get away with writing in Python.

    And that question is answered both by RedHat Linux and Debian Linux - and the answer is that both are built with an awful lot of Python.

    In contrast, Android is mostly Java with lots of C a C Linux kernel. Windows is mostly C# and lots of C. iOS is mostly Objective C and lots of C.

    You can have an OS built with almost any language you want, as long as you also want parts of it built in C. (Edit: This is meant to amuse you, not be guidance for what is possible. Today, we love our C code. C didn’t always exist, and might someday no longer be our favorite hardware driving language.)

    An interesting current development is discussion around rebuilding parts of the Linux Kernel with Rust, which can run just as fast as C. This would effectively cause RedHat, Debian and Android to replace some of their C code with Rust. To date, there’s been a lot of interest and discussion and not a lot of (any?) actual funding or work completed.