• 0 Posts
  • 11 Comments
Joined 6 months ago
cake
Cake day: May 19th, 2024

help-circle
  • Why the heck would 2 projects share the same library?

    Coming from the olden days, with good package management, infrequent updates and the idea that you wanted to indeed save that x number of bytes on the disk and in memory, only installing one was the way to go.

    Python also wasn’t exactly a high brow academic effort to brain storm the next big thing, it was built to be a simple tool and that included just fetching some library from your system was good enough. It only ended up being popular because it is very easy to get your feet wet and do something quick.


  • The difficulty with python tooling is that you have to learn which tools you can and should completely ignore.

    Unless you are a 100x engineer managing 500 projects with conflicting versions, build systems, docker, websites, and AAAH…

    • you don’t really need venvs
    • you should not use more than on package manager (I recommend pip) and you should cling to it with all your might and never switch. Mixing e.g. conda, on linux system installers like apt, is the problem. Just using one is fine.
    • You don’t “need” need any other tools. They are bonuses that you should use and learn how to use, exactly when you need them and not before. (type hinting checker, linting, testing, etc…)

    Why is it like this?

    Isolation for reliability, because it costs the businesses real $$$ when stuff goes down.

    venvs exists to prevent the case that “project 1” and “project 2” use the same library “foobar”. Except, “project 1” is old, the maintainer is held up and can’t update as fast and “project 2” is a cutting edge start up that always uses the newest tech.

    When python imports a library it would use “the libary” that is installed. If project 2 uses foobar version 15.9 which changed functionality, and project 1 uses foobar uses version 1.0, you get a bug, always, in either project 1 or project 2. Venvs solve this by providing project specific sets of libraries and interpreters.

    In practice for many if not most users, this is meaningless, because if you’re making e.g. a plot with matplotlib, that won’t change. But people have “best practices” so they just do stuff even if they don’t need it.

    It is a tradeoff between being fine with breakage and fixing it when it occurs and not being fine with breakage. The two approaches won’t mix.

    very specific (often outdated) version of python,

    They are giving you the version that they know worked. Often you can just remove the specific version pinning and it will work fine, because again, it doesn’t actually change that much. But still, the project that’s online was the working state.


  • At the cost of sounding naive and stupid

    It may be a naive question, but it’s a very important naive question. Naive doesn’t mean bad.

    The answer is that that is not possible, because the compiler is supposed to translate the very specific language of C into mostly very specific machine instructions. The programmers who wrote the code, did so because they usually expect a very specific behavior. So, that would be broken.

    But also, the “unsafety” is in the behavior of the system and built into the language and the compiler.

    It’s a bit of a flawed comparison, but you can’t build a house on a foundation of wooden poles, because of the advantages that wood offers, and then complain that they are flammable. You can build it in steel, but you have to replace all of the poles. Just the poles on the left side won’t do.

    And you can’t automatically detect the unsafe parts and just patch those either. If we could, we could just fix them directly or we could automatically transpile them. Darpa is trying that at the moment.







  • Sure. Yes. I’m aware.

    The point is, if an employee isn’t productive, the company should notice, because they should be running some kind of oversight over the work either being done or not being done.

    If the work is being done, even if the employee isn’t always 100% focused, the company shouldn’t care.

    If the work is not being done, the company should care, regardless of how active the mouse moves.

    using mouse jigglers to fake being at work is the kind of thing that keeps more companies from allowing WFH.

    No, companies don’t allow WFH because they don’t trust employees or can’t verify, employees doing their work from home. Most of the time, because the company people don’t understand that work and couldn’t judge if it’s being done correctly without adults in the room.


    tldr: people should be hired and fired based on their performance. Crazy talk, I know.