• Mikina@programming.dev
    link
    fedilink
    arrow-up
    1
    ·
    edit-2
    1 year ago

    I’ve recently discovered pipenv, and it has been a massive QoL improvement. No need to figure out bazillion of commands just to create or start an environment, or deal with what params should you use for it like you do with venv. You just pipenv install -r requirements.txt, and everything is handled for you. And when you need to run it, just pipenv run python script.py and you are good to go.

    The best thing however are the .pipfiles, that can be distributed instead of requirements.txt, and I don’t get why it’s not more common. It’s basically requirements, but directly for pipenv, so you don’t need to install anything and just pipenv run from the same folder.

    • henfredemars@infosec.pub
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      I’ve been burned by pipenv before on a large project where it was taking upwards of 20 minutes to lock dependencies. I think these days they use poetry instead, but I’ve heard the performance is still not very scalable

      With that said, I think it can be a nice addition, but I think it comes down to Python packages not really taking dependency management as a top priority instead of favoring flexibility. This forces a package manager to download and execute the packages to get all the dependency information. Naturally, this is a time-consuming process if the number of packages is large.

      On multiple instances I’ve seen projects abandon it for pip and a requirements.txt because it became unmanageable. It’s left a bad taste in my mouth. I don’t like solutions that claim to solve problems but introduce new ones.