alex gaynor's blago-blog

Posts tagged with easy_install

My Workflow

Posted November 7th, 2009. Tagged with python, pip, virtualenv, virtualenvwrapper, easy_install.

About a year ago I blogged about how I didn't like easy_install, and I alluded to the fact that I didn't really like any programming language specific package managers. I'm happy to say I've changed my tune quite drastically in the past 2 months. Since I started working with Eldarion I've dived head first into the pip and virtualenv system and I'm happy to say it works brilliantly. The nature of the work is that we have lots of different projects all at once, often using wildly different versions of packages in all sorts of incompatible ways. The only way to stay sane is to have isolated environments for each of them. Enter virtualenv stage left.

If you work with multiple Python projects that use different versions of things virtualenv is indispensable. It allows you to have totally isolated execution environments for different projects. I'm also using Doug Hellmann's virtualenvwrapper, which wraps up a few virtualenv commands and gives you some hooks you can use. When I start a new project it looks something like this:

$ git checkout some_repo
$ cd some_repo/
$ mkvirtualenv project_name

The first two steps are probably self explanatory. What mkvirtualenv does is to a new virtual environment, and activate it. I also have a hook set up with virtualenvwrapper to install the latest development version of pip, as well as ipython and ipdb. pip is a tremendous asset to this process. It has a requirements file that makes it very easy to keep track of all the dependencies for a given project, plus pip allows you to install packages out of a version control system which is tremendously useful.

When I want to work on an existing project all I need to do is:

$ workon project_name

This activates the environment for that project. Now the PATH prioritizes stuff installed into that virtualenv, and my Python path only has stuff installed into this virtualenv. I can't imagine what my job would be like without these tools, if I had to manually manage the dependencies for each project I'd probably go crazy within a week. Another advantage is it makes it easy to test things against multiple versions of a library. I can test if something works on Django 1.0 and 1.1 just by switching which environment I'm in.

As promised tomorrow I'll be writing about an optimization that just landed in Unladen Swallow, and I'm keeping Monday's post a secret. I'm not sure what Tuesday's post will be, but I think I'll be writing something Django related, either about my new templatetag library, or the state of my multiple database work. See you around the internet.

You can find the rest here. There are view comments.

Django Ajax Validation 0.1.0 Released

Posted January 24th, 2009. Tagged with ajax_validation, release, django, easy_install.

I've just uploaded the first official release of Django Ajax Validation up to PyPi. You can get it there. For those that don't know it is a reusable Django application that allows you to do JS based validation using you're existing form definitions. Currently it only works using jQuery. If there are any problems please let me know.

You can find the rest here. There are view comments.

Why I don't use easy_install

Posted November 20th, 2008. Tagged with ubuntu, easy_install, python.

First things first, this post is not meant as a flame, nor should it indicate to you that you shouldn't use it, unless of course you're priorities are perfectly aligned with my own. That being said, here are the reasons why I don't use easy_install, and how I'd fix them.
  • No easy_uninstall. Zed mentioned this in his PyCon '08 lightning talk, and it's still true. Yes I can simply remove these files, and yeah I could write a script to do it for me. But I shouldn't have to, if I can install packages, I should be able to uninstall packages, without doing any work.
  • I can't update all of my currently installed packages. For any packages I don't have explicitly installed to a particular version(which to it's credit, easy_install makes very easy to do), it should be very to upgrade all of these, because I probably want to have them up to date, and I can always lock them at a specific version if I want.
  • I don't want to have two package managers on my machine. I run Ubuntu, so I already have apt-get, which i find to be a really good system(and doesn't suffer from either of the aforementioned problems). Having two packages managers inherently brings additional confusion, if a package is available in both which do I install it from? It's an extra thing to remember to keep up to date(assuming #2 is fixed), and it's, in general, an extra thing to think about, every time I go to update anything on my machine.

So what's my solution? PyPi is a tremendous resource for Python libraries, and there are great tools in Python for working with it, for example using a file makes it incredibly easy to get your package up on PyPi, and keep it up to date. So there's no reason to throw all that stuff out the window. My solution would be for someone to set up a server that mirrored all the data from PyPi, regularly, and then offered the packages as .deb's(for Debian/Ubuntu users, and as RPMs for Fedora users, etc...). That way all a user of a given package manager can just add the URL to their sources list, and then install everything that's available from PyPi, plus they derive all of the benefits of their given package manager(for me personally, the ability to uninstall and batch upgrade).

Note: I'm not suggesting everyone use apt-get, I'm merely suggesting everyone use their native package manager, and there's no reason easy_install/pip/virtualenv can't also be used.

You can find the rest here. There are view comments.