Home » Odeon Blogs »

Calvin, Weaver

zcat bug on mac osx

There's a bug on Mac OS X's (Lion) default zcat utility that appends a ".Z" suffix to your .gz files automatically.

I discovered it when running a shell script to copy/download a remote postgresql database dump zipped up in .gz format and when I tried to unarchive it.

  1. calvin$ ./fetch_server_db.sh
  2. remote db dump (gzip)
  3. copying remote dump to localhost
  4. pg_dump_2011-10-17_16-06-36.db.gz 100% 881KB 220.2KB/s 00:04
  5. deleting remote file
  6. loading dump in local db
  7. ALTER SCHEMA
  8. zcat: pg_dump_2011-10-17_16-06-36.db.gz.Z: No such file or directory


THE FIX

Fortunately for us, there's another utility on Mac OS X called gzcat which works perfectly fine.  And since most of my colleagues use the same zcat command in my shell script, I decided to replace my broken zcat utility like that:-

  1. calvin$ sudo mv /usr/bin/zcat /usr/bin/broken-zcat

  2. calvin$ sudo ln -s /usr/bin/gzcat /usr/bin/zcat

A temporary bandaid of sorts until I can dig deeper to find out what's wrong with my particular version of zcat on Mac OS X.  Just so we know, my broken zcat version is:-

  1. calvin$ broken-zcat --version
  2. broken-zcat 1.3.12
  3. Copyright (C) 2007 Free Software Foundation, Inc.
  4. Copyright (C) 1993 Jean-loup Gailly.
  5. This is free software. You may redistribute copies of it under the terms of
  6. the GNU General Public License <http: www.gnu.org="" licenses="" gpl.html="">.
  7. There is NO WARRANTY, to the extent permitted by law.

  8. Written by Jean-loup Gailly.</http:>

This goes into my TODO list.


Category: OS X


Tagged as: lion zcat

2 Comments

Test Driven Development for Non Techies

If you are a non-techie working with software developers, dev ops and other techies, you might have heard so much about the brouhaha among your techie and software developer friends about this thing called "Test Driven Development" and "Unit Testing".

What does that all mean really? And what does that mean in terms of the quality of the software you own and pay for when hiring developers?

In this post, I will attempt to explain in simpler terms what unit testing and test driven development is all about.  To do so, I will break down the concept of test driven development into 3 different sections

  • Unit testing
  • Integration testing
  • System testing

Unit testing

Unit testing is automating the execution of a test run of the smallest possible piece of a program.  This often refers to individual functions (or "methods") written in the software.  For example, if I want to automate a check that a user with the correct password is able to login to my application and be redirected to a specific url, I would have a specific function that looks something like this:

  1. # ... various setup code above ...

  2. def test_login(self, user, password):
  3. # now check the login process
  4. post_data = {'username': user.email, 'password': password}
  5. login_url = reverse('login')
  6. response = self.client.post(login_url, post_data)
  7. self.assertRedirects(response, reverse('home'))

The last statement checks that upon a successful login, the response from our web application will be redirected to the "home page" url.

Because this specific unit test, together with other unit tests, is located in a "module" (a directory containing functions and other code relating to that module) called "userprofile", I can now simply run "./manage.py test userprofile" in my terminal:

  1. calvin$ ./manage.py test userprofile
  2. Creating test database for alias 'default'...
  3. .......
  4. ----------------------------------------------------------------------
  5. Ran 7 tests in 4.588s

  6. OK
  7. Destroying test database for alias 'default'...

If for some reason we modify our code so that a user who has logged in no longer gets redirected to the home page url, this unit test will fail and running ./manage.py test userprofile will tell us that our unit test has failed. 

These individual functions is a "unit of test" because there's no other meaningful way to divide it up further.

Unit tests are used to test a single unit in isolation, verifying that our application works as expected, without considering what the rest of the program would do.  This protects each unit from inheriting bugs from mistakes made elsewhere, and makes it easy to narrow down on the actual problem if one should occur.

In itself, unit testing isn't actually sufficient to confirm that a complete program works correctly as intended.  But unit tests are the foundation upon which everything else is based.  Just as we cannot build a house without solid unit-size materials (like a single brick or a single steel rod encased on concrete), building a non-trivial software without unit tests is simply asking for trouble.

Integration testing

Now that we have a good idea on what it means to be writing (automated) unit tests in your software, we can explore the next level of testing.  In integration testing, tests now begin to encompass interactions between related units.  While each tests should still be run in isolation (using "./manage.py test userprofile" as explained earlier), we now care about whether the tested units behave correctly as a group.

Integration testing can be performed with the exact same tools as unit testing.  So for new developers who are just getting acquainted with the practice of "Test Driven Development", they might make the mistake of ignoring the distinction between the two.  Ignoring this distinction is dangerous because multipurpose tests often make assumptions about the correctness of some of the units that they involve and if test functions are written without having defined distinct/standalone functions or methods in the first place, we lose much of the benefits which automated testing should have granted.

Understanding these nuances and knowing when to write a "unit test function" versus an "integrated test function" and taking a disciplined approach towards software development is what differentiates between a well-oiled, professional software team and a cowboy software team.

System testing

System tests check parts of the program after all modules are plugged together.  In practical terms, the automation is executed by running a command such as

  1. ./manage.py test

instead of

  1. ./manage.py test [module]

So if your application comprises of many "modules" (representing groups of functionalities) worked on by different developers, running "./manage.py test" and seeing that all the tests are passing gives you confidence that your team of software developers is working well together and writing code that integrates seamlessly.

System tests are very important but they are not very useful without integration tests and unit tests.  If individual unit functions are built well, integration tests are well written and a good foundation has been laid out for your custom software, watching your system tests go all green ("OK") is a wonderful feeling that even a non-techie would appreciate.

Here's an example of a user-friendly interface from buildbot (a continuous integration application) that shows you what has passed and what has failed so even a non-technical business/product owner or stakeholder can understand at a glance.




Now you know why writing software is so much fun! :-)

Writing unit tests, integration tests and passing system tests is an interactive, dynamic and really fun team sports for professional software teams - well, almost.

Fun and "cool" factor aside, including continuous, automated test systems in your software projects imply that as a Software Product Owner/Business Owner, you reduce the costs and time required to manually test out your software whenever you launch a new feature.  Test driven development is a long-term investment that ultimately results in a more robust software - and a higher quality software experience leads to happier users and customers of your software product!


Category: Product Management


Tagged as: testing unit

Leave a Comment

Macports House Cleaning

Once in a while, it helps to do some house cleaning to keep your Mac OS clean and lean.

If you are using MacPorts to manage your open source software packages and libraries, a problem you might notice is that MacPorts tend not to uninstall old versions of downloaded ports.  This probably means that over time, you would have accumulated a bunch of outdated and older versions of software packages and libraries on your machine.

Let's see how I can clean up my act and remove these outdated software packages.

  1. calvin$ sudo du -sh /opt
  2. Password:
  3. 8.4G /opt

  4. calvin$ sudo port clean --all installed

  5. calvin$ sudo port -f uninstall inactive

  6. calvin$ sudo du -sh /opt
  7. 5.6G /opt


Not too bad - 2.8 GB of outdated open source packages and libraries removed! Nice little trick to add to your MacPorts maintenance tips and tricks.  :-)


Category: Python


Tagged as: macports

3 Comments

[Review] Specification By Example

SpecificationByExample.com is a very good read with real-life examples of software projects executed by agile software development teams.

The Problem

The author, Gojko Adzic, explains why software projects fail.  Software projects often fail as a result of a lacking of understanding of business goals by the software team writing the application; and on the other side of the coin, product owners (business owners) often lack the in-depth understanding required to create user stories for the software team.

7 Collaboration Philosophies

To solve the problem of communication and collaboration between the stakeholders of the software (product owners and end users) and the software designers/engineers, Gojko proposes that agile software teams and product owners collaborate and focus on these specific areas:

  1. Deriving Scope from Goals
  2. Specifying User Stories Collaboratively
  3. Illustrating using examples
  4. Refining the specifications through open discussions
  5. Automating validation without changing specifications
  6. Validating frequently (by drilling down to understand which piece of code does what and why)
  7. Evolving a living documentation system over time

6 Real Life Success Stories

With 6 excellent real life examples of successful software projects ranging from high traffic consumer web applications to busy back office intranet software, SpecificationByExample.com is an excellent book and I would highly recommend it to product owners and software professionals who are keen to continually improve the quality of their software and their own understanding of product creation and innovation.

Build the Product Right, Build the Right Product

And to sum it all up, here's a clear and excellent visualization of why it is important to "build a product right" (Test Driven Development, Extreme Programming, Agile methodology etc) AND at the same time, "build the right product" (Specification by example):


Category: Product Management


Tagged as: agile extreme programming specifications test driven development user stories

Leave a Comment

List open ports on Mac OS X

Mac OS X is unix-based, which for all practical purposes for the software developer/system administrator implies that some of the useful linux commands like netstat isn't available in your Mac Terminal.

On a linux machine, listening to the various open ports on it is as simple as running:

  1. netstat -atp | grep -i "listen"

On your Mac OS X machine, this is what you need to do instead:

  1. sudo lsof -i -P | grep -i "listen"

lsof essentially lists (like "ls") information about files opened by your processes.  An open file includes
  • a regular file,
  • a directory,
  • a block special file,
  • a character special file,
  • an executing text reference,
  • a library,
  • a stream or
  • a network file (Internet socket, NFS file or UNIX domain socket)
There you go. One more step towards becoming a Mac OS X superuser. :-)



Category: OS X


Tagged as: lsof mac os x

5 Comments

OS X Lion PostgreSQL

So, Apple has decided to bundle PostgreSQL with their stock Lion install.

And should you also install PostgreSQL via MacPorts (or even Homebrew), this is the little hiccup you will run into when you try connecting to your local postgresql db server.

  1. $ psql -d postgres -U postgres
  2. psql: could not connect to server: Permission denied
  3. Is the server running locally and accepting
  4. connections on Unix domain socket "/var/pgsql_socket/.s.PGSQL.5432"?


To solve this problem, we will need to unload the current postgresql server and load up the new server.

So, simply:
  1. $ DIR=/opt/local/var/db/postgresql90/defaultdb
  2. $ sudo launchctl unload -w /Library/LaunchDaemons/org.macports.postgresql90-server.plist
  3. $ sudo mkdir -p $DIR
  4. $ sudo chown postgres:postgres $DIR
  5. $ sudo su postgres -c "/opt/local/lib/postgresql90/bin/initdb -D $DIR"
  6. $ sudo launchctl load -w /Library/LaunchDaemons/org.macports.postgresql90-server.plist
  7. $ psql -h localhost -U postgres

And you should happily see your postgresql connection work perfectly fine on your terminal -
  1. psql (9.0.4)
  2. Type "help" for help.

  3. postgres=#


Say "Hip Hip Hooray".


Category: postgresql


Tagged as: lion mac postgresql

17 Comments

Get infected by vim

So, if you have made the correct life changing decision (o yes, that's right - please ignore the crazies who suggested you should use emacs :-p) and decided that vim is going to be the weapon (aka code editor) of choice, this little post will explain how you can organize your vim plugins cleanly so that you will not have a "vim functionality maintenance headache" in the many good years to come.

Traditionally, you customize your vim editor by means of settings and scripts placed in your .vimrc file as well as your .vim directory.

calvin$ cd ~
calvin$ ls -la
drwxr-xr-x@  19 calvin  staff    646 Sep  4 23:41 .vim
-rw-r--r--         1 calvin  staff  22010 Sep  5 12:45 .viminfo
-rwxrwxrwx    1 calvin  staff     25 Jul 31 20:15 .vimrc

calvin$ ls -la .vim
total 120
drwxr-xr-x@ 19 calvin  staff    646 Sep  4 23:41 .
drwxr-xr-x+ 78  calvin  staff   2652 Sep  5 12:54 ..
-rw-r--r--   1       calvin  staff    105 Sep  4 23:41 .netrwhist
drwxrwxrwx   3 calvin  staff    102 Jul 31 18:26 after
drwxrwxrwx   5 calvin  staff    170 Aug 27 00:57 autoload
drwxr-xr-x@  7 calvin  staff    238 Jul 31 18:26 colors
drwxr-xr-x@  8 calvin  staff    272 Aug 27 00:57 doc
drwxr-xr-x@  4 calvin  staff    136 Apr 10  2010 ftplugin
drwxr-xr-x@  5 calvin  staff    170 Feb  8  2010 indent
drwxrwxrwx   5 calvin  staff    170 Jul 31 18:26 lib
drwxr-xr-x@ 13 calvin  staff    442 Aug 27 09:57 plugin
drwxr-xr-x@ 10 calvin  staff    340 Aug 27 00:58 syntax
drwxrwxrwx   6 calvin  staff    204 Aug 27 01:00 tests

Unfortunately, this approach implies that each vim plugin you choose to "install" requires that you manually find that plugin's "ftplugin files" and place it in your ~/.vim/ftplugin directory; that plugin's "plugin files" and place it in your ~/.vim/plugin directory.

Although this is a one-time affair when you first install a specific vim plugin and its directory specific files, the problem begins when it's time for you to update/upgrade your plugin... imagine having 10s of vim plugins with its respective files placed in different directories and you have to track them down after last installing them 1 year ago and you no longer remember where's where.

Get ready to be infected by Pathogen

Fortunately for us vim users, we have some pretty talented people in the vim community.  Tim Pope has written the perfect vim plugin to rule them all.

Hop right over to  https://github.com/tpope/vim-pathogen, follow the install instructions and from now on, ALL your other vim plugins can be housed in a directory structure like this

~/ .vim
        |____ autoload
        |                |_________ pathogen.vim
        |____ bundle
                        |__________ yourotherplugin01
                        |                                               |_____ plugin
                        |                                               |_____ autoload
                        |                                               |_____ ...
                        |
                        |__________ yourotherplugin02

                                                                        |_____ plugin

                                                                        |_____ autoload

                                                                        |_____ ...


In this manner, each of your plugins are completely isolated in its own directory.  That means that when it is time to update/upgrade a specific plugin, it's as simple as doing a git pull (if you are using git to make that specific plugin of course).

And of course, if you are using .vim as a .git repository and each vim plugin is its own nested git repository, git submodules will make your vim update as easy as a walk in the park.

Elegance!  That never fails to turn me on :-))


Category: vim


Tagged as: vim vim-pathogen

2 Comments

Virtualenv for multiple python projects


Virtualenv is a simple and effective way of isolating a Python environment on a specific machine. 

Why? 

Why do we need an isolated Python environment? 

The need for virtualenv begins innocently enough after your first django or other types of Python project. Each distinct project you are working on may depend on different python libraries.

For example, for "project A" that you started say sometime in 2009 could very well be using Python Imaging Library (commonly known as PIL) version 1.1.6.  How do we know? We check it like this in our Python shell:

  1. $ python
  2. Python 2.7.2 (default, Aug 23 2011, 20:21:01)
  3. [GCC 4.2.1 (Based on Apple Inc. build 5658) (LLVM build 2335.9)] on darwin
  4. Type "help", "copyright", "credits" or "license" for more information.
  5. >>> from PIL import Image
  6. >>> Image.VERSION
  7. '1.1.6'
  8. >>>


It so happens that PIL 1.1.6 does not support processing of interlaced PNG files when you were working on Project A some time ago.

Fortunately, you have made a great choice in using PIL and it is an active Python library with a community of developers constantly improving it.  So discovering that interlaced PNG support has now been implemented -  http://hg.effbot.org/pil-2009-raclette/issue/2/interlaced-png-support-patch (Hip Hip Hooray!) - you now happily decide to upgrade your PIL to 1.1.7 and use PIL 1.1.7 for the new Projects B that you have just started.

Unfortunately, the code which you wrote in Project A conflicts with your OS-wide (system wide) PIL upgrade from 1.1.6 to 1.1.7 and now you have to waste time refactoring your code in Project A.

This is a common scenario of what could transpire when you are working on multiple Python projects on the same machine - you may be forced to update some code you wrote a long time ago (even though there isn't a real need to)  simply because the open source Python libraries you have installed are kept in your system wide "site-packages" directory.

A Brief Note on site-packages and Python Path

Now, what is this "site-packages" directory, you ask?

On your terminal, run:

  1. $ python -c "from distutils.sysconfig import get_python_lib; print get_python_lib()"
  2. /opt/local/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages


As you can see, when I first installed Python 2.7 on my Mac OS, the "site-packages" directory is an automatically created directory during the installation process and it is where all the additional open source/3rd party Python libraries are stored when we additionally install these open source/3rd party Python libraries.

This seemingly magical site-packages directory is in your Python search path (simply known as "Python Path") by default after installation.

What that simply means is that when we start a Python shell or run a Python script, stating 

  1. from PIL import Image


will automatically work and we now are able to write image processing functionality in our Python code as provided by PIL's "Image" class.   And the PIL source code has been placed in the site-packages directory during the installation process.

So, back to our Problem

So in our described scenario, what we are now facing is that Project A was using PIL 1.1.6 and we want our new project B to use PIL 1.1.7 with its new interlacing PNG capability (feature)!

Because of practical time constraints, we do not have time to upgrade Project A at the current moment and we want to be able to continue working on project B without delay.

Virtualenv solves this problem for us by allowing us to create an isolated group of directories (the environment) which ignores our OS wide (system wide) PIL 1.1.6 installation once we activate this environment.

A new set of directories with its own "site-packages" subdirectory will be created and when we create this virtual python environment.  With a simple virtualenv command, we can tell our virtual python environment to completely ignore the system wide site-packages directory and install PIL 1.1.7 specifically in that environment.

Let's begin our step-by-step

If you haven't installed the virtualenv software on your machine, do so.  

If you are using macports, it is a simple one-liner to get virtualenv installed on your Mac OS:

  1. $ sudo port -v install py27-virtualenv


To check that we have gotten our installation done correctly, simply run

  1. $ virtualenv --version
  2. 1.6.1


And to take a peek at all the available subcommands which virtualenv provides us with, run

  1. $ virtual --help
  2. Usage: virtualenv [OPTIONS] DEST_DIR

  3. Options:
  4. --version show program's version number and exit
  5. -h, --help show this help message and exit
  6. -v, --verbose Increase verbosity
  7. -q, --quiet Decrease verbosity
  8. -p PYTHON_EXE, --python=PYTHON_EXE
  9. The Python interpreter to use, e.g.,
  10. --python=python2.5 will use the python2.5 interpreter
  11. to create the new environment. The default is the
  12. interpreter that virtualenv was installed with (/opt/l
  13. ocal/Library/Frameworks/Python.framework/Versions/2.7/
  14. Resources/Python.app/Contents/MacOS/Python)
  15. --clear Clear out the non-root install and start from scratch
  16. --no-site-packages Don't give access to the global site-packages dir to
  17. the virtual environment
  18. --unzip-setuptools Unzip Setuptools or Distribute when installing it
  19. --relocatable Make an EXISTING virtualenv environment relocatable.
  20. This fixes up scripts and makes all .pth files
  21. relative
  22. --distribute Use Distribute instead of Setuptools. Set environ
  23. variable VIRTUALENV_USE_DISTRIBUTE to make it the
  24. default
  25. --extra-search-dir=SEARCH_DIRS
  26. Directory to look for setuptools/distribute/pip
  27. distributions in. You can add any number of additional
  28. --extra-search-dir paths.
  29. --never-download Never download anything from the network. Instead,
  30. virtualenv will fail if local distributions of
  31. setuptools/distribute/pip are not present.
  32. --prompt==PROMPT Provides an alternative prompt prefix for this
  33. environment


Creating our isolated Python virtual environment

Now, to create our isolated virtual environment for us to continue working with Project B using PIL 1.1.7, we run

  1. $ virtualenv -p python2.7 --no-site-packages --distribute projectbenv
  2. Running virtualenv with interpreter /opt/local/Library/Frameworks/Python.framework/Versions/2.7/bin/python2.7
  3. New python executable in projectb/bin/python
  4. Installing distribute.....................................................................................................................................................................................done.
  5. Installing pip...............done.


Notes:
  • The option -p is where we specify the Python version we want to use in the environment named "projectbenv".
  • When we specify the option "--no-site-packages", we are saying that the environment we are creating will ignore the system wide site-packages.
  • --distribute is where we tell our environment to use a python library called "python distribute" instead of the default "python setuptools".  We will discuss distribute versus setuptools as a separate topic.  Just take my word for it at the moment that distribute is a lot better than setuptools :-)

  1. $ ls -la projectbenv/
  2. total 8
  3. drwxrwxrwx 6 calvin staff 204 Aug 23 23:42 .
  4. drwxrwxrwx 103 calvin staff 3502 Aug 23 23:42 ..
  5. lrwxrwxrwx 1 calvin staff 66 Aug 23 23:42 .Python -> /opt/local/Library/Frameworks/Python.framework/Versions/2.7/Python
  6. drwxrwxrwx 12 calvin staff 408 Aug 23 23:42 bin
  7. drwxrwxrwx 3 calvin staff 102 Aug 23 23:42 include
  8. drwxrwxrwx 3 calvin staff 102 Aug 23 23:42 lib


As you can see, a bunch of directories have been created. If you drill down the directories further, you will find a "site-packages" directory under the "lib" directory.  This is the place where all your open source libraries (for instance, when you install PIL 1.1.7 in our above example scenario) will be kept when you install those libraries when you are in the environment.

How Do We Get In (and out of) this Python Virtual Environment?

So that's the million dollar question - after we have created such an environment, how do we begin hopping into this "4th dimension"? :-)

In the bin directory that was created above, there is a command named "activate".

So let's say we want to create a project directory separately. Yes - there's no necessity to keep your project directory inside the project environment directory.

  1. $ mkdir project
  2. $ ls -la
  3. total 2
  4. drwxrwxrwx 6 calvin staff 204 Aug 23 23:42 .
  5. drwxrwxrwx 103 calvin staff 3502 Aug 23 23:42 ..
  6. drwxrwxrwx 3 calvin staff 102 Aug 23 23:42 projectb
  7. drwxrwxrwx 3 calvin staff 102 Aug 23 23:42 projectbenv

 And we get into the virtual "projectbenv" environment by simply running the activate command (which happens to be kept in the bin directory).

  1. $ source projectbenv/bin/activate


You will now notice that you have a "(projectbenv)" string right above your command line prompt ($).

That tells you that whatever you do from this point onwards, you are in that virtual environment and any open source Python packages which you install using a PIP command, will be installed into projectbenv's site-packages.

To get out of this virtual environment, simply run

  1. $ deactivate


Now you are equipped with the basic skills to manage multiple Python projects where each project can depend on different versions of open source Python libaries! Hack away! ;-)




Category: Python


Tagged as: distribute site-packages virtualenv

Leave a Comment

PostgreSQL cheat sheet for django beginners


Here's a simple article explaining the basics of interacting with PostgreSQL from the command line and from the shell terminal; and telling django to connect to it.

1. What is a postgresql superuser?

Once you have successfully installed postgresql on your OS, it installs a single superuser by default named postgres.

When you want to create your own PostgreSQL database for a particular Django project, you can use this default superuser to help you create a new PostgreSQL user, its corresponding password, and a new PostgreSQL database.

Some notes before you begin...
  • the $ sign denotes your terminal/command line shell prompt.
  • the # sign denotes your postgresql shell prompt.

2. Checking

If you have successfully installed PostgreSQL, you should be able to do the following in your terminal:-

  1. $ which psql
  2. /opt/local/lib/postgresql90/bin/psql


Entering the postgresql shell with your "postgres" superuser:-

  1. $ psql -U postgres
  2. psql (9.0.4)
  3. Type "help" for help.

  4. postgres=# \q

  5. $

As you can see, \q is the postgresql command you give to exit the postgresql shell (and you are returned to your terminal prompt).

3.  Changing Your Superuser's Password

Should, for any reason you want to change your postgres superuser's password, get into the postgresql shell (by running "psql -U postgres" in your terminal) again and do this:-

postgres=# \password postgres
Enter new password:
Enter it again:


This is not a particularly important step for your local machine (completely optional) but very very important for your server's postgresql database should you decide to allow remote connections into your postgresql database for any reason.

4.  Creating a new Database User, Password and a new Database

With one simple command, you can create your new database user and set a password for it from your terminal.

  1. $ createuser -U postgres yournewuser -P
  2. Enter password for new role:
  3. Enter it again:
  4. Shall the new role be a superuser? (y/n) n
  5. Shall the new role be allowed to create databases? (y/n) y
  6. Shall the new role be allowed to create more new roles? (y/n) n

Once you have created your new database user and set a password, you can now create your database and tell this new database that it is used by your new user.

  1. $ createdb -U yournewuser -E utf8 -O yournewuser yournewdb -T template0


5.  Update your django settings.py file!

With these done, you can now update your DATABASES setting in your django settings.py file.

  1. DATABASES = {
  2. 'default': {
  3. 'ENGINE': 'django.db.backends.postgresql_psycopg2', # Add 'postgresql_psycopg2', 'mysql', 'sqlite3' or 'oracle'.
  4. 'NAME': 'yournewdb', # Or path to database file if using sqlite3.
  5. 'USER': 'yournewuser', # Not used with sqlite3.
  6. 'PASSWORD': 'whateverpasswordyouenteredearlier', # Not used with sqlite3.
  7. 'HOST': '', # Set to empty string for localhost. Not used with sqlite3.
  8. 'PORT': '', # Set to empty string for default. Not used with sqlite3.
  9. }
  10. }


Now, if we write our django data models (also called 'classes') in models.py and then run python manage.py syncdb command in the terminal, the corresponding database table(s) and field(s) will be created in our newly created postgresql database.

And that's it. Have fun!

(Although there are two pretty decent GUI tools for managing your PostgreSQL database, i.e. pgAdmin3 and phppgadmin, I have decided to introduce and encourage the use of these basic commands in your terminal and in the postgresql shell because they are really not too complicated as you can see in the above examples.)


Categories: Django postgresql


Tagged as: django postgresql

8 Comments

Pypy on virtualenv and a need for speed



Pypy is a high performance python interpreter written in python (for Python 2.7.1 at this point in writing).  In contrast, the original (standard) python interpreter is written in C, commonly referred to as CPython.

So why would we use Pypy instead of the standard CPython?

Performance and speed = faster Python apps!

Pypy works as a just-in-time compiler. It looks for some bits of code that are executed often and optimize those bits into assembly.

Ok. Sounds interesting, but just how fast is fast?

Let's take a look at the performance benchmark comparing django on pypy against django on standard python.  The performance contrast is stunning - more than 0.5 seconds faster in absolute terms for this benchmark.  For a more generic overview of pypy's performance, check out http://speed.pypy.org/.

Why is 0.5 seconds important to us as application developers or product owners? Do we really care about optimizing for 0.5 seconds?

In a talk at Web 2.0 Conference in 2006, Marissa Mayer, head of usability at Google, spoke at length about their own experiences relating to user experience and web application speed - http://www.zdnet.com/blog/btl/googles-marissa-mayer-speed-wins/3925.

Long story short, it is discovered that:
For Google an increase in page load time from 0.4 second to 0.9 seconds decreased traffic and ad revenues by 20%. For Amazon every 100 ms increase in load times decreased sales by 1%.

So yes, as software professionals working on large scale apps, we care about implementing (web) apps that are speedy from an end-user perspective.  By the way, this is where some properly implemented native apps (apps built specifically for mobile devices like the iPhone or Android or for your Desktop computer) has the advantage (at the moment). But that's a story for another day.

Usability and User Psychology

To further emphasize usability issues in relation to app performance and speed, here's a 1993 article (o yea, ancient by internet standards :-) ) - http://www.useit.com/papers/responsetime.html.  Quote:
  • 0.1 second is about the limit for having the user
    feel that the system is reacting instantaneously, meaning that no
    special feedback is necessary except to display the result.
  • 1.0 second is about the limit for the user's
    flow of thought to stay uninterrupted, even though the user will notice
    the delay. Normally, no special feedback is necessary during delays of
    more than 0.1 but less than 1.0 second, but the user does lose the
    feeling of operating directly on the data.
  • 10 seconds is about the limit for keeping the
    user's attention focused on the dialogue. For longer delays, users will
    want to perform other tasks while waiting for the computer to finish, so
    they should be given feedback indicating when the computer expects to
    be done. Feedback during the delay is especially important if the
    response time is likely to be highly variable, since users will then not
    know what to expect.
Now that you are convinced (I hope) that it is worthwhile to experiment with pypy purely from a speed/performance gain perspective and convince your pointy-haired boss to let you try out some of your Python apps on pypy, here's a quick run down on how we can get pypy running on your machine:-

1.  Grab the pypy trunk (Yes, I like bleeding edge software) and build from source


  1. $ cd ~/work
  2. $ git clone https://github.com/pypy/pypy.git


2.  Compile pypy on your machine to create the pypy-c binary

Be sure that you have Python 2.7 already installed in the first place.  We will need Python 2.7 to run translate.py as seen below:-
  1. $ cd ~/work/pypy/pypy/translator/goal
  2. $ python translate.py --opt=jit targetpypystandalone.py



This is going to take a while (probably a lot more than 30 minutes).  So go take a walk :-)

3.  Use pypy in an isolated python environment with virtualenv

As pypy is a lot more bleeding edge and experimental and may not work with python libraries with c extensions, let's do ourselves a favour and keep pypy in an isolated environment where we can experiment to our heart's content without worrying about it messing with our python libraries already installed in our global site-packages.

If you do not have virtualenv installed, it's as easy as running "sudo port install py27-virtualenv" on your Mac (or equivalent commands on your preferred linux distro). 

Once that's done, we can now run:-
  1. $ cd ~/work/
  2. $ mkdir env
  3. $ cd env
  4. $ virtualenv -p ~/work/pypy/pypy/translator/goal/pypy-c --no-site-packages --distribute pypy
  5. Running virtualenv with interpreter /Users/calvin/work/pypy/pypy/translator/goal/pypy-c
  6. New pypy executable in pypy/bin/pypy-c
  7. Also creating executable in pypy/bin/pypy
  8. Installing distribute............................................................................done.
  9. Installing pip...............done.



4.  And checking


  1. $ source pypy/bin/activate
  2. $ python
  3. Python 2.7.1 (6f7e32a3d998, Jul 24 2011, 00:39:56)
  4. [PyPy 1.5.0-alpha0 with GCC 4.0.1] on darwin
  5. Type "help", "copyright", "credits" or "license" for more information.
  6. And now for something completely different: ``topics are for the feeble
  7. minded''
  8. >>>> exit()


And now, we can experiment with any python apps we like in this isolated python environment using pypy!

And to get out of your pypy-c python environment at any point in time, we simply run "deactivate".

And in case you were wondering what the image above is all about, it is a picture depicting a snake eating its own tail - i.e. ouroboros.  Since pypy is Python written in Python (Restricted Python - RPython - to be exact), that's the logo that the pypy developers use!


Category: Python


Tagged as: pypy

Leave a Comment
Page generated in: 1.45s