Profiling python applications

    Brief note with links and examples on profiling:
    1. performance: hotshot or python profile / cProfile + kcachegrind log visualizer (there is a port for windows , an analogue of WinCacheGrind )
    2. memory usage: web-based dowser


    1. collect statistics profiler options:

      • example 1, using a quick hotshot (which can become deprecated): we
        import hotshot
        prof = hotshot.Profile("")
        # your code goes here

        convert the log format using the utility from the kcachegrind-converters package :
        hotshot2calltree > your_project.out

      • example 2, using the standard profile / cProfile: we
        python -m cProfile -o your_project.pyprof

        convert the log format using pyprof2calltree :
        pyprof2calltree -i your_project.pyprof -o your_project.out

        (with the -k option, it immediately starts kcachegrind and there is no need to create an intermediate file)

    2. open and study the log in the kcachegrind visualizer

    For the first time I had to use Django for profiling applications when I caught tricky recursive import in someone else's code. Then I used the existing handler for mod_python , but since the latter is no longer popular, alternative connection methods and even modules for profiling appeared long ago (the latter did not use it).


    Unfortunately, I still do not know the means to make it as simple and pleasant. I did not want to rummage through the debugger, I was not satisfied with Guppy - it is strong, but difficult - so often, fortunately, you do not have to profile. Objgraph also does not provide easy navigation outside the debug shell .

    My choice now is a dowser , a CherryPy frontend application. With it, everything is simpler, although not so flexible:

    1. create a controller, essentially a CherryPy 3 application:
      import cherrypy
      import dowser
      def start(port):
              'environment': 'embedded',
              'server.socket_port': port

    2. connect to your application:
      import memdebug
      # your code goes here

    3. we go to the browser and look at the statistics (ignore the objects of CherryPy, other libraries - we are only looking for ours) The

      functionality is ascetic, you have to click and figure it out a little, but it is enough to find problems. There is no need to learn and remember the debugger API. Attention, on some operations like “Show the entire tree”, a non-enormous amount of memory may be required for large applications.
    4. the application itself will not close. After learning, interrupt Ctrl + Z and kill

    What methods and means of profiling are still worth learning about?

    Also popular now: