Maximizing uptime for your @21 python app with daemonizing and crontab

It’s a good practice to make your 21 server applications as robust as possible, both because this is intrinsically right and because the 21 marketplace keeps track of uptime, which you can see via 21 publish list:

Screenshot from 2016-10-19 11-44-44.png

(Obviously, I have had some learning experiences on the way to robustness…)

Two important techniques are “daemonizing” and cron jobs.  Daemonizing detaches the server process from your shell session so that it keeps running after  you log out of your session.   Cron jobs launch processes at scheduled times.  To daemonize the 11 apps I now have running on the 21 marketplace,  I have been copy and pasting a little chunk of code that I borrowed from an app written by James Poole.

if __name__ == '__main__':

   @click.command()
   @click.option("-d", "--daemon", default=False, is_flag=True, help="Run in daemon mode.")

   def run(daemon):
            if daemon:
                pid_file = './superiorwordcloud.pid'
                if os.path.isfile(pid_file):
                    pid = int(open(pid_file).read())
                    os.remove(pid_file)
                    try:
                        p = psutil.Process(pid)
                        p.terminate()
                    except:
                        pass
                try:
                    p = subprocess.Popen(['python3', 'superiorwordcloud-server.py'])
                    open(pid_file, 'w').write(str(p.pid))
                except subprocess.CalledProcessError:
                    raise ValueError("error starting superiorwordcloud-server.py daemon")
            else:
                print("superiorwordcloud server running...")
                app.run(host='::', port=5016, debug=True)
   run()

(Apologies for not making this text, but I couldn’t get WordPress to cooperate with the indentation.)

It’s a good practice to launch the daemon from a cron job. This gives the maximum combination of robustness and system performance, as MarkR explained on StackOverflow:

In general, if your task needs to run more than a few times per hour (maybe

A daemon which is always running, has the following benefits:

  • It can run at frequencies greater than 1 per minute
  • It can remember state from its previous run more easily, which makes programming simpler (if you need to remember state) and can improve efficiency in some cases
  • On an infrastructure with many hosts, it does not cause a “stampedeing herd” effect
  • Multiple invocations can be avoided more easily (perhaps?)

BUT

  • If it quits (e.g. following an error), it won’t automatically be restarted unless you implemented that feature
  • It uses memory even when not doing anything useful
  • Memory leaks are more of a problem.

In general, robustness favours “cron”, and performance favours a daemon. But there is a lot of overlap (where either would be ok) and counter-examples. It depends on your exact scenario.

You can use this line in a crontab file to launch the daemon as a cron job every time your system reboots (which is helpful, since otherwise you have to remember to restart it!)

@reboot cd /home/bitnami/pagekicker-21/superiorwordcloud/ \
&& python3 superiorwordcloud-server.py -d

I’m no expert, so additional tips and tricks on how to accomplish these goals are more than welcome.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s