I've been using SysV interprocess communication between the various processes that control and monitor the devices around the house, and it's been working fine. But, at some point I'm going to want to separate the processes into different machines. A Pi for the lights, a Pi for the devices I hope to have somewhere else, and a pet project of mine, a Pi to control the functions of the pool. SysV doesn't talk between machines. I can use TCP for this, but I don't want to invent a protocol on top of it, I want something I can just pick up and use. How about HTTP?
I could bring up a web server on each machine and hook it into the processes that are involved, but that would take a bunch of research. So I got to looking around and there were two nice solutions, Tornado <link> and CherryPy <link>. They're both nice, but CherryPy is simpler to use, so I chose it as my first experiment. Besides, it's cool to be running CherryPy on a Raspberry Pi, ... right?
It came up on the first try using their tutorial and worked. That looks promising, but what about doing something else besides just waiting for an HTTP request? How to have several of them running on the same machine talking between processes? How do I keep track of the various addresses?
In my usual form, I just started writing code to see what would happen. CherryPy uses a publish, subscribe philosophy, and provides some basic published items. I subscribed to their 'main' and used that to run my tiny timer, and suddenly, it could do something besides just wait for HTML to come in. So, I have my timers, the ability to periodically check device status and probably do some other stuff. Now would be a good time to expand the experiment.
The idea is to be able to communicate with each control or monitor process to see what is going on with a browser from my recliner as well as a running process that accumulates everything for presentation. Hopefully, this will make it much easier to tell what is going on when something quits working properly. I hope to use JSON as the data transfer protocol and then we'll be able to read it as well as use it easily in code. Eventually, I want to be able to separate the processes between Raspberry Pi's and have them keep working. My very own tiny data center setting on a shelf over there somewhere.
But, I'm a bit sick of changing things, breaking them, and not having a clue what I did to break them. Enter GitHub <link>. I've put my current Raspberry Pi source on GitHub so I can track changes and back out things that just didn't work well. I can also create a fork for massive experiments, load them on another Pi and play until it works. A side effect is that you folk can grab the source without having to copy and paste from a window. There have been many problems with python and its indentation rules, this should fix that. Also, if you find a bug, you can fix it and then I can grab it as well.
The latest test of my move to CherryPy is in the file cherrytest.py in the other-stuff directory in my GitHub repository <link>. The house directory is where I put the code I'm running right now on my Pi to control the house. When I start testing using CherryPi to communicate, I'll fork off a new version and play there until I can tell if it is going to work out.
My very own little sandbox to play in.
I could bring up a web server on each machine and hook it into the processes that are involved, but that would take a bunch of research. So I got to looking around and there were two nice solutions, Tornado <link> and CherryPy <link>. They're both nice, but CherryPy is simpler to use, so I chose it as my first experiment. Besides, it's cool to be running CherryPy on a Raspberry Pi, ... right?
It came up on the first try using their tutorial and worked. That looks promising, but what about doing something else besides just waiting for an HTTP request? How to have several of them running on the same machine talking between processes? How do I keep track of the various addresses?
In my usual form, I just started writing code to see what would happen. CherryPy uses a publish, subscribe philosophy, and provides some basic published items. I subscribed to their 'main' and used that to run my tiny timer, and suddenly, it could do something besides just wait for HTML to come in. So, I have my timers, the ability to periodically check device status and probably do some other stuff. Now would be a good time to expand the experiment.
The idea is to be able to communicate with each control or monitor process to see what is going on with a browser from my recliner as well as a running process that accumulates everything for presentation. Hopefully, this will make it much easier to tell what is going on when something quits working properly. I hope to use JSON as the data transfer protocol and then we'll be able to read it as well as use it easily in code. Eventually, I want to be able to separate the processes between Raspberry Pi's and have them keep working. My very own tiny data center setting on a shelf over there somewhere.
But, I'm a bit sick of changing things, breaking them, and not having a clue what I did to break them. Enter GitHub <link>. I've put my current Raspberry Pi source on GitHub so I can track changes and back out things that just didn't work well. I can also create a fork for massive experiments, load them on another Pi and play until it works. A side effect is that you folk can grab the source without having to copy and paste from a window. There have been many problems with python and its indentation rules, this should fix that. Also, if you find a bug, you can fix it and then I can grab it as well.
The latest test of my move to CherryPy is in the file cherrytest.py in the other-stuff directory in my GitHub repository <link>. The house directory is where I put the code I'm running right now on my Pi to control the house. When I start testing using CherryPi to communicate, I'll fork off a new version and play there until I can tell if it is going to work out.
My very own little sandbox to play in.