Quantcast
Channel: Desert Home
Viewing all 218 articles
Browse latest View live

Using the Google Charts API with Xively (Pachube -> Cosm -> Xively)

$
0
0
It's actually getting a little tiring having the old Pachube change names and interfaces every year.  However, they have programmers there and they have to be doing something ... right?  I decided, based on a request from a follower, to put the code I have been using to collect data that is stored on Xively and hand it off to the Google Graph API.  This gives me a nice graph that I can look through to inspect my energy usage.

I've used the Google graphs to find problems with my power usage and distribution for a couple of years now.  Nice system.  There are other graphing systems out there, but each of them is different from the others and one has to settle on something to work with.  I just chose Google, not because it's the best, but because I think it might continue to exist for a while and has reasonable documentation.  Oh, it's also free.

At any rate, most of the graphing software takes a table as input.  That presents a particular problem for us inexperienced web programmers.  What the heck is a table and how do we make one?  Once we understand that to some degree, how do we get the data from Xively and put it into a table?  This little bit of knowledge is hard to come by.  I went through several iterations of trying to get data, format it into something to give to the Google API and see what was displayed.  I came up with this:

The Web Page
<html>
    <title>Desert Home Day Graph</title>
<head>
</head>

<body>
    <!--somehow, SUN thought this was better than the c include statement....sigh -->
    <!--These are the google api files that you need for their tables and graphs -->
    <script type="text/javascript" src="http://ajax.googleapis.com/ajax/libs/jquery/1.3.2/jquery.min.js"></script>
        <script type='text/javascript' src='https://www.google.com/jsapi'></script>
        <script type='text/javascript'>
          google.load('visualization', '1', {'packages':['annotatedtimeline']});
    </script>

    <script type="text/javascript">
    //I stole this function almost verbatim from the Pachube web site.
    //It formats the date into a form I like
    function adjustdateformat(incoming){
             if (typeof(incoming) != "function") { //no scripts allowed in data
            var ts_parts = incoming.split('T');
            var d_parts = ts_parts[0].split('-');
            var t_parts = ts_parts[1].split(':');
            var d = new Date(Date.UTC(parseInt(d_parts[0],10), parseInt(d_parts[1],10)-1 ,parseInt(d_parts[2],10), parseInt(t_parts[0],10), parseInt(t_parts[1],10), parseInt(t_parts[2],10) ) );
            return(d);
        } else {
            alert("There was a script in the data !!");
            return(0);  //for when there is a script
        }
    }
    // this creates the table that will be stuffed with data returned
    var data = new google.visualization.DataTable();

    // The google data api doesn't have a call to return a column id by name
    // so I had to invent this one
    function getColumnIdByName(name){
        var idx;
        var thisname;
        var lastOne = data.getNumberOfColumns();
        for(idx = 0; idx < lastOne; idx++){
            if(name.toString() == data.getColumnLabel(idx)){
                return(idx);
            }
        }
        // I really don't like magic values like this, but it'll do for now.
        return(99);
    }

    // This parses into the json data to grab values.
    // there's documentation all over the web on the json format
    // but it has to be parsed to use it
    function climbtree(archivedData){
        // this will create the table for the graph
        if (typeof(archivedData.datastreams[0].datapoints) == 'undefined'){
            alert("No archived data returned");
            return;
        }
        $.each(archivedData.datastreams, function(key, value){
            // key 0 has the first datastream and the times for the table.
            // you'll always get this if there is data to deliver
            if(key == 0){ // only add a date column for the first datastream
                //alert("I got key 0 ");
                if (data.getNumberOfColumns() == 0){
                    data.addColumn('datetime', 'Date');
                    data.addColumn('number', value.tags);
                }
                $.each(value.datapoints, function(key,value){
                    //alert("got an at value of" + this.at);
                    var d = adjustdateformat(value.at);
                    var level = value.value;
                    var thisrow = data.addRow();
                    data.setValue(thisrow, 0, d);
                           data.setValue(thisrow, 1, parseFloat(level));
                })
            } else {
                //second, third, etc datastreams
                //alert("got another key = " + value.tags);
                var columnIdx;
                if((columnIdx = getColumnIdByName(value.tags)) == 99){
                    columnIdx = data.addColumn('number', value.tags); //new datastream came in
                }
                $.each(value.datapoints, function(key,value){
                    var d = adjustdateformat(value.at);   //format its datestamp the same way
                    var rowIdx = data.getFilteredRows([{column:0,value:d}]); //and find it in the table
                    var level = parseFloat(value.value);
                    data.setValue(rowIdx[0], columnIdx, level);
                })
            }
        })
    }
    </script>

    <!--     this actually goes and gets the data.  What I'm doing is grabbing the data
        for two items: Real Power and Outside Temperature.  Since Cosm has a limit on
        the number of items they will return, I do it in six hour intervals.  So, a day
        will loop four times and two days 8, etc.  There's also a limit on other stuff,
        so the most you can get is around a week.  If you want more, you'll have to do
        some more work to handle it.  Remember though, there's a lot of data here and
        it will take some time to get it back from the site.  That's why I prefer server
        generated charts.  The server has the data right there and can access it quickly
        when we go get it, it takes time for the chunks to come to us.
    -->
    <script type="text/javascript">
    $(document).ready(function(){
        var url='http://api.pachube.com/v2/feeds/9511.json?&key=GtGuoMKJSqv2tzqGvWaITLhlEDUxrYXSixqPSmlyj-s&per_page=1000&datastreams=0,7&duration=6hours&interval=60';
        var databack;
        var startpoint = '';
        var loopcount;
        if(queryString["day"] == undefined)
            loopcount = 4;
        else
            loopcount = (queryString["day"] * 4) + 1;
        //alert('loopcount = ' + loopcount);
        for ( var i=0; i<loopcount; i++ ){
            //alert('going for ' + url + startpoint);
            $.ajax({
                    type: 'GET',
                    url: url + startpoint,
                    dataType: 'json',
                    data: {},
                    async: false, // you have to have this so the data will arrive before display
                    success: function(dataarchive) {
                     //alert("the url should have returned");
                    if (databack = (typeof(dataarchive.datastreams[0].datapoints) != 'undefined')){
                        var len = dataarchive.datastreams[0].datapoints.length;
                        //alert("got back " + len + " items");
                        //alert("from " + dataarchive.datastreams[0].datapoints[0].at +
                        //    "to " + dataarchive.datastreams[0].datapoints[len - 1].at);
                        startpoint = "&end=" + dataarchive.datastreams[0].datapoints[0].at;
                        climbtree(dataarchive);
                    };
                }
            });
        }
        //alert('after the get loop');
        var chart = new google.visualization.AnnotatedTimeLine(document.getElementById('chartholder'));
        // OK, now I've got a chart, but need to rearrange the columns
        // this is a matter of taste, so adjust as needed.
        var chartView = new google.visualization.DataView(data);
        // this sets the order of the columns
        chartView.setColumns([0,2,1]);
          chart.draw(chartView, {displayAnnotations: true, 'scaleType':'allfixed','scaleColumns': 'allmaximized', 'scaleColumns': [0,1]});
    });
    </script>

    <!--
        This little thing allows you to put a parameter in the URL to get
         multiple days.  Use a ? then the number of days i.e. ?day=2 will get
        you two days worth of data.
    -->
    <script type="text/javascript">
        var queryString = new Array();
        var parameters = window.location.search.substring(1).split('&');
        for (var i=0; i<parameters.length; i++) {
            var pos = parameters[i].indexOf('=');
                // If there is an equal sign, separate the parameter into the name and value,
                // and store it into the queryString array.
                if (pos > 0) {
                    var paramname = parameters[i].substring(0,pos);
                    var paramval = parameters[i].substring(pos+1);
                    queryString[paramname] = unescape(paramval.replace(/\+/g,' '));
                }
            else {
                    //special value when there is a querystring parameter with no value
                    queryString[parameters[i]]="[nil]"
                }
        }
    </script>
    <!-- and the actual chart declaration; change this for size and stuff -->
        <div id='chartholder' style='width:100%; height:310px;'> </div>

</body>
</html>

This is the actual web page I created to display a days worth of data on power usage and outside temperature taken from my sensors.  To use it, just copy the code into a file, save it as html (htm, whatever) and then click on it.  It'll load my data and give you a graph you can play with.  Then, modify it to use your feed id, key, and other stuff to display your own data.  It can serve as a template for your own graph as well.

If you want to use it as part of a web page with other information, pictures, etc, just put it into an iframe and include the page.  I keep this page on Dropbox so I can use it from anywhere, even my tablet.  Some browsers have a little trouble with iframes (chrome) others have trouble with Google charts (Opera, Silk), so your mileage may vary, but it can certainly serve as an example.

This is what you will get with a line like:
<iframe width="100%" height="325" scrolling="No" seamless frameBorder="0"
        src="http://dl.dropboxusercontent.com/u/128855213/CosmGraphPower.html"> </iframe>
And yes, the src attribute up there points to the page I have on Dropbox to show this page.  Feel free to grab the source directly from there as well. 

As I said before, my web programming experience is slim, so there may well be better, more elegant ways to do this.  If so, leave a comment for other folks (and me!) to help make our lives simpler.  For example, you have no idea what a pain in the butt it was to get this page to work using the tools that Blogger provides.  Their editors kept changing things and messing up the display.  I finally got it to work, but I haven't tried it on multiple browsers so, it may be messed up. Look at it in FireFox, it seems to be the best with this display.

Holy Cow, I've Got Competition

$
0
0
As this blog can attest, I've been shifting my power usage to non-peak periods for a few years now. I didn't realize what an accomplishment this was because I did it out of GREED. Yes, greed. I got tired of giving my money to the power company and wanted to keep it for myself. Now I happened across an effort in Europe to do the same thing I'm doing, but they made it into a contest and are actually offering a reward. 

What??

 Yep, what I give away for free is being solicited and suggested to corporations, education facilities, and NGOs (non government organizations). Sheesh? Why didn't they just make a few suggestions to the army of nerds out here that are attacking this problem in their own way? So, take a look at http://dynamicdemand.nesta.org.uk/ ; the ideas they present are somewhat pie-in-the-sky, but some of them might actually do some good. Funny though, they didn't mention things like controlling the refrigerator compressor during peak periods, alternating expensive appliances such that they can't be on at the same time. Using tile flooring to hold the heat and cool during needed periods, automatic ceiling fans, the stuff that normal people are working on all over the world.

 Some of the more mundane items I have like shade on the sunny side of the house, solar tubes in the ceiling for light during the day, and extremely low power pool pump using special venting to support vacuum problems completely escaped their notice as well.

 Anyway, have a look, but remember, you saw it here first.

Waterco Multicyclone Filter for My Swimming Pool

$
0
0
I have had nothing but trouble with my pool since about a month after it was installed.  Sure, it's fun to laze about in it on the hot summer days; it makes desert life bearable, but darn it's hard to take care of.  I have a salt water pool as mentioned several times in the tab above labeled "Swimming Pool" <link> and that generates its own chlorine, but that creates "Ph creep", which I deal with by pumping acid into the pool at regular intervals using  an Acid Pump <link> that I created.  However, there's still problems.

See, I have a cartridge filter instead of the more traditional diatomaceous or sand filter.  These things require cleaning and replacement.  Cleaning can be as often as every couple of weeks in dirty weather or a few months when there's no sand storms, bug migration, tree blooms, or dirty dogs to mess it up.  However, this year I've been pestered by caterpillars, crickets, beetles, frogs, sandstorms, and tree blooms such that I've had to clean the cartridges several times.  Add to that the calcium carbonate snow that accumulates due to the chlorine generation and summer has been a pain in the pool department.

I decided to get a pre-filter to get rid of some of the crap to keep the work load down.  I got one of these:
This is the Waterco Multicyclone 16.  Basically, it's a Dyson vacuum cleaner that works for your pool.  It goes after the motor and before the big pool filter and removes most of the crap before it gets into the filter.  Theoretically it should keep me from having to clean the cartridges as often, maybe as little as once a year.

Frankly, I'm skeptical.  However, after looking at the various videos and hunting around for honest reviews of the thing (I couldn't find one), I took the plunge and bought one.  I just installed it today, so I don't have a history to call on, but I can say that it wasn't hard to install and looks like it's doing what they say it does.  Here's the installation:

Yes, it's tall.  I could have mounted it closer to the pump, but then I would have had to get a new drain faucet and it would have only been six inches shorter.  I used two 45 degree fittings because 90 degree fittings impede flow and I wanted all the flow I could get.  The idea of the pump is that the water goes up through a pipe in the middle, swirls around through 16 cyclone fixtures inside where centrifugal force removes debris allowing it to fall to the bottom of the clear area where it can be discharged using the red valve at the bottom.  The cleaner water then goes to the cartridge pump and gets 'polished' before discharging into the pool.

I used the glue technique I learned when I was working on the solar heater.  I found out that to get the absolute best seal, forget the normal PVC cement and purple primer, go directly to the blue 'Red Hot' cement and forget the primer.  That little trick took me almost a week and calls to the pipe manufacturer to discover.  Here's a picture of the filter in action:
No leaks !!  You really can't tell from this picture that there is water in there, but I expected it to be swirling around; it didn't.  The cyclone action is all in the top section and the relatively serene clear bowl at the bottom is just clear water.  At least it was clear at first.  Over the next 20 minutes particles started to appear in the bowl, then a bug, then some of the white snow created by the chlorinator.  After about 20 minutes, I purged it by opening the valve and everything that had accumulated shot out the pipe to the side.  So far, it is doing exactly what they claimed.  Just before I purged it, I took this picture of the bowl:
See the accumulation at the bottom and in the discharge pipe?  That's after 20 minutes !  Yes, that's the kind and amount of crap that is getting into the cartridge filters and clogging them up.  I suspect I'll be purging this thing every couple of days during the dirty season.  That is after the new wears off; right now I purge it every time I go out there because it's cool to watch the debris shoot out.

In all honesty, I can't recommend this device for other people yet, I haven't had enough experience with it yet.  However, if the accumulation of particulates is any indication, this thing works.  I'm going to watch it like a hawk for the next week or so (naturally, it's new and cool) then determine if it is actually doing the job it's advertised to do.  I'll hook up the vacuum head and clean the pool and see what happens when I flood it with the stuff that hangs around on the bottom of a pool in the desert; that should be a good test.

At any rate, stay tuned, I should know more in a few weeks.

Edit: I've had some experience with it now, more information here <link>

Waterco Multicyclone Filter for My Swimming Pool part 2

$
0
0
Part one of this is here <link>.

It's been a week with the new pre-filter on my pool and I can actually say that I'm satisfied.  No leaks and it works as advertised; nice device.  Here's what it accumulated in the reservoir:

Yes, this is a weeks worth of the sand and stuff that would have been starting to plug up my cartridge filter already.

The white layer on top is the calcium carbonate and calcium sulfate that is created by my salt generator.  That's the stuff they call 'snow' that accumulates in the corners of the pool and annoys the heck out of me.  This filter catches that stuff and keeps it from getting into the cartridges...nice.

Now I feel comfortable recommending this device for other folks to use on their pools.  I don't have enough experience with how much it will save me in the longer term, but if it keeps me from having to clean the cartridges every couple of weeks in the stormy season, it's worth every penny I paid.  Cleaning the filters tears them up over time forcing me to have to buy new ones.  At almost 90 dollars a cartridge and having 4 of them in the filter, it adds up in a real hurry.  Just saving me just one purchase of replacement cartridges will pay for the cyclone filter.

The fact that it's capable of filtering out the 'snow' gives me an idea.  If I put another one in the path of the chlorine generator, it should remove the snow before it gets to the pool to annoy me.  This would also lower the calcium level in the water because I would be constantly removing it after the generator formed it into the snow.  That would be good if it didn't lower the calcium level enough to cause leaching from the walls of the pool.  I don't have enough money to experiment with that right now, but maybe in a year or so I could get another one and install it to remove that stuff.  Heck, two of them might be enough to get rid of the cartridge filter entirely, or scale it back to a much smaller device that doesn't cost so much to maintain.  I have to worry about flow rate though, smaller filters may not have enough flow.

At any rate, if you live in a dirty area and have to clean your filter unreasonably often, this is a good device that just may relieve some of the burden and expense.  I really like it.

Finally got a Raspberry Pi

$
0
0
OK, I broke down and took the plunge to a different kind of little computer; I went out and bought myself a Raspberry Pi <link>.  Nice little machine.  It wasn't an easy decision on my part since I can finally make the little Arduinos do most anything I want to with hardware, but I got a bug up my butt.

I was corresponding with a person working on garage door control and he was pushing the door state out to his cell phone.  So, when the door opened, it sent a message to his phone and told him.  I decided that would be a really great way to keep track of things around the house.  Besides the garage door, it would be nice to know that the power was too high while I was away so I can log into the house and turn something off, that kind of thing.  The problem is that that stuff can get hard to do on an Arduino.  When you get into the internet realm, the Arduino doesn't shine.  What was needed was a machine that didn't use much power, and had plenty of support for ethernet and internet.  A laptop would do the job, but that costs way too much and relies on something that can take too much time to boot up and run.  The Raspberry Pi sounded like the perfect solution.  It has a serial port and an ethernet port built right in and runs Linux; how cool is that?

It came in, and I started looking at how to get started with the thing.  Let me tell you , there are hundreds of tutorials out there and they all leave something out that is necessary.  This is because you actually need a console for the little device to get it going.  I stepped through the process of creating an SD card with the operating system, then plugged it into my home ethernet network.  I downloaded a terminal program and  used ssh to talk to it, did the various configuration items and finally got it to work in a reasonable fashion.  So, I actually got it working without hooking it to my TV and going out to buy a bluetooth keyboard and mouse.

Yes, the inventors said they wanted people to be able to use the board without having to buy anything, but how many of us have a bluetooth keyboard around the house?  Sure we have a TV, it's mounted in an entertainment center and turning it around to get to the video ports is a pain, but doable.  They have a nice little port on there that can be a serial console, but who has an old fashion terminal anymore?  Besides, it 3.3 volts, not 12 like the old rs232 terminals; you have to buy a special adapter to use it.  And, I'm here to tell you the power supply you use matters ... a lot.  These things use a lot of power for a tiny little computer, and many of the wall warts out there can't do it.  See, not only does it need several hundred milliamps, it needs it highly regulated.  I had one wall wart that would supply 1.5 amps, but the regulation wasn't good enough and when the ethernet came up, it would fail because the voltage dropped a bit before the regulator kicked in and brought it back up.  There was another one that just didn't live up to its rating, a third that wasn't filtered enough.  But, like I said, I have a bunch of them and a couple of them worked just fine; they had enough capacity and were properly regulated and filtered.  See, wall warts are mostly made to charge batteries, not run little power hungry computers.  I'll probably be looking into power supplies for these little things in the future.

But I got it to work.  When you try it, look for 'Raspberry Pi headless' on bing or google and there are a few thousand examples of how to proceed.  None of the examples I found were perfect, but bouncing from one to the other I was able to get it going.

The little device came up, set the time, started all of the processes and just worked.  Now I had to do something with it.  I chose to use python as the language instead of C because I'm getting tired of having to do every little thing necessary and wanted some more sophisticated abilities.  Sure, each of the languages has it's strong point, but might as well start somewhere.  Loaded the libraries I thought I needed and wrote a little code to talk to my two air conditioner thermostats.  IT WORKED FIRST TRY.  Well, not exactly first try, I had some syntax errors and such to fix, but it worked.  Then I wanted to be able to schedule things like polling the thermostats every minute, so I loaded another library and it worked.

The Raspberry Pi Script
import time
from apscheduler.scheduler import Scheduler
import urllib2

NThermoUrl = "HTTP://192.168.0.202"
SThermoUrl = "HTTP://192.168.0.203"

NthermoStatus = []
SthermoStatus = []

def openSite(Url):
        try:
                webHandle = urllib2.urlopen(Url)
        except urllib2.HTTPError, e:
                errorDesc = BaseHTTPServer.BaseHTTPRequestHandler.responses[e.code][0]
                print "Cannot retrieve URL: " + str(e.code) + ": " + errorDesc
                sys.exit(1);
        except urllib2.URLError, e:
                print "cannot retrieve URL: " + e.reason[1]
        except:
                print "Cannot retrieve URL: Unknown error"
                sys.exit (1)
        return webHandle

def getThermoStatus(whichOne):
        if whichOne == "North":
                website = openSite(NThermoUrl + "/status")
        else:
                website = openSite(SThermoUrl + "/status")
        # now read the status that came back from it
        websiteHtml = website.read()
        # After getting the status from the little web server on
        # the arduino thermostat, strip off the trailing cr,lf
        # and separate the values into a list that can
        # be used to tell what is going on
        return  websiteHtml.rstrip().split(",")

def ThermostatStatus():
        print(time.strftime("%A, %B %d at %H:%M"))
        NThermoStatus = getThermoStatus("North")
        print "North reports: " + str(NThermoStatus)

        SThermoStatus = getThermoStatus("South")
        print "South reports: " + str(SThermoStatus)

        print

sched = Scheduler()
sched.start()
sched.add_interval_job(ThermostatStatus, minutes=1)

ThermostatStatus()
while 1:
        time.sleep(1)


Frankly, I'm impressed.  Here's a little bitty board that can be made to do the things I need to do around the house.  Sure, it needs the support of intelligent devices out there to do the actual work, but this little guy can coordinate activity and report things to me over various devices.

I still have a lot of work to do before it can take over the house, I have to get an XBee working on it, experiment with a web server, figure out where to put it and what to put it in, that kind of thing.  I've gotten used to my House controller having flashing lights to tell me things are happening and a special light to let me know something is wrong, so those things have to be considered.  I'll have to experiment with my cloud data store at Xively, I like having my data available there.  It took a long time and a lot of experimentation to get the current house controller working, this will be roughly the same.

However, this little guy has a real operating system on it and can truly multitask.  This could be fun.

Raspberry Pi and XBee

$
0
0
The continuation of this post is here <link>.

So, since I have the little Pi working with the internet, how about getting an XBee hooked to it and start receiving data from my home network?  Well, it isn't hard to hook an XBee up to a Pi; the little device uses 3.3 volts and has an output pin, so I hooked one up.  Four wires later, I had the XBee attached, powered and ready to go.  Of course, the serial port (the only one on the Pi) is already used for a console, but there's (again) a thousand web sites out there that show how to change the init files to allow the serial port to be used directly, so I used one of the examples and freed the serial port for use.

A quick aside here.  In looking around the web, I haven't found a single instance where someone successfully hooked a Pi up to an XBee and did something real with it.  This didn't bode well for me, I want this thing to monitor a network of a dozen XBees and keep the devices they're attached to under control, not print pretty messages on the screen.  It actually looks like a lot of people buy the Pi because it's cool and don't do much besides bring up a little web server on it.

But, the first thing is to see how I can read data from the XBee on the Raspberry Pi.  So, prowling around I found an XBee library for Python and it seems to support everything I need as well as some things I might want to use someday.  I installed it on the Pi and wrote a little test program to see what happens:

The Python Script
#!/usr/bin/python
import serial
from xbee import ZigBee

serial_port = serial.Serial('/dev/ttyAMA0', 9600)

zb = ZigBee(serial_port)

while True:
    try:
        data = zb.wait_read_frame() #Get data for later use
        #print data # for debugging only
        print data['rf_data']

    except KeyboardInterrupt:
        break

serial_port.close()

This actually wasn't as simple as it looks.  This tiny little piece of code took almost all day to get to work and gave me a lot of trouble.  Sure, it looks easy, but that's after failing over and over again.  Most of my problems came from not having a clear example to work from and a horrible dearth of documentation.  As usual, libraries provided by community efforts have minimal documentation, but this was an example of even less than that.  There was also problems with not having any description of the proper XBee settings.  For example, the author wrote that the XBee had to be in API mode, so following the Andrew Rapp library example, I set it to API mode 2.  I got nothing.  That led me to look at the code for the library and it turns out this library uses API mode 1 by default.  Switching to API mode 1 allowed me to actually see data.  And, there were a number of problems like that.

Another one that drove me nuts for an hour or so was that he (the author) sees the XBees as either Series 1 which uses different packets or Series 2 which is ZigBee based (mostly), but requires you to use special classes for each of them  So, if you import like this:

from xbee import XBee

I won't work at all using a series 2 XBee running the various ZigBee software.  It will work with the older software.  That's why the example above has:

from xbee import ZigBee

The author didn't understand that the series 2 devices can run multiple styles and versions of the protocols.

So, I can read XBee packets and pick the data out of them to do something with, but now along comes another problem.  All he supplies is a blocking read.  That's about useless for any project that needs to be able to catch packets and do something else at the same time.  He does have an asyncronous read in the code, but it uses threads to set aside a process that handles the XBee interaction, but since threads can't share data, it would require interprocess communication or queues to pass the data around.

Something like that takes all the usefullness out of an incredibly simple device like the XBee.  If you have to have multiple processes, queues, locks, and such to read a message, no one is going to bother.

I'm still looking at it, but this particular path doesn't show much promise so far.

Raspberry Pi and XBee Asynchronous Operation

$
0
0
In my last post <link> I whined a bit about using the Python XBee library on the Pi; it's complicated, but I made it work.  This is not actually too hard to understand, but it is more complicated that just checking to see if something is out there and then using it.

What I did was use the XBee library's asynchronous call, which creates another thread, to receive the message, then put the message on a queue.  In the original thread, I check the queue and pull off the message, dismantle it and use it.  Actually, this isn't a terrible way to do it, just very different from what I've done before.

Then, I realized that waiting on the queue to have something in it was silly.  I simply checked the queue to see if something was there and if not continued.  If there is something there, I call a routine to handle it.  To test it, I added the Python scheduler to the code and set up a timer so that, every 30 seconds, I send a status request message out to my network, and see if the answer comes back.  This is a feature of my network, not a general thing.  What I did was enable the current Arduino house controller to respond to a very simple message by sending the status of a few devices as a broadcast.  This allows me to have devices that send the query and look at the response to see what's going on.  It also helps by being a source of messages that I could look for.

The code got pretty complex, but I tried to comment the heck out of it so you can see what is going on:

The Raspberry Pi Script
#! /usr/bin/python
# This is an example of asyncronous receive
# What it actually does is fork off a new process
# to do the XBee receive.  This way, the main
# code can go do somthing else and hand waiting
# for the XBee messages to come in to another
# process.

from xbee import ZigBee
from apscheduler.scheduler import Scheduler
import time
import serial
import Queue

# on the Raspberry Pi the serial port is ttyAMA0
PORT = '/dev/ttyAMA0'
BAUD_RATE = 9600

# The XBee addresses I'm dealing with
BROADCAST = '\x00\x00\x00\x00\x00\x00\xff\xff'
UNKNOWN = '\xff\xfe' # This is the 'I don't know' 16 bit address


packets = Queue.Queue()

# Open serial port
ser = serial.Serial(PORT, BAUD_RATE)

# this is a call back function.  When a message
# comes in this function will get the data
def message_received(data):
        packets.put(data, block=False)
        print 'gotta packet'

def sendPacket(where, what):
        # I'm only going to send the absolute minimum.
        zb.send('tx',
                dest_addr_long = where,
                # I always use the 'unknown' value for this
                # it's too much trouble to keep track of two
                # addresses for the device
                dest_addr = UNKNOWN,
                data = what)

# In my house network sending a '?\r' (question mark, carriage
# return) causes the controller to send a packet with some status
# information in it as a broadcast.  As a test, I'll send it and
# the receive above should catch the response.
def sendQueryPacket():
        # I'm broadcasting this message only
        # because it makes it easier for a monitoring
        # XBee to see the packet.  This is a test
        # module, remember?
        print 'sending query packet'
        sendPacket(BROADCAST, '?\r')

# OK, another thread has caught the packet from the XBee network,
# put it on a queue, this process has taken it off the queue and
# passed it to this routine, now we can take it apart and see
# what is going on ... whew!
def handlePacket(data):
        print 'In handlePacket: ',
        print data['id'],
        if data['id'] == 'tx_status':
                print data['deliver_status'].encode('hex')
        elif data['id'] == 'rx':
                print data['rf_data']
        else:
                print 'Unimplemented frame type'


# Create XBee library API object, which spawns a new thread
zb = ZigBee(ser, callback=message_received)

sendsched = Scheduler()
sendsched.start()

# every 30 seconds send a house query packet to the XBee network
sendsched.add_interval_job(sendQueryPacket, seconds=30)

# Do other stuff in the main thread
while True:
        try:
                time.sleep(0.1)
                if packets.qsize() > 0:
                        # got a packet from recv thread
                        # See, the receive thread gets them
                        # puts them on a queue and here is
                        # where I pick them off to use
                        newPacket = packets.get_nowait()
                        # now go dismantle the packet
                        # and use it.
                        handlePacket(newPacket)
        except KeyboardInterrupt:
                break

# halt() must be called before closing the serial
# port in order to ensure proper thread shutdown
zb.halt()
ser.close()


Yes, this also means I can both send and receive XBee messages.  So, I have a scheduled XBee message that can get the response, a separate thread to receive XBee messages and not load down the main thread, the ability to get the data out of the message and use it.  Now, I need to combine that with a web server that can display the results.

One thing that needs to be understood about the Python XBee library:  it only returns good packets.  If there is a collision and the checksum doesn't work, you don't get the message.  If the message is fragmented by noise, you don't get the message.  If anything goes wrong, you don't get the message.  That makes debugging a network problem almost impossible since you can't see anything when things go bad.  So, keep an arduino around to look at stuff or build a sniffer like I did <link> to follow the traffic.  XCTU can help, but remember, if the message is sent to a specific address, it's invisible to the output of a different XBee and XBees don't have a promiscuous mode like Ethernet chips.

I'm going to look at web servers now.

Edit: It took exactly 5 minutes to get a web server running.  I'm starting to like this board.

Raspberry Pi and Xively Part 1

$
0
0
Frankly, I've been disappointed in Xively since it changed from the old Cosm.  Remember, before that it was Pachube.  Once upon a time it was a group of ambitious programmers that were working on a dream, now it's something else; I'm not sure what.  However, I haven't given up on them.  They still haven't completed things like their graphing API that everyone wants to use, but they have expanded their default graphs so I can select the period I want to see.  So, maybe there's still some interest among their developers.  However, I did ask them to come up with a way to port the legacy feeds to their new development system, and they put me in contact with a developer that was supposed to work on it.  This was over a month ago and I haven't heard from him again in weeks.  The single most annoying thing was their removal of the forum they had.  Now, if you have a question you're supposed to go to StackOverflow, and those folks can be rude and condescending.  Sure, I'm not the sharpest tool in the shed, but there are people on there that are just plain jerks.  I absolutely hate asking questions there.

Anyway, I started playing with Python and their site.  Since I might as well try their new interface and various capabilities at the same time, I followed their example and used their library.  The first thing I ran into is that their python library is considered preview so I had to use the '--pre' option to load the darn thing.  Next, they want me to use a virtual environment to develop, since I don't want to learn anything unnecessary at this point, I just ignored that part.  Using the example shown at http://gnublade.github.io/xively-python/tutorials/raspberrypi.html, I put together a module and tried it.  It worked.

However, do not under any circumstances name your source module the same as the library you're trying out.  I named my source module 'xively.py' and it took me a couple of hours to figure out what the heck the problem was.  When I changed it to 'testxively.py', I could take out the hundred or so debug statements that I had scattered all over the place and get on with trying it out.

Right now, I'm trying to combine gathering some data from my XBee network and incorporate it with the Xively example to create a module that can update Xively with real data.  In my pursuit, I had to learn about logging with Python; seems the scheduler module uses system logging if it encounters an error.  The Xively example only updates one data item, so that means I have to learn how to update a whole series of them (ain't no example for that).

So, maybe you understand why I labeled this entry as 'Part 1', I'm not sure how many parts there will be, but this is getting really confusing.


Raspberry Pi and Xively Part 2

$
0
0
Yesterday I was annoyed at Xively after spending hours trying to get their library to work with the limited documentation, well if I had written this earlier today, I would have been totally livid.  I overcame my problem with updating more than one datastream (Xively term) at a time, but it certainly wasnt because their documentation was clear.  Far from it, I found an example of updating a couple of items in a download of their library, BUT IT HAD COMPILE ERRORS.

We've all seen this.  Examples that need libraries they don't mention, code fragments that don't make sense, full blown examples that only illustrate a trivial case, and the ultimate insult, examples that don't compile.  I chased one of those for about an hour before I found the solution.

I did get a python script to work updating more than one datastream to a brand new feed I created on Xively.  Yes, you have to learn a new set of terms to use this stuff.  Armed with this tiny bit of success, I put in the code to gather XBee packets and update some global variables, then push them up to Xively and ran it for an hour or so to watch what happened.  It worked pretty well.  I don't have code in it to gather all the data I want, or a way to store it such that my new web server can present it, but I've got a nice start.

Let's talk about some of the things I've discovered.  The python scheduler works really well and does all the stuff I want.  I can set a routine to run any time I want, even to the point of six months from now at noon.  It's not quite as good at tiny periods, but I don't need that right now.  The Xively library is actually pretty extensive, but the documentation is terrible.  They don't even have a list of classes that are available, they rely on samples that suck instead.  The XBee library doesn't work the way a person coming from an Arduino experience expects it to, it forces you to use threads which makes passing data around harder than expected.  To make up for this, there is a cool library that can queue things up for you so another thread can grab them.  This little queue is really nice and could be used in a lot of different ways.

The HUGE advantage is the raspberry's handling of the internet.  It just works.  No long delays while the ethernet chip makes up its mind to work, you have tons of connections to play with, processing the returned data is a snap since python has an enormous string handling library.  There's so much that can be done there it's amazing.

Here's what I have so far:

The Python Script
#! /usr/bin/python
# This is an example of asyncronous receive
# What it actually does is fork off a new process
# to do the XBee receive.  This way, the main
# code can go do somthing else and hand waiting
# for the XBee messages to come in to another
# process.

from xbee import ZigBee
from apscheduler.scheduler import Scheduler
import logging
import datetime
import time
import serial
import Queue
import xively

#-------------------------------------------------
# on the Raspberry Pi the serial port is ttyAMA0
XBEEPORT = '/dev/ttyAMA0'
XBEEBAUD_RATE = 9600

# The XBee addresses I'm dealing with
BROADCAST = '\x00\x00\x00\x00\x00\x00\xff\xff'
UNKNOWN = '\xff\xfe' # This is the 'I don't know' 16 bit address

# The Xively feed id and API key that is needed
FEED_ID = 'putsomethinghere'
API_KEY = 'and here to'

# Global items that I want to keep track of
CurrentPower = 0
DayMaxPower = 0
DayMinPower = 50000
CurrentOutTemp = 0
DayOutMaxTemp = -50
DayOutMinTemp = 200

#-------------------------------------------------
logging.basicConfig()

#------------ XBee Stuff ------------------------
packets = Queue.Queue() # When I get a packet, I put it on here

# Open serial port for use by the XBee
ser = serial.Serial(XBEEPORT, XBEEBAUD_RATE)

# this is a call back function.  When a message
# comes in this function will get the data
def message_received(data):
        packets.put(data, block=False)
        #print 'gotta packet'

def sendPacket(where, what):
        # I'm only going to send the absolute minimum.
        zb.send('tx',
                dest_addr_long = where,
                # I always use the 'unknown' value for this
                # it's too much trouble to keep track of two
                # addresses for the device
                dest_addr = UNKNOWN,
                data = what)

# In my house network sending a '?\r' (question mark, carriage
# return) causes the controller to send a packet with some status
# information in it as a broadcast.  As a test, I'll send it and
# the receive above should catch the response.
def sendQueryPacket():
        # I'm broadcasting this message only
        # because it makes it easier for a monitoring
        # XBee to see the packet.  This is a test
        # module, remember?
        #print 'sending query packet'
        sendPacket(BROADCAST, '?\r')

# OK, another thread has caught the packet from the XBee network,
# put it on a queue, this process has taken it off the queue and
# passed it to this routine, now we can take it apart and see
# what is going on ... whew!
def handlePacket(data):
        global CurrentPower, DayMaxPower, DayMinPower
        global CurrentOutTemp, DayOutMaxTemp, DayOutMinTemp

        #print data # for debugging so you can see things
        # this packet is returned every time you do a transmit
        # (can be configure out), to tell you that the XBee
        # actually send the darn thing
        if data['id'] == 'tx_status':
                if ord(data['deliver_status']) != 0:
                        print 'Transmit error = ',
                        print data['deliver_status'].encode('hex')
        # The receive packet is the workhorse, all the good stuff
        # happens with this packet.
        elif data['id'] == 'rx':
                rxList = data['rf_data'].split(',')
                if rxList[0] == 'Status':
                        # remember, it's sent as a string by the XBees
                        tmp = int(rxList[1]) # index 1 is current power
                        if tmp > 0:  # Things can happen to cause this
                                # and I don't want to record a zero
                                CurrentPower = tmp
                                DayMaxPower = max(DayMaxPower,tmp)
                                DayMinPower = min(DayMinPower,tmp)
                                tmp = int(rxList[3]) # index 3 is outside temp
                                CurrentOutTemp = tmp
                                DayOutMaxTemp = max(DayOutMaxTemp, tmp)
                                DayOutMinTemp = min(DayOutMinTemp, tmp)
        else:
                print 'Unimplemented XBee frame type'

#-------------------------------------------------

# This little status routine gets run by scheduler
# every 15 seconds
def printHouseData():
        print('Power Data: Current %s, Min %s, Max %s'
                %(CurrentPower, DayMinPower, DayMaxPower))
        print('Outside Temp: Current %s, Min %s, Max %s'
                %(CurrentOutTemp, DayOutMinTemp, DayOutMaxTemp))
        print

# This is where the update to Xively happens
def updateXively():
        print("Updating Xively with value: %s and %s"%(CurrentPower, CurrentOutT
emp))
        print
        now = datetime.datetime.utcnow()
        feed.datastreams = [
                xively.Datastream(id='outside_temp', current_value=CurrentOutTem
p, at=now),
                xively.Datastream(id='power_usage', current_value=CurrentPower,
at=now)
                ]
        feed.update()

#------------------Stuff I schedule to happen -----
sendsched = Scheduler()
sendsched.start()

# every 30 seconds send a house query packet to the XBee network
sendsched.add_interval_job(sendQueryPacket, seconds=30)
# every 15 seconds print the most current power info
sendsched.add_interval_job(printHouseData, seconds=15)
# every minute update the data store on Xively
sendsched.add_interval_job(updateXively, seconds=60)

# Create XBee library API object, which spawns a new thread
zb = ZigBee(ser, callback=message_received)

# Initialize api client
api = xively.XivelyAPIClient(API_KEY)
# and get my feed
feed = api.feeds.get(FEED_ID)

#Do other stuff in the main thread
while True:
        try:
                time.sleep(0.1)
                if packets.qsize() > 0:
                        # got a packet from recv thread
                        # See, the receive thread gets them
                        # puts them on a queue and here is
                        # where I pick them off to use
                        newPacket = packets.get_nowait()
                        # now go dismantle the packet
                        # and use it.
                        handlePacket(newPacket)
        except KeyboardInterrupt:
                break

# halt() must be called before closing the serial
# port in order to ensure proper thread shutdown
zb.halt()
ser.close()


This gives the following output on the raspberry:

Console Output
pi@deserthome:~/src$ python powertoxively.py
Power Data: Current 0, Min 50000, Max 0
Outside Temp: Current 0, Min 200, Max -50

Power Data: Current 639, Min 639, Max 639
Outside Temp: Current 109, Min 109, Max 109

Power Data: Current 639, Min 639, Max 639
Outside Temp: Current 109, Min 109, Max 109

Power Data: Current 636, Min 636, Max 639
Outside Temp: Current 109, Min 109, Max 109

 Updating Xively with value: 636 and 109

Power Data: Current 636, Min 636, Max 639
Outside Temp: Current 109, Min 109, Max 109

Power Data: Current 638, Min 636, Max 639
Outside Temp: Current 109, Min 109, Max 109

Power Data: Current 638, Min 636, Max 639
Outside Temp: Current 109, Min 109, Max 109

Not a huge bunch of impressive stuff, but it illustrates the point.  I put as many comments in there as I could, both to help people understand and to nudge my own memory when I come back to this after doing something else for a while.

What I do is create a thread to catch XBee packets and queue them up to be handled.  In the main thread, I grab the packets off the queue and take them apart, saving a couple of important items in global variables.  I have an event scheduled to print the value of the global variables every 15 seconds and another event scheduled to run every minute and send updates to Xively.  Yes, this is an odd way of doing it, but it's what I already do on my current house controller.  I've found that scheduling things to happen is a much simpler way of handling tasks than anything else I've tried.

There's no internet handling in this module at all.  I will do that next.  As I mentioned before, I have two thermostats that are hooked to my local lan that can take commands and respond; I'll put the code in to query them every so often and save the results.  Since the internet handling in python is so robust, that shouldn't be a problem at all.

The big problem is deciding how to store the house data in such a way that the web server I have running on the Pi can get at it.  Everyone uses a database, but I'm not sure a big hunk of code like that is reasonable for a task like this.  Gotta think about it and experiment a bit before I go that route.

Perseverance, or maybe bull-headedness, has gotten me this far and I truly hope other people that are thinking about doing something like this stumble across this site.  It just might save them some of the headaches I've had.

Raspberry Pi and Xively Part 3

$
0
0
I ran the python script I posted previously overnight <link>.  Here's graphs I picked up from Xively:


The little downward spikes in the temperature chart are an artifact of the way I measure outside temperature <link>.  I have techniques to correct this, but I didn't bother for this experiment.  The power chart is real and corresponded to my regular stuff that is running live.

It was interesting working out how to run the script in background with the terminal disassociated, but like everything else, it turned out to be easy once I figured it out.  Seems the python interpreter doesn't output text when you disassociate the terminal so you can't tell it's working.  That is unless you discover the secret parameter '-u' that makes it flush the output.  The command turned out to be:

nohup python -u scriptname.py > logfile &

Then you can 'tail -f logfile' to see what's happening.  None of this is unusual for *nix systems, it's just annoying to have to do a web search every 2.7 seconds to find out something that isn't obvious.  I'm sure it will get better over time ... maybe.

So, I have a way of grabbing and logging it to the cloud now and I'm testing ideas for temporary storage of the various data for web presentation.  Feels like some progress.

Raspberry Pi and Sqlite3

$
0
0
If you got here through a search engine looking for a tutorial on Sqlite3 on the Raspberry Pi, that's not what this is.  I've been setting up a new Raspberry Pi to control my house and decided to experiment with a data base to store current data in so multiple processes can get at it.  After reading way too much information on various data base managers, I decided to test Sqlite3.  It seems like a reasonable choice for a really small system since it doesn't require a manager process and isn't split across multiple controllers.  Since the interface language is SQL, whatever I wind up doing can be ported to a larger data base tool when I need to.

So, after installing Sqlite3 on my Pi, I spent about two hours cobbling together enough code to try it out.  I chose to monitor my thermostats <link>, since they are ethernet enabled and respond directly to requests, then to store their current state in the data base.  The idea is that I can put up a web page that shows the status of the two thermostats and eventually add controls to change the settings.  Once again, there was a ton of information on the web about how to use Sqlite3, and things went pretty smoothly.

I like this little data base manager.  There's not a huge set up, actually, there wasn't much set up at all.  Just install it, and use it.  As usual, here's the code I came up with:


The python Script
import sys
import time
from apscheduler.scheduler import Scheduler
import urllib2
import sqlite3
import logging


logging.basicConfig()

def openSite(Url):
        try:
                webHandle = urllib2.urlopen(Url)
        except urllib2.HTTPError, e:
                errorDesc = BaseHTTPServer.BaseHTTPRequestHandler.responses[e.code][0]
                print "Cannot retrieve URL: " + str(e.code) + ": " + errorDesc
                sys.exit(1);
        except urllib2.URLError, e:
                print "cannot retrieve URL: " + e.reason[1]
        except:
                print "Cannot retrieve URL: Unknown error"
                sys.exit (1)
        return webHandle

def getThermoStatus(whichOne):
        website = openSite("HTTP://" + whichOne[0] + "/status")
        # now read the status that came back from it
        websiteHtml = website.read()
        # After getting the status from the little web server on
        # the arduino thermostat, strip off the trailing cr,lf
        # and separate the values into a list that can
        # be used to tell what is going on
        return  websiteHtml.rstrip().split(",")

def ThermostatStatus():
        # The scheduler will run this as a separate thread
        # so I have to open and close the database within
        # this routine

        print(time.strftime("%A, %B %d at %H:%M:%S"))
        # open the database and set up the cursor (I don't have a
        # clue why a cursor is needed)
        dbconn = sqlite3.connect('/home/pi/database/desert-home')
        c = dbconn.cursor()
        for whichOne in ['North', 'South']:
                c.execute("select address from thermostats "
                        "where location=?; ", (whichOne,))
                thermoIp = c.fetchone()
                status = getThermoStatus(thermoIp)
                print whichOne + " reports: " + str(status)
                c.execute("update thermostats set 'temp-reading' = ?, "
                        "status = ?, "
                        "'s-temp' = ?, "
                        "'s-mode' = ?, "
                        "'s-fan' = ?, "
                        "peak = ?,"
                        "utime = ?"
                        "where location = ?;",
                        (status[0],status[1],
                        status[2],status[3],
                        status[4],status[5],
                        time.strftime("%A, %B %d at %H:%M:%S"),
                        whichOne))
                dbconn.commit()

        print
        dbconn.close()

# I like things that are scheduled
# that way I don't have to worry about them
# being called, because they take care of themselves
sched = Scheduler()
sched.start()

# schedule reading the thermostats for every minute
sched.add_interval_job(ThermostatStatus, minutes=1)

# This is a priming read to show immediate results
ThermostatStatus()
# all the real work is done by the scheduler
# so the main code can just hang
while 1:
        try:
                time.sleep(1)
        except KeyboardInterrupt:
                break

# I want to look into using atexit for this
sched.shutdown(wait=False)


This works really well to interrogate each thermostat and put the results into a data base.  While I was working on it, I decided to get a little bit fancy and actually put the location and IP addresses of the two thermostats in the data base, retrieving and using them in the communication process.  Now that I know how, I'll have to think about doing the same thing when I adapt this to talk to my little XBee devices.

Once again, I set up the scheduler to cause things to happen.  This led to an interesting discovery, the scheduler starts a new thread to do the work.  I noticed this when Sqlite3 refused a connection because I opened the data base in one thread and then tried to use it in another one.  Rather than being an inconvenience, this is actually great.  I managed to prove that I can pass data from independent processes through the data base.  This will make the eventual loading of data by a web server much easier.

Since that was easy, I decided to put the actual time I last talked to a particular device in the data base as well.  Later, I can look at that time to see if the devices are having a problem.  Knowing a device is doing well is a constant concern when you have a bunch of them that are working independently around the house.

Now, I need to decide if I should work on a web page to display the status of the thermostats or start combining the various pieces of code I've constructed together to monitor the house.  But right this second it started to sprinkle outside.  Since rain is so rare here, me and the dog are going for a walk.

Raspberry Pi and SQLite3 and a Web Server

$
0
0
Remember when I thought it would be easy to get something out of an SQLite3 data base and put it on a screen with a web server?  I don't care how old I get, I'm still way too naive.  This turned out to be a royal pain in the bottom.  It started out when I went looking for a language to do it in.  Since I haven't ever used php, I decided to try it out.  It took me almost three hours of research to figure out what I needed to do to get it to work with the web server, and then another couple of hours to get php to work with SQLite3.  Yes, there were examples on the web, most of them were incoherent, and most of the rest of them were wrong.

What happens is that all the authors assume you know something about what your doing.  Well, that wasn't the case.  They also assume you know about permissions, file locations and such things for ALL the variations of unix out there.  That wasn't the case either.  They try desperately to give enough detail, but too often, leave something crucial out that kills the effort.  To add another level of complication, the unix variants change pretty rapidly, so if you hit a blog post over a few months old, be wary of it; it may not work anymore.  If you don't give up though, you'll find some lonely little post in an obscure forum that has it all reduced down to a simple one line command that actually works.

So, after I got enough stuff installed to actually write code, I had to learn enough php to do something.  I put together a tiny bit of code to open the database and immediately had my first failure.  Did you know that a php module absolutely has to have a file name that ends with .php or it won't work?  I do ... now.  After chasing down that problem, I started adding lines and fixing misunderstandings and stepped through about 8 hours of failure until I actually got up a web page with data I had taken from my thermostats in python, put in a database using sql, retrieved in php, and put up on a web page in html.

It may not look like much, but I'm sure proud of it:


The code to get this out of the data base I described in the previous post looks like this:

The php Module
<?php
# This is about the minimal debugging
# I could find that was easy to use
ini_set('display_errors', 'On');
error_reporting(E_ALL|E_STRICT);

# This is the database open call, it returns a
# database object that has to be use going forward
$db = new SQLite3('/home/pi/database/desert-home');

# you do a query statment that return a strange
# SQLite3Result object that you have to use
# in the fetch statement below
$result = ($db->query('SELECT * FROM thermostats;'));
#print_r($result); # I wanted to see what it actually was

# The fetch call will return a boolean False if it hits
# the end, so why not use it in a while loop?
#
# The fetchArray() call can return an 'associated' array
# that actually means give you back an array of ordered
# pairs with name, value.  This is cool because it means
# I can access the various values by name. Each call to
# fetchArray return one row from the thermostats table
while ($res = $result->fetchArray(SQLITE3_ASSOC)){
        #var_dump($res);
        print ("<strong>" . $res["location"] ." thermostat,</strong><br />");
        print ("Currently: " . $res["status"] . " <br \>");
        print ("Temperature: " . $res["temp-reading"] . " <br \>");
        print ("Settings are: <br \>");
        print ("Mode: " . $res["s-mode"] . " <br \>");
        print ("Temperature: " . $res["s-temp"] . " <br \>");
        print ("Fan: " . $res["s-fan"] . " <br \>");
        print ("<br>");
}
$db->close(); # I opened it, I should close it
?>


Php is an odd language, but it has some real strengths.  At some point I may post about the ones that I noticed and fell in love with, but there are several thousand articles out there that describe its strengths and weaknesses so I'd just be adding chaff if I went into it too much.

So, now I have one device being read, recorded, and monitored.  There are a lot more devices, and I still have to think about controlling them.

I'm starting to understand why more people aren't doing things like this around their own home.

Raspberry Pi, XBee, SQLite3, and a Web Page

$
0
0
It's been raining on and off, so I thought this would be a good chance (excuse) to experiment with reading XBee packets (in python), and saving them in a database.  It would be cool to also send the data out to Xively to be saved, and maybe display them on a web page.  So, I sat down and did it.

Granted, it was a pain switching between python, SQL, and php, but after a while, you kind of get used to it.  I took the XBee example I previously posted that does asynchronous reads of XBee network packets and modified it to save some of the data items to my SQLite3 database.  Then I modified the php web page from a couple of days ago to get the items out of the database and display them.  Things went much faster and easier since I didn't have to install anything, just add things here and there.

That's the way it usually goes; getting started is over 50% of the work.  Then the next 40% moves along pretty well with the last 10% taking forever.

Here's the updated XBee python code to catch stuff and save it in the database.  If you look closely, I started grabbing the packets from the Acid Pump and saving them.  That's my first attempt at grabbing a particular device and saving it.  The status packet I've been grabbing is forwarded by my controller <link> that I've been using for a couple of years and hope to replace.  I'll have to work on the packets being sent by my XBee thermostat <link> that is setting in a housing on a fence post outside next.

The python Script
#! /usr/bin/python
# This is the actual house Monitor Module
#
# I take the techniques tried in other modules and incorporate them
# to gather data around the house and save it in a data base.  The
# data base can be read for presentation in a web page and also
# forwarded to Xively for cloud storage and graphing.
#
# This particular version only reads the XBee network, it
# doesn't go to my internal lan to gather data, that's next
# on my list.
#
# For the XBee network, I fork off a new process
# to do the XBee receive.  This way, the main
# code can go do somthing else and hand waiting
# for the XBee messages to come in to another
# process.

from xbee import ZigBee
from apscheduler.scheduler import Scheduler
import logging
import datetime
import time
import serial
import Queue
import xively
import sqlite3


#-------------------------------------------------
# on the Raspberry Pi the serial port is ttyAMA0
XBEEPORT = '/dev/ttyAMA0'
XBEEBAUD_RATE = 9600

# The XBee addresses I'm dealing with
BROADCAST = '\x00\x00\x00\x00\x00\x00\xff\xff'
UNKNOWN = '\xff\xfe' # This is the 'I don't know' 16 bit address

# The Xively feed id and API key that is needed
FEED_ID = '1428598370'
API_KEY = 'put_your_xively_key_here'

# Global items that I want to keep track of
CurrentPower = 0
DayMaxPower = 0
DayMinPower = 50000
CurrentOutTemp = 0
DayOutMaxTemp = -50
DayOutMinTemp = 200

#-------------------------------------------------
logging.basicConfig()

#------------ XBee Stuff ------------------------
packets = Queue.Queue() # When I get a packet, I put it on here

# Open serial port for use by the XBee
ser = serial.Serial(XBEEPORT, XBEEBAUD_RATE)

# this is a call back function.  When a message
# comes in this function will get the data
def message_received(data):
        packets.put(data, block=False)
        #print 'gotta packet'

def sendPacket(where, what):
        # I'm only going to send the absolute minimum.
        zb.send('tx',
                dest_addr_long = where,
                # I always use the 'unknown' value for this
                # it's too much trouble to keep track of two
                # addresses for the device
                dest_addr = UNKNOWN,
                data = what)

# In my house network sending a '?\r' (question mark, carriage
# return) causes the controller to send a packet with some status
# information in it as a broadcast.  As a test, I'll send it and
# the receive above should catch the response.
def sendQueryPacket():
        # I'm broadcasting this message only
        # because it makes it easier for a monitoring
        # XBee to see the packet.  This allows me to monitor
        # some of the traffic with a regular XBee and not
        # load up the network unnecessarily.
        #print 'sending query packet'
        sendPacket(BROADCAST, '?\r')

# OK, another thread has caught the packet from the XBee network,
# put it on a queue, this process has taken it off the queue and
# passed it to this routine, now we can take it apart and see
# what is going on ... whew!
def handlePacket(data):
        global CurrentPower, DayMaxPower, DayMinPower
        global CurrentOutTemp, DayOutMaxTemp, DayOutMinTemp

        #print data # for debugging so you can see things
        # this packet is returned every time you do a transmit
        # (can be configured out), to tell you that the XBee
        # actually send the darn thing
        if data['id'] == 'tx_status':
                if ord(data['deliver_status']) != 0:
                        print 'Transmit error = ',
                        print data['deliver_status'].encode('hex')
        # The receive packet is the workhorse, all the good stuff
        # happens with this packet.
        elif data['id'] == 'rx':
                rxList = data['rf_data'].split(',')
                if rxList[0] == 'Status':
                        # remember, it's sent as a string by the XBees
                        tmp = int(rxList[1]) # index 1 is current power
                        if tmp > 0:  # Things can happen to cause this
                                # and I don't want to record a zero
                                CurrentPower = tmp
                                DayMaxPower = max(DayMaxPower,tmp)
                                DayMinPower = min(DayMinPower,tmp)
                                tmp = int(rxList[3]) # index 3 is outside temp
                                CurrentOutTemp = tmp
                                DayOutMaxTemp = max(DayOutMaxTemp, tmp)
                                DayOutMinTemp = min(DayOutMinTemp, tmp)
                                dbconn = sqlite3.connect('/home/pi/database/desert-home')
                                c = dbconn.cursor()
                                # do stuff
                                c.execute("update housestatus "
                                        "set curentpower = ?, "
                                        "daymaxpower = ?,"
                                        "dayminpower = ?,"
                                        "currentouttemp = ?,"
                                        "dayoutmaxtemp = ?,"
                                        "dayoutmintemp = ?,"
                                        "utime = ?;",
                                        (CurrentPower, DayMaxPower, DayMinPower,
                                        CurrentOutTemp, DayOutMaxTemp,
                                        DayOutMinTemp,
                                        time.strftime("%A, %B, %d at %H:%M:%S")))
                                dbconn.commit()
                                dbconn.close()


                elif rxList[0] == 'AcidPump':
                        # This is the Acid Pump Status packet
                        # it has 'AcidPump,time_t,status,level,#times_sent_message
                        # I only want to save status, level, and the last
                        # time it reported in to the database for now
                        dbconn = sqlite3.connect('/home/pi/database/desert-home')
                        c = dbconn.cursor()
                        c.execute("update acidpump set status = ?, "
                                "'level' = ?,"
                                "utime = ?;",
                                (rxList[2], rxList[3],
                                time.strftime("%A, %B, %d at %H:%M:%S")))
                        dbconn.commit()
                        dbconn.close()
                else:
                        #print ("can\'t handle " + rxList[0] + ' yet')
                        pass
        else:
                print ('Unimplemented XBee frame type' + data['id'])

#-------------------------------------------------

# This little status routine gets run by scheduler
# every 15 seconds
def printHouseData():
        print('Power Data: Current %s, Min %s, Max %s'
                %(CurrentPower, DayMinPower, DayMaxPower))
        print('Outside Temp: Current %s, Min %s, Max %s'
                %(CurrentOutTemp, DayOutMinTemp, DayOutMaxTemp))
        print

# This is where the update to Xively happens
def updateXively():
        print("Updating Xively with value: %s and %s"%(CurrentPower, CurrentOutTemp))
        print
        # Currently I have to use UTC for the time,
        # there's a bug somewhere in the library or
        # Xively.  It doesn't matter though because
        # it's easy to convert
        now = datetime.datetime.utcnow()
        # open the database
        dbconn = sqlite3.connect('/home/pi/database/desert-home')
        c = dbconn.cursor()
        # Yes, there are better ways to do the stuff below,
        # but I wanted to use a single statement to get it
        # from the data base an update the field going to
        # Xively.  It turns out that is is a rather odd
        # looking statement, but it works.
        feed.datastreams = [
                xively.Datastream(id='outside_temp',
                        current_value = c.execute(
                                "select currentouttemp from housestatus")
                                .fetchone(),
                        at=now),
                xively.Datastream(id='power_usage',
                        current_value = c.execute(
                                "select curentpower from housestatus")
                                .fetchone(),
                        at=now)
                ]
        dbconn.close() # close the data base
        feed.update()  # and update Xively with the latest

#------------------Stuff I schedule to happen -----
sendsched = Scheduler()
sendsched.start()

# every 30 seconds send a house query packet to the XBee network
sendsched.add_interval_job(sendQueryPacket, seconds=30)
# every 15 seconds print the most current power info
sendsched.add_interval_job(printHouseData, seconds=15)
# every minute update the data store on Xively
sendsched.add_interval_job(updateXively, seconds=60)

# Create XBee library API object, which spawns a new thread
zb = ZigBee(ser, callback=message_received)

# Initialize api client
api = xively.XivelyAPIClient(API_KEY)
# and get my feed
feed = api.feeds.get(FEED_ID)

#This is the main thread.  Since most of the real work is done by
# scheduled tasks, this code checks to see if packets have been
# captured and calls the packet decoder
while True:
        try:
                time.sleep(0.1)
                if packets.qsize() > 0:
                        # got a packet from recv thread
                        # See, the receive thread gets them
                        # puts them on a queue and here is
                        # where I pick them off to use
                        newPacket = packets.get_nowait()
                        # now go dismantle the packet
                        # and use it.
                        handlePacket(newPacket)
        except KeyboardInterrupt:
                break

# halt() must be called before closing the serial
# port in order to ensure proper thread shutdown
zb.halt()
ser.close()


This has my most recent discoveries regarding python and its interface to the SQLite3 database.  It took me a while to figure out how to grab a single item out of the data base, but it's illustrated above.  Of course I updated the web page I'm working on to show off the current power usage and outside temperature that is forwarded by my controller.  Here's the modified web page:

The PHP Web Page
# This is about the minimal debugging
# I could find that was easy to use
ini_set('display_errors', 'On');
error_reporting(E_ALL|E_STRICT);

# This is the database open call, it returns a
# database object that has to be use going forward
$db = new SQLite3('/home/pi/database/desert-home');
#
# I'm going to get the power usage and outside temperature
# from the database and display it.  I use the querySingle()
# call because I haven't messed with it yet.  I could have gotten
# the entire record and just displayed the parts I needed instead
$power = $db->querySingle("select curentpower from housestatus;");
$outtemp = $db->querySingle("select currentouttemp from housestatus;");
print ("<Strong>Current Power usage: ". $power . " Watts<br>");
print ("Current Outside Temperature: " . $outtemp . "&deg;F</strong><br><br>");

# you do a query statment that return a strange
# SQLite3Result object that you have to use
# in the fetch statement below
$result = ($db->query('SELECT * FROM thermostats;'));
#print_r($result); # I wanted to see what it actually was
#
# The fetch call will return a boolean False if it hits
# the end, so why not use it in a while loop?
#
# The fetchArray() call can return an 'associated' array
# that actually means give you back an array of ordered
# pairs with name, value.  This is cool because it means
# I can access the various values by name. Each call to
# fetchArray return one row from the thermostats table
while ($res = $result->fetchArray(SQLITE3_ASSOC)){
        #var_dump($res);
        print ("<strong>" . $res["location"] ." thermostat,</strong><br />");
        print ("Currently: " . $res["status"] . " <br \>");
        print ("Temperature: " . $res["temp-reading"] . "&deg; <br \>");
        print ("Settings are: <br \>");
        print ("Mode: " . $res["s-mode"] . " <br \>");
        print ("Temperature: " . $res["s-temp"] . "&deg; <br \>");
        print ("Fan: " . $res["s-fan"] . " <br \>");
        print ("<br>");
}
$db->close(); # I opened it, I should close it
?>

There's not a whole lot I can point out that isn't already in the comments above.  Here's the web page as the browser displays it:


Yep, I've discovered how to put up a degree symbol.  Now, I have two processes that run in background all the time updating the database with different items.  That's a bit silly, so I'm going to combine them into one piece of code that goes over my lan and talks to the web devices as well as monitoring the XBee network for updates.  That way I only have one thing to make sure is running all the time.

And yes, the temperature up there is correct.  The rainstorm has lowered the temperature enough that I have all the doors open that don't get rained into.  Time to air out the house.

Raspberry Pi, Now it's actually getting somewhere

$
0
0
So, when I started trying to put things together, it got so darn complex that I changed a ton of stuff to make it somewhat simpler, at least for me.  Here's the current display:


The buttons don't do anything except show an alert to prove they do something, I still have a number of devices that I have to include, and there's a ton of stuff to pretty it up and put in some presets I've grown used to.  I currently have presets for things like A/C off, summer night settings, winter night settings, that kind of thing.

But, isn't it cool ?

All the data is live and real, including the SteelSeries gauges up at the top.  Sorry, it isn't on the internet yet, I have to wait until I get it finished enough to turn off the old controller before I can put it online.  I broke the web interface into two components, one that presents the web page and the other that sucks things out of the data base and hands them off to the web interface.  The code to create a complete screen was just getting out of hand, so I split it apart.

This gave me the opportunity to try out tables and colors and things, as well as experiment with getting data out of the database to construct a JSON response.  This actually turned out to be relatively easy, but it drove me nuts thinking of variable names.  I got to mess with jquery() which turns out to be quite a workhorse and makes things even simpler.

I'm starting to get tired of reading tutorials on various things.  But the single most annoying thing is the difference between comment delimiters in the various languages.  Heck, you have to use multiple different kinds in a single web page ... silly.

I'll do a post on the overall architecture of the system soon, I want to conquer the buttons and build in the rest of the devices so I can replace the current controller.  With the way things have been going I'll probably change the architecture two or three times before it's done.

With the cost of tablets dropping, my ultimate plan is to have a couple around the house that are used to control things.  Y'know, one of them by the bed so I don't have to get up to change something.  The ultimate remote control.

Heck, I may hook the barbecue up to it someday.

Raspberry Pi, Getting Control Of The Devices

$
0
0
Still making progress on this thing. There's a little pressure to get it finished since I want it to take over the house control functions. I managed to get the buttons working to control the thermostats and next is my first XBee device, and it is going to be painful. I haven't found or experimented with a php library for controlling the XBees so back to searching the web for clues. Meanwhile, I was having a lot of trouble getting the various syntax correct and was also having some trouble with the command to the devices. I needed something on the web page to tell me what was actually going on that could maybe become a monitor of the network internals. I sneaked the code that I use to put up the code boxes on this blog and shoved it into the controller web page. It worked like a charm; I was even able find a way to force it to scroll to the bottom as new items came in. This could turn out to be a great way of monitoring actual conversations between devices later when the entire device set is hooked in. Here's an example of it in operation:


Recognize the blue box?  It's got a bunch of debug on it from me hooking in the Acid Pump, but that particular piece doesn't work yet since it's XBee controlled.  It's actually pretty easy to hide things in web pages, so I may hide it and put a button up to show it when needed.  There's actually no limit to what kinds of things I can do now that I have a real web server running to present the data.

I still haven't settled on the entire architecture.  I have a background process that collects data from around the house and shoves it into a database.  Then the web page interacts with a php file to suck the data back out.  All the buttons connect to another php file that translates the clicks into commands that are tailored for each particular device since they are all different from each other.  Yes, I screwed up and made them different.  But as I went along, I learned things and did them differently as each new device came online.  Actually, the separation from data display and command makes it easier to add in a device.  I construct a basic page, implement the commands, and go back and play with presentation.  Since presentation is a never ending task, it's nice to have it isolated.

Raspberry Pi Home Automation Process Architecture

$
0
0
Don't you dare make fun of the title of this post.  Yes, it's a bit presumptuous, but I couldn't think of anything else.  After a lot of experimenting and trying to overcome various problems, I finally settled on an architecture of the various devices and controls that I think can work over time.

Let's talk about the XBee network I have as an example.  I have a network of XBees that monitor and control things around the house and this has to hook into the Pi.  So, in an earlier post, I discuss how easy to hook into the Pi it was.  Now, that I've used up the only serial port on the Pi, I ran into a problem.  Only one process can reliably hook to the Pi's serial port and communicate both ways.  If you have two processes talking to one port, bad things happen like overlaying data, seeing a character come in and not being able to get it because some other process got to it first; that kind of thing.  This isn't like an arduino where there is only one piece of code that needs this stuff, there's a number of them. So, what to do?  Functionally, the entire process interaction looks like this:


The House Monitor box is a process that runs all the time and constantly monitors the XBee network as well as periodically interrogating the ethernet devices.  When data comes in, it updates the database with the current readings.  It is the sole interface to the XBee network.  To get commands from the internet to do something like open a garage door, the web page sends a jquery() PUT to the command handler who forwards it (translating on the way) to the House Monitor.  The house monitor forwards it along to the XBee network.  In due course, the device will receive the command, do whatever, and send back a status that is recorded in the database.

If I load one of the presentation pages, it will send a jquery() GET to the Data Provider code that will suck the data out of the database and hand it back to the page for display on a user's browser.  The web pages have a timer running on them that refreshes the data every so often so it can be used as a monitor of what's going on.  For example, the swimming pool does several things on its own, like turn the motor on high in the morning, drop it to low for a while, then finally turn it off entirely during the peak period.  Each of these things is caught by the House Monitor process and recorded as the current state in the data base.  So a presentation page will update and show that the motor did its thing.  If I open the garage door, the garage controller will report it and House Monitor will stick that in the data base too.  Basically, the database is the holder of all the current status of devices, including the current power usage.

Yes, this means there is a small delay between commanding something to happen and seeing the result of it on the display, but that's not important to me.  There's always a delay between sending a signal to a real device and seeing the result.  

Doing things this way, I can make an interface inside the house that talks directly to the House Monitor and it can control stuff with no delay (other that what's required to do whatever it is).  The idea is that the House Monitor process is the thing in actual control, other software and devices talk to it.

I separated the presentation from the data gathering so I can change an HTML page anytime I want to without worrying about how the data handling will have to change to support it.  This way I can mess with tables, pictures, colors and stuff without accidentally screwing up the data handling.  It also hides the database entirely away from the web, so certain hacking techniques can't harm it.  It also means that I can have multiple presentation pages without having to duplicate the way I grab stuff out of the data base.

I also got ALL the devices I update to the various cloud services talking to the Pi.  I have updated my account on Xively to the latest stuff they came out with and am recording the actual readings there every minute or so.  I may lose the old data, since they still haven't given me any way to move it into the new system.  Or, I may just write a python script to suck the data out and put it back into the new feeds.  It would only take a few million transactions to get it here and put it there.  He he.  

Code?  You want me to show you the Code?  Actually, there's nothing secret about it and I'm totally willing to share, but I'm going to hold off a few more days until I can remove the old controller from service and reuse the page it is described on.  

Physically, the Pi is naked, no box, no special lights on it, and hooked to a breadboarded XBee.  Nope, I haven't even started looking for something to put it in, or decided if I want it to have a display.  It would be nice to have a display that shows something, but there are so many possibilities that I'm kind of stuck deciding what to do.   The odd way they constructed the little board doesn't lend itself to any simple ideas.  It'll work out in time; I may just use a cardboard box.

Here's the latest web page presentation:


Yes, the gauges reflect the very latest readings from the sensors, the boxes show the real state of the various devices, and the buttons work.  It isn't very pretty, but I wasn't working on pretty.  Actually, I made all the boxes float so they display in different places depending on the resolution of the browser.  I can see all of them on my phone by just scrolling around.  This part will be constantly modified as I think of something to try out.  It's just a presentation page and I don't have to change anything else to make it look completely different.   Oh, the little diag button up there will expand into a scrolling screen of debug information; makes it nice to see some of the underlying communication when I need to.

Less than an amp of power and roughly fifty buck so far.  Go ahead, try and beat it.

Raspberry Pi and ThingSpeak

$
0
0
As I was continuing to program my Raspberry Pi, it came to me that I don't just update Xively, I also send data to ThingSpeak and emoncms.  They're both good services and I don't really want to drop them.  My previous post on ThingSpeak <link> was fun, so I thought about it a bit and decided to take that service on first.

One thing I thought of is that with the architecture I chose for the processes and data base, the updating of an external (or even internal) service could be a separate process that gets readings from the data base and forwards them off.  So, first I separated the Xively code into a separate process and tested the idea.  It worked like a charm.  So, I created a process for doing the same thing for ThingSpeak.

It followed the model of: not enough examples, not enough documentation in the examples that did exist, and nothing that actually fit what I wanted to do.  But, since I'm getting a little more buzz-word and jargon savvy, this effort was about a morning instead of days.

So, the python script below sucks items out of the database, formats, and forwards them to ThingSpeak.  I used the same tactic that I've found makes my job much, much easier, scheduling.  If you glance at the code, all the main processing does is set up a few things, schedule a task every minute, then hang in a loop.  Every minute, it wakes up a routine that does all the work.  That's all there is to it.  Unlike many of the examples out there, this code is running right this second on my little Pi and doing a fine job of sending the data off.

It runs in the background like the code that monitors the house and records the data in my data base.  I just invoke it by "nohup python -u updatethingspeak.py >logfile &" (leave out the quotes of course), and it just works its little heart out.  I'll eventually look into automating the bring up of the various pieces so that I don't have to bother with it after a reboot or power failure, but for now, doing it by hand is fine.

As usual, it is HIGHLY commented.  I hate examples the cause you to search the internet for hours to get any understanding of what is going on.  I also used the simplest statements I could while still leveraging the power of python.  It's funny that python was designed to be self-documenting and easy to read, but excels at being obtuse and strange.  That's almost entirely because people have chosen to use it that way.  My thinking is that I'm going to come back to this code in a year or two and have to remember what the heck I did and why.  When you do something similar, keep that in mind.

The Python Script
#! /usr/bin/python
from apscheduler.scheduler import Scheduler
import datetime
import logging
import time
import sqlite3
import httplib, urllib

# The feed id and API key that is needed
thingSpeakKey = "putsomethinginhere"
# the database where I'm storing stuff
DATABASE='/home/pi/database/desert-home'
# This is where the update to ThingSpeak happens
def updateThingSpeak():
#print "Updating ThingSpeak ", time.strftime("%A, %B %d at %H:%M:%S")
# open the database
dbconn = sqlite3.connect(DATABASE)
c = dbconn.cursor()
# Getting it out of the database a field at a time
# is probably less efficient than getting the whole record,
# but it works.  
#
# On ThingSpeak I only update real power, power factor, voltage,
# frequency, outside temperature and inside temperature
# so I'm simply going to put the values in variables instead of
# some complex (and probably faster) compound statement.
outsideTemp = c.execute(
"select currenttemp from xbeetemp").fetchone()[0]
# This a really cool thing about some languages
# the variable types are dynamic, so I can just change it
# from a string to a int on the fly.
outsideTemp = int(float(outsideTemp) +.5)
power = c.execute(
"select rpower from power").fetchone()[0]
power = int(float(power)+.5)
voltage = c.execute(
"select voltage from power").fetchone()[0]
voltage = int(float(voltage)+.5)
apparentPower = c.execute(
"select apower from power").fetchone()[0]
apparentPower = float(apparentPower)
current = c.execute(
"select current from power").fetchone()[0]
current = int(float(current)+.5)
frequency = c.execute(
"select frequency from power").fetchone()[0]
frequency = float(frequency)
powerFactor = c.execute(
"select pfactor from power").fetchone()[0]
powerFactor = float(powerFactor)
insideTemp = c.execute(
"select avg(\"temp-reading\") from thermostats").fetchone()[0]
insideTemp = int(float(insideTemp)+.5)
# OK, got all the stuff I want to update
dbconn.close() # close the data base
#
# This is a debug statement that I put in to show
# not only what the values were, but also how they
# can be formatted.
# print ("Power = %d \nVoltage = %d \nApparent Power = %d "
# "\nCurrent = %d \nFrequency %.2f \nPower Factor = %.2f "
# "\nOutside Temp = %d \nInside Temp = %d" %
# (power, voltage, apparentPower, current,
# frequency, powerFactor, outsideTemp, insideTemp))

# OK, now I've got all the data I want to record on ThingSpeak
# So, I have to get involved with that thing called a REST interface
# It's actually not too bad, it's just the way people pass 
# data to a web page, you see it all the time if you watch
# how the URL on your browser changes.
#
# urlencode takes a python tuple as input, but I just create it
# as a parameter so you can see what it really is.
params = urllib.urlencode({'field1': power, 'field2':powerFactor,
'field3':voltage, 'field4':frequency, 
'field5':insideTemp, 'field6':outsideTemp,
'key':thingSpeakKey})
# if you want to see the result of the url encode, just uncomment
# the line below.  This stuff gets confusing, so give it a try
#print params
#
# Now, just send it off as a POST to ThingSpeak
headers = {"Content-type": "application/x-www-form-urlencoded", "Accept": "text/plain"}
conn = httplib.HTTPConnection("api.thingspeak.com:80")
conn.request("POST", "/update", params, headers)
response = conn.getresponse()
#print "Thingspeak Response:", response.status, response.reason
# I only check for the 'OK' in the reason field.  That's 
# so I can print a failure to any log file I happen to set 
# up.  I don't want to print a lot of stuff that I have to
# manage somehow.
if (response.reason != 'OK'):
print "Problem, ", response.status, response.reason

conn.close

# This is where the main code begins.  Notice how basically nothing
# happens here?  I simply show a sign on message, set up logging, and
# start a scheduled task to actually do the work.
print "started at ", time.strftime("%A, %B, %d at %H:%M:%S")
logging.basicConfig()

#------------------Stuff I schedule to happen -----
scheditem = Scheduler()
scheditem.start()
# every minute update the data store on ThingSpeak
scheditem.add_interval_job(updateThingSpeak, seconds=60)

while True:
time.sleep(20) #This doesn't matter much since it is schedule driven


Yep, that's all there is to it.  Using this technique, I can add or subtract services really easily.  If I run across something that catches my eye, it'll only be a little while before I'm testing it.

Have fun.

Raspberry Pi and emoncms

$
0
0
In the previous post <link> I described how I set up a separate process for logging to the cloud servers at ThingSpeak and Xively.  This removes the code from my house monitor process and makes changing or adding things much easier.  So, I had two more to do, my legacy feeds on Xively and the datastore on emoncms.  I just finished the emoncms process last night.  It was relatively easy except for one tiny item.  The emoncms server almost never returns an error.  I messed up the feed over and over again getting it to work and the server never returned an error.  So, I had to run the process, watch the site for updates and hope it updated fast enough for me to tell if it was getting through.

I got it working and ran it over night to be sure it was going to hold up.  The code looks very, very much like the code for updating ThingSpeak, so there aren't too many surprises.  Once again, this is a standalone process that takes items from a simple database and forwards them to emoncms for archival.

The Python Script
#! /usr/bin/python
from apscheduler.scheduler import Scheduler
import datetime
import logging
import time
import sqlite3
import httplib, urllib

# The API key that is needed
EMONKEY = "PUTSOMETHINGINHERE"
# the database where I'm storing stuff
DATABASE='/home/pi/database/desert-home'
# This is where the update to ThingSpeak happens
def updateEmonCms():
#print "Updating emoncms ", time.strftime("%A, %B %d at %H:%M:%S")
# open the database
dbconn = sqlite3.connect(DATABASE)
c = dbconn.cursor()
# Getting it out of the database a field at a time
# is probably less efficient than getting the whole record,
# but it works.  
#
# On ThingSpeak I only update real power, power factor, voltage,
# frequency, outside temperature and inside temperature
# so I'm simply going to put the values in variables instead of
# some complex (and probably faster) compound statement.
outsideTemp = c.execute(
"select currenttemp from xbeetemp").fetchone()[0]
# This a really cool thing about some languages
# the variable types are dynamic, so I can just change it
# from a string to a int on the fly.
outsideTemp = int(float(outsideTemp) +.5)
power = c.execute(
"select rpower from power").fetchone()[0]
power = int(float(power)+.5)
voltage = c.execute(
"select voltage from power").fetchone()[0]
voltage = int(float(voltage)+.5)
apparentPower = c.execute(
"select apower from power").fetchone()[0]
apparentPower = float(apparentPower)
current = c.execute(
"select current from power").fetchone()[0]
current = int(float(current)+.5)
frequency = c.execute(
"select frequency from power").fetchone()[0]
frequency = float(frequency)
powerFactor = c.execute(
"select pfactor from power").fetchone()[0]
powerFactor = float(powerFactor)
insideTemp = c.execute(
"select avg(\"temp-reading\") from thermostats").fetchone()[0]
insideTemp = int(float(insideTemp)+.5)
# OK, got all the stuff I want to update
dbconn.close() # close the data base
#
# This is a debug statement that I put in to show
# not only what the values were, but also how they
# can be formatted.
# print ("Power = %d \nVoltage = %d \nApparent Power = %d "
# "\nCurrent = %d \nFrequency %.2f \nPower Factor = %.2f "
# "\nOutside Temp = %d \nInside Temp = %d" %
# (power, voltage, apparentPower, current,
# frequency, powerFactor, outsideTemp, insideTemp))

# OK, now I've got all the data I want to record on emoncms
# so I have to put it in json form.  json isn't that hard if you 
# don't have multiple levels, so I'll just do it with a string
# format.  It's just a set of ordered pairs for this.
params = ("RealPower:%d,PowerFactor:%.2f,"
"PowerVoltage:%d,PowerFrequency:%.2f,"
"InsideTemp:%d,OutsideTemp:%d" %
(power,powerFactor,voltage,frequency,insideTemp,
outsideTemp))
# if you want to see the result of the formatting, just uncomment
# the line below.  This stuff gets confusing, so give it a try
#print params
#
# Now, just send it off to emoncms
conn = httplib.HTTPConnection("emoncms.org:80")
request = "/input/post?apikey=" + EMONKEY + "&" + "json=" + params
#print request
# emoncms uses a GET not a POST
conn.request("GET", request)
response = conn.getresponse()
#print "emoncms Response:", response.status, response.reason
# I only check for the 'OK' in the reason field.  That's 
# so I can print a failure to any log file I happen to set 
# up.  I don't want to print a lot of stuff that I have to
# manage somehow.  However, emoncms seldom returns an error,
# I messed this interaction up a number of times and never got
# an error.  
if (response.reason != 'OK'):
print "Problem, ", response.status, response.reason
# conn.close

# This is where the main code begins.  Notice how basically nothing
# happens here?  I simply show a sign on message, set up logging, and
# start a scheduled task to actually do the work.
print "started at ", time.strftime("%A, %B, %d at %H:%M:%S")
logging.basicConfig()

#------------------Stuff I schedule to happen -----
scheditem = Scheduler()
scheditem.start()
# every minute update the data store on ThingSpeak
scheditem.add_interval_job(updateEmonCms, seconds=60)

while True:
time.sleep(20) #This doesn't matter much since it is schedule driven


As usual, it's way over commented.  I think there's a lot more lines of comment than lines of code in these things.

Now, I have only one more of these to build, the one to keep my legacy feeds at Xively up to date.  I want to keep them running when I turn off the old controller so I can transfer the data from them to the new Xively interface.  Then, I'm going to take a look at a new service that just came on line a little while back to see how they work.

Cloud services are the way of the future IMHO.

Sending Mail From a Raspberry PI

$
0
0
Remember, one of the reasons I got the Pi was to be able to send mail to myself when something bad happened?  Finally got around to looking at that.  What a royal red pain in the butt!  First, regular mail is a configuration nightmare that I gave up on after looking at it for a while.  Then, I went looking for a simple client that could be scripted or had some kind of api I could use...no luck.  Then I ran across two pretty good possibilities: sendEmail  and smtp.cli.  So, googling around I decided to try sendEmail because it had some really good posts out there.

I installed it using apt-get and gave it a shot: failed.  There was a message about not having the right TCP libraries, so I went to the author's site to see what to do and he said to install a couple of perl libraries and give it another shot.  Did some searching and there's an installation tool in perl that could do it, so I tried it.  About an hour later after a ton of compiles, and a hundred miles of console output, I tried it again:  it failed.  OK, FINE, maybe the version of perl isn't the right one, I'll just upgrade perl to get the latest.  apt-get upgrade perl... about an hour of wasted time watching the console being worked to death and got to try it again: it failed.

Now, I'm getting seriously annoyed, so I hunted some more.  Found someone that had exactly the same set of problems which isn't surprising since this is a vanilla Pi (pun intended) and somebody out there must have had the problem.  There were solutions in this post that actually made much more sense than the mess I had gotten myself into.  So, I followed the suggestions, and of course, they were wrong for my particular system.

It wasn't the author's fault, he did the best he could describing it, but I had to allow for different versions of perl, slight directory differences, different source files for things, etc.  I finally found the offending file and edited it to fix the bug and ... wait for it ... it worked.

Remember, your file locations and the actual line that is messed up will vary by location, so if you try this, use this as a guide, not a tutorial.

After installing sendEmail using "sudo apt-get install sendEmail", I ran it with this line:

sendEmail -f "myname@gmail.com" -t "myname@gmail.com" -u "this is a subject" -m "inside the message" -s "smtp.gmail.com":587 -o tls=yes -xu "meagain" -xp "mypassword"

And sendEmail nicely told me:

sendEmail[14590]: ERROR => No TLS support!  SendEmail can't load required libraries. (try installing Net::SSLeay and IO::Socket::SSL)

First, what the heck kind of message is that?  What the heck language is it written in?  Ancient sanscrit makes much more sense to look at than this thing.  So, I searched for the terms and found out that perl has a tool to get updates.  This is where I tried using perl to install the packages and sat forever watching it install, test, print reports, and generally mess around wasting my time.  But, it finished, so I tried it again:  same result.  So, I upgraded perl.  If you're not using perl, don't do this.  It takes forever.  And it didn't help a bit.

This is where I found the post that was actually some help <link>, so I used good ol' apt-get to load the darn libraries that were needed:

sudo apt-get install libnet-ssleay-perl libio-socket-ssl-perl

Which only took a a couple of minutes, and I was back to trying the sendEmail command again.  It failed, but the message was different this time:

invalid SSL_version specified at /usr/local/share/perl/5.14.2/IO/Socket/SSL.pm line 418

I guess you can call this progress.  At least it makes a little more sense and at looks a little different from random characters, and was talked about in the post I mentioned above.  So, I hunted for the SSL.pm file (it was right where the message above said it would be) and edited it.  I found the line that the post talks about around line 1640 or so and changed it.  Now, I'm ready to try it again:

 Email was sent successfully!

Whew!  Now, go check my mail; it was there in all its glory.

I didn't even get to trying smtp.cli darn it.  However after spending so much time getting this to work, I think I'll let someone else do that for me.

Look at the Thermostat I found at Home Depot

$
0
0
A friend of mine was looking for a new thermostat.  He's not a techie, and didn't want to build his own so he was prowling around Home Depot seeing what they had.  He called me from the store and told me they were selling my thermostat <link>.

Of course they weren't, but it was really flattering when I took a look at what he was talking about.  Here is the Honeywell model RTH221B.  It's a basic programmable thermostat that'll do a nice job for someone, but the resemblance to mine is COOL.


Here's mine, on the wall working away.


Of course I don't have a crew of engineers to fine tune the location and shape of the buttons.  Another crew that contracts out the fabrication. Or, even another crew that does silkscreen work on the front, so theirs is classier.

Of course, theirs isn't infinitely internally programmable, ethernet connected, adapted for peak demand billing, or have a backlit high contrast display.

Hey, I'm flattered.
Viewing all 218 articles
Browse latest View live