Raspberry Pi Week : Guestblog : Serve Your Message With a Slice of Pi

May 27

This weeks guestblog is brought to us by Daniel Messer, aka the Cyberpunk librarian. Find out more about Daniel, his podcast and his awesome website over at Cyberpunk librarian.Com


1740959_1558668224413538_2131227828_n

Digital signage is a passion of mine which is odd because, for the most part, I hate advertising. When you think “digital signs” you have to think about advertising because the two go hand and hand, right? You see them all over the place from your local big box store where they use monitors on end-caps to sell you stuff you don’t need to the trendy Apple store where they’ll use an iPad Air 2 as a digital sign.

 

That’s a baseline US$500 device sitting there. It kicked off the tablet revolution and ushered in a so-called “Post PC Era.” And there it is, bolted to a table, telling you why you’d want to buy a PC. That’s irony so thick you could spread it on toast.

But it doesn’t have to be this way.

As a public library webmaster, one of my jobs is managing the digital signs in our branches. I don’t create the content so much anymore, but I handle the tech side when needed. These things run off of small, dedicated PCs running Windows 7. Their administrative interface is lousy, the PCs are overkill as they’re rolling glorified PowerPoint presentations, and the big screen monitor and PCs kick out enough heat to keep you warm in the winter.

For the record, I live in Phoenix. We really don’t have a winter here.

When I first got my hands on a Pi, I knew this would be better for getting the library’s messages out on digital signs. They’re tiny, produce little heat, could be velcroed to the back of a monitor, and they run on free software. Diving into different software packages I played withScreenly OSE, Concerto, RiseVision, and others before landing on something I really liked — an idea I got at an airport.

The Phoenix Mesa Gateway Airport, aka AZA, uses Pis for almost all of their digital signs from the ones dropping ads to the ones that tell you when your flight departs and from what gate. Even the monitors beside the gates use Raspberry Pis to display the gate number, flight number, destination, and so on. I talked with a couple of their IT staff to find out what they used and the answer was surprising and refreshing.

They use a web browser. When you look at those signs, you’re simply looking at a website, in a browser, in full screen mode. The website refreshes itself every so often, and there’s a quick blank screen while it does this, but then you’re presented with the latest information.

I went back to my desk and got to work.

An old server bound for surplus found new life in our racks with an Ubuntu Server installation. I knew I’d have to do things slightly different as our library district covers a huge area while the airport covers a single large building. Instead of using the Pis to call a website, I’d have them bring up web content stored locally, but synced from the server.

I installed Chromium on the Pis because I like how easily you can feed it switches through a command line. That’ll be useful later. I also need to make sure that the screens don’t go blank or into power-save mode. Turns out there are a couple of ways to worry about this, but an easy way to handle it is to simply install XScreenSaver and then disable it.

My Pi OS of choice is Raspbian, which uses LXDE as its desktop environment. That’s excellent because to make all the necessary changes I just need to edit a couple of files, maybe three if you’re running the latest version. Opening up LXTerminal and then running

sudo nano /etc/xdg/lxsession/LXDE/autostart

I added the following lines:

@xset s off

@xset -dpms

@xset s noblank

@chromium --kiosk --incognito /home/pi/display/index.html

(Depending on your version of Raspbian, you might need to put the @chromium line in

/etc/xdg/lxsession/LXDE-pi/autostart

The other three can go in the first file.)

The first three lines disable all the screen blanking stuff that Raspbian would normally do, which forces our screen to stay on. The last one launches Chromium, in a full screen kiosk mode which. When you launch Chromium in incognito mode it won’t remember any previous shutdown errors and thus, never throws an error on startup. Then it displays my index.html in a local directory. Since we’ve put these in the autostart file(s), they will automatically happen when the pi user logs into the GUI. (Which you can set to happen immediately after boot up through sudo raspi-config.)

The index.html is simply a slideshow powered by JavaScript. It displays images, sized to fit the monitor’s resolution, and that’s it. It also refreshes itself every ten minutes to pick up new content.

But how to update it?

Since the geographic area covered by the library district is bigger than some east coast states, I wanted things to update quickly, in the background, on a schedule, while reliably pulling down the data and resuming the odd failed transfer. Fortunately, you can do all of that with rsync and cron.

Remember that old server now running Ubuntu? That’s the only place I update the code and content. I can change and add slides, modify the code, save everything, and ten minutes later all the Pi displays are updated. Here’s how that works:

On each Pi, I set up rsync to talk to the server. To do this without a password you need to set up a keypair for the Pi and the server. First thing to do is make sure ssh works between the Pi and server. If so, generate a keypair on the Pi using ssh-keygen. Don’t use a password to generate the keypair and don’t use sudo as the user pi will be doing all the work. Once you have the keypair transfer it to the server using:

ssh-copy-id -i ~/.ssh/id_rsa.pub user@servername

Replace the user@servername with the user on the server where you’re hosting these files. In this case, the syncing directory is under my own username, dan, in a directory called display.

ssh-copy-id will ask for your login password to the server and, if all goes well, that’ll be the last time it asks for it. Once the keypair is set up between the Pi and the server, you should be able to ssh and rsync without a password.

Now we’re ready to sync things up! Set up a cron job using:

crontab -e -u pi

This will launch nano and you can set up a cron job to call rsync as you like. Me, I do it every ten minutes. That means that, on the very outside, any change I make to the master files will take up to twenty minutes to reflect on the monitors in the branches. I could set it to go more often, but there’s nothing so critical as flight information on those screens. So my cron job looks like this:

*/10 * * * * rsync -az --partial dan@piserver:/home/dan/display /home/display/

Looking back, let’s see what we’ve built. We’ve got a Raspberry Pi, connected to a monitor in a remote location. It’s running a slide show through Chromium and all the content is local, so it comes up fast with no lag. That content is synced to a server via rsync running as a cron job and everything updates every ten minutes, both the browser and the content.

So in the end, we’ve used no software geared specifically towards digital signage. The digital sign is powered by open source operating systems, running open source software, on open source hardware. As a librarian into open access, that’s the kind of thing that really makes my day.

Related Posts

kw604 : How to buy a second hand Digital Camera.

Jun 19


We have a special guest in the studio this week as we give you the tips and tricks you need to know when you are ready to buy your first second hand DSLR camera. House photographer Konrad drops by and tells us the DO’s and the Don’t ’s of buying a digital camera and how you can get the most bang for your buck. All of that in this weeks video edition of kw604.  

Shownotes :

Related Posts


The coming of Cyber archaeologists.

Nov 03

Will we need cyber archaeologists.

tapeLooking at it, its the oddesd of things. This flimsy plastic box with two round holes in it, seems to come from another age. A brown warn little plastic tape worms itsself from one side of the container to the other. Only 20 some years old , the cassette is as obsolete as the dinosaurs. Yet a few weeks ago my dear aunt called me up in a panic, telling the tale how the evil old cassette  player she had owned for so many years had 'eaten' a cassete with a recording on it of my late grandmother singing. I of course offered to go ahead and fix it. After half an hour of poking and prodding with a pair of tweezers and some sticky tape I managed to get the cassette back together. Now I just had to find a  cassette player to play it on… It was at that moment i realised .. I did not have one anymore.   The thought propped up to me that we store so much information these days on so many carriers, but yet all these media are futile and soon we won't be able to recover anything we stored 10 years ago because technology moves so fast. Will we need cyber archeologists in the future ? 

Media are futile.

rotThere are few media that survive the test of time. Even paper turns to dust after so many hundred years, depending on how it is stored. And so are the media we store stuff on today. The average lifespan of a cassette tape, a cd-recordable, a dat tape or even a floppy disk does not even come close to the lifespan of paper. Yet while a single peace of paper can hold out for a hundred years, a DVD rom with all the collected works of Plato won't last a hundred years at all. The loss off information that can occur when our media turn sour is only multiplied by the enormous amounts of data they can carry. To loose a single sheet of paper over the course of a thousand years might be a loss, To loose a thousand documents on a single cd-rom after 10 years is even worse.  So what is there to do but to transfer information from medium to medium in order to let it stand the test of time ? Or what if we find the carrier that will last us to infinity.. What format must we use to write our data ?

Formats are fleeting

If your average DLT tape will turn brittle and break in a hundred years you might just have been lucky. Think not of the medium the information is written on , think of the format the information is stored in. Format types like .doc , .xls and so on are  even more fleeting then their carriers. You can make your programs backward compatible into the extreme , supporting exotic fileformats of days long gone is a painfull task. Some, like .html, .txt .pdf and .rdf, might be supported for years to come, but what about other, exotic and propriatary standards,  formats of backup programs and so on. One might hold a treasured box of data in ones hand but if the fileformat is no longer supported .. How can we ever access it ? Perhaps we will find the key to the format .. but what about the system it was written for ?

Systems are fleeting

vaxIt can be even worse. Say we have salvaged the medium and have somewhere found the original application to read it with. What if it only runs on specific hardware ? An evolution that is even faster then the formats and the media , must be the hardware ! What if the information we need only runs on some ancient system like say for example a commodore 64 ? Where to find one ? and even more importantly : where to find the parts if something breaks. Even to this day some "legacy' programs that are still being used in production, run on hardware that is no longer supported by the manufacturer. So what do we have to do ? Store both the information, the media, the original application AND the hardware it runs on in our archives ?  What can be so important that we need to go through all this hassle  ?

 

what is important

"So what .." I hear you say ?  What if we loose that excell file thats 8 years old ? Who cares ? … But that is just it. We might know what information is important today, but we will never be able to tell what information is pivotal or trivial in the future. The first posting by Linus Torvalds on usenet might have been unimportant,  Yett only history will tell wether this one event might be something for the historybooks. The fact is we store more and more information these days on systems, media and in formats that might not stand the test of time. Wether or not something will be important in the future is impossible to tell at this time, thus we risk turning the digital era we live in today, into tomorrows informational dark ages , from which nothing will be remembered in the future.

Cyber archaeologists

 I see a new profession emerging. Perhaps starting out as a niche market, later to evolve in  something that will turn into an exact science. People who spend their time looking through old digital archives. Who have the skills to work with old legacy hardware, know which side is up on a floppy disk , and God forbid, even speak the language of the old commodore 64. Cyber-archeologists digging through our digital past, being able to unlock and uncover the secrets of the past and bring them back in the light of whatever modern civilisation there might be. A proffesion that holds both the keys to FINDING information and being able to ACCESS it aswell. A trait of archeologists not speaking of the jurrasic but of the "basic" or  the "x86" period of the past …  

 

 

 Epîlogue

As evolution speeds up .. so does the regression of the past into oblivion. 

I for one do think we will have them in the future. Experts in finding what was stored but yet was lost. Keepers of keys that can unlock the files from our past and bring them back. With the amount of information we produce, the digital legacy we leave behind… its unthinkable that these things would be lost forever in a period of only a few decenia.  Prove me wrong .. Digg into your past and find the first digital document you ever made ?  Perhaps you"ll need a cyber-archaeologists to complete the task.

Related Posts