Column : Single display Simplicity.

Feb 15

If you ever watched the movie Wargames (I am showing my age here) or remember the scene inside the control room of the Nebuchadnezzar and you found your geek heart-rate quickening at the sight of all those screens … then you know what I mean when I say : “One display is never enough”.

Somehow, having a workstation with multiple computers blasting lines of random code onto several monitors surrounding a hyper-connected fast typing individual has something strangely appealing to it. More computers, more screens, more keyboards, more input, more data .. it somehow enhances our sense of power.

I remember the near fistfight scenario’s at the office where users would demand a dual-screen setup. They would vaguely think up scenario’s about the need to simultaneously run applications or compare data across the screen. The troglodyte rat-race that had prefaced this situation featured the urge to get “the biggest screen” in the office. Now there was a new kind of ‘hip’ in town .. the need to have TWO screens. Never mind if they needed it .. their neighbour had two .. so one screen was just not enough.

The domestic geek in the confounds of his own private dungeon is very much the same. ONE computer cannot be enough… you need SEVERAL machines. And since we are well on a roll on tilting the number of devices per user quota , lets add some tablets and smartphones to the mix .. shall we ?
So what you end up with is a complete over-connected bat cave with plenty of systems that you need to maintain. More screens then you can encompass, each displaying a separate part of your information streams (or each redundantly displaying the SAME information streams). Giant “waves” of notifications across all the different systems whenever you get a new reply on Twitter. Plenty of of keyboard-swapping and a dizzying amount of chair-swivelling. Yes : You have built a veritable mission control centre with enough machines to keep 5 people occupied, but you have a staff of one.

starbucks-and-laptop

So what if we go back to simplicity ? Last week, I had to confine myself to our kitchen table downstairs so I could keep my wife (who had the flu) company as she was sleeping on the couch. My “Mission Control Centre” sat unmanned upstairs. The only thing I had was my laptop, a notebook riser, an external keyboard and mouse .. and a pair of earphones. My geek universe shrunk to ONE 13 INCH screen. Can you imagine the horror ? 

So was it horrible ? Did I suffer information deprivation ? Did my fingers grasp the empty air where otherwise secondary (or tertiary) keyboards used to be ? Initially the answer was : YES. It took me quite  to squeeze all of my workflows onto one machine/display but after a couple of hours I started to enjoy it. I  have to admit , my personal digital architecture ( the way I have organised my Cyber Lifestyle) highly favours “sliding” from OS to OS , from machine to machine so I seldom have data or workflows locked down to one specific machine. I could cope with using just ‘one machine’ for a short time, but would it last ? 

To my own surprise I actually started enjoying it ! The “one machine” approach meant that I could  focus on what I was doing. The distractions were kept down to a minimum (Windows that I did not need simply remained closed) and notifications just came in ONCE on ONE system. When I closed down my email client, I did not get any mail notifications. I didn’t have to fight the urge to “check Twitter at a glance”. With fewer monitors and fewer systems I gained more focus then ever before.

On top of that, I got back some “intimacy” with my system. When there was a problem, or I needed to figure out how to do stuff , it was just on that ONE machine. It had been a long time since I experienced the feeling of having “one” computer that was “MY” computer and not just  “A” computer. This kind of “human-device” intimacy resulted in me taking extra good care of that machine. I tweaked it to my liking. It took me 25 minutes to find the right wallpaper and really ‘settle in’ on my machine,  instead of just ‘passing by’ and quickly rap on the keyboard before moving on. That one little computer became my ‘home’.

So I have learned something : More is not always better. I admit it is quite hard to do full screen video editing on a single 13 inch laptop but it does help you focus on the content. More machines means more maintenance, more distractions and more ‘distance’ from the machine. We loose not only the intimacy with “OUR computer”, we also lose the intimacy with the applications we have because we “bounce around” so much. We hop from phone to web interface, from mail client to tablet and loose any ‘deep knowledge’ of an OS or an application. We don’t seem to the time anymore, or better said : We don’t TAKE the time.

It felt refreshing to be back upstairs after a couple of days. Basking in the glow of all three 24inch displays, overlooking my digital horizon while leaning back with my hot cup of tea .. but somehow I missed my little excursion. Going back to the basics of one simple machine reminded me .. that sometimes less .. is so much more.

Related Posts

The coming of Cyber archaeologists.

Nov 03

Will we need cyber archaeologists.

tapeLooking at it, its the oddesd of things. This flimsy plastic box with two round holes in it, seems to come from another age. A brown warn little plastic tape worms itsself from one side of the container to the other. Only 20 some years old , the cassette is as obsolete as the dinosaurs. Yet a few weeks ago my dear aunt called me up in a panic, telling the tale how the evil old cassette  player she had owned for so many years had 'eaten' a cassete with a recording on it of my late grandmother singing. I of course offered to go ahead and fix it. After half an hour of poking and prodding with a pair of tweezers and some sticky tape I managed to get the cassette back together. Now I just had to find a  cassette player to play it on… It was at that moment i realised .. I did not have one anymore.   The thought propped up to me that we store so much information these days on so many carriers, but yet all these media are futile and soon we won't be able to recover anything we stored 10 years ago because technology moves so fast. Will we need cyber archeologists in the future ? 

Media are futile.

rotThere are few media that survive the test of time. Even paper turns to dust after so many hundred years, depending on how it is stored. And so are the media we store stuff on today. The average lifespan of a cassette tape, a cd-recordable, a dat tape or even a floppy disk does not even come close to the lifespan of paper. Yet while a single peace of paper can hold out for a hundred years, a DVD rom with all the collected works of Plato won't last a hundred years at all. The loss off information that can occur when our media turn sour is only multiplied by the enormous amounts of data they can carry. To loose a single sheet of paper over the course of a thousand years might be a loss, To loose a thousand documents on a single cd-rom after 10 years is even worse.  So what is there to do but to transfer information from medium to medium in order to let it stand the test of time ? Or what if we find the carrier that will last us to infinity.. What format must we use to write our data ?

Formats are fleeting

If your average DLT tape will turn brittle and break in a hundred years you might just have been lucky. Think not of the medium the information is written on , think of the format the information is stored in. Format types like .doc , .xls and so on are  even more fleeting then their carriers. You can make your programs backward compatible into the extreme , supporting exotic fileformats of days long gone is a painfull task. Some, like .html, .txt .pdf and .rdf, might be supported for years to come, but what about other, exotic and propriatary standards,  formats of backup programs and so on. One might hold a treasured box of data in ones hand but if the fileformat is no longer supported .. How can we ever access it ? Perhaps we will find the key to the format .. but what about the system it was written for ?

Systems are fleeting

vaxIt can be even worse. Say we have salvaged the medium and have somewhere found the original application to read it with. What if it only runs on specific hardware ? An evolution that is even faster then the formats and the media , must be the hardware ! What if the information we need only runs on some ancient system like say for example a commodore 64 ? Where to find one ? and even more importantly : where to find the parts if something breaks. Even to this day some "legacy' programs that are still being used in production, run on hardware that is no longer supported by the manufacturer. So what do we have to do ? Store both the information, the media, the original application AND the hardware it runs on in our archives ?  What can be so important that we need to go through all this hassle  ?

 

what is important

"So what .." I hear you say ?  What if we loose that excell file thats 8 years old ? Who cares ? … But that is just it. We might know what information is important today, but we will never be able to tell what information is pivotal or trivial in the future. The first posting by Linus Torvalds on usenet might have been unimportant,  Yett only history will tell wether this one event might be something for the historybooks. The fact is we store more and more information these days on systems, media and in formats that might not stand the test of time. Wether or not something will be important in the future is impossible to tell at this time, thus we risk turning the digital era we live in today, into tomorrows informational dark ages , from which nothing will be remembered in the future.

Cyber archaeologists

 I see a new profession emerging. Perhaps starting out as a niche market, later to evolve in  something that will turn into an exact science. People who spend their time looking through old digital archives. Who have the skills to work with old legacy hardware, know which side is up on a floppy disk , and God forbid, even speak the language of the old commodore 64. Cyber-archeologists digging through our digital past, being able to unlock and uncover the secrets of the past and bring them back in the light of whatever modern civilisation there might be. A proffesion that holds both the keys to FINDING information and being able to ACCESS it aswell. A trait of archeologists not speaking of the jurrasic but of the "basic" or  the "x86" period of the past …  

 

 

 Epîlogue

As evolution speeds up .. so does the regression of the past into oblivion. 

I for one do think we will have them in the future. Experts in finding what was stored but yet was lost. Keepers of keys that can unlock the files from our past and bring them back. With the amount of information we produce, the digital legacy we leave behind… its unthinkable that these things would be lost forever in a period of only a few decenia.  Prove me wrong .. Digg into your past and find the first digital document you ever made ?  Perhaps you"ll need a cyber-archaeologists to complete the task.

Related Posts