I’ve written several times before how our industry –and society at large, for that matter — needs to start planning for when gigapixel displays become commonplace. I’m convinced that’s going to happen much sooner than most people think.
So I was very enthusiastic when Adrian Cotterill, editor-in-chief of the DailyDOOH, asked me if I’d speak about gigapixels at the Thought Leadership Summit: Videowalls Unplugged conference. The DailyDOOH organized that event last week at the #NECshowcase in London.
The problem was that I was in Canada and the event was in England. “That’s not a problem“, said Adrian. “Make a video and we’ll present that, instead!“ So, with the help of Arc-Media, a talented local production firm, we did just that. On the left is a picture of ‘virtual me’ on stage (photo credit: Andrew Neale)
Here’s what I had to say. Many thanks to The Daily DOOH for letting me use it here. All the opinions in it are mine alone. Comments and suggestions are most welcome.
I came across the really interesting video, below, yesterday. Basically Matt Richardson married a picoprojector to a Raspberry Pi computer and created an excellent demonstration of yet another use for digital light. Good work, Matt!
Whether or not you actually want this on your bike (I do!), it points out the very useful things that can be done with relatively low brightness –because he illuminated a fairly small area. It also shows you often can get away with low resolution — in this example because BIG characters needed to be projected to be easily seen by the bike rider.
Picoprojectors are an ideal tool for playing with digital light. I talked about that in a post about 18 months ago (Seeing the (digital) light). Natan Lindeer did some interesting work at MIT MediaLab, too (see LuminAR) .
(this post originally appeared in The Daily DOOH on February 13 2013)
How far does a megapixel go? The answer is not very far at all. One of the points I tried to make in my recent post about the MegaPixel Summit at #ISE2013 was that a million pixels is puny.
That raised a few eyebrows. But let’s take a look at some numbers and you’ll understand what I mean. Displays of many megapixels are easily created today and even a gigapixel (one thousand megapixels) is on the edge of do’able now. Soon, a gigapixel will become mainstream.
(This post also appeared on The Daily DOOH http://www.dailydooh.com/archives/79153)
When I first heard that there was going to be a Megapixel Summit at #ISE2013, my immediate reaction was “it’s about time!”
I’ve been preaching about ‘pixels everywhere’ for a long time and others have too. Pixels on any surface. Pixels as a new building material. Pixels as a new way to think about lighting. Pixels as the paint for massive digital canvases. Any shape, any size, anywhere (and where pixels are almost ‘free’).
But on reflection, my first reaction was actually wrong. I think the organizers have missed an opportunity. There is no new ground being broken it seems. To be fair, the promotional material is pretty clear on what will be covered so no one is being misled.
It’s just that I had hoped for more and here’s why… (continue reading…)
Way back in July, I wrote about Digital Light… showing the way. In that posting, I talked about intelligent outdoor lighting that would know where we are, where we wanted to go, and would help show us how to get there. And, it would save energy at the same time.
A site in London (the City of Westminster, to be more precise) designed by Jason Bruges Studio is part way there as you can see from the video below from Bruge’s website. He treats this as art and well he should, saying:
“The artwork responds to the different speeds, rhythms and concentration of people in the alley, and a flowing pattern of light is built up in the passageway which reflects the recent movement.”
But at the same time, it’s lighting, too. He goes on to say:
“White LED uplights, recessed into the paving, increase in intensity as people pass by causing a rippling wave of light to move through the passageway tracing their movement. When there are no pedestrians the lights dim to a low brightness to save energy while also providing a safe level of illumination.”
I like this a lot. It’s art. It’s functional. It’s smart about energy. It’s responsive. Now imagine what could be done with color and with a lot more pixels.
Check out his website. The studio has some really interesting pixels everywhere projects include Mimosa for Philips using Lumiblade LEDs.
I found this on the Arduino blog recently: Matt Leggett has been having some fun with wearable pixels. He sewed an alcohol sensor, some LEDs and an Arduino processor into a jacket. The idea is that you breathe into the sensor and the LEDs light up to show how inebriated you are. Too many lit LEDs and your friends should call a taxi for you. Perhaps the Arduino could even make the call automatically. Maybe Moritz Waldemeyer or Vega Wang could incorporate this into their wearable electronics.
I think this could be done without needing to blow into a sensor. After all, too many drinks and you might not remember how! SoberSteering is developing a car steering wheel that senses blood alcohol levels through ones skin. Maybe their sensor could be built into clothing and sense alcohol levels in real time.
I know of more than a few fellow bloggers who probably wouldn’t wear this kind of this kind of clothing. For them, it would mean pixels everywhere would keep them from going anywhere, by car at least <grin>. Or, at the very least, they may stop getting served earlier in the evening.
In fact, MicroTiles might never have been invented if certain unnamed inventors had been wearing Leggett’s invention!
Chris Harrison of Carnegie Mellon University and Microsoft researchers have been collaborating on using the human body as an input surface. They call this approach “OmniTouch”. Chris Harrison’s earlier project called ‘Skinput’ had similar goals and had a more interesting name but OmniTouch has some interesting advances over Skinput. Read on.
If this looks familiar to you, you may be thinking of Pranav Mistry’s SixthSense project at MIT Medialab a while back. OmniTouch has some interesting advances over SixthSense, though.
Mistry’s SixthSense also projected an image on ones body but used fiducials (colored marks) on the tips of the users fingers. A wearable camera tracked the fiducials and a computer deduced if a finger was touching a projected input point.
Harrison’s Skinput used a picoprojector to display an image on a body surface like a hand or a forearm. Bioacoustic techniques using a specially-designed armband detected taps on the skin and, with a bit of signal processing, could determine where on the skin the tap occurred. Skinput needed an armband which could be covered by clothing so it was a bit less obtrusive than the SixthSense fingertip markers.
OmniTouch enables a wide variety of surfaces to be input devices, not just a body surface. It uses a wearable projector and camera like SixthSense, but doesn’t require SixthSense’s markers on ones fingertips. OmniTouch uses a depth sensing camera, similar to Kinnect, but capable of shorter focus distances which increases flexibility. So, for example, one could use a wall or a pad of paper as an interactive surface. Depth sensing allows touch as well as hovering gestures.
The concept is very interesting. Wearing a projector and a depth sensing camera is clunky, but the concept is interesting…. interactive pixels everywhere in front of you. It’s just a proof of concept… now we need smart companies to make it small, stylish, and usable. Check out the video below from Chris Harrison’s website.
Here are a couple of links if you want more detail: http://chrisharrison.net/projects/skinput/SkinputHarrison.pdf and http://chrisharrison.net/projects/omnitouch/omnitouch.pdf