Physical-Digital Liminality in Learning Spaces
Published: July 17, 2012 by Dr. Jeremy Kemp
My work explores the smudgy edges of digital and physical learning spaces and the new computer tools and teaching methods that support the ongoing innovation within and between them. I think of these as “liminal” spaces* because they mark a limit, dividing line, boundary, threshold…My work explores the smudgy edges of digital and physical learning spaces and the new computer tools and teaching methods that support the ongoing innovation within and between them. I think of these as “liminal” spaces* because they mark a limit, dividing line, boundary, threshold…
I’m fascinated by this continual and iterative transition between physical and digital realms in learning contexts, so much so that I keep poking at it from many different angles:
- I’ve been teaching online for 13 years now and continue to explore learning management systems.
- Second Life caught my eye in Spring, 2006 and it became an obsession as I used it to simulate physical learning spaces.
- Then I invented the SLOODLE mashup between online classrooms and virtual environments with Daniel Livingstone in Scotland. most of my traditional academic publishing is on this topic.
- I’ve embraced the Maker community and am learning to pass bits into atoms through 3D printing and sundry computer numerically controlled building devices.
- My work with Anthony Bernier gathers reference video of young adult library spaces and converts it to walkable models and data visualizations.
- Finally, I am exploring the use of microcontrollers aka Smart Objects to enliven learning spaces with more complex user experiences.
Let me share a few more examples from my work this past decade. But rather than describe them in text, let me show you a few videos first.
See the 15 minute YouTube Playlist below with excerpts of 11 of my videos from the last five years that follow this line of inquiry.
These 11 excerpts each have a specific connection to this line of inquiry. As you watch, consider these rationales:
- Neo Realizes – A fun piece of fiction that shows Keanu Reeves realizing he is inside a simulated environment.
- President Avatar Sample – A past SJSU president is re-imagined as an avatar
- Build a Bench – Physical items on campus become virtual objects
- ANGEL in Second Life 3D Dropbox Nugget – Grant-funded coding I did that allows students to create virtual objects and turn them in as assignments
- ET4Online venue Flyover – See the 3D models of real buildings
- 3D Printer creates Logo – Digital designs are instantiated by 3D printers
- QR Code Embroidery – A website url becomes a QR code that is then sewn onto cloth by a CNC embroiderer – to be captured later by a cell phone
- iBooks Author Widgets and Layout – 3D models embedded in Books
- Sloan-C Emerging Technology – A data visualisation in Google Earth
- Smart Objects – Using microcontrollers to control the physical world
- Mockup 2: Clinic Setting – Stimuli I created for fMRI studies of avatars and emotion for a research group at Stanford University
Now that you have a basic background, let me offer a little word picture to explain the thinking I do here and how I put it to practice. Let’s try a thought experiment that I hope makes salient the digital and physical exchange you and I are conducting right now. It also emphasizes the fact that we are communicating asynchronously and from a distance.
Ready?* This phrase is related to Turner’s liminality but repurposed for digital media as in Harrison and Morgan. (Thanks to Anthony Bernier for commenting on this.)
Here… Read this symbol now:
Mark the time: 10:48am PDT, July 17, 2012
Mark the location: 37°20’9.5814″ latitude by -121°52’55.7112″ longitude
Fast forward to the future – uh, your present…. I’m predicting that you are reading this on some kind of computing device. Pretty clever, huh? I also know that you are very distant and removed from me. In fact, you are in my future!
Mark your time and location… See? Challenge: Be the first reader to reply with your “when and where” by clicking the leave a comment link.
Now let’s dissect this a little further and really geek out. What I’ve done here is to share a tiny piece of information between our two cerebral cortexes. I thought of the letter “t” and now YOU are thinking of the same letter….
Our brains each have a million neurons lit up to represent this symbol. And we are now imagining the movement of our tongues across our teeth to make this sound in speech. “tuh” A useless reaction given our situation, but unavoidable.
It is worth noting here that we humans are automatically driven to turn this information into physical action. “tuh”
So the start and end of this communication between us – me in your past and you in my future – are completely physical. That’s because we are not machines, after all. Our neuron interactions are electro-chemical but the “wiring” is fleshy and wet. Indeed, if we were sitting inside MRI machines right now, the scan of our brains might reveal this language processing with a slightly increased blood flow in a certain region. See – I told you we were going to geek out here.
But what about the passage BETWEEN these two final semantic destinations in our brains — two inches behind our foreheads? What journey did this “t” travel on to get from my neurons to your neurons?
In fact, this little letter “t” has flipped back and forth between digital representations and physical instantiations thousands of times. So, for example, after my finger hit the “t” key the contact under the plastic key closed and the circuits in my keyboard converted this click to a set of zeroes and ones and sent these down the USB cable to my computer. The letter “t” symbol was then turned into power fluctuations in my computer’s memory. Next it was stored on my hard drive as a magnetic variance.
And when I finished this piece it split into millions of packets that travelled here in San Jose State and were reassembled onto a hard drive named “slisapps.sjsu.edu.”
Then you arrived. Your computer requested this page and our little letter “t” broke apart again into packets, left San Jose and flew to you at the speed of light. One of those packets included our letter “t” and travelled through Mountain View or Chicago or Tokyo. As the symbol bounced around the ‘Net it moved through the memories of dozens of computers and probably a few hard drives as well.
Finally our letter was re-rendered on your monitor and turned into light, detected by your retina…
In fact, the real situation is much more interesting and geeky. I actually held up my iPhone and spoke the sound “tee” to dictate what you see at the right. Sounds from my vocal cords were converted to bits by Apple’s Siri speech recognition tool.
I originally thought adding all these details to the start of this post would only confuse the issue, but this serves as yet another illustration. Yet once more you see how our communication crosses the physical-digital threshold over and over again.
Siri is a cloud-based application where the local sound file gets shot out from the phone and interpreted in Apple’s data centers before returning as ASCII to the device – in this instance the capital letter “T” because helpful Siri always uppercases the first character on a new line.
Location data came courtesy of the “WhereAmIAt” app.