Tuesday, February 28, 2012

Google Goggles

National Public Radio and the New York Times, among other sources, have been reporting on the forthcoming "Google Goggles" that are purportedly being designed at the Google X labs in Mountain View, CA.  The goggles would offer their wearer an augmented reality display capable of providing information relevant to any particular context in a way similar to how news channels provide viewers with a graphic layer of time, weather, market updates, breaking news, etc. over the live-action shot.

The concept of augmented reality goggles isn't unique or new.  Two other developers are also working on similar designs, and contact lenses with similar functionality have been discussed in the past.  If we include science fiction, the idea is actually quite old indeed.  So the reason this story has captured my attention is not its novelty, but the reflection that is provoked when we consider that it will be a reality.

The consequences of the Google Goggles are similar to what my collaborators I have described previously as an "intercept scenario" simulation (Jones, Lombard & Jasak, 2012).  In such a simulation, the brain and body are separated so that all nerve impulses are intercepted and fed into a mainframe computer in a style similar to that of the The Matrix (1999).  The goggles are only different insofar as they intercept perception outside of the body.  Permitting the physical world to be framed and interpreted by streams of information flowing before your eyes, effectively changes the bedrock of reality as surely as any simulation.  So why waste processing power building a complete simulation when you can control the way people interpret their existing reality through dependence on external tools of interpretation?

Some will see this prediction as alarmist, pointing out that if you don't want the glasses you don't have to buy them and that you can always take them off.  I would respond through recourse to the example of ordinary glasses used for vision correction.  Those of us who wear glasses know how different and disorienting the world is when we remove them.  Imagine glasses that not only aid your perception, but your cognition as well.  Indeed, at some point, removing these glasses may be like removing a part of yourself.  Though... it would be a part of yourself that wasn't really yours to begin with.

Tuesday, February 21, 2012

The Cyborg's Dilemma... Again.

"The Dilemma of Being a Cyborg" is an article by Carina Chocano which recently appeared in the New York Times magazine addressing the growing anxiety that many people face concerning what appears to be universal digitization.  In 1997, Frank Biocca first described the cyborg's dilemma in these terms:
The more natural the interface the more 'human' it is, the more it adapts to the human body and mind.  The more the interface adapts to the human body and mind, the more the body and mind adapts to the non-human interface.  Therefore, the more natural the interface, the more we become 'unnatural,' the more we become cyborgs.
At the TED conference in 2010, Amber Case seized on this theme with her talk, "We're All Cyborgs Now," noting the capacity for digital storage devices to augment memory, cognition and consciousness.  Aside from this, much has been written in the field of Embodied Cognition on the subject of cognitive offloading and how it suggests that the boundaries of our minds may not coincide with the boundaries of our physical brains.

Chocano brings a humanistic perspective to the subject, noting the important role that physical objects have played in establishing context and meaning in her life.  She is leery of the transformation, reduction, and compression of everything into "data" and the increasing dependence most of us have on the volumes of digital information at our fingertips.  Chocano, thus describes her "cyborg's dilemma" in the following way:
We're collectively engaged in a mass conversion of what we used to call, variously, records, accounts, entries, archives, registers, collections, keepsakes, catalogs, testimonies and memories into, simply, data.
Throughout the article, Chocano seems self-critical and conscious of her own reluctance to immediately embrace the digital future, but the reasons for her concern are not without basis.  Her suspicion that access to information is not the same thing as knowledge is supported at length by the work of Nicholas Carr ("Is Google Making Us Stupid?" and The Shallows).  Related to this concern, though perhaps more sinister, is the standardization and accessibility facilitated by digital technology.  In a completely standardized digital world, we don't own our own knowledge.  Our life histories, buying habits, social networks, intellectual labor - indeed our very identities - are swallowed by and subordinated to the metaprograms that beckon us to join, share and participate in this new culture even as we relinquish our control over it.

A week ago, my girlfrined brought home a long-overdue VHS/DVD transfer machine.  She bought it to preserve old home movies, but I will likely use it to transfer some of my considerable collection of obscure cinema.  When I was in high school and college, I used to collect strange foreign and experimental films at local video stores and through the mail.  I remember the experience fondly, as each little box represented a different trip through someone else's subconscious mind.  Stripped of their plastic housing and cardboard dressing, they will undoubtedly be easier to store, but they also will have lost the aura that drew me in from the start.  They will be part of a network of data and every pixel will be accounted for, assigned a number, and any hint of mystery will be utterly gone.

Monday, February 13, 2012

The Obsolete Classroom?

In a recent item from The Chronicle of Higher Education, Nick DeSantis reports on the plans of Stanford Professor, Sebastian Thrun, to break away from his tenured position and create his own institution: "Udacity" (presumably a clever union of the words "University" and "Audacity").  Thrun apparently believes that the traditional classroom is outdated and should be abandoned for a multi-mediated digital platform where collaboration and group learning replace lectures and note-taking.

The initial response that anyone might have to this is positive.  If new media can help people learn better and faster by appealing to a diverse set of learning styles and unleashing the power of cooperation, why shouldn't it be used.  I fully agree that it should.  The problem comes when we examine the common false-choice fallacy that is frequently associated with new technologies and the cornucopia of remedies promised by those who are their cheerleaders.  In this case, the fact that teaching an exclusively online course doesn't increase, but decreases, the number of communication modalities available is omitted. All of the tools at the disposal of the online instructor are also available to the classroom instructor, plus one: the gold standard of face to face communication.

Tuesday, February 7, 2012

Smartphones and The Cyborg's Dilemma

In this month's issue of Fast Company, there's an article by Adam Bluestein titled, "As Smartphones Get Smarter, You May Get Healthier: How mHealth Can Bring Cheaper Health Care To All."  M-Health stands for "mobile health," a new trend in helath care technology which takes advantage of the smartphone platform to make expensive medical devices available around the world and on the go for only a fraction of the cost of convensional health care machinary.  For example, Bluestein's article features engineer Ramesh Raskar whose smartphone-based autorefractor takes advantage of the existing display technology to make an ordinarily cost-prohibitive instrument available in poor nations.  The technology cleverly takes advantage of an already existing economy of scale in smartphones.  In addition to Raskar's autorefractor, mobile ultrasounds, electrocardiograms, microscopes, and other applications build off of the display technology, wireless communication, and power supplies of existing smartphone technology.  See Bluestein's diagram below for the full range of innovations:



Amid all of this incredible innovation, there is one line in Bluestein's article that some may find unsettling.  It's a quote from the Chief Innovation Officer at Humana: "It's like a the human body has developed a new organ."  As I considered this, I was reminded of Frank Biocca's famous paper, "The Cyborg's Dilemma," which is described as a sort of paradox in human-machine relations: "The more natural the interface the more 'human' it is, the more it adapts to the human body and mind.  The more the interface adapts to the human body and mind, the more the body and mind adapts to the non-human interface.  Therefore, the more natural the interface, the more we become 'unnatural,' the more we become cyborgs."

Of course, the prospect of becoming a cyborg is significantly less disturbing when one considers the alternative plan.  Nevertheless, it cannot be ignored that this is another critical example of how the intimate connection we have with our tools is in the process of transforming us.