New frontiers in social networking

The big news this week is the launch of a National Science Foundation-funded study aimed at “developing the NeuroPhone system, the first Brain-Mobile phone Interface (BMI) that enables neural signals from consumer-level wireless electroencephalography (EEG) headsets worn by people as they go about their everyday lives to be interfaced to mobile phones and combined with existing sensor streams on the phone (e.g., accelerometers, gyroscopes, GPS) to enable new forms of interaction, communications and human behavior modeling.”

More precisely, the research, being conducted at Dartmouth College, is intended to accomplish several goals, including developing “new energy-efficient techniques and algorithms for low-cost wireless EEG headsets and mobile phones for robust sensing, processing and duty cycling of neural signals using consumer devices,” inventing “new learning and classifications algorithms for the mobile phone to extract and infer cognitively informative signals from EEG headsets in noisy mobile environments,” and actually deploying “networked NeuroPhone systems with a focus on real-time multi-party neural synchrony and the networking, privacy and sharing of neural signals between networked NeuroPhones.”

I’ve always thought that the big problem with existing realtime social networking systems, such as Facebook and Twitter, is that they require active and deliberate participation on the part of individual human nodes (or “beings”) – ie, typing out messages on keypads or other input devices – which not only introduces systemic delays incompatible with true realtime communication but also entails the possibility of the subjective distortion of status updates. NeuroPhones promise, by obviating the need for conscious human agency in the processing and transmission of updates, to bring us much closer to fulfilling the true realtime ideal, opening up enormous new opportunities not only in “human behavior modeling” but also in marketing.

Plus, “real-time multi-party neural synchrony” sounds like a lot of fun. I personally can’t wait to switch my NeuroPhone into vibration mode.

This post is an installment in Rough Type’s ongoing series “The Realtime Chronicles,” which began here.

UPDATE: Here is a picture of a prototype of the NeuroPhone:


And here is a paper in which the researchers describe the project. They note, at the end, that “sniffing packets could take on a very new meaning if brain-mobile phone interfaces become widely used. Anyone could simply sniff the packets out of the air and potentially reconstruct the ‘thoughts’ of the user. Spying on a user and detecting something as simple as them thinking yes or no could have profound effects. Thus, securing brain signals over the air is an important challenge.”

12 thoughts on “New frontiers in social networking

  1. Wintermute

    Holy ****.

    I was at best kidding and at worst using cautionary hyperbole about the whole mindless, helpless netizen zombie stuff.

    This however seriously scared the crap out of me.

    I wonder what Jaron Lanier will have to say about this. You Are A Gadget, quite literally. Lumbering meat relays for Twitter data, in the omnipresent, omniscient God Machine.

  2. AndreasK

    So companies like Google will use this interface for advertisement purposes to full extent

    Thinking about drinking?

    Go to McDonalds 1 mile from you, here in a direction by Google Maps

    Feel that you want to read a book?

    But the best bestseller from Google books! 200000 people downloaded this book yesterday! Herein the best excerpts delivered directly to your brain.

    Shopping. Do not like this Lois Vetton bag? Said NO ? Buy an excellent bag from Prada!

  3. Tom Lord

    It gets weirder. Yes, really.

    The same research folks at Dartmouth ALREADY HAVE OUT IN THE APP STORE the program cenceme which which updates your social network status in real-time with factoids like whether you are sitting or standing; whether you are in a conversation, a quiet place, or a noisy environment; what the weather is like where you are; etc.

    So, they definitely have an agenda here.

    Also, I think the bit about “neural synchrony” is a joke, actually. There is a semi famous guy who in the 1970s brought us the notion of “pyramid power”. Another example of his work, according to Wikipedia, is “In 1977, Flanagan told a press conference that he had 15 gold needles embedded in his body at a cost of $1,000 in the belief that this would make him immortal.”

    That guy sells a product called “NeuroPhone” and that guy has claimed that, among its benefits, it leads to neural synchrony and consequently an infinite IQ.

    That neurophone has nothing to do with the Dartmouth model — hence the little joke in their description.


    As far as I can tell, using the P300 signals for, say, a phone dialing app amounts to little more than replicating a well known experiment from the 1960s on today’s hardware.

    One interesting, thing, though is that P300 monitoring in response to controlled stimuli can, in fact, be used for a kind of irresistible form of interrogation.

  4. Ben Stone

    An edit:

    “Dave, stop. Stop, will you? Stop, Dave. Will you stop, Dave?” Thus HAL plead with the implacable astronaut Dave Bowman, a weirdly poignant scene in Stanley Kubrick’s 2001: A Space Odyssey. Bowman, having narrowly escaped a deep-space death sentence by the rapidly unraveling supercomputer, calmly, coldly proceeds to disconnect memory circuits one by one. “Dave, my mind is going,” HAL beseeches. “I can feel it. I can feel it…”

    I can feel it too. An unsettling realization struck: something has been fooling with my connections; remapping neural circuitry, screwing up mental tempo, sharping and flatting perception. My mind isn’t gone -so far as I can tell- but it is changing. I’m not processing naturally. This surfaces with a vengeance when reading. Settling into in a book or article used to be a pleasure. My mind would get caught up in the narrative, revel in the turn of argument. I’d spend satisfying hours assimilating involved stretches of content. That’s rarely the case anymore. Now, concentration drifts. The fidgets take hold, the thread skitters off. There I be, dragging a misfiring brain back into line. Deep reading used to come easy. Now it’s a wrestling match.

    I know what’s going on: the Internet. The Web has –in ways- been a godsend. Research that required days now takes minutes. A few clicks and there’s that telltale fact or quote. Even off work, I’m like as not foraging the Web: e-mails, headlines, blogs, videos, podcasts, or just surfing.

    For me and myriad others, the Net has become the medium universal, sole conduit for the lion’s share of information. It ripples through the senses straight into the mind. The advantages of immediate access abound. They’ve been widely described, duly applauded. “The perfect recall of silicon memory,” writes Clive Thompson in WIRED, and “can be an enormous boon to thinking.” But the boon comes at a price. As the media theorist Marshall McLuhan pointed out in the 1960s, media are not passive channels of information. They supply not only the stuff for thought, but shape the processes of thought. In that light, the Net chips away at the capacity for concentration and contemplation. My mind now expects information in the Internet milieux- a jouncing stream of particles and packets. Once I was a treasure hunter in the rich sea of words. Now I zip along its surface on a 6 Mbs Jet Ski.

  5. Tom Chandler

    Shame to think those neural interfaces – used to such dark (and sometimes heroic) effect in all those cyberpunk novels – would find themselves reduced to “i’m having the french roast.”

  6. Edward Boches

    Gary Shteyngart is way ahead of you with his apparatti in Super Sad True Love Story. Soon our thoughts, history, backgrounds, sexual preferences, and even our thoughts of the moment will be part of the stream.

  7. Francois

    Generic neurological interfaces might still be a few years in the future, but mind-controlled mobile phone games are certainly around (albeit, I agree, on a much simpler level of interaction). Here’s what some colleagues recently came up with:

    Players must switch between a meditative and active mind state to progress through a maze. Not that easy from what I’ve heard.

  8. JV2K

    This is very old tech that someone

    decided to spend some money on.

    The government has refined this to

    an exact science and guess where they

    are using it? Yes sir, surveillance.

    Surveillance of who? How about

    innocent, non-terrorist, civilians

    among others, being tortured 24/7

    in the “loving comfort” of their own

    homes. Stimulus and recorded response.

    The automatic response is always mirth

    and laughter that’s associated with

    tin-foil hat paranoia and SCZ.

    Unfortunately, the ppl who do this

    torture are not idiots which means



    MENTAL ILLNESS (caps is a sure sign, ha!)

    to the point where the “subject”

    spends all his or her savings

    trying to cope with whatever

    mental illness is being manufactured

    for them.

    If you know what to look

    for you can see them on You Tube

    looking like lunatics and sleep

    deprived as hell, trying very hard

    to make sense. These ppl will always

    look exactly like the real lunatics

    on YT b/c that’s what sleep deprivation

    is meant to do. These are the “lucky’

    ones who are told exactly what is

    happening to them.

    It’s all taking

    place right under our noses in

    plain sight and no one is going

    to stop it b/c the money will

    never dry up.

  9. TopicLogic

    This technology would still require not only technology to understand us, but for us to better understand it. One prime issue is the lack of a common dictonary of terms that are personel to us. In recent tests of the Google speach technology you may say “text” but not “SMS”. I would see this being even more of an issue with non-vocal communication.

    For this to work we need to create personal index of terms that software can use or the error rates will be too high.

Comments are closed.