Broadway, as you’ll recall, was the nickname of the fellow that 50 Cent hired to ghost his tweets. ”The energy of it is all him,” Broadway said of the simulated stream he produced for his boss. Or, as Baudrillard put it: “Ecstasy of information: simulation. Truer than true.”
Now that we’re all microcelebrities, we need to democratize Broadway. No mortal can keep up with Twitter, Facebook, Instagram, Tumblr, LinkedIn, Snapchat, etc., all by himself/herself. . We all need a doppeltweeter to channel our energy.
Since the ability to clone Broadway is still three or four years out, Google is stepping into the breach by automating the maintenance of one’s social media presence. The company, as the BBC reports, was earlier this week granted a patent for “automated generation of suggestions for personalized reactions in a social network.” The description of the anticipated service is poetic:
A suggestion generation module includes a plurality of collector modules, a credentials module, a suggestion analyzer module, a user interface module and a decision tree. The plurality of collector modules are coupled to respective systems to collect information accessible by the user and important to the user from other systems such as e-mail systems, SMS/MMS systems, micro blogging systems, social networks or other systems. The information from these collector modules is provided to the suggestion analyzer module. The suggestion analyzer module cooperates with the user interface module and the decision tree to generate suggested reactions or messages for the user to send.
Translation: At this point, we have so much information on you that we know you better than you know yourself, so you may as well let us do your social networking for you.
Google notes that the automation of personal messaging will help people avoid embarrassing social faux pas:
Many users use online social networking for both professional and personal uses. Each of these different types of use has its own unstated protocol for behavior. It is extremely important for the users to act in an adequate manner depending upon which social network on which they are operating. For example, it may be very important to say “congratulations” to a friend when that friend announces that she/he has gotten a new job. This is a particular problem as many users subscribe to many social different social networks. With an ever increasing online connectivity and growing list of online contacts and given the amount of information users put online, it is possible for a person to miss such an update.
A computer will generate a personal “congratulations!” note to send to a friend, and upon the reception of the note, the friend’s computer will respond with a personal “thanks!” note, which will trigger the generation of a “no problem!” note. I think this is getting very close to the social networking system Mark Zuckerberg has always dreamed about. When confronted with an unstated protocol for behavior, it’s best to let the suggestion analyzer module do the talking.
Beyond the practical stream-management benefits, there’s a much bigger story here. The Google message-automation service promises to at last close the realtime loop: A computer running personalization algorithms will generate your personal messages. These computer-generated messages, once posted or otherwise transmitted, will be collected online by other computers and used to refine your personal profile. Your refined personal profile will then feed back into the personalization algorithms used to generate your messages, resulting in a closer fit between your computer-generated messages and your computer-generated persona. And around and around it goes until a perfect stasis between self and expression is achieved. The thing that you once called “you” will be entirely out of the loop at this point, of course, but that’s for the best. Face it: you were never really very good at any of this anyway.
This post is an installment in Rough Type’s ongoing series “The Realtime Chronicles,” which began here. A full listing of posts can be found here. Image from Google patent filing.