Sometimes, when a revolution is under our noses, it’s impossible to see.
This is a response to a thoughtful piece by Dan Kline, who wonders what Jane McGonigal's awesome talk presages for AI programmers.
It’s my opinion that what an AI programmer brings to the table is more relevant than ever. The thing about AI is that once it achieves viability, it is no longer recognized as AI. Expert systems, voice recognition, path finding; all started as AI research problems, but once practical, stop being considered AI.
AI Programmers are the Algorithmic Avante Garde
The title AI programmer is a almost a smear, but the AI programmer should wear the title proudly. The role of the AI programmer is to invent what comes next, so that the rank and file programmer can take the results for granted. The AI programmer should never expect to be feted, because it won't happen. When you are inventing something new, people won't understand what you're doing. Once the inventing is done, they'll wonder why such an obvious thing took so long to figure out. You can spot the pioneers by the arrows in their backs.
The AI programmer has to be visionary and self-confident
Much like the absurd debate over whether games are art, trying to pin down any particular technique as AI is self-abrogating endeavor. The more you pin it down, the more you discover that something that might be AI is just a plain old algorithm after all. People forget,
Someone has to do the discovering
Ex-AI is all around you. Every repetitive or uninteresting job in the real world gets moved to AI in three stages. First, the cost of someone doing the job becomes high (example, secretaries answering phones). Second, the job goes to a low-labor cost locale (call centers). Third, AI gets the job (try checking the status of your flight). But then we don’t call it AI, it’s “call automation.”
Looking at Jane’s piece, which I love, I feel that you could possibly draw the conclusion that AI doesn’t have a place because people will do all the work and provide all the interaction. I draw the opposite conclusion. I believe that Jane’s piece implies a level of astonishing ubiquity of AI; so endemic and mind-blowing in its extent, that we, the ants in the mound, won’t even be able to see it - or to be blunt, we already don't see it. Frankly, she proposes the Matrix, and she proposes that we will love it. And you, oh AI programmer, are building it.