Why Technology Never Trumps Humans
It seemed like a good idea: Build a fun, flirty iPhone app that generates millions of custom pick-up lines on the fly, simply by tapping in specifics of a situation:
A user enters the place, time of day and characteristics of his intended date, hits a button and chooses a line ranging from clever to clumsy. And just to keep things under control, we allowed the user to choose between lines that are either Safe or Sexy. The app is called LittleWingman and it tested through the roof.
It's an equal-opportunity application, generating pick-up lines regardless of gender or orientation, which means it spews out lines for men to women, women to men, men to men and women to women -- all on the fly. And because it contains no graphics, no profanity and no abusive language of any kind, we knew it was a cinch to gain approval from Apple's iTunes Store.
And it did. Eventually.
Nine gruelling months after it was originally submitted.
Why was LittleWingman constantly rejected? As it turns out, not for any specific objectionable words or graphics -- it doesn't have any. In fact, may be the first and only app ever rejected purely for the sexual ideas it stimulates in users' minds.
Are phrases like "tight-fitting jeans" and "legs" objectionable? Not to most people. But when LittleWingman composed them into the following line, iTunes had a big problem with it:
"I'm tonight's official legs inspector. I'm going to have to ask you to remove those tight-fitting jeans."
At first, we thought iTunes objected to words like "breasts" and "ass" -- two commonly used words in many other apps. So we replaced those with "casabas" and "tush," only to be rejected again. Within a week or two, the same canned message came back with the same canned rejection:
"Think any of the rabbis at this ceremony can lend us some personal lubricants?"
Random? Funny? Hardly objectionable as a flushing toilet, upskirt shots or jiggling breasts you'll find in other iPhone applications, yet iTunes rejected that generated line flat out.
The correspondence flew back and forth, with LittleWingman getting rejected for combining innocent phrases like "kiss" with innocent body parts like "lips" into pick-up lines that resulted wonderfully appealing ideas as to what things people might actually kiss with their lips.
Each time, the App Store returned the same canned response, with no guidance as to fixing the problem, mainly because there was no problem there to fix. Unlike the now-banned "baby shaker" app, LittleWingman was pure, positive pick-up lines -- and healthy ones, at that.
At six months, we thought we had a breakthrough: iPhone 3.0's 17+ adult rating was just the ticket to get us past our non-existent objectionable content. We re-submitted. And got rejected. Again.
The maddening, automated responses were finally disrupted when, after seven months, a real, breathing App Store human being actually left a voicemail at our offices. We began the dialogue which, two months later, resulted in LittleWingman -- with only the two word changes from its original submission. And that, as it turns out, is the main problem with technology: it lacks human judgment, which cost us time, energy -- and nine months' of sales.
A user enters the place, time of day and characteristics of his intended date, hits a button and chooses a line ranging from clever to clumsy. And just to keep things under control, we allowed the user to choose between lines that are either Safe or Sexy. The app is called LittleWingman and it tested through the roof.
It's an equal-opportunity application, generating pick-up lines regardless of gender or orientation, which means it spews out lines for men to women, women to men, men to men and women to women -- all on the fly. And because it contains no graphics, no profanity and no abusive language of any kind, we knew it was a cinch to gain approval from Apple's iTunes Store.
And it did. Eventually.
Nine gruelling months after it was originally submitted.
Why was LittleWingman constantly rejected? As it turns out, not for any specific objectionable words or graphics -- it doesn't have any. In fact, may be the first and only app ever rejected purely for the sexual ideas it stimulates in users' minds.
Are phrases like "tight-fitting jeans" and "legs" objectionable? Not to most people. But when LittleWingman composed them into the following line, iTunes had a big problem with it:
"I'm tonight's official legs inspector. I'm going to have to ask you to remove those tight-fitting jeans."
At first, we thought iTunes objected to words like "breasts" and "ass" -- two commonly used words in many other apps. So we replaced those with "casabas" and "tush," only to be rejected again. Within a week or two, the same canned message came back with the same canned rejection:
At 5:51 PM -0800 3/5/09, devprograms@apple.com wrote: Thank you for submitting LittleWingman to the App Store. We've reviewed LittleWingman again and determined that we still cannot post this version of your iPhone application to the App Store because it contains inappropriate sexual content and is in violation of Section 3.3.12 from the iPhone SDK Agreement which states: "Applications must not contain any obscene, pornographic, offensive or defamatory content or materials of any kind (text, graphics, images, photographs, etc.), or other content or materials that in Apple's reasonable judgement may be found objectionable by iPhone or iPod touch users." If you believe that you can make the necessary changes so that LittleWingman does not violate the iPhone SDK Agreement we encourage you to do so and resubmit it for review.We combed through the content again, looking for any profanity or objectionable content. But we couldn't find any, because there wasn't any. It was the application that was writing the content by itself, based on what the user had chosen. For example, LittleWingman generated this line for user who finds herself at a wedding:
"Think any of the rabbis at this ceremony can lend us some personal lubricants?"
Random? Funny? Hardly objectionable as a flushing toilet, upskirt shots or jiggling breasts you'll find in other iPhone applications, yet iTunes rejected that generated line flat out.
The correspondence flew back and forth, with LittleWingman getting rejected for combining innocent phrases like "kiss" with innocent body parts like "lips" into pick-up lines that resulted wonderfully appealing ideas as to what things people might actually kiss with their lips.
Each time, the App Store returned the same canned response, with no guidance as to fixing the problem, mainly because there was no problem there to fix. Unlike the now-banned "baby shaker" app, LittleWingman was pure, positive pick-up lines -- and healthy ones, at that.
At six months, we thought we had a breakthrough: iPhone 3.0's 17+ adult rating was just the ticket to get us past our non-existent objectionable content. We re-submitted. And got rejected. Again.
The maddening, automated responses were finally disrupted when, after seven months, a real, breathing App Store human being actually left a voicemail at our offices. We began the dialogue which, two months later, resulted in LittleWingman -- with only the two word changes from its original submission. And that, as it turns out, is the main problem with technology: it lacks human judgment, which cost us time, energy -- and nine months' of sales.