Recently there's been a dramatic rise in the number of fraudulent apps getting attention -- even top sales positions -- in the iPhone and iPad App Store. Some scam apps are copy-cats that duplicate as closely as possible the name and icon of popular games in order to confuse consumers and get them to buy a scam app instead of the real thing. This costs the consumer money for the scam app and developers money for the lost sale. Others scam apps appear to be byte-for-byte copies, stolen whole-cloth and offered for sale side-by-side with the original. This still costs the developer money for the lost sale, and while consumers get a functioning app, it's likely not one with any support going forward. Still other scam apps rip off the copyright of a popular brand (like Pokemon) for bogus apps that do nothing but cheat customers out of their money.
They all combine to damage confidence in the App Store, and harm the experience of the iOS platform.
For a developer, it's just one more risk they need to consider when developing for iOS -- even if they make a superbly crafted app, avoid dilution and downward price pressure from lower quality apps in the same space, and hit the jackpot by landing on a top seller list, their marketshare and customer base can be quickly assaulted by scammers.
For consumers, it's just one more hurdle to face when trying to find the good apps -- even if they hear about something fantastic from a trusted source, even if they manage to find the right app, they now have to worry if the one they find is the right, right app.
For Apple, it's just one more problem they have to figure out in order to maintain the appeal and value of their ecosystem -- even though they have a curated system that makes it easy to sell and easy to buy, they now have to deal with scammers damaging both the selling and buying trust of their store.
Right now, from the outside, Apple's approach seems to be that of YouTube -- approve any app that meets technical criteria and then respond to publicity or legal takedown demands from copyright holders when and if they come in. It's one of the smartest, safest approaches, legally, for Apple. They certainly don't want to take on the responsibility of pre-emptively moderating intellectual property, and then have their necks on the lawsuit line when something slips through and the rights holders sue both the offending party and Apple.
It's also open for abuse by large companies misusing infringement claims to remove competing apps by smaller companies who can't afford the litigation.
So it's a complex, entangled, messy piece of business that harms Apple, developers, and consumers. The fault lies entirely with the scammers making the fraudulent apps -- they're the ones to blame. But ultimately Apple will have to fix it, because it's Apple's store.
How to fix it is the question.
Paul Haddad from Tapbots had an interesting suggestion on Twitter: Start with the Top 100 lists. Keep those extra, extra curated. Scam apps in search results are more difficult to tackle problem, but scam apps in the Top 100, especially in games, is probably manageable. Get a team that knows the biggest classics and the hottest new games, and when something that looks like a scam app shows up in the Top 100, contact the developer and ask for proof of ownership and license, and contact the owner of the original app and inform them of a potential violation of their IP.
If the scammers can't make money, they'll be less inclined to spend time scamming.
It likely puts Apple in a more actionable position, will probably get them sued more often when a scam app slips through or when a non-scam app gets incorrectly targeted, but it just might be the cost of doing business to maintain a better, more valuable store for developers.
Apple could also make it easier to report a scam app via the App Store. You can currently report problems with apps and games you've purchased, but it would be great if you could flag inappropriate content right from the dropdown menu on ever app price sticker. It does take some of the shine off, and would result in a lot of noise for Apple, but huge spikes in reporting could also let them get some crowd-sourced help in finding offenders faster, ultimately letting them keep a cleaner, better store.