What value is testing junk V junk?

Ad copy testing is all very well, but if the ad copy is at fault in the first place what real differ

Testing bad copy against bad copy will only tell you which piece of bad copy is better than the other, or should that read less bad.  Confusing, isn’t it.  But whichever way you look at it the phrase rearranging the deck chairs on the Titanic springs to mind.

Another phrase that leaps to mind concerns monkeys sitting at typewriters and eventually one of them knocking out the complete works of Shakespeare.  Yes, it’s possible, hypothetically, but it relies on astronomic laws of chance which in practical terms don’t really exist in the real world.  Or do they?  If there’s one thing that does mirrors this situation it’s the far reaching, ever expanding world of Google’s algorithms, which brings me to paid search.

You can test AdWords ad copy to death simply because Google has the means to let you do so.  But bad copy will always only ever be bad copy no matter how much you test it.  Far better to write good copy in the first place.  This approach, of course, needs a copywriter, and not just any copywriter but one that’s trained in ATO, Ad Text Optimization, which is specialist, direct response copywriting for Google AdWords and takes into account how Google tools work and how its algorithms rank ads.

Another way to look at it is to imagine a test drive session at Silverstone.  Heat one has got two double-decker buses side by side on the grid, and heat two has two FI cars.  I know which set of results I’d find the most interesting.