Here’s an experiment I’m dying to try out: At the next industry event (AdMonsters or other), I want to see how quickly the conversation about header bidding turns into potshots aimed at Google. Based on what I’ve seen lately, my guess is that it’s almost as fast as the response time of a typical header bidding partner.
Lost in a lot of the discussion is that header bidding has been around for a long time. I recall Criteo pushing header bidding back in 2013 at AdMonsters Publisher Forums, though it didn’t have a “sexy” name like header bidding.
The wide publisher adoption of header bidding is relatively recent and it’s because people figured out that the system was rigged: the auction process was conducted to the advantage of Google and not the publishers from a revenue standpoint (read more about it here). Once word got out that publishers were seeing double-digit revenue growth with this “hack,” it wasn’t just the plucky programmatic-only pubs that joined in – suddenly all the big pub players have started contemplating whether header bidding can work for them.
As it has became clear that header bidding is more than a fad, Google responded with a press release announcing they would open Dynamic Allocation to specific third-party demand partners. To address the biggest header concern – latency – Google made sure to highlight that those partners would be running through a server-to-server connection.
The release has probably had a cooling effect for some who might have wanted to dive into the header bidding pool, many publishers simply aren’t going to wait. Why would someone wait for more revenue? In fact, many pubs are joining this gold rush because the the revenue is needed.
For all of that, do you know what’s getting bashed even harder than Google? Ad tech. Ads with enough tracking pixels they sometimes don’t serve (I liken it to a runner with so many fitbit-type devices that they can no longer run). Anything that can make the ad experience worse than it already is will only increase the number of people who block ads. It’s the existential problem we face in our industry and where Google is absolutely in the right. Latency kills. People react to one bad experience and collectively judge us all by blocking all of our ads.
I’m not saying header bidding is causing ad blocking, but if it’s adding latency I’m saying it’s not helping.
So the real game is keeping the lights on (revenue) while keeping the electricity coming into the house (users). Here are some thoughts on accomplishing that feat.
Know what you might be missing. I’m glad Google is coming out with a solution that might (notice my hedging) bring us closer to a unified auction, but ultimately will it mean more revenue? How will you know? What I mean to say is, get into header bidding now if you’re not already. Test. Learn. When Google’s solution comes out, you’ll know what you’re gaining or giving up.
Faster, header bidding, faster! If you are testing header bidding and you’re creating significant latency for your users, stop. You have to have the right dev chops to play in this game and not everyone is staffed for it. I’m hearing more and more people saying they aren’t seeing latency issues. I don’t believe everyone that says it, but if it’s true then the task is to get even faster. We can’t settle for slow. Keep pushing for faster response times.
Don’t forget what we’ve learned. “Header bidding is dead” articles have started to clog the trades. Sigh. I think this thing called header bidding is going to change and hopefully with server to server solutions that provide transparency to the market, we get the lightning fast unified auction model we all hoped for. It most likely won’t be called header bidding at that point but something else. But let’s not forget one key part of header bidding that is a concept that I feel will live on. We can develop solutions that can better inform our ad servers outside the ad server. It opens up a scenario where exciting new companies can come in and provide real value. To me that’s one of the most exciting things we’ll see emerge in the future.