Programmatic Buying · Sell-Side Stack

Header bidding for CTV in India: how publishers implement unified auction

Header bidding on CTV is technically different from header bidding on web. There is no browser header, no Prebid.js, and no synchronous JavaScript execution. CTV header bidding — more accurately called server-side bidding or unified auction for video — uses server-to-server OpenRTB calls that run parallel demand requests from multiple SSPs before the publisher's ad server makes a final decision. For India CTV publishers, implementing server-side bidding is the primary route to increasing programmatic fill rates and CPMs.

Header bidding on CTV vs web

On web, header bidding runs in the browser: a JavaScript wrapper (Prebid.js) fires parallel bid requests to multiple SSPs before the page loads, collects bids, and passes the highest to the ad server. On CTV, this is impossible — smart TV apps are native applications without a browser runtime.

CTV unified auction instead uses a server-side wrapper:

  • When an ad opportunity arises (pre-roll, mid-roll break), the publisher's app notifies the publisher's ad server (Google Ad Manager, FreeWheel, or a custom ad decisioning layer).
  • The ad server fires parallel OpenRTB bid requests to multiple SSPs simultaneously from its server — not from the user's device.
  • SSPs respond with bids. The ad server compares all bids and selects the winner based on price and any deal prioritisation rules.
  • The winning ad is served through SSAI (server-side ad insertion) — the ad video is stitched into the content stream server-side, so the viewer's device never needs to load a separate ad call.

This server-side architecture means auction latency (the time between opportunity and ad decision) is added at the server level, not the client level — which is critical for buffering experience on India's variable-bandwidth connections.

Server-side bidding implementation for video

A CTV publisher implementing unified auction needs:

  1. Ad server with server-side bidding support: Google Ad Manager (via Open Bidding) or FreeWheel support server-side demand competition. Ad server must be configured to call external demand partners (SSPs) in parallel and accept their bids.
  2. SSP integrations: The publisher registers with each SSP they want to include in the auction (PubMatic, Magnite, Index Exchange, SpotX/Magnite for video). Each SSP integration requires technical setup (endpoint configuration, seat ID, floor price rules).
  3. OpenRTB bid request construction: The ad server must send a well-formed OpenRTB 2.x bid request to each SSP containing required fields: app bundle ID, content metadata, device information, user ID (device ID), ad slot specifications (duration, format, skippability), and floor price.
  4. SSAI integration: Winning ad must be delivered via SSAI. The ad server communicates the winning VAST URL to the SSAI layer, which fetches and stitches the ad before delivering the combined stream to the device.

Timeout management is the most operationally complex part of CTV server-side bidding. The publisher must set a bid timeout (typically 150–250ms for CTV) after which non-responding SSPs are excluded from the auction. Too short a timeout excludes demand unnecessarily; too long risks visible buffering at ad breaks.

India publisher adoption of unified auction

Server-side unified auction adoption among India CTV publishers is uneven:

Publisher tierUnified auction statusNotes
JioHotstarInternal yield optimisation — not standard open unified auctionJioHotstar manages demand through its own proprietary system; standard SSP unified auction is not the mechanism
Zee5Partial — Google Ad Manager Open Bidding implementedRunning server-side bidding with select SSP partners; expanding demand competition
SonyLIVPartial — GAM-basedAd decisioning via GAM; SSP competition varies by inventory tier
MX Player / GlanceAdvanced — multiple SSP integrationsMX Player has historically been more programmatically open than premium OTT; strong SSP footprint
Regional OTT, FASTVariable — many still waterfall-basedSmaller publishers often lack resources for full server-side bidding implementation; waterfall remains common

Buyer implications of CTV unified auction

More demand competition means higher clearing prices. When a publisher moves from waterfall to unified auction, clearing CPMs typically increase 15–30% — which is good for publishers but means buyers face higher floors. Programmatic CPM benchmarks in India are rising partly because of this increased competition.

Bid landscape visibility. DSPs that participate in multiple SSPs' CTV programs are bid in competition with each other at unified auction publishers. A DV360 buyer and a TTD buyer are competing for the same impression simultaneously. Understanding which SSPs a publisher uses helps predict competitive bid landscapes.

Deal ID priority is preserved. Even in unified auction, publisher ad servers prioritise guaranteed deal IDs (PG, preferred deals) above the open auction. The auction only runs for impressions not claimed by a deal. Buyers with direct deal IDs are effectively shielded from unified auction competition.

Latency considerations for India

India-specific latency factors that affect CTV server-side bidding:

India CTV publishers typically use ad server and SSP infrastructure hosted in Singapore (closest major cloud region). A bid request from a publisher's server in Singapore to an SSP server in the US or Europe adds 150–250ms round-trip latency on top of the publisher's internal auction processing time. For a 200ms bid timeout, cross-region SSPs may consistently miss the timeout.

Publishers and SSPs that have co-located their CTV bidding infrastructure in Singapore or Mumbai reduce this latency materially. PubMatic and Magnite operate India-region infrastructure, which reduces effective latency for their India CTV demand. Buyers running through DSPs with India-region SSP connections have a latency advantage in CTV auctions with tight timeouts.