A question of consent

Welcome to the great digital trade-off: you get cat videos, we get your data. Users click “Accept All” with the enthusiasm of someone dismissing a smoke alarm, and adversaries collect what they need, technically with permission. It is not so much informed consent as it is ambient surrender.

Marketers bid more for your attention than you would ever pay for the content you are consuming, and the system knows it. So here we are: surrounded by cookies we never asked for, tracked by pixels we cannot see, and apparently fine with it because the website loaded half a second faster. The garden is not gated. It is wide open, and every time you wander in, you are agreeing to that.

Meaningful consent has three properties. It is informed: the person knows what they are agreeing to, in plain language, before they agree to anything. It is specific: they have consented to this use of their data, not to a vague intention to use it for “service improvement and related purposes.” And it is revocable: they can withdraw it, and something actually changes when they do.

The current model offers none of this. It offers a dialogue box designed to frustrate, a pre-ticked checkbox, and a “Reject All” button that takes three more clicks than “Accept All” and resets every time you return.

The structural problem

Even users who understand the system and try to opt out face a design environment built to resist them. Cookie banners are deliberately confusing. Privacy settings are buried. Data portability tools, where they exist at all, produce exports that are technically correct and practically useless.

This is not accidental. The incentive structure rewards data collection and punishes friction. The result is a system that produces the legal artefacts of consent (a log entry confirming a click) while systematically preventing the substance of it (an informed, meaningful choice).

For adversaries, the consent economy is not a problem to work around. It is a feature. Consent frameworks legitimise data collection at scale, create repositories that can be breached or subpoenaed, and establish a paper trail that protects the collector while exposing the individual. When data collected “with consent” is sold to a broker, shared with a government agency, or incorporated into a re-identification attack, the original click in a dialogue box is doing a lot of work it was never designed to do.

The fiction of consent does not protect the person who clicked. It protects the organisation that asked them to.