Impacts¶
Surveillance does not need to act to be effective. The awareness of its possibility changes behaviour. The awareness of its certainty changes more. The impacts here are layered because the consequences of the same surveillance infrastructure are materially different depending on who you are and what you do.
Individuals¶
The primary and most direct impact on ordinary citizens is the chilling effect: the modification of behaviour driven by the belief that communications may be monitored, associations may be recorded, and political or social activity may be logged and retained.
Research consistently shows that mass surveillance awareness reduces engagement with politically sensitive content, reduces willingness to communicate about controversial subjects, and increases self-censorship among groups most likely to be targeted. These effects do not require any individual to have been specifically surveilled. The structural awareness of capability is sufficient.
Beyond the chilling effect:
Misclassification risk: population-scale data collection and automated analysis will produce false positives. Individuals who match a pattern associated with a risk category may be flagged for further investigation, included on watchlists, subjected to enhanced scrutiny at borders, or denied services, without ever having done anything that would justify any of this. The algorithm does not know the difference; it knows correlations. Challenging a misclassification requires knowing you are on a list, which is rarely disclosed.
Loss of anonymity in public life: movement through public space, attendance at meetings, participation in protests, and digital activity in nominally private contexts are all increasingly observable and retainable. Anonymity in civic life, which has historically been a protection for dissent, minority views, and vulnerable populations, is structurally eroding.
NGOs and civil society¶
Organisations working on human rights, refugee support, environmental protection, political advocacy, or any activity that involves documenting or challenging government behaviour face a specific threat profile that combines elements of all the adversary types in this model.
Exposure of beneficiaries and sources: an NGO’s case management data, contact lists, and secure communications may contain the identities of people who face direct personal risk if exposed. Refugees, whistleblowers, human rights defenders in conflict zones, and people fleeing persecution are identifiable through an organisation’s records. Surveillance of the organisation exposes them.
Operational chilling: legal teams, advocacy teams, and strategy documents are surveillance assets if the organisation is of interest to a domestic intelligence agency, a foreign government, or a donor state’s intelligence service. Awareness that internal communications may be accessed constrains what can be discussed and how.
Reputational and funding attacks: intelligence about internal disagreements, financial pressures, or sensitive cases can be used to generate public-facing attacks, trigger regulatory investigations, or pressure donors. This is the functional equivalent of the disinformation operations described in the deanonymisation model, with state resources behind it.
Companies¶
Industrial espionage is the most direct impact on commercial organisations, and it is not limited to adversarial states. Allied intelligence agencies have documented records of collecting economic intelligence on EU companies and making it available to domestic competitors. The Snowden disclosures included specific examples. The competitive intelligence collected includes R&D data, trade negotiation positions, merger and acquisition planning, and pricing strategies.
Compliance pressure as access vector: regulatory and law enforcement agencies can use compliance requirements as leverage to obtain data or access. A business that depends on its operating licence, tax treatment, or regulatory standing in a jurisdiction may face informal pressure to cooperate with data requests that fall short of the formal legal threshold. This pressure may never be explicit and may never produce a written record.
Vendor and infrastructure dependency: the structural dependence of European business on non-EU cloud infrastructure, software, and hardware means that commercially sensitive data regularly transits or resides in jurisdictions where it is subject to legal access mechanisms under foreign law. This is not a hypothetical risk. It is a continuous operational condition for most large European enterprises.
Research institutions¶
Universities, research institutes, and think tanks hold intellectual property, pre-publication research, and communications with international collaborators that are attractive to state intelligence collection for economic and political reasons.
Cross-border collaboration creates structural exposure: research conducted jointly with institutions in other jurisdictions involves data flows that may be subject to collection under any of those jurisdictions’ legal regimes. A collaborative research project involving institutions in three countries is potentially subject to the intelligence laws of all three.
The academic community’s tradition of open communication and international collaboration is a professional norm that creates an attack surface. Secure communication practices that would be routine in a high-risk civil society organisation are rarely standard in research settings.
Nations and EU¶
The EU faces a structural problem that GDPR and data protection law were not designed to solve: asymmetric intelligence capability between member states, and collective dependency on infrastructure that is jurisdictionally outside the Union.
Asymmetric intelligence: some member states are deeply integrated into the Fourteen Eyes sharing arrangement; others are not. The intelligence capability and the surveillance data available to different member state governments are therefore not equivalent. This creates asymmetric information within EU institutions themselves.
Infrastructure dependency: European citizens’ data is processed by US-headquartered platforms, stored on US cloud infrastructure, and transmitted through networks that include equipment from manufacturers with contested relationships to foreign governments. Three iterations of the EU-US data transfer framework have been unable to resolve the fundamental tension between US national security law and EU data protection rights. This is a sovereignty problem, not a technical one.
Erosion of digital sovereignty: the ability of the EU to govern its own digital environment, protect its citizens’ data, and maintain the integrity of its democratic processes depends on infrastructure it does not control, legal frameworks that other jurisdictions do not respect, and intelligence sharing arrangements that are not symmetrical or transparent.
Cross-layer: trust erosion¶
Across all of these layers, the deepest impact is on trust: in institutions, in systems, and between people.
When legal structures designed to protect privacy contain exemptions large enough to drive a signals intelligence agency through, when commercial platforms are required to cooperate with surveillance requests and prohibited from disclosing them, when the data generated by ordinary life is available for purchase by any actor with a budget and an interest, the reasonable response is uncertainty about what is private.
That uncertainty is not neutral in its effects. It falls most heavily on those who most need privacy: political dissidents, journalists, human rights workers, minorities under political pressure, and anyone whose legitimate activity might, under some future government’s definition, become a threat.
The system does not need to punish often. It needs to be perceived as capable of it. That perception is, at this point, well-founded.