Protecting Cyborg-Consumers in the Internet of Bodies
By Guido Noto La Diega, Benjamin Clubbs Coldron, Christian Twigg-Flesner, Christoph Busch, Tabea Stolte and Marc-Oliver de Vries - Posted on 8 January 2026
TL;DR
- Consumer IoT is moving from our homes and pockets into and onto our bodies—think glucose monitors, smartwatches, EEG headphones, connected cars.
- This “cyborgification” boosts capabilities but also creates new, more intimate forms of vulnerability that current laws only partially address.
- Today’s consumer protections still lean on a narrow idea of “the average consumer” and “inherent” vulnerability (e.g., age, disability). We argue for a shift toward contextual, relational, and more universal protections.
- Transparency helps, but it’s not enough. We need “fairness and security by design,” stronger limits on manipulation and bricking, meaningful user controls over personalization, and easier redress and collective enforcement.
What do we mean by “consumer IoT”—and why now?
The consumer Internet of Things (IoT) covers connected devices that collect and exchange data: smart speakers and thermostats, wearables and fitness trackers, connected cars, and increasingly, technologies that sit on—or even under—our skin.
These devices promise convenience, personalization, and performance. But as they become ever more integrated into our bodies and lifeworlds, they introduce new risks. Remote control, software updates, algorithmic nudges, and “always-on” data flows mean companies can influence environments, consumption, and even habits in real time.
We call the result “cyborgification”: a growing human–machine symbiosis that enhances capabilities while complicating and deepening our vulnerabilities.
From digital to cyborg vulnerability
Classic consumer protection often focuses on:
- Inherent vulnerability: e.g., age or disability
- Contextual/relational vulnerability: power imbalances in specific interactions
- Universal/ontological vulnerability: we’re all vulnerable, in different ways, over time
Digital vulnerability added a layer: how platforms, dark patterns, and data-driven personalization can exploit attention, decision fatigue, and information asymmetries.
Cyborg vulnerability goes further. It emerges at the intersection of:
- Physical actuation: devices can do things in or around our bodies
- Datafication: intimate, granular, and continuous data streams
- Dependency: software updates, cloud services, and closed ecosystems
- Identity: devices become part of how we perceive ourselves and our world
In short, when the internet becomes physical—and personal—the stakes rise. Malfunctions or exploitative practices can mean more than bad ads; they can touch safety, autonomy, and dignity.
What does cyborg vulnerability look like in everyday life?
- Under-the-skin dependencies: insulin pumps, continuous glucose monitors, hearing aids, or other implants that rely on proprietary apps and services. Updates can “brick” (that is, deactivate) features or push upgrades and subscriptions.
- Wearables as identity: smartwatches, rings, or earbuds that become lenses for self-understanding—step counts, heart rate, stress scores—creating both motivation and anxiety.
- Just-in-time pressures: connected cars inferring hunger or fatigue to serve hyper-targeted offers; smart homes adapting interfaces that steer choices at precisely vulnerable moments.
- New attack surfaces: more sensors and actuators mean more ways for third parties (or poor corporate practices) to affect bodies, data, and environments.
In our focus groups with smartwatch users in Scotland and Germany, participants described how numeric targets quickly became the frame for self-evaluation—helpful for some, but anxiety-inducing for others. Interviews with connected car and smart home experts confirmed how product support, software control, and ecosystem lock-ins shape consumer experience and leverage.
How is the law trying to keep up?
There’s progress, but there are also gaps.
- Unfair Commercial Practices Directive (UCPD): tackles misleading/aggressive practices and adds extra protection for “particularly vulnerable” groups. Problem: vulnerability is often read as inherent (age, infirmity) rather than contextual, and personalization lets firms target individuals, sidestepping group-based protections.
- Unfair Contract Terms Directive (UCTD): transparency helps, but it’s a weak shield when contracts are take‑it‑or‑leave‑it and devices create dependencies. “Informing” people doesn’t neutralize manipulation or lock-in.
- Digital Services Act (DSA): pushes transparency and user control of recommender systems (Article 27) and special protection for minors. Good direction, but cyborg vulnerabilities affect everyone—not just children.
- GDPR and DMA: consent, data minimisation, purpose limitation, and limits on profiling. In practice, consent fatigue and opaque purposes blunt effectiveness, while “personalization” is marketed as a benefit. The DMA tightens obligations for gatekeepers, but transparency alone doesn’t rebalance power.
- Product safety and liability (PLD, MDR, vehicle safety rules): updates aim to cover software, cybersecurity, interconnectedness, and self-learning functions—vital for devices that can affect bodies. Still, complex IoT supply chains make it hard for consumers to prove defects and responsibility.
- AI Act: bans some manipulative AI and high-risk uses, with transparency duties for others. The focus on “subliminal” techniques and inherently vulnerable groups risks leaving many real-world manipulation tactics untouched.
Bottom line: today’s framework is a patchwork that often assumes an “average, rational” consumer and puts too much weight on disclosure. Cyborgification widens the gap between what consumers can control and what systems can do.
Where we think policy should go next
- Treat cyborg vulnerability as contextual and widely shared
- Move beyond a rigid “average vs. vulnerable” consumer split.
- Recognize that IoT design, data flows, and actuation can render any consumer vulnerable in certain moments.
- Make “fairness and security by design” the norm
- Binding design duties for both hardware and software: safe defaults, least-privilege data flows, graceful degradation if cloud services fail, and meaningful offline/edge fallbacks where feasible.
- Clear limits on bricking, coercive updates, and paywalls for essential safety features.
- Curb manipulation—not just reveal it
- Set red lines for personalized dark patterns and high-pressure timing.
- Require platforms and IoT ecosystems to provide non-profiled modes that are actually usable—not buried.
- Shift from “notice-and-consent” to meaningful control
- DSA-style controls over recommender parameters should extend into IoT apps and device ecosystems.
- Standardized, comprehensible controls for data sharing, inferences, and actuation permissions.
- Strengthen redress and collective enforcement
- Make it easy for consumers (and groups) to obtain remedies for unfair practices and defective updates.
- Resource regulators to audit recommender systems and IoT ecosystems, not just read policies.
- Align standards and law (“law by design”)
- Tie EU/UK legal duties to harmonized technical standards for safety, explainability, and user control—shaping how products are built, not just how they are marketed.
- Reimagine contracting in the IoT
- Explore collective bargaining, standardized “no‑surprises” terms, or even agent-to-agent negotiation on behalf of consumers—with safeguards to prevent capture.
What companies can do now
- Build for failure: ensure safety-critical functions don’t depend on a single cloud service or subscription tier.
- Don’t brick: separate safety/security updates from monetization; honour reasonable device lifecycles.
- Minimise and compartmentalise data: collect only what’s necessary for a given feature, with clear on-device processing where possible.
- Make controls real: give users discoverable, plain-language toggles for personalization and data sharing, with non-profiled modes that still work well.
- Test for manipulation risks: run UX reviews for dark patterns, timing exploits, and vulnerable-moment nudges.
What consumers can do (imperfect but practical)
While the burden should be firstly on companies to embed consumer protections in the design of their products and services, there are some tips that consumers may want to consider:
- Check update policies and device lifecycles before you buy.
- Prefer devices with local functionality and open standards over closed, cloud-only ecosystems.
- Use non-profiled modes where offered; regularly review app/device permissions.
- Keep firmware updated; enable multi-factor authentication; change default passwords.
- Know your rights: portability, access, objection to profiling, and redress for unfair commercial practices.
Why this matters
Cyborgification isn’t sci‑fi; it’s daily life for billions of people. Our homes, cars, and bodies are becoming part of an always-on feedback loop that can empower—but also manipulate, coerce, or exclude. If consumer protection keeps treating vulnerability as an exception, it will miss where power now resides: in design choices, data architectures, and remote control over “Things” that increasingly define our autonomy.
A modern consumer law should:
- Assume vulnerability can be manufactured by context and design,
- Put clear limits on manipulation,
- Require fairness and security by default, and
- Make redress easy—individually and collectively.
Done well, the IoT can support public health, sustainability, and human capability. Done poorly, it turns people into programmable nodes. The choice is architectural—and legal.
Note on our research
- We used a mixed-method approach: doctrinal legal analysis alongside interviews with connected-car and smart home experts, and focus groups with smartwatch users in Scotland and Germany.
- Thematic analysis highlighted how personalization, lock-ins, and opaque support ecosystems shape lived experiences—amplifying the need to move beyond “transparency” toward enforceable design duties and practical user control.
Acknowledgements: This work draws on a project funded by the Arts and Humanities Research Council (AHRC) and the German Research Foundation (DFG) under the UK-German Funding Initiative in the Humanities (ref. AH/W010518/1).
The full article is open access: Benjamin Clubbs Coldron, Guido Noto La Diega, Christian Twigg-Flesner, Christoph Busch, Tabea Stolte & Marc-Oliver de Vries, ‘When the Internet Gets Under Our Skin: Reassessing Consumer Law and Policy in a Society of Cyborgs’ (2025) 48 J Consum Policy 205–232.