After digging through old files from my PhD, I ran into a copy of a class presentation I worked on with Sarah Macleod and Mohammad Habibnezhad on cognitive enhancement technologies. Examples of cognitive enhancement technologies could be cognitive enhancing drugs such as Ritalin, or even whatever the heck Elon Musk is working on at Neuralink; what they have in common is that they make us better at thinking or learning. One of the potential future applications of my thesis work is the development of computer-based education that adapts to users attention. Such technologies may one day deliver education radically better than either our current MOOCs or even in-person lectures, and is therefore an example of a cognitive enhancement technology.
The presentation that we gave concerned whether cognitive enhancement technologies should be strictly regulated. We argued that cognitive enhancement technologies do not require additional regulation. Our argument was best summarized by the following chart:
Cognitive enhancement technologies can seem scary, but there already already existing frameworks to understand them. In Canada, the Food and Drugs Act governs all food, drugs, or medical devices, including any devices that modify or correct the body structure of humans. Both cognitive enhancing drugs and devices which modify the functioning of the human brain (as in Musk’s case) would therefore be governed by the Food and Drugs Act. This regulatory regime is fundamentally designed to ensure the safety of such technologies, or failing that offer sufficiently great rewards for the expected risks. If such high risk technologies are unable to offer great benefits to their users, they are likely to be banned.
I find the lower half of the matrix more interesting. When technologies are low risk to humans, we can envision them as either goods or rights. For example, there is pretty good evidence that coffee is a cognitive enhancer. There is some evidence that caffeine provides benefits to learning and memory, and even better evidence that it improves reaction time. However, the benefits of caffeine are slight. In market economies, we normally think of these sorts of things as ‘goods,’ insofar as they satisfy a consumer’s want. The main benefit of coffee is that it satisfies my desire for coffee; potential cognitive enhancement is secondary.
Sometimes however, goods can be low risk and offer high advantages, so much so that not having them will disadvantage a person’s capabilities to be a functioning member of society. For example, access to primary and secondary education is categorically different from access to coffee. People who do not have access to primary and secondary education are severely limited in their ability to take part in society. Children who have access to education can learn skills required to participate in the economy or polity, or potentially choose to pursue tertiary education of their choosing. Those who do not have access are severely limited in their capabilities and agency and will never be able to choose how to contribute to society. This idea is better summarized by Amartya Sen and Martha Naussbaum, and is often called the capability approach to rights.
I believe that low risk, high benefit cognitive enhancement technologies may fall into this category of ‘rights’ if they truly offer large political or economic advantages to their consumers. If we imagine a radically better way to teach students using learning technology, students who used such hypothetical technology would have significantly greater capabilities in society. They could potentially learn in a fraction of a time, becoming much more productive and potentially much more competitive. People who do not have access to this hypothetical technology would likewise be severely limited in their capabilities. We would therefore consider universal access to such cognitive enhancing technologies, at least if they become adopted at a large scale. These days, there are even business models that incentivize both innovation and free access to such high benefit technologies. Such business models might serve as a starting point for such technologies as they eventually make their way to open access.
Long story short, this is why I believe that cognitive enhancing technologies do not need additional regulation. If they are high risk technologies, we have existing regulation that covers them. If they are low risk, they are either goods or rights depending on their benefits; we have exiting methods of distributing these. Black Mirror will have to wait on this one.