[ad_1]
Private thoughts may not be private for much longer, heralding a nightmarish world where political views, thoughts, stray obsessions and feelings could be interrogated and punished all thanks to advances in neurotechnology.
Or at least that is what one of the world’s leading legal ethicists of neuroscience believes.
In a new book, The Battle for Your Brain, Duke University bioscience professor Nita Farahany argues that such intrusions into the human mind by technology are so close that a public discussion is long overdue and lawmakers should immediately establish brain protections as it would for any other area of personal liberty.
Advances in hacking and tracking thoughts, with Orwellian fears of mind control running just below the surface, is the subject of Farahany’s scholarship alongside urgent calls for legislative guarantees to thought privacy, including freedoms from “cognitive fingerprinting”, that lie within an area of ethics broadly termed “cognitive liberty”.
Certainly the field is advancing rapidly. The recent launch of ChatGPT and other AI tech innovations showed that some aspects of simulation of thought, termed machine learning, are already here. It’s been widely noted also that Elon Musk’s Neuralink and Mark Zuckerberg’s Meta are working on brain interfaces that can read thoughts directly. A new field of cognitive-enhancing drugs – called Nootropics – are being developed. Technology that allows people experiencing paralysis to control an artificial limb or write text on a screen just by thinking it are in the works.
But aside from the many benefits, there are clear threats around political indoctrination and interference, workplace or police surveillance, brain fingerprinting, the right to have thoughts, good or bad, the implications for the role of “intent” in the justice system, and so on.
Farahany, who served on Barack Obama’s commission for the study of bioethical issues, believes that advances in neurotechnology mean that intrusions through the door of brain privacy, whether by way of military programs or by way of well-funded research labs at big tech companies, are at hand via brain-to-computer innovations like wearable tech.
“All of the major tech companies have massive investments in multifunctional devices that have brain sensors in them,” Farahany said. “Neural sensors will become part of our everyday technology and a part of how we interact with that technology.”
Coupled with advances in science aimed at decoding and rewriting of brain functions are widespread and pose a discernible risk, Farahany argues, and one that requires urgent action to bring under agreed controls.
“We have a moment to get this right before that happens, both by becoming aware of what’s happening and by making critical choices we need to make now to decide how we use the technology in ways that are good and not misused or oppressive.”
The brain, Farahany warns, is the one space we still have for reprieve and privacy, and where people can cultivate a true sense of self and where they can keep how they’re feeling and their reactions to themselves. “In the very near future that won’t be possible,” she said.
In a sense, we already use technology to translate our thoughts and help our minds. Social media’s ability to read minds is already offered, free of charge, through participation with like and dislike functions, predictive algorithms, predictive text and so on.
But advances in neurotechnologies – exploiting a direct connection to the brain – would offer more precise and therefore potential dangerous forays into a hitherto private realm.
“I wrote this book with neurotechnology at the forefront as a wake-up call, but not just neurotechnology but all the ways out brains can be hacked and tracked and already are being hacked and tracked,” Farahany said.
Concerns about military-focused neuroscience, called the sixth dimension of warfare, are not in themselves new.
The Defense Advanced Research Projects Agency (Darpa) has been funding brain research since the 1970s. In 2001, the military umbrella launched a program to “develop technologies to augment warfighters”.
François du Cluzel, a project manager at Nato Act Innovation Hub, issued a report in November 2020 entitled Cognitive Warfare that, it said, “is not limited to the military or institutional world. Since the early 1990s, this capability has tended to be applied to the political, economic, cultural and societal fields.”
The US government has blacklisted Chinese institutes and firms it believes to be working on dangerous “biotechnology processes to support Chinese military end uses”, including “purported brain-control weaponry”.
In late 2021, the commerce department added 34 China-based entities to a blacklist, citing some for involvement in the creation of biotechnology that includes “purported brain-control weaponry” and of “acting contrary to the foreign policy or national security interests” of the US.
Nathan Beauchamp-Mustafaga, a policy analyst at the Rand Corporation and author the China Brief, has warned of an “evolution in warfare, moving from the natural and material domains – land, maritime, air and electromagnetic – into the realm of the human mind”.
Farahany argues that societies need to go further than addressing cognitive warfare or banning TikTok. Legislation to establish brain rights or cognitive liberties are needed alongside raising awareness of risks of intrusion posed by digital platforms integrated with advances in neuroscience.
“Neuro rights” laws, which include protections on the use of biometric data in health and legal settings, are already being drawn up. Two years ago, Chile became the first nation to add articles into its constitution to explicitly address the challenges of emerging neurotechnologies. The US state of Wisconsin has also passed laws on the collection of biometric data regarding the brain.
Most legal protections are around the disclosure of the collection of brain data, not around neuro rights themselves.
“There’s no comprehensive right to cognitive liberty, as I define it, that applies to far more than neurotechnologies but applies to self-determination over our brains and mental experiences, which applies to so many of the digital technologies we’re approaching today,” Farahany said.
Or, as Farahany writes in her book: “Will George Orwell’s dystopian vision of thoughtcrime become a modern-day reality?”
The answer could be yes, no or maybe, but none of it precludes an urgent need for formal brain protections that legislators or commercial interests may not be inclined to establish, Farahany believes.
She said: “Cognitive liberty is part of a much broader conversation that I believe is incredibly urgent given everything that is already happening, and the increasingly precision with which it’s going to happen, within neurotechnology.”
[ad_2]
Source link