This is problematic on multiple levels. The proposal would create a new agency, the Health Advanced Research Project Agency (HARPA) in HHS. It would be modeled after a similar agency at the Pentagon (DARPA). /1 washingtonpost.com/politics/2019/…
In conjunction with the Susan Wright Foundation, the project would cost $40-60M and use “volunteer” data to identify neurobehavioral signs someone is headed to a violent explosive act. Speaking of signs, how about using the sign that /2
a person is involved in the white nationalist movement because they seem to dominate the profile of recent mass shooters. Even FBI Director Wray has identified that demographic as a domestic terrorist threat. /3
Most studies show that no more than a aaquarter of mass shooters have a diagnosed mental illness. More commonly, experts say, attributes of mass shooters include strong sense of resentment, desire for notoriety, a h/o domestic violence, narcissism and access to firearms. /4
These attributes may be a significant problem in the U.S. but they are by no means isolated to our borders. All of them exist throughout the world with the exception of one. Access to firearms. /5
A side note, many of the attributes listed above could be assigned to many current and former members of the Trump administration and family. /6
The HARPA agency would develop a “sensor suite” using AI to identify in real time mental status changes that could make an individual more prone to violent behavior. /7
Using real-time data analytics, the proposal lists widely used technologies to collect the data - Apple Watches, FitBits, Amazon Echo, Google Home. /8
All “voluntary”. Will this be like the time Facebook users “volunteered” their data to SCL and Cambridge Analytica? Will the consent be buried deep in TOS fine print, incomprehensible to anyone who is not an attorney? /9
What happens if someone is identified as having a mental status change suggestive of a violent act? Will they be involuntarily committed if they refuse treatment? What if they refuse medication? Will they be medicated through court orders? Based on predictive modeling alone? /10
People can be involuntarily committed for 72 hours. However this is a rigorous process. The person has to pose a real threat to themselves or others, for instance telling someone they intend to kill themselves, with the means and a plan to carry out the suicide. /11
Would we now be forcibly institutionalizing people based on algorithms? Will those algorithms be the same ones that flood my Twitter TL with nonsense tweets from Jim Jordan, whom I don’t even follow. Because those algorithms suck. /12
Instead of wasting time and money having the government identifying risk factors to violence, why don’t they, just, I dunno.....READ. efsgv.org/wp-content/upl…
Trump and the skin tags he calls advisors are looking for a political win. Solutions to problems that START with the intent to give someone a political advantage are disingenuous and ineffective.
This is a serious problem. It deserves a serious solution. Not one pushed by a former TV executive whose claim to fame is hiring Trump for “The Apprentice”.
They have tried to use this approach in addressing pancreatic cancer. It failed.
Lastly, who will be paying for treating these people that this technology identifies? Does this mean they will have a pre-existing condition? Who will this information be shared with? Doctors? Family? Employers? Health insurance companies?
Share this Scrolly Tale with your friends.
A Scrolly Tale is a new way to read Twitter threads with a more visually immersive experience.
Discover more beautiful Scrolly Tales like this.
