• 0 Posts
  • 63 Comments
Joined 9 months ago
cake
Cake day: October 11th, 2023

help-circle







  • Deinstitutionalization was dreampt up by deluded idealists that slept with a copy of Naissance de la Clinique firmly lodged in their asses. Abolishing asylums was good, because at the time asylums were the aforesaid victorian dungeons. But from the outset, the movement was based on the belief that a magic pill would cure everything and all long term treatment was oppressive.

    Antipsychotics enabled community treatment at all. But the wholesale rejection of both long term and secure treatment facilities was an indefensible failure of reasoning and an abject tragedy, and one that was set in motion by Hoffman and his peers when they penned the foundational texts of the movement.

    We desperately need secure treatment facilities. There is no solution if we do not have them, just the continuing abject failure of basic human decency that we currently have. This system is broken, and it is directly the fault of everyone who began the deinstitutionalization movement and their total inability to foresee the obvious consequences of their actions. Regan was evil and JFK was understandably bitter, and even though they both worked to bring the end of asylums, they are both still guilty for their roles in bringing this current hell down on us.


  • What?? We desperately need mental health institutions back. No, we don’t need the romanticized victorian dungeons, but what we do need is an alternative to jails. Secure treatment facilities. We have… four, on the west coast. Two of which have at most ~160 beds. The priority waiting list for admission is decades long (no, that isnt an exaggeration) and there isn’t a non-priority waiting list. If you’re not a priority, you just go to jail!

    Community treatment is critical and we totally lack anything like it, but good god deinstitutionalization was one of the biggest public health and social equity diasters this country has ever had.





  • Thats on the companies to figure out, tbh. “you cant say we arent allowed to build biological weapons, thats too hard” isn’t what you’re saying, but it’s a hyperbolic example. The industry needs to figure out how to control the monster they’ve happily sent staggering towards the village, and really they’re the only people with the knowledge to figure out how to stop it. If it’s not possible, maybe we should restrict this tech until it is possible. LLMs aren’t going to end the world, probably, but a protein sequencing AI that hallucinates while replicating a flu virus could be real bad for us as a species, to say nothing of the pearl clutching scenario of bad actors getting ahold of it.