mastodon.social is one of the many independent Mastodon servers you can use to participate in the fediverse.
The original server operated by the Mastodon gGmbH non-profit

Administered by:

Server stats:

373K
active users

nixCraft 🐧

Researchers say an AI-powered transcription tool used in hospitals invents things no one ever said apnews.com/article/ai-artifici This is known as hallucination -- it can include racial commentary, violent rhetoric, and even imagined medical treatments. Yet, they decided to use an OpenAI-based LLM? Has hospital management lost its mind, or is human life no longer valued?

Assistant professor of information science Allison Koenecke, an author of a recent study that found hallucinations in a speech-to-text transcription tool, works in her office at Cornell University in Ithaca, N.Y., Friday, Feb. 2, 2024. The text preceded by "#Ground truth" shows what was actually said while the sentences preceded by ""text"" was how the transcription program interpreted the words. (AP Photo/Seth Wenig)
AP News · Researchers say AI transcription tool used in hospitals invents things no one ever saidBy GARANCE BURKE

Boeing hid a significant problem with the software in their 737 MAX airplanes that resulted in two airplane crashes, killing over 350 people. Boeing faced financial penalties, but no Boeing executives were criminally charged with these issues. They know they can get away by paying money. As long as this option exists, CEOs and executives will keep cutting corners, and innocent humans will die. It is the same thing with OpenAI. Sama knows he will not go to jail if someone dies.

@nixCraft
Not using your mind is often indistinguishable from losing it

@nixCraft my sister does transcriptions (German/English) and doesn’t hallucinate at all. Please redirect all payments from the useless tech company to her. Thx.

@nixCraft

"hallucination"...

Marketing speak designed to lessen the perception of the severity of the huge problem that so-called AI is:
- lying
- fantasizing
- fabricating
- absolutely wrong
- falsifying
- completely just making shite up.

The severity of the problem is being whitewashed and use of terms that foment such whitewashing should be shunned.
Especially in health care where the stakes can be life/death.

Until/Unless this is corrected, "AI" output simply can not be trusted.

@lupus_blackfur @nixCraft all of these terms are anthropomorphising a glorified next-sequence-of-words estimator.

Hallucination does as well, but perhaps not as much.

@nixCraft That’s hopefully changing when software vendor liability kicks in in 2026 in the EU.

@nixCraft
And for the last 30 years they replace one set with another.
"Meet the new boss, same as the old boss"
Until they start concentrating on products ... they can expect more of the same.

@nixCraft “Fines” just means “legal for a price”. A price that execs are willing to pay with the company’s money, instead of any personal price they should be paying

@nixCraft especially since it's not their own money they pay the fines with. There's no incentive for personal responsibility.

@nixCraft ai should only be used for joke material and experimentation, AI is still too brainless to be reliable.

@nixCraft Boeing execs are like many, many others who will do anything to justify their obscene paychecks as long as they run no risk of jail time. Fines? Multimillion dollar fines are less than parking tickets to the budgets of these corporations. Planes full of people crashing, millions addicted to drugs, millions of people fed lies and suicidal posts by their programmed attention algorithms. "Just another day at the office"

@nixCraft why the fuck so you even need an llm for transcripts? We have had speech to text for decades.

@Chase @nixCraft I can see an ML model helping with matching specific sounds (however they're mangled for processing) to specific words, but I also wonder why people think a language model is the tool for that job.

It doesn't seem to be, my naive take is that this is "just" a classification task.

Wrong tool for the job, for sure!

@nixCraft The problem here is that speech recognition is a hard problem and although tools like dragon dictation and others have been around for a long time no tool will be 100% perfect because every human talks slightly different.

@nixCraft How is this even possible? yeah and probably getting worse

@nixCraft@mastodon.social I can't pull citations, but when this kind of usecase was floated at the beginning of the AI hype my friends and I knew this was going to happen. Anyone with half a braincell knew this was going to happen.

But hospital admins are braindead, and hospital staff are often remarkably tech-illiterate. I imagine that hospitals will continue to adopt and use the technology until someone makes a major network eat shit in a liability case, as liability is about the only way you can make hospital admins care about anything beyond profit.