Friday, October 4, 2024

A billion alternate facts are the same as no facts at all

Aya Waller-Bey wrote a guest editorial in last Sunday’s Detroit Free Press saying educators should not recommend black students write about their trauma in college admissions essays. Waller-Bey is a PhD candidate in sociology at the University of Michigan. She has worked in college admissions for a decade and studies race and identity in college admissions essays. Many times high school counselors will tell black students to write about the traumatic times in their lives, saying it will boost their chances of acceptance. Part of this is because an application platform used by thousands of schools suggests students to write about a time they “faced a challenge, setback, or failure” and to relate that to lessons learned. And part of it is the 2023 ruling by the Supreme Court that restricts race-conscious admissions. That means the essay is one of the limited ways for a minority student to let the admissions office know they are part of a minority. Together those two things signal to the adults around the student that admission chances go up when the student writes about their pain. But Waller-Bey sees a great deal wrong with this emphasis. + Writing about trauma is not a requirement. Students who don’t write about trauma are accepted. + A student is much more than their trauma. + Focusing on the trauma “contorts the colorful lives that Black students live into the anti-Black fantasies of others.” One should not have to “serve my trauma on a platter,” to divulge painful moments of their lives, for the consumption of the admissions team. + An essay may be a critique on the racism, the social systems and policies, that contributed to their suffering. Yes, racism permeates their lives. But writing about it says “I beat the odds,” which is another racist stereotype. The adults should stop telling students they need to focus on or even frame their stories around their pain. What should they write about instead? What they love about themselves and their communities. What gifts and talents they have, their vision of their future. What they learned “playing in their high school bands, running for class office, working at a local shoe store, or designing computer games.” Most important, the topic should be the student’s choice. Mark Sumner of Daily Kos hasn’t been doing many Ukraine Updates lately. So I was glad to see that he posted an update last week. One reason why there have been few updates is the invasion of Ukraine has been rather static. So not much has changed since his last update in June. Even so Sumner discussed a few things. Putin is not losing his grip on power. However, when regime change happens in Russia it tends to happen suddenly. Russia is making steady progress in the area west of Donetsk. For a long time the battle front had been static. But Ukrainian forces are getting tired and Russian troops get a steady supply of conscripts. The US Congress blocking aid money caused damage to the Ukrainian effort. Ukraine invaded the Russian region of Kursk back at the start of August and are still there. But Ukraine may not be able to expand the incursion. And one might debate whether the effort hurts Zelenskyy more than Putin. Zelenskyy met with Harris, then with the nasty guy. Harris said she will continue America’s support. The nasty guy made the visit all about himself. When I opened this post I noticed Sumner is listed as “Staff Emeritus.” He was a major source of sane information about this war and about the COVID pandemic. I mentioned his articles frequently. But now, instead of posting daily (sometimes more than once) as a regular news person he will post when he feels he has something to say. Like this. Sumner wrote about the current threat AI, as cumbersome as it is right now, is still a profound threat to humanity. Consider a simple math representation. Let L be the effort to create a convincing lie and D be the effort to debunk it. If L decreases or D increases the lie has time to persist and its damage increases. With AI lies are easy to create. They can be so convincing that special tools are needed to tell if it is fake. Low L and high D. Add to D the need to share the info that the lie is false so that the general public will no longer believe it. That must be done for every lie. And that is hard. And people will continue to spread lies. As AI improves lies will be harder to debunk and some can’t be debunked. Which means the chances of encountering a verifiable fact becomes less likely. Since AI can churn this stuff out the lies will flood the facts.
The idea that there might be two sets of alternate facts was enough to send half the nation tumbling down a rabbit hole. But a billion different alternate facts are the same as no facts at all. Fighting to salvage even a core of common beliefs under that kind of pressure will become ever more difficult as the most basic ideas become frayed by an abrasive force of well-supported undebunkable lies. It isn’t easy to conceive how any society stands up to that challenge.
It also isn’t easy to believe that those driving AI will recognize in time that safety rails are needed. There is a bit of hope. AI has serious performance limitations that more computing power, more data, and more energy (as in let’s reopen Three Mile Island – kids, ask your grandparents) can’t remedy.
That’s not to say that the current LLM-oriented technology doesn’t have value. It absolutely does. It just might not have the kind of all-pervasive utility that the multi-billion dollar investments have been chasing. Certainly not great enough to justify the energy costs, which are also environmental costs. If this generation of AI doesn’t turn out to produce all the benefits promoters have been promising, that will be a shame. But also maybe a blessing.
Bill in Portland, Maine, in his Cheers and Jeers column for Kos, includes a daily feature By the Numbers where he reports on various numbers in the news. Like this one:
Gallons of rain hurricane Helene dumped, according to the NOAA: 40 Trillion

No comments:

Post a Comment