I've been following this saga and all other tech shenanigans with much anxiety as well -- heartlessness is a very good way to describe it. Because of course, in the face of greed, having a heart is a disadvantage. It's mind-boggling how all this isn't a bigger focus for government (or even public) scrutiny. Here's a post that complements yours well, Martha: https://open.substack.com/pub/mollywhite/p/effective-obfuscation?r=8q1pz&utm_medium=ios&utm_campaign=post
Thanks, Neva, I appreciate that link to another stack. Focusing on a future utopia is often an excuse for not dealing with real-world problems in the present. The effective-altruism strand in the OpenAI debate is a little complex, though, given that some of that movement derives from the work of Peter Singer and isn't all about saving humanity in the unspecified future. Regardless, the business heads are in charge at OpenAI, which was probably predictable, but it still needs to be called out for what it is.
"With AI, I’m not worried about machines taking over and destroying humanity. What troubles me are the current and near-term impacts of a transformative technology: job losses with no backup plan; the misery of those already on the streets; a dumbing down of human expression; the destruction of serious media and journalism; just a few tech companies controlling how information is distributed globally." - This is the part that always gets minimized in mainstream coverage even though they're the most imminent. Thank you for laying out what many of us really think and feel about unfettered technology. Wishing you safe space and comfort in your grief.
I think you have a well developed vision of how AI might impoverish our humaneness. I haven't thought about it enough or studied it enough or tried to understand it well enough to foresee its specific harmful effects. I hope you continue writing about AI and some of the specific scenarios you fear.
In the meantime I streamed Oppenheimer this weekend and it brought back nuclear weapons fears. I thought about those fears when you mentioned the two current wars most in the news. It's been almost 80 years since 1945, and I know that when something hasn't happened in such a long period of time, we start to fool ourselves into believing it can't happen.
The only happy note I can end on is that I'm glad I get to read your thoughts.
David, thanks so much for your condolences, which mean a lot. I'm still grappling with the shock of my friend's death, but it heightened my sense that we're moving more than ever toward a culture that values success and rationalism over human kindness and heart.
I see the rapid growth of AI in the public sphere as part of a longer trend kicked off by tech platforms for digital communication, one that began by calling all writing and art "content" in the early 2000s and has essentially destroyed most legacy forms of journalism. Now these same trends in favor of optimization, information processing, and utility are swallowing humanities departments in universities and reframing how we read. The emphasis on the digital record is erasing much that didn't end up there, especially now that the "record" is becoming institutionalized in generative AI systems.
Some might say this is just how progress happens. I don't mind progress, but I say we're in trouble — and by "we," I mean anyone who believes that human experience depends on more than utility and information, who feels we are embodied beings, fragile and wonderful and worth knowing despite the potential for heartbreak.
So sorry about your loss, Martha. And I am really enjoying your think pieces about AI. I find myself slowing down to take them in, in a day that is overrun with pings on my time. Ironies abound in that statement, I know. 🙂
I'm so sorry about the sudden loss of your friend, what a shock. I'm unsophisticated about tech but have found myself curious (and increasingly concerned) about AI and surprised it's not being more widely discussed, at least in my circles. Hard I suppose for humans to conceive of the many possibilities we've not yet lived. Thank you for your thoughtful reflections. I'm sure there's always been some resistance to massive technological shifts and a desire to turn back toward what we know, and yes, what feels ultimately most grounded and human. AI seems of a magnitude greater than many changes before it, however, and you're right, the people at the top making these massively consequential choices for us aren't representative of the population and wield a scary amount of power.
Addie, thank you so much for your condolences, they are much appreciated this morning. And I appreciate your thoughtful comments about how difficult it is to wrap our minds around a massive shift like AI — the shift is underway, so we're already grappling with something that extends far beyond individual humans. But we can still come together collectively to resist changes that increase structural social inequalities and to speak up for a future that is not strictly driven by technology optimization and profit. As you say, the most worrying thing is that the AI boosters at the top "wield a scary amount of power."
Martha I'm so very sorry for your loss. May their memory be a blessing. And thank you Addie for this comment, it distilled a lot of my similar feelings. And again Martha, thank you for your reply and your perspective in the piece. I found myself nodding along, scratching my head, and wanting to burn my laptop all at the same time.
Sad to hear about your loss, sigh. — E. Musk noted a founding principle of OpenAI was to counter the powerful concentrated lead in AI by Google by being 'Open Source '. Fast forward to today, where is Sam's head now... The actions of the company are signaling a menacing turning. However, no information is escaping from this black hole event.
You’re right - it’s still being treated as a black-hole event, as if nothing really happened, even in Ezra Klein’s recent podcast episode about what went on at OpenAI.
I've been following this saga and all other tech shenanigans with much anxiety as well -- heartlessness is a very good way to describe it. Because of course, in the face of greed, having a heart is a disadvantage. It's mind-boggling how all this isn't a bigger focus for government (or even public) scrutiny. Here's a post that complements yours well, Martha: https://open.substack.com/pub/mollywhite/p/effective-obfuscation?r=8q1pz&utm_medium=ios&utm_campaign=post
Thanks, Neva, I appreciate that link to another stack. Focusing on a future utopia is often an excuse for not dealing with real-world problems in the present. The effective-altruism strand in the OpenAI debate is a little complex, though, given that some of that movement derives from the work of Peter Singer and isn't all about saving humanity in the unspecified future. Regardless, the business heads are in charge at OpenAI, which was probably predictable, but it still needs to be called out for what it is.
"With AI, I’m not worried about machines taking over and destroying humanity. What troubles me are the current and near-term impacts of a transformative technology: job losses with no backup plan; the misery of those already on the streets; a dumbing down of human expression; the destruction of serious media and journalism; just a few tech companies controlling how information is distributed globally." - This is the part that always gets minimized in mainstream coverage even though they're the most imminent. Thank you for laying out what many of us really think and feel about unfettered technology. Wishing you safe space and comfort in your grief.
Martha,
My condolences to you at the loss of your friend.
I think you have a well developed vision of how AI might impoverish our humaneness. I haven't thought about it enough or studied it enough or tried to understand it well enough to foresee its specific harmful effects. I hope you continue writing about AI and some of the specific scenarios you fear.
In the meantime I streamed Oppenheimer this weekend and it brought back nuclear weapons fears. I thought about those fears when you mentioned the two current wars most in the news. It's been almost 80 years since 1945, and I know that when something hasn't happened in such a long period of time, we start to fool ourselves into believing it can't happen.
The only happy note I can end on is that I'm glad I get to read your thoughts.
Best,
David
David, thanks so much for your condolences, which mean a lot. I'm still grappling with the shock of my friend's death, but it heightened my sense that we're moving more than ever toward a culture that values success and rationalism over human kindness and heart.
I see the rapid growth of AI in the public sphere as part of a longer trend kicked off by tech platforms for digital communication, one that began by calling all writing and art "content" in the early 2000s and has essentially destroyed most legacy forms of journalism. Now these same trends in favor of optimization, information processing, and utility are swallowing humanities departments in universities and reframing how we read. The emphasis on the digital record is erasing much that didn't end up there, especially now that the "record" is becoming institutionalized in generative AI systems.
Some might say this is just how progress happens. I don't mind progress, but I say we're in trouble — and by "we," I mean anyone who believes that human experience depends on more than utility and information, who feels we are embodied beings, fragile and wonderful and worth knowing despite the potential for heartbreak.
I appreciate your support, David, truly.
So sorry about your loss, Martha. And I am really enjoying your think pieces about AI. I find myself slowing down to take them in, in a day that is overrun with pings on my time. Ironies abound in that statement, I know. 🙂
Thanks, Shelley. If we don't acknowledge the ironies of digital communication, we'll never be able to resist them :-)
Martha,
I'm so sorry about the sudden loss of your friend, what a shock. I'm unsophisticated about tech but have found myself curious (and increasingly concerned) about AI and surprised it's not being more widely discussed, at least in my circles. Hard I suppose for humans to conceive of the many possibilities we've not yet lived. Thank you for your thoughtful reflections. I'm sure there's always been some resistance to massive technological shifts and a desire to turn back toward what we know, and yes, what feels ultimately most grounded and human. AI seems of a magnitude greater than many changes before it, however, and you're right, the people at the top making these massively consequential choices for us aren't representative of the population and wield a scary amount of power.
Addie, thank you so much for your condolences, they are much appreciated this morning. And I appreciate your thoughtful comments about how difficult it is to wrap our minds around a massive shift like AI — the shift is underway, so we're already grappling with something that extends far beyond individual humans. But we can still come together collectively to resist changes that increase structural social inequalities and to speak up for a future that is not strictly driven by technology optimization and profit. As you say, the most worrying thing is that the AI boosters at the top "wield a scary amount of power."
Martha I'm so very sorry for your loss. May their memory be a blessing. And thank you Addie for this comment, it distilled a lot of my similar feelings. And again Martha, thank you for your reply and your perspective in the piece. I found myself nodding along, scratching my head, and wanting to burn my laptop all at the same time.
Emmy, I appreciate your comment. And I hope you don't burn your laptop! Your Substack looks like one I'd love to read.
Thank you for saying that, I'd love to have you! <3 (laptop is still alive and well)
Hi Martha,
Sad to hear about your loss, sigh. — E. Musk noted a founding principle of OpenAI was to counter the powerful concentrated lead in AI by Google by being 'Open Source '. Fast forward to today, where is Sam's head now... The actions of the company are signaling a menacing turning. However, no information is escaping from this black hole event.
You’re right - it’s still being treated as a black-hole event, as if nothing really happened, even in Ezra Klein’s recent podcast episode about what went on at OpenAI.