
Plus: End to government shutdown may be in sight; how newsrooms are using AI.
One wrong word can quickly become a PR crisis.
On Thursday, OpenAI CFO Sarah Friar created a firestorm when speaking at a Wall Street Journal event. She suggested that the federal government could “backstop,” or guarantee, the debt that AI companies take on for the pricy chips that power the AI revolution, CNN reported.
The idea of the government taking responsibility for the debts of a private company rankled some and led Friar to quickly attempt to walk back her comments with a LinkedIn post.
“I used the word ‘backstop’ and it muddied the point,” Friar wrote. “As the full clip of my answer shows, I was making the point that American strength in technology will come from building real industrial capacity which requires the private sector and government playing their part.”
People can disagree on whether or not Friar muddied her point or was taken out of context – she did, however, also use the word “guarantee.” But the post on its own further added flames to the fire, with the ambiguity of what, exactly, it means for “the private sector and government (to play) their part.”
David Sacks, the Trump administration’s AI and crypto czar, wrote on X that “There will be no federal bailout for AI. The U.S. has at least 5 major frontier model companies. If one fails, others will take its place.” He did also give Friar the benefit of the doubt while also cautioning against her request: “I don’t think anyone was actually asking for a bailout. (That would be ridiculous.),” he wrote in a subsequent post.
Hours after Sacks’ post, OpenAI CEO Sam Altman, took to X to post his own response to the furor. He clarified flatly that, “we do not have or want government guarantees for OpenAI datacenters.” He went on to explain, however, that they would potentially be interested in governments building their own AI infrastructure of possibly offering guarantees in semiconductor fabrication.
“Our CFO talked about government financing yesterday, and then later clarified her point underscoring that she could have phrased things more clearly,” he said.
Why it matters: Misspeaking or explaining oneself inarticulately is an unfortunate part of public commentary. Even the most well-prepared executive can put their foot in their mouth. Let’s assume that Friar had all the appropriate training and her thoughts just got away from her in the hot seat. It happens.
Arguably the bigger PR issue was the follow up LinkedIn post, which left room for doubt and questions. The post didn’t go far enough in walking back her backstop comments and instead continued to focus on the government “playing (its) part.” Ambiguity remained.
These mistakes were rectified in Altman’s post, which began by unequivocally stating the position: they don’t want guarantees. Here’s what they do want from the government. Here are answers to some questions raised by Friar’s muddied communication.
The whole affair, including the successful final message from the CEO, shows how important it is to have various communicators working for different executives on the same page when things go awry. Had Friar posted something like Altman did in the wake of her comments, this all might have disappeared much more quietly. But a second inelegant statement elevated its profile and raised doubt as to OpenAI’s true intentions.
Editor’s Top Reads:
- The record-breaking government shutdown appears to be nearing its end as eight Senate Democrats sided with Republicans on a stop-gap measure that would reopen the government and fund it through Jan. 30, though some departments, including the Department of Agriculture, the FDA, and the VA, through the fiscal year. Republicans have promised to rehire workers President Donald Trump fired during the shutdown and to provide a vote on extending expiring Affordable Care Act Insurance subsidies in December. The deal is not done, however, and the government remains closed. That means air traffic is still snarled and millions are without SNAP benefits. Until the government is fully opened, federal employees are paid and people receive their food benefits again, expect continuing aftershocks through employee bases and questions from the public. Continue to be open, transparent and be prepared for unexpected twists and turns.
- A New York Times article explains how newsrooms are using AI – and they’re incredibly varied. Some newsrooms, like Bloomberg, are indeed using AI to write public-facing content, including AI summaries of stories that have resulted in scads of corrections. But others are finding more creative ways of using the technology, like analyzing legislative testimony and cross-referencing it with voting records or building a chatbot to answer questions about Time’s Person of the Year coverage. The methodology is varied, but most major newsrooms are working with AI in some way, big or small. Understanding how newsrooms are integrating this tech is key to getting pitches to receptive audiences – the more you know what’s being automated and what remains human, the more finely you can tune your pitches.
- In other journalism news, two BBC leaders have resigned after allegations of misleading editing of a speech Trump made before the attack on the U.S. Capitol, putting together comments made about 50 minutes apart. The editing came to light after a memo was leaked by a former BBC advisor. It was then picked up by White House Press Secretary Karoline Leavitt, which ratcheted up the heat. The departing leaders admit to some mistakes but deny allegations of systematic bias in the British publicly funded media. We’ve seen the Trump administration gain wins against the American media through the courts, through pressure on proposed mergers and other means. Now they’re flexing their muscle internationally. PR practitioners can expect extra time and caution in the editing process for anything remotely political — and possibly spooked and skittish journalists.
Allison Carter is editorial director of PR Daily and Ragan.com. Follow her on LinkedIn.
The post The Scoop: OpenAI scurries to walk back CFO’s ‘backstop’ snafu appeared first on PR Daily.












