| | | | By Daniel Payne, Erin Schumaker, Carmen Paun and Ruth Reader | | | | Many providers say they want government regulators to create and implement a more comprehensive standard for AI tools. | Geoffroy Van Der Hasselt/AFP via Getty Images | Health providers now have a new way to test artificial intelligence applications. The Healthcare AI Challenge creates a virtual testing ground for new AI systems, where providers can better understand how a program would work in real-world scenarios. Clinical experts from leading medical centers are working to create the environment review outputs from several commercial AI systems and rate them. That would help other providers, as well as AI developers and regulators, make sense of how models perform compared with one another and with doctors. The challenge environment could raise issues not brought forward by developers or early research on AI models. “What we’re finding early — we’ve been doing this for just a little while before we announced it — is that these models don’t perform as well as what you think they do from a publication,” Dr. Keith Dreyer, Mass General Brigham’s chief officer for data science and imaging information, told Daniel. Dreyer said the collaboration’s goal is to test algorithms on de-identified patient data “and see exactly what would happen” — before using them on patients. The program comes from a collaboration among Mass General Brigham, Emory Healthcare, the radiology department at the University of Wisconsin School of Medicine and Public Health, the University of Washington School of Medicine’s radiology department and the American College of Radiology. More partners are expected to join in the future. Even so: Many providers say they want government regulators to create and implement a more comprehensive standard for AI tools, seeking to lessen the amount of vetting they do themselves.
| | | Saratoga Springs, N.Y. | Erin Schumaker/POLITICO | This is where we explore the ideas and innovators shaping health care. A ProPublica investigation finds that insurer UnitedHealth Group has been using an algorithm — deemed illegal in three states — to deny patients mental health coverage. Share any thoughts, news, tips and feedback with Carmen Paun at cpaun@politico.com, Daniel Payne at dpayne@politico.com, Ruth Reader at rreader@politico.com, or Erin Schumaker at eschumaker@politico.com. Send tips securely through SecureDrop, Signal, Telegram or WhatsApp.
| | The lame duck session could reshape major policies before year's end. Get Inside Congress delivered daily to follow the final sprint of dealmaking on defense funding, AI regulation and disaster aid. Subscribe now. | | | | | | Surgeon General Vivek Murthy laid out a series of public health strategies that have worked in the past, like increasing taxes on tobacco products. | Susan Walsh/AP | A new report today from Surgeon General Vivek Murthy suggests that despite massive progress driving down tobacco-related deaths and disease in the U.S. since the 1960s — there’s more work to be done. Cigarette and secondhand smoke exposure account for nearly 1 in 5 deaths, or 500,000 lives lost, yearly, according to the report. And progress has been unequal: Black individuals, people living in poverty and people with lower education levels are disproportionately likely to use tobacco. Policy path forward: In his report, Murthy laid out a series of public health strategies that have worked in the past, like increasing taxes on tobacco products, expanding tobacco-free public spaces and implementing aggressive counter-marketing and public education measures. Additionally, Murthy recommends enlisting new strategies, including banning menthol and reducing nicotine content in tobacco products. "Black individuals actually attempt cessation at higher rates but succeed at lower rates. I believe one of the major contributors is menthol," Murthy said. Tobacco companies target the Black population with menthol products, he pointed out, which are easier to get hooked on and harder to quit than traditional tobacco. "These measures can literally save millions of lives," Murthy told Erin. "That’s the goal here. The end game, ultimately, is to create a society where no lives are lost and no disease is caused by tobacco." What's next: The Food and Drug Administration has dragged its feet on a national menthol cigarette ban, and it remains to be seen where the incoming Trump administration will stand on a ban, or any of the other policies Murthy outlined. A hint at Donald Trump’s allegiances: The president-elect has promised to "save Vaping again!" during his second term. When questioned about retaining his post under Trump — who asked Murthy to resign during his first term — the surgeon general said, "I’ve got time to think about my future. I’m just focused on trying to get as much done as we can before the end of the year."
| | Policy change is coming—be the pro who saw it first. Access POLITICO Pro’s Issue Analysis series on what the transition means for agriculture, defense, health care, tech, and more. Strengthen your strategy. | | | | | | State attorneys general say there's evidence social media platforms are aware of their negative mental health effects on underage users. Rick Rycroft | AP | A bipartisan coalition of 31 state attorneys general are trying to revive a stalled effort in Congress to protect kids online by passing legislation to mitigate mental health harms associated with social media. They say there is increasing evidence that social platforms are aware of the negative mental health effects they burden underage users with, but still won’t make changes. States have been active in tackling this issue. Some 42 states are suing Meta, the parent company of Facebook and Instagram, and another 14 are suing TikTok, alleging these platforms have harmed teens’ mental health. In September, 39 state attorneys general called on Congress to mandate warning labels on social media platforms, as suggested by Surgeon General Vivek Murthy. Why it matters: In July, Senators voted 91-3 to pass the Kids Online Safety Act by Marsha Blackburn (R-Tenn.) and Richard Blumenthal (D-Conn.) to require tech companies to design their platforms with children’s best interest in mind and offer the strongest safety settings as a default as part of a two-bill package. In September, the House Energy and Commerce Committee took up a weakened version that would not require tech companies to protect against mental health risks, only physical harm. It hasn’t moved since because some Republicans, including House Majority Leader Steve Scalise, say it would infringe on free speech. Context: Tech companies and a tech industry group, NetChoice, are lobbying against the bill, arguing it would violate free speech rights. Some civil rights groups, including the ACLU, say the bill would make it more difficult for marginalized groups, like LGBTQ+ people, to find supportive online communities. What’s next: A coalition of parents and tech accountability groups have put pressure on Congress to pass KOSA and even some tech companies — including Pinterest, Snap, X, and Microsoft — support the bill. Supporters are trying to convince lawmakers to include the bill in an end-of-year spending package. “KOSA doesn’t do anything to restrict content — it’s a product safety bill,” Blackburn said at a children’s mental health conference at Brown University on Monday.
| | Follow us on Twitter | | Follow us | | | |
No comments:
Post a Comment