| | | | By Daniel Payne, Carmen Paun, Ruth Reader and Erin Schumaker | Presented by 340B Health | | | | California is forging ahead on rules for procuring AI. | Damian Dovarganes/AP | California state agencies that plan to buy artificial intelligence products will need to vet them for bias under new guidance released at Democratic Gov. Gavin Newsom’s direction. The state is already pursuing a pilot project that would use AI to streamline health care licensing, and state officials see promise in using AI to analyze health data for fraud, among other non-health care purposes, POLITICO’s Jeremy B. White reports. Why it matters: AI is known to reflect human biases toward different demographics included in the data it’s trained on. Given California’s size and status as the tech industry’s epicenter, the state’s approach to managing the fast-evolving technology could ripple beyond its borders. “There are no well-developed procurement guidelines anywhere in the country for what a government does to buy this technology,” Newsom’s AI point person, Jason Elliott, said. “We hope these become a model for other states and cities around the world.” The big picture: Federal and state policymakers are rushing to manage the rapid proliferation of advanced AI. California’s guidance is partly modeled on White House recommendations for deploying AI and managing its risks. The goal is “evaluating whether the generative AI is actually providing benefits,” Amy Tong, California’s secretary of government operations, said of AI tools that mimic human intelligence. “You’re going to hear people who are skeptical about this and people who think we should ban this yesterday, but the average person is cautiously optimistic about what this could bring.” State agencies will need to train employees and designate people to monitor AI developments and, depending on the risk assessed using a federal framework, they’ll face higher levels of scrutiny for buying AI products.
| | A message from 340B Health: Support the 340B PATIENTS Act The 340B PATIENTS Act eliminates harmful big pharma restrictions on 340B savings that are vital for expanding health care and support for patients and rural communities in need. By restricting 340B pharmacy partnerships, drugmakers have siphoned billions from the health care safety net solely to bolster their profits. The 340B PATIENTS Act stops this damaging behavior. We call on Congress to support this vital legislation. Learn more. | | | | | Chappaquiddick, Mass. | Erin Schumaker/POLITICO | This is where we explore the ideas and innovators shaping health care. Two chemicals in their bodies make teenagers smell like goats, the New York Times reports, while toddlers smell like flowers, according to a small new study that a parent of a toddler in our office said was faulty. Share any thoughts, news, tips and feedback with Carmen Paun at cpaun@politico.com, Daniel Payne at dpayne@politico.com, Ruth Reader at rreader@politico.com or Erin Schumaker at eschumaker@politico.com. Send tips securely through SecureDrop, Signal, Telegram or WhatsApp.
| | A message from 340B Health: | | | | | A new study suggests radiologists shouldn't rely on just any AI. | Spencer Platt/Getty Images | When an AI tool for radiologists produced a wrong answer, doctors were more likely to come to the wrong conclusion in their diagnoses. That was one finding of a study by Harvard and MIT researchers published in Nature Medicine this week. What happened? The study explored the findings of 140 radiologists using AI to make diagnoses based on chest X-rays. How AI affected care wasn’t dependent on the doctors’ levels of experience, specialty or performance. And lower-performing radiologists didn’t benefit more from AI assistance than their peers. Some radiologists’ diagnoses improved when aided by AI, while others’ diagnoses worsened. Why it matters: Doctors increasingly rely on AI tools to aid diagnosis, but some products could be misleading them. What’s next? Researchers said doctors should be confident in AI models’ accuracy to ensure that care quality improves when the tools are used. One way to ensure doctors can judge how much faith to put in AI? Explain why the system came up with its answer.
| | A message from 340B Health: Support the 340B PATIENTS Act 340B hospitals are the backbone of the nation’s health care safety net, providing essential services to patients with low incomes and those living in rural America. 340B hospitals play a vital role in delivering 77% of Medicaid hospital care, providing 67% of the nation’s unpaid care, and offering comprehensive specialty services that otherwise might not be available. 340B helps lower health care costs and enable doctors, nurses, and pharmacists to provide expanded care for the benefit of their community—all at no cost to the taxpayer. The 340B PATIENTS Act will end harmful drug company restrictions on 340B savings that are vital for protecting patients and communities. By restricting 340B pharmacy partnerships, big pharma has siphoned billions from the health care safety net solely to bolster its profits. The 340B PATIENTS Act stops this damaging behavior. We call on Congress to support this vital legislation. Learn more. | | | | YOUR GUIDE TO EMPIRE STATE POLITICS: From the newsroom that doesn’t sleep, POLITICO's New York Playbook is the ultimate guide for power players navigating the intricate landscape of Empire State politics. Stay ahead of the curve with the latest and most important stories from Albany, New York City and around the state, with in-depth, original reporting to stay ahead of policy trends and political developments. Subscribe now to keep up with the daily hustle and bustle of NY politics. | | | | | | Rodgers and Pallone have found common ground on data privacy. | Francis Chung/POLITICO | Data brokers would be barred from selling information about Americans’ health to foreign adversaries such as China and Russia under legislation passed unanimously by the House this week. If enacted, it would be Congress’ most significant move on data privacy in years, POLITICO’s Alfred Ng reports. How so? The bill includes more than health data protections; it would also affect sales of any data deemed sensitive. But it specifically cites “any information that describes or reveals the past, present, or future physical health, mental health, disability, diagnosis, or healthcare condition or treatment of an individual.” The bill, by the top Republican on the House Energy and Commerce Committee, Cathy McMorris Rodgers of Washington, and Frank Pallone of New Jersey, the panel’s top Democrat, borrows definitions of sensitive data from the American Data Privacy and Protection Act, their unsuccessful effort two years ago to pass a national privacy law. Rodgers said last week that she intends to reintroduce that American Data Privacy and Protection Act this year. What's next? It’s still unclear what will happen to the bill targeting foreign adversaries in the Senate, where no one has introduced similar legislation. But Maria Cantwell (D-Wash.), chair of the Senate Commerce Committee, which has jurisdiction, said she supported the legislation. “Data brokers’ sale of Americans’ most sensitive information to our foreign adversaries is wrong and damages our national security,” Cantwell said in a statement.
| | Easily connect with the right N.Y. State influencers and foster the right relationships to champion your policy priorities. POLITICO Pro. Inside New York. Learn more. | | | | | Follow us on Twitter | | Follow us | | | |
No comments:
Post a Comment