AI Is Coming To An iPhone Near You, But Has Stumbled In Healthcare

Posted by Richard on December 17, 2024

Ten years ago, the notion of having a powerful and complex AI program might have seemed like science fiction. But now it’s going to fit in your pocket, because Apple is partnering with ChatGPT to bring AI to iPhones.

Apple Intelligence was unveiled at June’s World Wide Developers Conference, and has generated ample buzz. It will allow users to use native AI on Apple’s powerful proprietary chips. AI tools are typically accessed over the internet — users connect with the AI tool provider’s servers, which can do all the heavy lifting to make the AI actually work. Apple Intelligence, however, will allow users to use native AI on Apple’s chips.

Apple Intelligence can be used to write, proofread, and summarize text — ideal for composing emails or articles. You can also use it as a research tool for tasks like troubleshooting a problem with your swimming pool pump or your website code. You can also create images, monitor your emails, and more. In some cases, however, Apple may ask to send specific requests to ChatGPT. This allows iPhone and Mac users to tap into powerful servers when necessary.

Compared to Microsoft and Google, Apple has been a bit slow to roll out its own AI solutions. But the timing has put Apple in a prime position to study AI’s problem areas and learn from mistakes. In the meantime, Apple has bought up more than two dozen AI companies, and CEO Tim Cook has argued that AI will play an integral role in many of the company’s products. Apple Intelligence will be ready this fall. Among the iPhones already on the market, only the 15 Pro and 15 Pro Max will get it. The platform will also be available on Mac computers, but only if they have an M1 or newer processor.

However, even though artificial intelligence has made enormous strides during the past few years, it’s probably not ready to try and save your life.
According to Yahoo! News, the online patient portal MyChart has run into serious problems with an AI-powered tool that helps clinicians respond to patient queries. When MyChart receives a question, the AI assistant, called In Basket Art, automatically drafts a response, which clinical staff will edit, approve, and send.

In Basket Art is intended to help doctors save time and allow them to devote their mental energy to more pressing tasks. But users have reported that the feature often hallucinates, generating inaccurate and sometimes dangerous information that can be inadvertently passed on to patients. One primary care physician in North Carolina reported that In Basket Art provided a patient with information about her vaccine history — though it did not even have access to her vaccine records.

According to the New York Times, these errors are not altogether uncommon. A recent study examined a batch of 116 AI-generated drafts and found that seven of them contained hallucinations. Another study found that the unedited drafts contained potentially harmful advice about 7 percent of the time. And patients may be none the wiser, since physicians, health systems, and MyChart are not required to notify patients when providers send AI-generated messages.

Epic, which owns MyChart, says it has refined In Basket Art to prevent it from providing clinical advice, but it’s not perfect, and users report continued problems with In Basket Art’s responses.