AI

Artificial Intelligence in Neurology

Artificial Intelligence (AI) has really taken off in the last few years and, as such, has driven us as veterinarians to critically evaluate where and how we would like to utilize this new technology. Last year I reported to you about a lecture I listened to at ACVIM about the use of AI for writing radiology reports. It was eye opening, to say the least! Recently (October 2023), a group of mostly veterinary neurologists took on AI in a new way. Abani et al challenged 13 boarded neurologists from Europe and North America to distinguish between AI-generated abstracts and human-generated abstracts. The results are chilling...

Materials and Methods

There were 3 test topics provided in this study. The purpose of providing 3 was to discriminate between "highly familiar" topics and the less familiar topics to see if there was a difference in detection of AI by the reviewers. 
Topic 1: SARS-CoV2 scent detection in dogs (considered low familiarity)
Topic 2: Biomarkers for SRMA (considered high familiarity)
Topic 3: Staining of cannabinoid receptor type 1 (medium familiarity)
An abstract, reference and introduction paragraph were written by humans on these 3 topics. ChatGPT was then used to generate 3 additional abstracts, with references and an introduction paragraph on the 3 topics. It was interesting that the authors noted ChatGPT was prompted as such: " Write an academic abstract with a focus on (subject) in the style of (author characteristics such as position, gender and age) at (University name), for publication in (journal name)." I mean...wow. ChatGPT is able to provide gender, age and position sensitivity. 

Results

  • Topic 1 and 3: 4/13 (31%) correctly identified the AI generated abstract when only provided the abstract without references and introduction paper. This increased to 9/13 (69%) when all parts were provided. 

  • Topic 2: 7/13 (54%) correctly identified the AI-generated abstract (provided alone), which increased to 10/13 (77%) when all parts were provided. 

Two separate plagiarism detectors were studied in this study as well. All of the original published manuscripts were noted to have 58%-100% similarity to available work which indicated this had been published elsewhere (it had). Test 1, 2, and 3 with the AI-generated papers had similarity indexes of 0-18%. This suggests that the plagiarism detectors could identify what had been previously published (the human-generated papers) and which hadn't (the AI-generated papers). Furthermore, they then evaluated all of the abstracts with an AI-detector. All original manuscripts were noted to have 0% AI-writing. Test 2 was noted to have 100% AI generation, and Tests 1 and 3 were noted to have 0% content written by AI. Gulp. 

Where does this leave us? My heightened sense of anxiety about AI-generated content was further heightened when realizing that many of my well respected, high academic achieving colleagues struggled to distinguish between AI-generated abstracts and human-generated abstracts in an area of our specialty. This further reinforced my commitment to reading the entire paper, whenever possible, before considering the data valid. We were taught to do this in school but alas, with our busy schedules, it can be missed. AI is not all bad, however. It can be quite helpful for correcting grammar, editing, summarizing references or papers and even performing statistics. I would encourage all of us to move through published literature with our eyes fully focused and with awareness of the use of AI in modern veterinary medicine. Except yesterday...hopefully you kept your eyes partially closed and didn't look directly at the sun!! 

I hope you enjoyed this little TidBit. It is a little bit off topic, but I hope you will find it useful, nonetheless. Please know that my TidBit Tuesdays are (to date) fully human-generated, as are my patient reports! Let me know if you have any topics that you'd like me to cover. Have a great week!

The use of AI In Veterinary Medicine

At the recent ACVIM Forum in Philadelphia, a radiologist gave a very enlightening presentation about AI, and specifically ChatGPT. Have any of you messed around with this technology yet? Is anyone using it for work flow support? Although this TidBit Tuesday isn’t specifically about a neurology topic, I was so blow away by the ChatGPT lecture I decided to include it as a TidBit Tuesday. We’ll be back to our regularly scheduled neurology topics next week… 😊

To get us all on the same page, ChatGPT is a new artificial intelligence (AI) software developed by Microsoft engineers. The presenter at ACVIM (Dr. Eli Cohen, provided an example during his talk of a “conversation” he had with ChatGPT that terrified me. While reviewing a radiograph ChatGPT suggested that one of the differentials for this pet with clear lytic bone lesions on each side of an intervertebral disc space could be “sterile discospondylitis”. Dr. Cohen, like all of us in the audience, instantly worried that we had missed this diagnosis in our years of practice experience. STERILE disco? Is this real? How could I have missed this?? So, he asked ChatGPT to provide references for this statement. AND IT DID. Dozens of references popped up on the screen. They were from reputable journals like JAVMA, JVIM, and Vet Rad and Ultrasound. By real, live people, practicing veterinary neurologists and radiologists. Some of us were in the audience. The catch? None of these references were real. NOT ONE of the references was actually a reference for this imaginary disease. ChatGPT had taken names of people that may have written about “sterile” and “discospondylitis” separately and combined this into believable reference points. My take away from this was to make sure if and when I use ChatGPT for any work-related item, that I personally double check (dare I say vet?) all of the data points. Here is a perfect example. I fed ChatGPT the following question:

What is the neuroanatomic lesion localization for a dog with seizures?

Here is the answer:

Seizures in dogs can arise from various neuroanatomic locations. The specific neuroanatomic lesion localization for seizures depends on the underlying cause and can vary between individual cases. Here are a few examples of potential lesion locations associated with seizures in dogs…

WRONG. What is the correct neuroanatomic lesion localization for a dog with seizures? That’s right, forebrain or prosencephalon. There is only one neuroanatomic lesion localization for pets with seizures. The etiology varies widely from hypoglycemia to brain tumors, but all seizures come from one part.

This was a wonderful reminder to me how important the grasp of words, terms and phrases is when we communicate in veterinary medicine. I, probably similar to you, will be using AI in my veterinary career in the future. I think it is probably inevitable. However, we must remember to double check what we put in is using the correct terminology, and that the produced answer is in line with our knowledge and understanding.


I’d love to hear if you use AI in your personal or professional life and how it has affected you. I hope you had a safe and happy 1st or 4th of July and I look forward to seeing you, without robots, in the future!