SciELO - Scientific Electronic Library Online

 
vol.31 suppl.spe1A Diabetes no Mundo DigitalNovos Paradigmas na Saúde Digital e a Medicina Interna do Futuro índice de autoresíndice de assuntosPesquisa de artigos
Home Pagelista alfabética de periódicos  

Serviços Personalizados

Journal

Artigo

Indicadores

Links relacionados

  • Não possue artigos similaresSimilares em SciELO

Compartilhar


Medicina Interna

versão impressa ISSN 0872-671X

Medicina Interna vol.31  supl.spe1 Lisboa maio 2024  Epub 20-Maio-2024

https://doi.org/10.24950/rspmi.2585 

ARTIGOS DE REVISÃO/ REVIEW ARTICLES

Artificial Intelligence and Ultrasonography

Inteligência Artificial e Ultrassonografia

1Department of Medicine. University of South Carolina School of Medicine; Columbia, United States of America


Abstract

Artificial intelligence (AI) and its many aliases, including machine learning, deep learning and big data, have invaded modern medicine impacting most aspects of modern practice. One of the most controversial and potentially impactful, is artificial intelligence use in medical imaging. While most commercial and academic attention has focused on higher cost imaging modalities such as magnetic imaging resonance (MRI) and computed tomography (CT), ultrasound has also become the target of AI application developers. Ultrasound presents additional barriers to AI application development and execution, not seen in axial imaging such as MRI and CT.

Point-of-care ultrasound (POCUS), with its lack of standardization and plethora of inexperienced users, poses the greatest imaging challenge to AI. However, POCUS is also the key to widespread access to diagnostic and interventional ultrasound at the patient’s bedside throughout the world. This article discusses AI, it utilization in POCUS, current challenges, risks, limitations, needs and future possibilities.

Keywords: Artificial Intelligence; Deep Learning; Internal Medicine; Machine Learning; Point-of-Care Systems; Ultrasonography.

Resumo

A inteligência artificial (IA) e os seus muitos pseudónimos, incluindo a aprendizagem automática, a aprendizagem profunda e os grandes volumes de dados, invadiram a medicina moderna, afetando a maioria dos aspetos da prática moderna. Um dos mais controversos e potencialmente impactantes é a utilização da inteligência artificial na imagiologia médica. Embora a maior parte da atenção comercial e académica se tenha centrado em modalidades de imagiologia de custo mais elevado, como a ressonância magnética (RM) e a tomografia computorizada (TC), os ultrassons também se tornaram o alvo dos criadores de aplicações de IA. O ultrassom apresenta barreiras adicionais ao desenvolvimento e execução de aplicações de IA, não observadas na imagiologia axial, como a RM e a TC. A ecografia point-of-care (POCUS), com a sua falta de normalização e a multiplicidade de utilizadores inexperientes, representa o maior desafio de imagiologia para a IA. No entanto, a POCUS também é a chave para o acesso generalizado ao diagnóstico e à ultrassonografia intervencionista à beira do leito do paciente em todo o mundo. Este artigo discute a IA, sua utilização em POCUS, desafios atuais, riscos, limitações, necessidades e possibilidades futuras.

Palavras-chave: Aprendizagem Automática; Aprendizagem Profunda; Ecografia; Inteligência Artificial; Medicina Interna; Sistemas Point-of-Care.

Introduction

Point-of-care ultrasound (POCUS) refers to the use of both diagnostic and interventional ultrasound by clinicians at the patient’s bedside. It is distinct from traditional ultrasound applications by imaging specialists such as radiologists and cardiologists.1 While POCUS examinations may be viewed as simpler and shorter than the comprehensive ones performed by traditional imaging specialists, they are often far more urgent and critical in nature.2,3Originating in specialties such as emergency medicine and trauma surgery in the United States and Europe, the POCUS concept was in its infancy in the 1990s and only started to expand exponentially in the mid-2000s.4 While widely adopted in the United States by emergency medicine, the key to its current broad popularity was POCUS infiltration into critical care, internal medicine and other clinical specialties not just in North America and Europe but also worldwide.

Despite three decades of expansion and at least two international POCUS focused societies devoted to education of providers and medical students, the majority of current and future POCUS users have very low skill in ultrasound and many have difficulty obtaining additional training for a variety of valid logistical and economic reasons.5 While a range of policies exist worldwide on what constitutes adequate training in POCUS to allow its integration into clinical practice, unlike the finite amount of time required for competency in a procedure like lumbar puncture or endotracheal intubation, POCUS presents considerably higher barriers.6,7Formal training required for competency appears to average approximately 40 hours of continuing education and require approximately 25 to 50 proctored ultrasound examinations for most individual POCUS scan types.8 Greater expertise requires considerably larger number of scans and extended experience. Coupled with the required understanding of basic ultrasound physics and ultrasound machine controls to perform a POCUS scan or guided procedure, the vast majority of potential POCUS users are unable to utilize this technology. The advent of smaller and simplified ultrasound devices such as hand held scanners has failed to create widespread use of the technology due to persistent skill limitations.9 However, with the introduction of machine learning or artificial intelligence into ultrasound imaging, specifically POCUS, these barriers are falling away and will enable use of POCUS equipment by anyone without need for training, in the future.10

Artificial Intelligence

The definitions of artificial intelligence and its sub-categories such as machine learning and deep learning can cause confusion for anyone not involved in the field. Artificial intelligence simply means an artificial process that simulates human intelligence, to some degree. Machine learning (ML) refers to training machines to learn and simulate intelligence, while deep learning is a subtype of machine learning which uses an artificial neural network-like structure modeled on the hierarchy seen in the human frontal cortex.11In other words, simpler processes at one level are combined into more complex ones as you progress from one layer to the next. In the case of an ultrasound image, recognition of light or dark pixels is then moved down to a level where edges of structures may be recognized, which in turn move to a level where curves or straight edges are identified and then eventually to a specific image type (such as the liver) or individual structures in an ultrasound image. Unfortunately, unlike the AI of science fiction, there is little magic or mystery in the process of how images are analyzed by ML algorithms, the name for computer programs which produce results such as identification of organs on the screen or estimating the cardiac ejection fraction. All of these processes are governed by well-established complex mathematical processes involving calculus and linear algebra with many feedback loops. What can be mysterious is exactly which factors a machine learning algorithm focuses on to obtain the correct answer.

A good example of the limitations of AI algorithms involves an analogy regarding a famous horse from Germany in the 1800s. The horse was thought to be able to complete basic computations verbalized to it by onlookers. It was eventually discovered that the horse, named Cleaver Hans, was watching the human for cues and tapped his hoof until he saw he should stop. In this case the correct answer was given for the wrong reasons and there was never true capability to perform addition or subtraction. A similar problem was discovered with an early ML algorithm designed to identify images of boats. It turned out the algorithm, trained to identify boats, failed to do so in the absence of water in the image. The algorithm was mistakenly picking up on water and not boats in an image. This was a failure of the designers to conceptualize the possibility of such an error and to provide the algorithm with varied enough training data that included boats in a range of settings. Additionally, designers did not perform saliency testing or heat mapping to identify which pixels in an image were most important to the ML algorithm in achieving the correct answer.12 Given that this occurred during the earlier years of AI and image analysis, it is unlikely that algorithm designers thought to question the methods of their algorithm’s success until the flaw was discovered.

AI POCUS Application Principles

There are important differences between traditional ultra-sound such as that employed in radiology or cardiology, and POCUS. The very nature of examinations and the time pressu-res involved are radically different in most POCUS settings from those of the radiology suite. However, the greatest difference may be the skill level of the ultrasound operator. In radiology a highly trained ultrasound technologist who is not allowed to make an interpretation from images may perform a well scripted, detailed and lengthy ultrasound examination. In some areas of the world, the radiologist performs the same exami-nation themselves rather than reviewing images stored by the technologist at a later time. In either case, the person holding the transducer knows exactly what to do with it and understands all of the anatomy and pathology they are viewing on the ultrasound machine screen. Introducing AI into a workflow such as this means making an experts life easier and typically involves creating aids for calculations, labeling, documentation and general workflow.13The POCUS user, is much more likely to have little to no skill/experience and may be able to devote less than 5 minutes to the examination, compared to 45 in radiology. These realities dictate a different approach to AI application design in the world of POCUS.

Despite nearly 3 decades of ultrasound education by various medical societies and commercial entities, most clinicians are still not competent to perform and interpret an ultrasound examination and the vast majority of these have no ultrasound skill at all. This means POCUS AI application developers have to target an unskilled user if they wish to expand their markets and improve patient access to ultrasound. Therefore, the ultra-sound machine itself has to act like a professor or expert over the shoulder of the novice operator, indicating where the transducer has to be placed, how it is moved and what anatomy and pathology is being viewed on the screen. At a higher level, POCUS AI applications have to enable objective assessments such as automatically performing measures of normal and abnormal structures as well as obtaining Doppler and other evaluations. This creates a completely different challenge, not only for AI developers, but for clinicians involved with application creation.

AI Guidance

One of the most pivotal realizations in POCUS AI is the need for guiding the user from the most basic operation such as placing the ultrasound probe in the correct anatomic area to be scanned, to instructing them in real time how to rotate, angle and slide the transducer throughout the entire examination. In POCUS, guidance often refers to guiding a needle to a nerve, fluid collection or blood vessel and this is an important application for AI. However, AI also allows a completely novice user to be guided through every portion of image acquisition, interpretation and examination performance. Such a capability is essential for achieving any dramatic increase in POCUS use by clinicians all over the world, given that lack of skill with ultrasound is the biggest barrier to usage. As multiple studies have shown, access to education, hands on instruction and time for supervised examination practice are the most frequently cited barriers to POCUS use across multiple medical specialties. Hardware and software vendors have been slow to realize that guiding a novice POCUS user, from telling them where to place the transducer to how to manipulate it and even where to move it, is critical even prior to activating the algorithms they actually set out to build such as automatic cardiac ejection fraction calculation, cardiac output estimation and others.

Needle or instrument guidance can be challenging for all but the most expert users if small or difficult targets are sought. These can include nerves with adjacent vasculature or tortuous veins with nearby arteries and nerves. The inplane needle visualization technique allows the highest precision and possible safety, but is often too difficult for novice users.14 An out-of-plane needle visualization approach allows for easier transducer stabilization over a vessel, but makes precise needle tip visualization nearly impossible for most novices, resulting in potential hazards such as posterior wall penetration.15 AI algorithms available to date have shown capability to not only enhance the difficult to see needle but also project its real time course into tissues if inserted and make suggestion regarding the best path towards a target.16

AI In POCUS Education

POCUS specific AI applications frequently have education capability built in, even if not realized by the developer. Because of the low or absent ultrasound skill of the typical POCUS user and need for anatomical labeling, image acquisition guidance and other automation, many of these applications are primed to teach students with little repackaging. Additional educational features are easily added and may already be part of the background workflow solutions POCUS AI applications have to provide. These include tracking errors in transducer movement, ability to quiz students regarding anatomy on screen as well as pathology. Further, in many regulatory settings, purely educational AI applications which are not intended for actual patient diagnosis are not regulated or are less regulated, allowing immediate release of software that will take more than a year to attain clearance for actual patient use. This path has already been followed by several current commercial POCUS AI creators but leaves another opportunity for POCUS educators. Educators should not overlook the ability to create their own educational AI applications which can be used without regulatory clearance in many locations.17

POCUS AI Applications

Numerous AI POCUS applications have been developed or are under development, both on an academic/research basis as well as by commercial vendors. While the research endea-vors are often cutting edge and push commercial developers to innovate, these projects themselves will rarely turn into a product which POCUS users can one day use. The concept of AI guidance, one that deserves its own discussion, is addressed in the section above since it applies across many other application types. The remaining currently available POCUS AI applications are largely a series of “one off” apps, meaning they possess a narrow focus rather than a broader one. An example might be an algorithm that identifies retinal detachment on ocular ultrasound, but does not look for other ocular pathology nor measure the optic nerve sheath diameter. While this may seem to be in keeping with the traditionally binary nature of POCUS, its represents a much more limited scope of AI assistance to the novice user. For example, numerous vendors have created lung B line counting AI applications.18 While useful, these applications do not account for an entire lung examination, a POCUS exclusive, which evaluates for multiple different normal and abnormal lung ultrasound signs over the anterior, lateral and posterior aspects of both hemithoraces. An additional level of required sophistication is bringing together findings from numerous scanned lung zones to arrive at a diagnosis, which may even require incorporation of clinical data. This level of functionality takes the lung ultrasound AI application from a useful workflow addition (counting B lines) to an actual diagnostic solution for pulmonary evaluation. While under some level of construction by multiple vendors at this time, there is still no such solution available on the market.

Cardiac applications are perhaps the most popular target of developers worldwide. Given the incredible utility of echocardiographic evaluation of the heart in real time in a wide array of clinical situations from the most mundane routine checkup to urgent, critical or outright emergent presentations, it is not surprising that most developers started their AI solution suites with some type of cardiac application. The majority of available AI solutions measure cardiac ejection fraction and some others address cardiac output. More individual cardiac AI apps are becoming available and are under development, but POCUS users really require a suite of solutions which can run automatically once the device guides a user to proper imaging planes.

Other applications that are either currently available or soon to be cleared include a POCUS DVT examination AI application that guides a complete ultrasound novice such as a nurse through a lower extremity examination. Musculoskeletal ML algorithms help identify structures and guide needles to nerves or blood vessels. Trauma AI applications seek to label relevant anatomy and intraabdominal fluid on the FAST examination.

The Future of POCUS AI and Unmet Needs

In the not too distant future, as POCUS AI algorithms are provided by more and more vendors and become commonplace on POCUS specific devices, we are likely to see much more widespread use of POCUS at the bedside. If every aspect of an examination can be enabled by AI algorithms, much like having your own expert or professor standing behind you and guiding your every move, then any clinician can perform a broad range of POCUS examinations (from the simplest to most complex), something currently reserved for the most expert POCUS users. Traditionally, experts in any skill can be the most resistant to adopting new, helpful technology. This was widely seen with the introduction of ultrasound guidance for vascular access as well as video laryngoscopy for endotracheal intubation.19,20In both cases, the technology was used by less skilled providers and often reserved for difficult cases. Experts scoffed at the need for a “crutch” technology that did not make an impact. However, in time, all vascular access experts using a blind approach were outmatched by ultrasound guidance and it is now the standard of care in many locations around the world. Similarly, video laryngoscopy is utilized by even the best airway managers on a routine basis. These corollaries support the concept that even skilled POCUS users, who now state AI is of no use in ultrasound, will eventually allow the seamless background algorithms to make all difficult calculations and assessments in their POCUS examinations, which are now proudly performed manually.21

POCUS education, a time and energy consuming topic for all involved, from educators to eager learners will be completely revolutionized by AI in the future and the current POCUS education approach and systems are likely to be largely eliminated, or at least drastically altered. Much like the transition from a physician examining a urine or blood sample with a microscope in 1970 versus the rapid automatic analysis performed by a modern medical device, a POCUS machine that can completely guide the user through every portion of the examination and even teach them, will likely obviate the need for additional educational modalities for a vast majority of clinicians. It’s unclear how ultrasound simulation vendors will be affected by such changes, but some may have foreseen this and already appear to have changed their models to incorporate AI models to provide education and are working toward procedural assistance and diagnosis.16

Challenges of Data Acquisition for AI Training and “Garbage In Garbage Out”

One of the most popular terms thrown about by AI ultra-sound designers is “garbage in garbage out”, first coined in 1957 during the early days of computing.22 It has come to symbolize the concept that, in part, quality images are required to train good algorithms which will enable reliable and accurate diagnosis of quality images from patients. While this concept may work in many aspects of computing and AI in higher end imaging, it often fails in POCUS. The basic premise is that POCUS providers often produce images that are considered “garbage” by traditional imaging experts. This may in fact be true from the perspective of image quality, proper imaging plane use, and management of settings like depth and gain. However, these factors are not easily addressed in large scale in the POCUS community and AI designers would be more successful in training algorithms to interpret these “garbage” POCUS images. Failure to do so can result in algorithm performance issues as was likely the case in a Class II FDA recall of a cardiac ejection fraction AI calculation software, which was produced using top of class ultrasound machines and expert obtained images but applied to a rudimentary imager.23 Similar challenges have been encountered by other hand-held hardware vendors when training algorithms on best in class images, but utilizing those algorithms to interpret images produced by comparatively low quality ultrasound devices.

The obvious answer is for AI designers to use only actual POCUS images to train commercial algorithms. Unfortunately, unlike radiology and cardiology image repositories, few POCUS databases are available either commercially or academically. Unlike PACS stored and largely curated ultrasound images from most radiology and cardiology departments, such resources are rarely encountered in POCUS settings and patient outcome as well as other diagnostic study results (such as radiology ones) are typically unavailable. Thus, AI designers have to undertake the expensive proposition of obtaining POCUS images, with additional data such as patient outcomes, thus creating their own image repositories or leveraging large databases available outside of POCUS, leading to potential algorithm performance challenges. These data pipeline challenges will have to be addressed on a large scale if smaller commercial endeavors are to succeed and bring to market a variety of useful AI POCUS applications. Without the competitive pressure of such disruptive and innovative small companies, large corporations with adequate funding but limited ability to innovate will be the only ones capable of acquiring large POCUS image datasets to train their AI algorithms.

POCUS AI Risks, Legal Considerations and Limitations

The greatest risk of new technology may be overreliance upon it by its users. The transition from hand written mathematical calculation to complete reliance on calculators and then computers has robbed many of even the most basic understanding of the formulas and mathematical process involved. The loss of ultrasound understanding and skill can have sinificant consequences theoretically, in case of some kind of technological or other disaster, but such events are unlikely. Additionally, waves of technological change drive themselves and can only be slowed so much by detractors and guardians of the status quo. Cybersecurity with broadly AI driven machines of all types in medicine will become increasingly important, as larger amounts of patient data is put at risk. A significant shift to bedside diagnosis through POCUS AI could also rob medicine of its traditional ultrasound imaging expertise if, in time, radiology shifts even more toward computed tomography and magnetic resonance. The potential impact of such a loss of knowledge and expertise is difficult to predict, but may be even harder to prevent.

As with POCUS in general, concerns regarding legal liability are regularly expressed by both clinicians and administrators. Despite evidence to the contrary, fear of legal lability has been used for years to scare away potential POCUS users.24,25Interestingly, it will be some time before the MDR, FDA and other regulatory bodies are ready to grant clearance for POCUS devices which make the diagnosis themselves, without a physician’s oversight and final decision. Because of this, liability will be limited to the licensed physician overseeing any injured patient’s care, much as it is now. A similar reality exists for those in jurisdictions where they fear administrative repercussions for errors and misses rather than litigation by patients or their families. In time, when POCUS AI advances to the point where diagnostic accuracy surpasses equivalent expert results from radiology and cardiology, the machines will be able to deliver diagnostic results despite being used by ultrasound novices.

Conclusion

AI in ultrasound and especially POCUS is already here and will expand dramatically in the next 5 to 10 years. However, so far, it has spread more slowly than initially predicted, but prior overestimations were the result of a failure to appreciate regulatory resistance and financial as well as logistical barriers involved in creating a broad suite of AI solutions. Further, initial efforts by AI scientists did not anticipate the need to move backward from automatic functions like ejection fraction calculation to actually guiding an operator to the correct spot on the chest and then telling them how to manipulate the transducer and imaging plane to maintain for data acquisition. The next hurdle in POCUS AI is development are solutions suites enabling guidance through a complete POCUS examination and automatic diagnosis, rather than one-off applications. Once this challenge is conquered, POCUS may finally be able to reach the bedside of every patient in the world who needs it.

REFERENCES

1. Díaz-Gómez JL, Mayo PH, Koenig SJ. Point-of-Care Ultrasonography. N Engl J Med. 2021;385:1593-602. doi: 10.1056/NEJMra1916062. [ Links ]

2. Spencer KT, Kimura BJ, Korcarz CE, Pellikka PA, Rahko PS, Siegel RJ. Focused cardiac ultrasound: recommendations from the American Society of Echocardiography. J Am Soc Echocardiogr. 2013;26:567-81. doi: 10.1016/j.echo.2013.04.001. [ Links ]

3. Plummer D, Brunette D, Asinger R, Ruiz E. Emergency department echocardiography improves outcome in penetrating cardiac injury. Ann Emerg Med. 1992;21:709-12. doi: 10.1016/s0196-0644(05)82784-2. [ Links ]

4. Michalke JA. An overview of emergency ultrasound in the United Sta-tes. World J Emerg Med. 2012;3:85-90. doi: 10.5847/wjem.j.is sn.1920-8642.2012.02.001. [ Links ]

5. Wong J, Montague S, Wallace P, Negishi K, Liteplo A, Ringrose J, et al. Bar-riers to learning and using point-of-care ultrasound: a survey of practicing internists in six North American institutions. Ultrasound J. 2020;12:19. doi: 10.1186/s13089-020-00167-6. [ Links ]

6. Ludden-Schlatter A, Kruse RL, Mahan R, Stephens L. Point-of-Care Ultrasound Attitudes, Barriers, and Current Use Among Family Medicine Residents and Practicing Physicians. PRiMER. 2023 ;7:13. doi: 10.22454/PRiMER.2023.967474. [ Links ]

7. Galen B, Conigliaro R. A Curriculum for Lumbar Puncture Training in Internal Medicine Residency. Med Ed Publish. 2019;26;8:33. doi: 10.15694/mep.2019.000033.1. [ Links ]

8. Blehar DJ, Barton B, Gaspari RJ. Learning curves in emergency ultrasound education. Acad Emerg Med. 2015;22:574-82. doi: 10.1111/acem.12653. [ Links ]

9. Malik AN, Rowland J, Haber BD, Thom S, Jackson B, Volk B, et al. The Use of Handheld Ultrasound Devices in Emergency Medicine. Curr Emerg Hosp Med Rep. 2021;9:73-81. doi: 10.1007/s40138-021-00229-6. [ Links ]

10. Wang H, Uraco AM, Hughes J. Artificial Intelligence Application on Point-of-Care Ultrasound. J Cardiothorac Vasc Anesth. 2021;35:3451-2. doi: 10.1053/j.jvca.2021.02.064. [ Links ]

11. Erickson BJ, Korfiatis P, Akkus Z, Kline TL. Machine Learning for Medical Imaging. Radiographics. 2017;37:505-15. doi: 10.1148/rg.2017160130. [ Links ]

12. Borys K, Schmitt YA, Nauta M, Seifert C, Krämer N, Friedrich CM, et al. Explainable AI in medical imaging: An overview for clinical practitioners -Beyond saliency-based XAI approaches. Eur J Radiol. 2023;162:110786. doi: 10.1016/j.ejrad.2023.110786. [ Links ]

13. Park SH. Artificial intelligence for ultrasonography: unique opportunities and challenges. Ultrasonography. 2021;40:3-6. doi: 10.14366/usg.20078 [ Links ]

14. Blaivas M, Brannam L, Fernandez E. Shortaxis versus longaxis approaches for teaching ultrasound-guided vascular access on a new inanimate model. Acad Emerg Med. 2003;10:1307-11. doi: 10.1111/j.1553-2712.2003. tb00002.x. [ Links ]

15. Blaivas M, Adhikari S. An unseen danger: frequency of posterior vessel wall penetration by needles during attempts to place internal jugular vein central catheters using ultrasound guidance. Crit Care Med. 2009;37:2345-9; quiz 2359. doi: 10.1097/CCM.0b013e3181a067d4. [ Links ]

16. Lloyd J, Morse R, Taylor A, Phillips D, Higham H, Burckett-St Laurent D, et al. Artificial Intelligence: Innovation to Assist in the Identification of Sono-anatomy for Ultrasound-Guided Regional Anaesthesia. Adv Exp Med Biol. 2022;1356:117-40. doi: 10.1007/978-3-030-87779-8_6. [ Links ]

17. Blaivas M, Arntfield R, White M. DIY AI, deep learning network development for automated image classification in a point-of-care ultrasound quality assurance program. J Am Coll Emerg Physicians Open. 2020;1:124-31. doi: 10.1002/emp2.12018. [ Links ]

18. Russell FM, Ehrman RR, Barton A, Sarmiento E, Ottenhoff JE, Nti BK. B-line quantification: comparing learners novice to lung ultrasound assisted by machine artificial intelligence technology to expert review. Ultrasound J. 2021;13:33. doi: 10.1186/s13089-021-00234-6. [ Links ]

19. Wrenn KD, Slovis CM. Misguided residency applicant questions? Acad Emerg Med. 1999;6:981-3. doi: 10.1111/j.1553-2712.1999.tb01178.x. [ Links ]

20. Aziz MF, Berkow L. Pro-Con Debate: Videolaryngoscopy Should Be Standard of Care for Tracheal Intubation. Anesth Analg. 2023;136:683-8. doi: 10.1213/ANE.0000000000006252. [ Links ]

21. Lichtenstein D. Clinical ultrasound in the age of artificial intelligence. Health Manag J. 2019; 19:154-6. [ Links ]

22. Work With New Electronic 'Brains' Opens Field For Army Math Experts. Hammond Times. 1957; 65. [ Links ]

23. Food and Drug Administration. Class 2 Device Recall Vscan Extend [accessed Jan 2024] Available at: https://www.accessdata.fda.gov/scripts/cdrh/cfdocs/cfRes/res.cfm?ID=173162Links ]

24. Blaivas M, Pawl R. Analysis of lawsuits filed against emergency physicians for point-of-care emergency ultrasound examination performance and interpretation over a 20-year period. Am J Emerg Med. 2012;30:338-41. doi: 10.1016/j.ajem.2010.12.016. [ Links ]

25. Stolz L, O’Brien KM, Miller ML, Winters-Brown ND, Blaivas M, Adhikari S. A review of lawsuits related to point-of-care emergency ultrasound applications. West J Emerg Med. 2015;16:1-4. [ Links ]

Conflitos de Interesse: O Dr. Blaivas prestou consultoria para as seguintes empresas relevantes nos últimos 12 meses: Anavasi Diagnostics, AutonomUS, ThinkSono, Intuitap Medical

Fontes de Financiamento: Não existiram fontes externas de financiamento para a realização deste artigo.

Proveniência e Revisão por Pares: Não comissionado; revisão externa por pares.

Conflict of Interest: Dr. Blaivas has consulted for the following relevant companies within the past 12 months: Anavasi Diagnostics, AutonomUS, ThinkSono, Intuitap Medical

Financial Support: This work has not received any contribution, grant or scholarship.

Provenance and Peer Review: Not commissioned; externally peer re-viewed.

© Autor (es) (ou seu (s) empregador (es)) e Revista SPMI 2024. Reutilização permitida de acordo com CC BY-NC 4.0. Nenhuma reutilização comercial. © Author(s) (or their employer(s)) and SPMI Journal 2024. Re-use permitted under CC BY-NC 4.0. No commercial re-use.

Received: March 08, 2024; Accepted: March 12, 2024

Correspondence / Correspondência: Michael Blaivas - mike@blaivas.org Department of Medicine. University of South Carolina School of Medicine; Columbia, Estados Unidos 6311 Garners Ferry Rd, Columbia, SC 29209, Estados Unidos

Creative Commons License This is an open-access article distributed under the terms of the Creative Commons Attribution License