Skip to main content
  • Original Article
  • Open access
  • Published:

Real-time video transmission of ultrasound images to an iPhone

Abstract

Background

As point-of-care ultrasound spreads across the globe, there is an increased need for training and supervision of ultrasound studies. Real-time oversight is important, especially in critically ill patients, but often an expert ultrasound over-read is not available on location. Technological advances have improved data transmission so that images and videos can be sent great distances very rapidly. In this study, we examine the feasibility of real-time wireless transmission of ultrasound video to an iPhone.

Methods

An ultrasound machine was connected via a video converter to a laptop. iCam (SKJM, LLC) software was used to transmit the video across the Atlantic Ocean to an iPhone. Images typical for those performed in an emergency department were sent, in random order by a ‘scanning physician.’ An ‘interpreting physician’ overseas was asked to identify the anatomy, presence or absence of pathology, and comment on the quality, speed, and delay of transmission.

Results

Rapid image transmission was feasible and the ‘interpreting physician’ was always able to correctly identify the anatomy and orientation. The average delay was minimal (2.7 s), allowing for real-time feedback. The frame rate was markedly slower in the received images as compared to the transmitted images, and was faster when the iPhone was connected via WiFi (1.1 fps) versus a 3G connection (0.4 fps).

Conclusion

Transmission of real-time ultrasound video to a remote iPhone using inexpensive technology is feasible, with the preservation of image quality and minimal delay. Transmission speed was superior with a WiFi connection than with a 3G connection.

Background

Point-of-care ultrasound has undergone a dramatic expansion to many new settings over the last two decades. With technological advances, machines have become smaller, more portable, and less expensive. As a result, machines are more accessible and, therefore, worldwide there are more novice sonographers who require training and supervision. Point-of-care ultrasound is now frequently used outside of the emergency department in novel ways and for novel applications. Novice sonographers with limited experience may need real-time assistance in acquisition or interpretation of these images.

Technological advances in wireless communication and data transmission have also rapidly progressed in recent years. Products, such as the iPhone (Apple, Inc.) have revolutionized the abilities of handheld phones and have made high-speed wireless transmission and reception of data via a 3G network accurate and immediate. iPhone functionality is greatly enhanced with the installation of various applications. iCam (SKJM, LLC) is an application that streams live audio and video input from a webcam directly to an iPhone.

This technological proof-of-concept study analyzes the feasibility of using the iCam software to transmit real-time ultrasound video to an iPhone in a remote setting and such an application could have important implications in point-of-care ultrasound when expert image interpretation is not readily available. Several such settings are pre-hospital transports, community hospitals, small villages in the underdeveloped world, high altitude clinics, the wilderness environment, or even the battlefield.

We sought to answer the following specific questions:

  1. 1.

    Is it possible to transmit live ultrasound images via iCam software?

  2. 2.

    Is the image quality (spatial resolution) sufficient to identify anatomy?

  3. 3.

    Is the image frame rate (temporal resolution) fast enough to interpret moving images?

  4. 4.

    Is the delay minimal enough to make real-time decisions and provide real-time guidance?

Methods

A schematic and photo of the hardware set-up are shown in Fig. 1. A Sonosite Micromaxx (Sonosite, Inc., Bothell, WA) with a curvilinear probe (2–5 MHz) on abdominal settings was used to acquire images by the “scanning physician” (SP), an emergency physician with fellowship training in emergency ultrasound. The ultrasound machine was connected via an S-video cable to a converter. S-video carries an analog video signal as two separate components, brightness and color.

Fig. 1
figure 1

A schematic of the hardware set-up. Solid lines represent hard-wired connections, dotted lines represent wireless connections

The signal entered an Advanced Digital Video Converter (ADVC-55, Grass Valley, Thomson Inc.). This converter is inexpensive (approximately $150 US) and commercially available and does not require specialized software or a separate power supply. The video signal was transmitted as an NTSC format at 720 × 480 pixels at 29.97 frames per second (fps) via an IEEE 1394 cable, also known as FireWire, which is capable of transmitting data at up to 800 MB/s.

The FireWire cable entered a MacBook Pro laptop (Apple, Inc.) with a 2.4 GHz Intel Core 2 Duo processor and 2 GB of memory and 667 MHz SDRAM. The operating system was Mac OS X 10.5.7.

iCam and iCamSource (SKJM, LLC) were used to view and send the image. These are inexpensive applications (approximately $5 US) and are available online. iCamSource can send video and audio as a signal which can be viewed on an iPhone with the iCam software. The signal is password protected. iCamSource was installed on the MacBook Pro. The ADVC-55 video feed was selected as the input for the iCamSource. The audio input was turned off.

The MacBook Pro was connected through an internal 802.11 g wireless network card (Airport) to a wireless 802.11 g router to a home DSL internet connection through existing telephone lines.

The “interpreting physician” (IP), a physician with training in emergency ultrasound, received the signal in a transatlantic location over 5,000 km away (Boston, US to Oxford, UK.) An iPhone with iCam software installed was connected via a cellular 3G connection or WiFi signal to the local network. The SP and IP communicated via Skype, a free internet phone call service.

The SP performed ultrasound scans that represented applications and views typical of those performed in emergency settings. These included a FAST examination, a cardiac ultrasound, an aorta scan, a thoracic ultrasound for lung sliding, and an internal jugular vein scan (to simulate procedural guidance.) The IP was blinded to which parts of the body were being scanned. For each video, the IP determined: (1) anatomy and orientation (2) whether or not there was pathology, (3) if the frame rate was adequate for real-time interpretation, and (4) if the delay was minimal enough to give real-time feedback. All scans were done first with the iPhone connected to the local network via a 3G connection, and then via a WiFi connection.

Results

The IP interpreted, in real-time, each of the eight video clips. Figure 2 shows a photo with a representative image as seen on the iPhone. The results were communicated to the receiving physician verbally via Skype. A summary of the results is shown in Table 1. The delay was 2.7 s, regardless of the type of clip or of whether the iPhone was connected via 3G or WiFi. The average bandwidth for the transmitted signal was 140 Kb/s (range 120–160 Kb/s). The frame rate was calculated and was 0.4 fps when the iPhone was connected via the 3G network, and 1.1 fps when connected via WiFi. The frame rate of the original transmitted signal was 11.9 fps.

Fig. 2
figure 2

An example of an image (right upper quadrant Morrison’s Pouch) transmitted to an iPhone

Table 1 Interpretations of transmitted clips

Discussion

Point-of-care ultrasound is a well-established practice within emergency departments in the United States and much of the developed world. This kind of diagnostic imaging is different from traditional or “formal” ultrasonography or radiology-performed imaging in which a study is ordered by a treating physician, performed by an ultrasound technician, interpreted by a radiologist, and then the results acted upon by the initial treating physician and incorporated into clinical decision-making. Point-of-care ultrasound involves the melding of these three functions into one as scans are performed at the bedside by a clinician trained in the acquisition and interpretation of ultrasound images.

As the use of point-of-care sonography spreads worldwide, so too does its potential. Ultrasound is being used more and more frequently in less traditional places and manners. However, not every location that uses point-of-care ultrasound has the resources, education, and supervision that most academic emergency departments have. Training all users and potential users of ultrasound in image acquisition and interpretation would be a formidable and extremely challenging endeavor. It is in these situations that remote transmission of real-time video images could play an important role.

Telesonography has been studied in some of these settings [13]. It has been shown to be feasible in the pre-hospital setting [4, 5], austere environments [6], by ship officers on merchant ships [7], outer space [8], and from lower acuity hospitals to trauma centers [9, 10]. Studies have shown that using robots, image acquisition can be done remotely as well [1116]. In addition, one could imagine the potential in other settings where ultrasound machines exist, but usually without much educational or clinical oversight, such as underdeveloped countries or community hospitals without an ultrasound expert readily on hand with difficult cases. A needs-analysis study in Queensland, Australia showed that up to 8% of studies done in community hospitals would have benefited from telesonography consultation, in diagnostic advice, scanning technique, and patient management advice [17]. 3G networks already exist in much of the developing world where resources are scarce—a sonographer could transmit images wirelessly to an expert interpreter anywhere in the world who could then provide immediate bedside assistance from thousands of kilometers away. Military applications are also possible where portable ultrasound machines could be carried out into the battlefield by soldiers and interpreted remotely by experts, providing decision support regarding treatment or transport.

This study is a technological proof-of-concept. It is different from many other studies in telesonography in that it uses inexpensive commercially available technology to transmit and view the images. Some other studies have looked at ultrasound images being sent from hand-held phones [18]. What is most unique about this study is that these images are being transmitted to a handheld phone. Cellular phones have the advantages of being easily portable and already items that a clinician would have readily available. The use of cellular phones as viewing instruments obviates the need for dedicated ‘reading rooms’ as well.

The iCam software allows for preservation of image quality and remote interpretation of ultrasound scans. Some of its many features expand its potential use as a clinical and educational tool. Up to four scans can be viewed simultaneously, allowing for one remote expert interpreter to oversee multiple scans from different parts of the world. The video is protected with a password, which enhances security of transmission. Likewise, anyone with the username and password can log into the live video stream, allowing for multiple interpreters of the same video. In addition, since audio feed can also be transmitted, iCam software has potential as a teaching tool—educational modules with live ultrasound video and real-time audio commentary could be transmitted, and learners worldwide could log in and have access to this teaching in the palms of their hands.

The question of adequacy of images relates most to the capabilities of the wireless internet connections. With infinite bandwidth, received images should have the same quality and frame rate as transmitted ones. When there is a limitation on bandwidth, the number of pixels per second transferred has a finite maximum. This results in degradation in the quality of the image, a drop in number of frames per second, a delay in transmission, or some combination of the above. Literature exists on technological advances, variable compression and bandwidth on their effect on real-time transmission of video [1923].

The results of this study support the concept that remote interpretation of images on an iPhone is feasible. Image quality (spatial resolution) as viewed on the iPhone was always preserved. In every instance, the IP was able to correctly identify the anatomy and orientation, and whether or not there was pathology visible. The delay of images was also minimal, only 2.7 s on an average. The SP and IP had a parallel live audio connection and the IP was able to interpret images seconds after they were acquired.

Frame rate (temporal resolution) dropped significantly in the received images. Frame rate as viewed on the laptop was equal to the frame rate viewed on the ultrasound machine, approximately 11.9 fps at a scanning depth of 13 cm. On the iPhone frame rate dropped to approximately 1.1 fps at this depth when connected via WiFi (approximately 11 times slower), and 0.4 fps when connected via 3G (approximately 31 times slower). For the majority of applications (FAST, aorta, pericardial effusion) images viewed at this slow frame rate were still perfectly sufficient to make an accurate diagnosis. Frame rate did affect the ability to judge cardiac ejection fraction, as accurate estimation of this requires real-time visualization of motion. For the same reason, one would expect that other motion-dependent applications may also be challenging, such as advanced echo, Doppler, or procedural guidance. While actual lung sliding was difficult to perceive at 0.4 fps, the lung tissue deep to the pleural line was visualized at different positions. In addition, comet tails were observed, and these findings together were enough for the IP to make the interpretation of “no pneumothorax.” Frame rate is dependent on the depth and on the ultrasound machine ranged from 24.8 fps at a depth of 4.7 cm to 5.8 fps at a depth of 30 cm. We did not measure the frame rate of the received images at these other depths, but presumably they would have decreased proportionately.

Although no procedures were performed, the SP did transmit images of an IJ vein, to simulate the ultrasound associated with central venous access. While not specifically studied, it is the opinion of the authors that a delay of 2.7 s would likely be too long to give real-time guidance of central catheterization. It is likely that by the time the image was received and feedback given, the needle would be in a different location. A slow and deliberate procedure would in theory be possible, but great care would be needed to minimize any movements between image transmission and interpretation. WiFi connection for reception of images had a faster frame rate than 3G and would be preferable if this were to be attempted.

Limitations

There were several limitations to our study. First, there were no abnormal findings transmitted. The IP did feel that the image quality was excellent enough that pathology would have been easily recognizable, but since this was not actually studied, this is more of a postulation than a proven finding. Further studies with both normal and abnormal scans would better characterize the ability to accurately differentiate between positive and negative findings. In addition, multiple IPs and a scoring system would be needed to score image quality.

The SP had sufficient prior experience with ultrasound and hence the images were high quality. This may not represent the images that novice sonographers would obtain, and hence generalizability of this study may be limited.

The schematic used of transmitting the images to the iPhone does not allow for 2-way audio. For the IP to provide feedback and real-time interpretation to the SP, a parallel laptop-to-laptop audio conference was established via Skype. This limits the utility of the iCam software to be a sole means of real-time video transmission with live feedback. In fact, if an IP is able to communicate via a Skype connection, an iPhone may not be needed. A computer-based audio connection is not required—a simple telephone call may suffice—but a separate audio connection is required for real-time communication.

A digital video converter was required to transmit and convert signals from the ultrasound machine to a laptop. Some ultrasound machines are PC-based and have the ability to connect directly to the internet. While these were not evaluated in this study, it may theoretically be possible to send images directly from such a machine, obviating the need for a video converter and a separate laptop. Further research is needed. And finally, the cost of transmission and reception was not studied. Both the SP and the IP had unlimited data transfer available so this was not an issue, but it would need to be considered in situations where this is not the case.

Conclusion

Transmission of real-time ultrasound video to a remote iPhone using inexpensive technology is feasible, with preservation of image quality and minimal delay. The transmission speed was superior with a WiFi connection than with a 3G connection. Further studies with a wider range of anatomy and by more novice sonographers are needed.

References

  1. Emerson DS, Felker RE (1995) Remote real-time ultrasound interactive telediagnosis: putting it into practice. J Ambul Care Manage 18:20–34

    Article  PubMed  CAS  Google Scholar 

  2. Landwehr JB, Zador IE, Wolfe HM, Dombrowski MP, Treadwell MC (1997) Telemedicine and fetal ultrasonography: assessment of technical performance and clinical feasibility. Am J Obstet Gynecol 177:846–848

    Article  PubMed  Google Scholar 

  3. Popov V, Popov D, Kacar I, Harris RD (2007) The feasibility of real-time transmission of sonographic images from a remote location over low-bandwidth internet links: a pilot study. Am J Roentgenol 188:W219–W222

    Article  Google Scholar 

  4. Garrett PD, Boyd SY, Bauch TD, Rubal BJ, Bulgrin JR, Kinkler ES (2003) Feasibility of real-time echocardiographic evaluation during patient transport. J Am Soc Echocardiogr 16:197–201

    Article  PubMed  Google Scholar 

  5. Strode CA, Rubal BJ, Gerhardt RT, Bulgrin JR, Boyd SY (2003) Wireless and satellite transmission of prehospital focused abdominal sonography for trauma. Prehosp Emerg Care 7:375–379

    Article  PubMed  Google Scholar 

  6. Huffer LL, Bauch TD, Furgerson JL, Bulgrin J, Boyd SY (2004) Feasibility of remote echocardiography with satellite transmission and real-time interpretation to support medical activities in the austere medical environment. J Am Soc Echocardiogr 17:670–674

    Article  PubMed  Google Scholar 

  7. Nikolic N, Mozetic V, Modrcin B, Jaksic S (2006) Might telesonography be a new useful diagnostic tool aboard merchant ships? A pilot study. Int Marit Health 57:198–207 (discussion 208–212)

    PubMed  Google Scholar 

  8. Arbeille P, Herault S, Roumy J, Porcher M, Besnard S, Vieyres P (2001) 3D realtime echography and echography assisted by a robotic arm for investigating astronauts in the ISS from the ground. J Gravit Physiol 8:P143–P144

    PubMed  CAS  Google Scholar 

  9. Al-Kadi A, Dyer D, Ball CG et al (2009) User’s perceptions of remote trauma telesonography. J Telemed Telecare 15:251–254

    Article  PubMed  Google Scholar 

  10. Dyer D, Cusden J, Turner C et al (2008) The clinical and technical evaluation of a remote telementored telesonography system during the acute resuscitation and transfer of the injured patient. J Trauma 65:1209–1216

    Article  PubMed  Google Scholar 

  11. Arbeille P, Capri A, Ayoub J, Kieffer V, Georgescu M, Poisson G (2007) Use of a robotic arm to perform remote abdominal telesonography. Am J Roentgenol 188:W317–W322

    Article  Google Scholar 

  12. Boman K, Olofsson M, Forsberg J, Bostrom SA (2009) Remote-controlled robotic arm for real-time echocardiography: the diagnostic future for patients in rural areas? Telemed J E Health 15:142–147

    Article  PubMed  Google Scholar 

  13. Canero C, Thomos N, Triantafyllidis GA, Litos GC, Strintzis MG (2005) Mobile tele-echography: user interface design. IEEE Trans Inf Technol Biomed 9:44–49

    Article  PubMed  Google Scholar 

  14. Delgorge C, Courreges F, Al Bassit L et al (2005) A tele-operated mobile ultrasound scanner using a light-weight robot. IEEE Trans Inf Technol Biomed 9:50–58

    Article  PubMed  Google Scholar 

  15. Istepanian RH, Philip N, Martini MG, Amso N, Shorvon P (2008) Subjective and objective quality assessment in wireless teleultrasonography imaging. Conf Proc IEEE Eng Med Biol Soc 2008:5346–5349

    PubMed  CAS  Google Scholar 

  16. Takeuchi R, Harada H, Masuda K, Ota G, Yokoi M, Teramura N, Saito T (2008) Field testing of a remote controlled robotic tele-echo system in an ambulance using broadband mobile communication technology. J Med Syst 32:235–242

    Article  PubMed  Google Scholar 

  17. Lewis C (2005) A tele-ultrasound needs analysis in Queensland. J Telemed Telecare 11(Suppl 2):S61–S64

    Article  PubMed  Google Scholar 

  18. Blaivas M, Lyon M, Duggal S (2005) Ultrasound image transmission via camera phones for overreading. Am J Emerg Med 23:433–438

    Article  PubMed  Google Scholar 

  19. Bassignani MJ, Dwyer SJ, Ciambotti JM et al (2004) Review of technology: planning for the development of telesonography. J Digit Imaging 17:18–27

    Article  PubMed  PubMed Central  Google Scholar 

  20. Brebner JA, Ruddick-Bracken H, Brebner EM et al (2000) The diagnostic acceptability of low-bandwidth transmission for tele-ultrasound. J Telemed Telecare 6:335–338

    Article  PubMed  CAS  Google Scholar 

  21. Finley JP, Justo R, Loane M, Wootton R (2004) The effect of bandwidth on the quality of transmitted pediatric echocardiograms. J Am Soc Echocardiogr 17:227–230

    Article  PubMed  Google Scholar 

  22. Malone FD, Athanassiou A, Nores J, D’Alton ME (1998) Effect of ISDN bandwidth on image quality for telemedicine transmission of obstetric ultrasonography. Telemed J 4:161–165

    PubMed  CAS  Google Scholar 

  23. Wootton R, Dornan J, Fisk NM et al (1997) The effect of transmission bandwidth on diagnostic accuracy in remote fetal ultrasound scanning. J Telemed Telecare 3:209–214

    Article  PubMed  CAS  Google Scholar 

Download references

Conflict of interest

This research was not sponsored by any company. None of the authors has any financial relationship with Sonosite, Inc.; SKJM, LLC; Grass Valley or Thomson, Inc.; Apple; or Skype; or any of their products including Micromaxx, iCam, ADVC-55, or iPhone. The authors have full control of all primary data.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Andrew S. Liteplo.

Additional information

Previous Presentation: Winfocus V in Sydney, Australia, 3–5 September 2009 by Dr. Liteplo.

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 2.0 International License ( https://creativecommons.org/licenses/by/2.0 ), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Reprints and permissions

About this article

Cite this article

Liteplo, A.S., Noble, V.E. & Attwood, B. Real-time video transmission of ultrasound images to an iPhone. Crit Ultrasound J 1, 105–110 (2010). https://doi.org/10.1007/s13089-010-0025-4

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s13089-010-0025-4

Keywords