Tuesday, July 2, 2019

A Baker's Dozen of Highlights from Deep Medicine by Eric Topol

Eric Topol's Deep Medicine is a significant contribution to American medicine and should be required reading for anyone interested in the present and future of health delivery. I read the book as a student and teacher of digital health technologies and quickly discovered that my understanding of AI and Medicine was superficial at best. In my quest to overcome this deficit, I found the following observations from the Topol book to be of particular relevance.

1. The first major entry of AI into the practice of medicine was automated systems for reading ECGs, which were first applied in the 1970s and became routine in the 1980s.
2. Deep neural networks (DNN) are the driving force supporting AI innovations in health. The DNN era was made possible by these four components: massive data sets, dedicated graphic processing units (GPUs), cloud computing, and open-source algorithmic development units.
3. Deep medicine incorporates three components:
a. Deeply define the medical essence of an individual, including medical, social, behavioral, family histories, and biology
b. Deep learning involving pattern recognition and machine learning as well as access to virtual health coaches
c. Deep empathy and connection between patients and clinicians (Topol would argue this is the most important of the three).
4. In contrast, shallow medicine is arguably where much of the practice of medicine is today. Patients exist in a world of insufficient data, time, context, and presence.
5. Most of the published deep learning examples represent only in silico, or computer-based validation, as compared to prospective clinical trials involving real patients. This is an important distinction!
6. Radiology, pathology, and dermatology practices all involve pattern recognition, which AI is good at detecting. Topol believes that eventually, all medical scans will be interpreted by machines. The machines will also produce an initial draft of the scan report, and the radiologist will sign of off and make the finding official. An interesting aside: It was noted that pigeons can discriminate between complex visual stimuli. A 2015 study showed that pigeons could be trained to read x-ray images and that flocked-source readings were remarkably accurate!
7. It seems likely that machines will outperform humans on specific, narrowly focused tasks, suggesting that narrow AI will take hold.
8. For many clinicians, the workflow can improve due to a faster and more accurate reading of scans and slides, with narrow AI seeing things humans would miss, or by eliminating the keyboard during a clinic visit, improving communication and presence.
9. Virtual medical assistants show promise, but no randomized controlled trials have shown improved clinical outcomes. For now, the products have utilized small retrospective or observational studies to demonstrate their worth.
10. We cannot realize the full potential of deep medicine unless we have a virtual medical assistant helping us out. Neither patient nor doctor is going to be able to process all of the continually expanding medical and biological data.
11. The electronic health record is a narrow, incomplete, error-laden view of an individual's health. This situation presents a significant bottleneck for the virtual medical assistant of the future.
12. The contrast between the levels of self-driving cars (Level 0/No Automation to Level5/Full Automation) and the practice of AI medicine is worthy of consideration. Topol believes that while Level 4 (High Automation) may be achievable for vehicles under ideal driving conditions, it is unlikely that AI medicine can move beyond Level 3 (Conditional Automation). He asserts that patients would never tolerate a lack of oversight by human clinicians across all conditions all the time.
13. In summation, Topol contends that the potential gains of deep medicine can serve to bring back "real medicine" which in his view, consists of presence, empathy, trust, caring, and being human.



Monday, April 29, 2019

Checklist to Evaluate the Internet of Health Things


Recently, I participated in the evaluation of a "smart bed" for a post-acute care entity here in St Petersburg. After the evaluation was completed, it became apparent to me that we had no formal evaluation process that we could use to assure that any IoHT device that came our way would fulfill our requirements for patient safety and organizational fit. Therefore, I decided to prepare a checklist, which could be used by any care provider to evaluate an IoHT device, be it a smartwatch, a portable EKG, or any one of the myriad wearables which are now coming to market.


                                      Checklist



1.      Does use of the device/wearable fit within the organization's values and culture?

2.      Impact on care and quality of life for patients, residents, and families?

3.      Will it enhance caregiver productivity? How?

4.      Is it HIPAA compliant?

5.      Has it been cleared by the FDA? When? Approved by the FDA? When?

6.      What are the privacy and security protections?

7.      Is the device/wearable interoperable with the organization's Electronic Health Record?

8.      With respect to user experience-is the device easy to understand and use? Is there an operating manual readily available? Is there a help desk at the manufacturer?

9.      Can you access proof of concept studies?

10.  Installation issues and ongoing maintenance and support? How will upgrades/improvements be handled?

11.  Smartphone app for iPhone and Android?

12.  Device/wearable website access available?

13.  Performance compared to comparable devices/wearables?

14.  Are there daily or weekly use reports? Dashboards?

15.  Does the device/wearable pass muster with State and Federal laws and regulations?

16.  Is the device Continua Certified? When? Note: The Continua Compliance and Certification programs help organizations to validate both compliance and interoperability for personal connected health devices and services in accordance to the Continua Design Guidelines (CDG)

Friday, March 8, 2019

Essential Building Blocks for Digital Health Technology Adoption

I choose to believe that all types of post acute care facilities should take steps to incorporate digital health technologies into their settings. It will be a competitve advantage, to be sure. And, it won't be particularly easy. However, the technologies have the potential to keep seniors living independently for extended periods, to support caregivers and families, and to improve the quality of care for patients and residents.

If you accept this premise, then I would suggest that you consider applying the following suggestions which-in combination-are most likely to lead to a successful digital health technology initiative.

1. Commitment from the Board of Trustees and top management to support the initiative at the outset, and throughout the period of implementation.
2. Incorporate digital health technology experimentation and adoption in the entity's strategic plan.
3. Encourage a culture of innovation ("Failing your way to success!"). I worry that a process or top down culture may scuttle a successful technology effort.
4. Install enterprise-wide Wi-Fi, and try to identify and eliminate eliminate dead zones as much as possible. Having WiFi in common areas only just won't cut it.
5. Identify product champions in management, nursing and information technology, and give them the resources and recognition necessary to get the job done.
6. Conduct extensive research on alternative technologies, and select the one which is most likely to be accepted in your particular facility. The first technology chosen might be voice, telehealth or care robots. Don't try to pilot more than one technology at a time. Do one well, before considering another technology. Consider that over time, you'll be adopting a number of new or improved technologies in support of your mission.
7. Once you decide to pilot a technology, be sure to fund it adequately and establish metrics which include customer satisfaction and improved clinical oucomes.
8. Be sure to include a guided orientation to the new technology, and consider a "Help Desk" to support everyday users. Often, seniors are unsure of how to enable or disable an Alexa skill, or how to pair Alexa with wireless lighting, air conditioning, or security devices.
9. Encourage residents to chart their own way with the technologies and to share their stories of success or failure. Tech-savvy seniors may be more numerous than you might imagine.
10. Seek out academic partners to provide guidance and support for your initiatives. It is in everybody's best interest to insure that the healthcare leaders of tomorrow are well versed in advanced technologies!

I hope that these suggestions are useful. I welcome your critical commentary.



Thursday, March 7, 2019

Brookings Institution 2018 Survey on Robots

The survey was undertaken by researchers at the Brookings Institution through an online U.S. national poll of 2,021 adult internet users between June 4 and 6, 2018. Here are some of the key survey findings, followed by my observations.

1. The survey asked how likely robots are to take over most human activities within the next 30 years? 19% feel this was very likely, 33% believes this is somewhat likely, 23% feel it is not very likely, and 25% were not sure. 

The headline for the Brooking's posting claims "52% believe robots will perform most human activities in 30 years." Combining the very likely (19%) and somewhat likely (33%) into a broad affirmation of the 30 year scenario seems to me to be a bit overstated. I honestly don't think people I encounter have fixed ideas about robots. Rather, what I witness is ambiguity and a bit of anxiety and awkwardness when confronting a care robot.


2. Thirty-two percent believe the U.S. government should set up a Federal Robotics Commission to regulate robot development and usage, compared to 29 percent opposed and 39 percent who were unsure. 

I am not seeing a significant mandate for Federal governmental action in these findings.
3. Sixty-one percent said they were uncomfortable with robots, while only 16 percent felt comfortable with robots and 23 percent were unsure. When asked how worried they were about robots, 61 percent said they were unworried, while 29 percent were worried and 22 percent were not sure.

How are we supposed to reconcile the finding that 61% of respondents are uncomfortable with robots, with the finding that 61% are unworried about robots? It doesn't make sense to me.

4. Thirty-eight percent felt robots would make their lives easier in the next five years, while 17 percent felt their lives would become harder and 45 percent did not know.

I believe the 45% of the respondents who claimed they didn't know is the more useful finding. If one has never had interactions with a care robot for example, how are they supposed to make a judgment about making their lives easier? Today, most Americans have had little or no personal interaction with care robots, or robots of any kind. 
5. When asked how common they thought robots would become over the next five years, 13 percent said very common, 32 percent said somewhat common, 26 percent felt they would not be very common, and 29 percent did not know.

I believe that care robots won't be commercially viable for 3 years or more, putting me in the "not very common" group. For now, pet robots are being used in a number of post acute care settings, and the Front Porch system has published a definitive study on the positive clinical impact of a robot pet called PARO. Kaiser Permanente is making a care robot named MABU available to some of it's members to help manage their chronic diseases. A nurse assistant robot called MOXI by Diligent Robotics is being piloted in Dallas and Austin hospitals. Interesting developments, to be sure, but far from widespread acceptance.

6. We asked about the kinds of robots that would interest them. Twenty percent were interested in robots that would help them clean house, 17 percent wanted robots that would provide home security, and only 9 percent were interested in a robot that helps to care for a child or aging relative.

To me, the key finding is the 9% who were interested in a robot that cares for a child or an aging relative. It doesn't suggest much market demand for robots to serve children or aging clients. Looking at a more nuanced breakdown, we observe the following.


6 a.How interested are you in having a robot that helps care for a child or aging relative?


  • 71% very disinterested
  • 13% somewhat disinterested
  • 4% somewhat interested
  • 5% very interested
  • 7% don’t know no answer

To me the finding of 71% very disinterested is a stark reminder that while we may be willing to have robots wash dishes and clean the house, we draw a line involving robots in direct care. This distinction was made clear to me during a recent lecture when an RN in the audience was adamant that no robot could ever deliver compassionate care to an aging client!

7. The survey inquired how much people would pay for a robot that handles routine chores.  Forty-two percent said they would pay $250 or less, 10 percent said they would pay between $251 and $500, 3 percent said they would pay from $501 and $750, 3 percent indicated they would pay between $751 and $1,000, and 3 percent were willing to pay more than $1,000. Thirty-nine percent did not provide a figure. 

The price points of care robots that I cover range from $100 for a Hasbro Joy for All pet to $18,000 for a robot nursing assistant, with the average purchase price between $500 and $1000, before adding monthly subscription fees. Two robots-JIBO and KURI- went out of business in 2018, because their price points were high ($700 to $900) and the units offered questionable value. There clearly is a sizeable gap between what the respondents wish to pay for a care robot and what the vendors are now asking. And, you can buy an Amazon or Google voice activated smart display for $250 and no subscription fees. These devices present significant competition for care robots. Finally, there isn't a care robot in this country with a commanding market position, enjoying a first mover advantage.


Source: https://www.brookings.edu/blog/techtank/2018/06/21/brookings-survey-finds-52-percent-believe-robots-will-perform-most-human-activities-in-30-years/

Sunday, March 3, 2019

Digital Health Technology Lecture Evaluation

My lectures are designed for active and curious adults who want to better understand and possibly apply digital health technologies to help them age in place. Their feedback, either from questions offered during the lecture, following the talk, or incorporated in program evaluations, is vital to the ongoing enrichment of the material. 

Immediately following a lecture, I seek out a quiet spot and prepare program notes. This process continues for the next few days until I have captured all of the relevant feedback that I can recall. I then review the notes, highlighting key points and insights. Refernce is made to these notes before I give another lecture on the same topic.


Here are ten things I look for as I examine my post meeting notes:

1. Identify the slides which clearly connected with the audience and those which did not. I'll continue to use the former, and remove or edit the latter. For example, a big hit with the audience is attributing a Myers-Briggs rating to Alexa. The latest gadgets from the annual Consumer Electronic Show (CES) don't seem to have much traction, however.
2. Did the device demos works as intended? The demos which are the most effective involved voice activated devices, virtual reality viewers, and robot pets like the Hasbro Joy for All cat and dog. I have encountered less success with augmented reality and wearables (e.g., Apple Smart Watch) demos.
3. The differences between Digital Health Technologies are often muted and I try to take note of technology spillovers. For example, many care robots use voice activation, and telemedicine may well be broadcast using Amazon or Google visual display devices.
4. I try to take note of repeated audience concerns with regard to using Digital Health Technologies. In my experience, the top three concerns are privacy/security, fall prevention, and overcoming loneliness.
5. Often members in the audience are looking for guidance using the devices. I try to offer suggestions when appropriate. For example, I suggest that the introduction of voice activated devices in a post-acute setting begin with an orientation, a "help desk" to assist with such concerns as enabling/disabling Alexa skills, and the formation of a Voice club or an Alexa club.
6. I encourage audience members to tell their stories about their experience with the technologies. One person recently noted that she gave Amazon Echo Dots to her  7 grandchildren, and they really enjoyed them. A more common response is that attendees use their grandchildren to help them understand and manage the devices.
7. Attendees have a way of asking questions that you don't anticipate: How do you trigger fall alerts when you are unconscious? What happens when there is a power outage? How can robots give compassionate care to frail elderly?
8.Have I given equal weight to the promoters and the critics of these devices? Being overly positive or overly negative upsets the balanced view needed by prospective consumers of the devices.
9. Did we look ahead three years or more, and project how the devices might be used by the audience members? After all, we are in the very early stages of device adoption.
10, Did I allow enough time at the close of the talk to provide for direct audience feedback and discussion?




Digital Health Technologies Lecture Preparation

John Wooden, the legendary UCLA Basketball coach, once said "If you don't prepare, prepare to fail." Therefore, I prepare. I have been following a pre-lecture routine for almost 20 years. Briefly, let me outine the steps I take prior to giving the lecture.
1. Over three separate days prior to the talk, I review the slides in presentation mode, and also carefully review my lecture notes. This gives me a chance to make any last minute edits to the material-which I often do!
2. I check with the lecture sponsor to learn the headcount for the lecture, and print my own handouts accordingly.
3. Ideally, I'd like to know the layout of the room, wireless availability and password access, seating arrangements, location of podium and microphones, and access to plugs for my demos. I try to make a site visit prior to the talk to check these things out.
3. I always bring a back up computer, projector, microphone, and power strip. Many times, one finds the technology components in the lecture space are inadaquate. Thus, you need  to have a failsafe support system. And, as a last resort, I am always prepared to give the lecture without using slides.
4. I arrive at least an hour before my lecture is scheduled to begin. Problems can then be identified and corrected well before the students arrive.
5. For students who arrive early, I usually provide an "early bird special" showcasing the technologies that will be discussed that day. For example, I conduct a face-off between the Amazon Echo Show and the Lenovo Smart Display with Google Assistant, or encourage attendees to utilize several virtual reality viewers.
6. I prefer to handle my own introduction, and to keep it brief. 
7. Attendees are encouraged to ask questions at any time. It keeps the audience engaged, and makes them part of the presentation.
8. Lastly, I remind myself to repeat questions from the audience, and to be sure to check on the following: Can you hear me? Is the pace of delivery OK? Can you see and read the slides easily?

These steps have served me well over the years, and I commend them to you.

Tuesday, February 19, 2019

Digital Health Technologies Teaching Toolkit


I first began to teach a class called "Emerging Technologies Transforming Home Healthcare" in the Fall semester of 2017. It was offered to registrants at the Osher Lifelong Learning Institiute (OLLI) at Eckerd College and at the University of South Florida.

I decided, from the outset, to include aging tech devices as an integral part of the lecture for illustrative purposes, and also to give attendees a chance to directly engage with the devices. My impression, based on immediate feedback and subsequent course evaluations, is that the attendees valued access to the devices and associated apps.

I'd like to share my inventory of devices and apps which proved to be successful supplements to the teaching material.

For Voice-Activated Devices: Amazon Echo Show, Lenovo Smart Display with Google Assistant, and Google Home are featured. "Competitions" between the Amazon Show and Google Smart Display were especially effective. Start by asking each device to introduce themselves.

For Robot Pets: Hasbro's Joy for All pet dog (Golden Pup) and pet cat (Orange Tabby) are consistent hits with the attendees. We distribute the robots at the start of class, and let the participants play with them during and after class.

For Robot Companions: Anki's Vector is always entertaining-whether it's playing blackjack or giving a fist bump. Vector now features Alexa access as well, so you can make any Alexa query you wish. I had access to Catalia Health's MABU care manager for a year, and attendees enjoyed using either voice or touch to respond to the robot's prompts (How are you feeling today? Did you take your medications?)

For Virtual Reality The Zeiss VR One Plus viewer using a smartphone together with an Orbulus VR app is a great place to begin the VR experience. The Oculus Go by Facebook is also offered for a more comprehensive and engaging VR experience.

For the Internet of Things: The Roomba 675 vacuuming robot is fun to use, directed either by a smartphone app or by voice commands in conjunction with Alexa.

For the Internet of Health Things: Use the Apple Smart Watch (Series 4) together with an iPhone for it's Health, and it's Watch app. You can demonstrate how the EKG feature of the watch works. I share my bike accident and the subsequent trigger of the Watch's fall alert. The Kardia portable EKG device and associated app are interesting alternatives to the Apple watch. Eargo hearing aids are also presented as a novel approach to dealing with hearing loss.

For the Visually Impaired: Demos using Microsoft's Seeing AI (Free for iPhones only) and the KNFB reader (costing $100 and compatible with iPhone and Android) feature reading handwriting, scanning bar codes, and identifying money.

For Augmented Reality: Amazon's AR View within the Amazon app and the IKEA Place app
showcase an AR overlay of furniture in your home setting. The WANNAKICKS app is an great way to use AR in deciding which sneaker to buy.

For Virtual Care: I encourage attendees to download Ada Health's free symptom checking software, try it out, and share their impressions in class. Also, I inquire whether anyone has used the BayCare Anywhere app or Tampa General's Virtual Care app and to share their impressions.

Please let me know if you have any questions or comments.