Skip to Main Content

Yale Psychiatry Grand Rounds: November 1, 2016

November 06, 2019
  • 00:00We're going to go ahead and get started.
  • 00:03Today we're really fortunate to have doctor, Sheri McKee. Present and in fact, she stepped into the into a great need that. We had for grand rounds today and so is really contributed in major ways to being able to do this talk today and so we thank her for that. Thank you very much. Sherry, doctor, McKee is a professor of psychiatry in school, Medison Ann. She directs the yield pharmacology laboratory and she's the clinical director of the forensic.
  • 00:34Drug diversion clinic.
  • 00:36She received her pH D from the University of Western Ontario and subsequent to that. She completed her fellowship training in the Department of psychiatry at Yale, where she has worked ever since.
  • 00:47At the national level, she's held a number of important positions, including president of the Society of addiction psychology of the American Psychological Association. She was a member of the board of directors for the research society on alcoholism. She's currently associate editor of psychology of addictive behavior, and she's been a standing member of the clinical treatment and Health Services Research Subcommittee of the NI AAA, which funds so many of our grants.
  • 01:15In her own work as she directs a very translation program of research focused on treatment development for addictive disorders, with an emphasis on women, an more recently, criminal justice populations. She's also throughout her career had a major focus on tobacco cessation an an alcohol use disorders.
  • 01:36In this work, she's directed large NIH funded efforts focused on developing effective medications for addictive behaviors.
  • 01:44Uh an with this work being with a focus on sex differences and how women and men might responded treatment differently.
  • 01:53Of particular relevance to her talk today. We're going to hear about some of her very current work on. That's been federally funded to develop just in time interventions using wearable biosensors for smoking cessation. An alcohol use disorders, so with this please. Welcome me an please join me in welcoming Doctor Mickey. Thank you.
  • 02:22Alright. Thank you for that introduction Stephanie.
  • 02:27So first my disclosures. I'm going to be talking a lot about um company. I developed called Lou May. I may founder stockholder consultants subcontractor and on the board of directors. So, just to say that I have a very complicated. Conflict management plan for all of this and I've also listed somebody other conflicts here on this slide.
  • 02:51So for the outline today. I'm going to touch on what are just in time interventions and how we can use biosensors and machine learning algorithms to support the development of such interventions and then I'm going to talk about my current progress in developing just in time interventions. We've had to grants funded so far, so I'll be talking about the results from from both of those, and then finish up with our next steps as Stephanie mentioned I got the invitation from John to do this on Sunday morning.
  • 03:21And up to this point, I'd only presented a 12 minute talk on on these data. So I've had to flush this out a little bit That being said, you might get out a bit early for lunch.
  • 03:36OK.
  • 03:37Just in time interventions are interventions that are designed to provide the right type the right amount of support at the right time typically in real time by adapting to an individual's ever changing internal and external states.
  • 03:56This article I put up here has a very nice outline of how to go about developing such interventions. They talk about these factors that go into understanding and developing a just in time intervention so a distal outcome is the ultimate goal. You want to achieve so if it's treatment for addiction. You want to support abstinence prevent relapse approximal outcome.
  • 04:23Is going to be more of a short-term outcome that the intervention is designed to achieve like craving reduction perhaps?
  • 04:33And this proximal outcomes are of course, crucial in as mediators in the pathway to achieve your ultimate outcomes.
  • 04:45Tailoring variables.
  • 04:48Our variables uhm.
  • 04:52That were information concerning the individual is used for individualization, so to decide how or when to intervene.
  • 05:04Decision points are points.
  • 05:08In time at which an intervention decision must be made.
  • 05:12You have intervention options that can be chosen from often these interventions are multifaceted.
  • 05:20And then you have decision rules.
  • 05:24And decision rules are away to operate operationalize the adaptation of specifying which intervention option to offer for whom for when and under which experiences and Contacts so the decision rules really link the intervention options in the tailoring variables in a systematic way.
  • 05:44Now to achieve all of this of course, you have to detect behavior. I dent, ideally ideally predicts or you have to first detected behavior. Then ideally predict the behavior and then intervene and obviously the availability of sensing technologies, underpins the success of a just in time intervention.
  • 06:08So, in the US there are 257 million smart Phone users and these are some of the sensors that are available in most smartphones.
  • 06:19And these sensors have been used in many interesting ways to develop just in time interventions.
  • 06:27So this intervention here, um looking at a smartphone application to support recovery for alcohol use disorders used GPS locating to send alerts to patients when they neared a high risk location for example, a bar that they used to frequent.
  • 06:49Now, when you start to combine data streams from various sensors, then you're talking about a digital phenotype.
  • 06:57And this is a quote from Tom Insel from a recent commentary paper.
  • 07:02Who would have believed that patterns of typing and scrolling could reveal individual fingerprints a performance capturing our neuro cognitive function continuously in the real world.
  • 07:13And then he goes on to say could anyone have foreseen the revolution and natural language processing and artificial intelligence that is allowing voice.
  • 07:22And speech collected on a smart Phone to become a possible early warning sign for serious mental illness.
  • 07:29So this is a schematic from his from his article and again showing the various data streams.
  • 07:39Like I talked about this smartphone sensors. How somebody interacts with the keyboard. So reaction time attention memory and then voice and speech analysis from when people are speaking on the Phone.
  • 07:53And these data streams are combined to tell us something about behavior cognition and mood to be used in diagnosis, monitoring from remission or relapse and risk prediction.
  • 08:08So this is a study that used such an approach 10 patients monitored over several weeks and.
  • 08:18With their digital phenotypes simply collected from their smartphones. They were able to recognize manic and depressive States and individuals with bipolar disorder and were able to do this with the precision of 97%.
  • 08:36Now you can integrate multiple sensors, so you can integrate multiple sensors to have a digital phenotype and then to add another layer of complexity to this, then you can process that data through a machine learning algorithm to improve detection of your behavior so before I started this work. This was my understanding of machine learning algorithms. It's still pretty close to where it is right now, but very simply. These are programs math and logic. We know about math and logic that adjust themselves to perform better.
  • 09:07They are exposed to more data so in essence, they learn so part of the machine learning means that those programs change how they process data overtime much like we learn how to process data overtime in our behavior changes as a result.
  • 09:23There are essentially 3 types of machine learning algorithms. There's what's called supervised learning unsupervised learning and reinforcement learning and these are just some examples of the types of algorithms or the types of applications that these different algorithms are used for so in supervised learning. You train the machine learning algorithm using data, which is well labeled in unsupervised learning. You do not need to supervise the model. It's it's all the data is just unlabeled and dumped into.
  • 09:54The algorithm.
  • 09:55And reinforce with reinforcement learning the operator you interacts with the environment to extract outputs or make decisions so it's much more interactive.
  • 10:09So for the types of machine learning algorithms. I'll be talking about later we do supervised learning and classification algorithms.
  • 10:20So.
  • 10:22This is one article that uses a machine learning algorithm to predict smoking urges so In addition to GPS participants in this study reported on their smoking behavior urges to smoke in their mood and to be able to train an algorithm. You have to have something called ground truth, so in this case, it was their real time. Urge reports and after training. This particular algorithm. They were able to report smoking smoking urges with an accuracy of 86%.
  • 10:51Real-time district real time detection of stress is still a Holy Grail that hasn't been yet achieved very difficult to do this particular study used.
  • 11:03Digital phenotype data from mobile Phone weather conditions and other information about individual traits and accuracy and detracting stress with 73% so not so great, you want to see those numbers up in the high 90s.
  • 11:22There's also been a number of studies that have then used machine learning algorithms to predict behavior. So this study looked at short-term mood alterations in those with depression.
  • 11:35This study looked at dietary lapses in weight loss.
  • 11:40And in this study didn't use any smart Phone data. But used 268 predictors that they fed into the algorithm, including things like socio demographics health status patterns of opioid use and more practitioner level.
  • 12:00And more practitioner level and regional level factors and across a 3 month span were able to predict future ODI episodes with 92% specificity.
  • 12:18So for the talk today, I'm going to focus on a gesture detection platform that I've been working to develop so we have gesture detection. In addition to machine learning algorithms. And both of these have made it possible to wirelessly impassively detect both are are smoking drinking and eating in real time.
  • 12:41And with such hardware and software innovations were able to build treatment platforms that are seamlessly integrated into users lives.
  • 12:51So smoking drinking and eating all use characteristic handed mouse gestures were each gesture type is associated with unique and movement in 3 dimensional space.
  • 13:01When you consider smoking behavior you raise your hand to your mouth. You take a puff and you lower your hand and you do this 8 to 10 times per cigarette.
  • 13:18So we've developed this system this hand in mouth tracking system again, which tracks gestures. In 3 dimensional space and the system uses gyroscopes and a seller. Ometer's that are available in any Smart Watch and we've developed a machine learning algorithm to detect these activities in real time without requiring any active input from a user.
  • 13:46So we've had two grants funded to support this work and this work started with the conversation. I had with a colleague at University of Massachusetts names Deepak Ganesan. He's a professor in computer science and he had developed this gesture recognition platform and I thought that would be really interesting to look at for smoking and we thought that.
  • 14:10It wasn't quite ready. Ferraro one funding and we thought our twenty one funding wasn't enough. So we decided to go for SBIR funding. You know working in soft money requires you to be a little entrepreneurial and.
  • 14:23So this was a new experience for me, applying for SBR funding. An one of the fun things you have to do to get this VR funding is you have to form a company, which we did, and his two recently graduated students became the two employees that a company needs to have SBR funding.
  • 14:42So, in the first step of developing this work in the first study. We had 53 volunteers and we videotape them as they were eating drinking and smoking.
  • 14:53And the video tape operated as the ground truth and we use this to train the gesture recognition system. Additionally, these volunteers were the system out in the real world. So then we could also collect data about what 10s of thousands of not hundreds of thousands of the hand gestures look like over the course of the day so while they were engaging in other activities, like driving shopping walking and so on.
  • 15:20So with this first study we were able to discern that the algorithm was 92 to 98% accurate across these behaviors in terms of representing high precision.
  • 15:33The second study that we did we put the watch and.
  • 15:40And they had smart Phone app on 23 adult daily smokers and they were the system for 4 weeks. And when the system remotely detected that they were smoking. They responded they received a prompt on their smart Phone are you smoking. Yes, no they responded an In addition to their smoking behavior. We also tracked other contexts behaviors that we could collect from the smartphone. I either sort of a digital phenotype. If you will so we collected time of day, GPS location, which activities so. So, your smartphone knows whether you're.
  • 16:13Walking driving sitting still and we also collected social milieu, which are smart, Phone can also do so. We could detect all the surrounding Bluetooth in your environment and get an idea of what your social network look like.
  • 16:29So the results of this demonstrated that users were highly compliant the majority. The recorded more than 12 hours a day they were highly compliant with the prompts and the system detected 13 cigarettes a day, which was very consistent with their baseline data.
  • 16:47This here shows a heat map of of the time of day of smoking. So each color represents the likelihood of smoking. So Green is no smoking and as they move towards red. It's more smoking. So this starts at midnight and runs through 24 hour day and each subject recorded over 12 hours of smoking data as you can see.
  • 17:09And demonstrated very individual patterns of smoking behavior.
  • 17:14With the GPS data we learned that smoking locations ranged up to 52 over the four week period with a median of 15 and understandably. Most of their smoking occured in either the home or the work environment.
  • 17:32And tracking activity smoking and driving was pretty common and.
  • 17:37Not surprisingly, social context was not that predicted for smoking behavior.
  • 17:46And this is probably the most important finding from this study, so we discovered that 2 weeks of data was sufficient to saturate our knowledge about somebody smoking patterns. So we learned all that we could after a two week period. So then what we did is we use that first 2 weeks to predict the second 2 weeks.
  • 18:03And when we did that we learned that we could predict future smoking with a 6 minute prediction window, so we can tell 6 minutes in advance of when. Somebody is going to smoke and this is an ideal prediction window to develop a just in time intervention.
  • 18:20So I then designed to behavioral platform to wrap around this prediction window.
  • 18:27So I wrote painfully 650 text messages I can't tell you how painful that was that were either. I was like eking out 10. Today, at the end, that were either skill based or motivational in nature, and then of course, we coded them. They were also coded for whether they content with specific to a time of day, a location activity mood and there was also messages that were tide to a specific point in the quit attempt so either early on in a quit attempt or later on.
  • 19:01And the idea is to provide the correct message at the correct time. There was also a relapse prevention protocol. That was that was provided when smoking was detected during the quick phase.
  • 19:16So during said that first, the user would wear this system for a two week period so they would wear the watch. Download the app and what the system does is it passively an remotely records there. Smokey behavior during this two week period to learn all about their smoking behavior. It's also collecting this context information so the time of day activity GPS and Bluetooth.
  • 19:41What the user experiences during this period, though is a countdown to quit?
  • 19:48So they're they're getting a quick plan reminder every day. It's very brief takes them less than 5 minutes to look at so this is 2 days left to quit giving them some suggestions of how to deal with their upcoming quit day and suggestion for how they might spend their day and getting the smoke smell out of their house.
  • 20:13There's also a smoking diary. That's available and this is sort of a key part to being able to tailor the messages later. So So what the diary does is it auto records when the system detects the cigarettes and the user can either indicate that that was accurate or inaccurate. They can also sell flog cigarettes. So if they have the watch. Often, they're charging the batteries. They can sell Flaga cigarette and also for all of these cigarettes, they can add additional tags to the cigarettes so.
  • 20:45In this particular case, the person was drinking coffee board alone in it at home.
  • 20:52They can also track time a day as well as GPS location during this two week wait period.
  • 20:59Our pre quit period and then once they have their quick day. Then there is a six week. Quick period or treatment period and during this time, the algorithm is executed every 15 minutes to evaluate the probability of upcoming smoking and the system is designed to spread. The message is evenly across a day and evening and we decided to push for personalized messages per day with the idea that looking at the research. This seems to be an optimal amount of messages.
  • 21:29We didn't want to go to many and people getting irritated in that they're not looking at them.
  • 21:35So again this content is actually pushed so they see it on their watch. They see it on their Phone as a notification so they don't actually have to interact with the app to get the intervention, which is very different than most smartphone apps.
  • 21:49So this is what the landing page. Looks like during the post quick period. So they can upload a motivational picture for themselves. They can have an an instant tip. There's a running count of their time since last cigarettes running count of how much money they're saving and then they see all the menu options and then the connection status on the watch.
  • 22:11We also allow for supports to be listed in here and people can call if they're if they're struggling. We also have other notifications and information that can get pushed to them if they indicate that they want more so we had this classification. Called me here and so that allows people to write content at the start when they set up there quit plan so they're asked if you're struggling, what would you want to tell yourself?
  • 22:40So we had some interesting responses, so this one.
  • 22:43My dad died because he smoked and I had to make the decision to turn off his life support. I do not ever want. My children grandchildren to make decisions like this, or see me in that condition.
  • 22:53So a lot of them were focused on health and family, but we also had a number that looked like this.
  • 22:59Yeah.
  • 23:04So again the primary purpose of what we're trying to achieve here is to send the right message at the right time.
  • 23:13So during the pre quit period if we had a user who is waking up in the morning immediately grabbing their coffee under cigarettes during the quick period. They would receive a message like this just a few minutes before that event would have taken place identifying coffee as a trigger further smoking and providing alternative suggestions for them at that moment in time.
  • 23:36Similarly, we have somebody who would leave work immediately go to their car or light a cigarette rate. Before that would have happened. They would get a message identifying driving as a trigger and offering suggesting that they listen to some music instead.
  • 23:51So after the intervention was developed.
  • 23:54We then did a randomized clinical trial of the platform and we decided to compare it to an active control and this was the NCIS platform. The Smokefree dot text.
  • 24:04We chose this because it has a very similar structure to what we had developed it pushes about 4:00 text per day. There's a two week countdown phase A6 week treatment phase and and this intervention has shown efficacy in and of itself. This is essentially a phase force results from a face for study comparing the intervention in real world quitters to know behavioral support whatsoever.
  • 24:33So for the study. We recruited 141 adult daily smokers. We recruited these individuals from across the United States and use web based recruitment. They completed initial screening on line. Then they were Spohn screened by research assistant so the criteria were quite broad smoking more than 5 cigarettes per day. Adult had to have an Android phone access to the Internet and no evidence of significant medical or mental illness.
  • 25:02They were randomized to one of the two conditions participants use their own Phone. We emailed sorry we mailed them as Smart Watch and they downloaded the app.
  • 25:12And they also completed questionnaires online every 2 weeks, so timeline follow back assessing cigarette use craving withdrawal and mood as secondary outcomes.
  • 25:23In terms of baseline variables, the two conditions were well matched no significant differences. They were about 40 years of age, primarily women. White college educated smoking about 14 cigarettes a day had moderate levels of nicotine dependence and we're also motivated to quit smoking.
  • 25:45Retention was good. We had about 10% non starters of the none of the starters about 90% completed treatment and of the treatment completers about 90% gave us one month follow up data.
  • 26:02For the smoking Diaries, they logged the subjects logged about 8400 triggers over the course of the study. The median was about 92 per subject, but they ranged all the way up to 682. That was a very conscientious subject and for those of you that do smoking research. You'll recognize I've ranked with the Top 5 triggers were that people logged and this is pretty consistent with what we see in the literature with regards to.
  • 26:32It triggers.
  • 26:34Top act is Top ranked ones were watching TV stressed home and alone.
  • 26:43The primary outcome, we evaluated was into treatment point prevalence so for somebody to be counted. Absent there had to be no smoking. No self reported smoking over the past 7 days and also that the biosensor did not detect any smoking over that same period otherwise the subject was censored as smoking.
  • 27:11So this is the primary outcome again attend to treat so everybody was included in this analysis.
  • 27:17And demonstrating that the lumi platform perform better than the NCIS. Smokefree dot text platform, increasing the odds of quitting by about 2.24 times and to our knowledge. This is the first demonstration of a just in time intervention for smoking cessation.
  • 27:38I also did a sensitivity analysis, essentially factoring out those who did not use the system sufficiently so those who didn't keep the battery charged for example, and and also where the biosensor outcome could not be determined with high confidence and when the user and the biosensor is performing perfectly. We get an odds of over 4 for the quit outcome.
  • 28:04For one month old come this is based on self report data because we stopped the gesture recognition at the end of treatment and we still see a pretty decent odds ratio. It's not significant and likely a power issue at this point in time.
  • 28:25When we look at engagement. This is important again as I mentioned before there's literally thousands of smart Phone applications out there for smoking cessation and some of them are designed really well, but where their downfall. Is is that they require user engagement and with our intervention. It's again. It's the behaviors assess passively so it doesn't require any input and also the intervention is delivered as notification. So so the user need not ever.
  • 28:56Interact with the app, but nonetheless they do so we can see quite a stunning difference in the mean number of interactions during the study and like I said, also the messages are pushed to the watch into the cell phone and users can read the message without ever clicking on it, but they still clicked on it quite often, and over 45%.
  • 29:19With regards to our secondary outcomes, there was no difference in craving withdrawal in mood and the surprise me a little bit, but we you know, I was thinking that in the future. We should be assessing. These measures sort of in the moment as people are are getting pushed content and this kind of design.
  • 29:42Are used to develop just in time interventions? They're called Micro randomized trials. So this is where individuals? Are randomized hundreds or thousands of times over the course of the study. So you have the same individuals. You just keep randomizing them to different components. And in these kinds of trials are are able to help you sort of pull apart. What are the affective components of your intervention?
  • 30:11So the system generates 500,000 data points per person, per day so we now have a data set of over 4 billion data points over the course of the study. There was 12,000 interventions logged about 174 participants overall within the loom a condition. They found the messages to be helpful and well timed an in terms of primary content. You can see what what the system pushed to the subjects were primarily skill based and you can look at sort of the target content.
  • 30:42Of of the
  • 30:44Of the messages.
  • 30:46We also collected a lot of feedback.
  • 30:50Some was good so these are some of the quotes from subjects.
  • 30:58Finding the application helpful to them to quit smoking on the bad side. There were some comments about the accuracy of prediction, particularly early on, So what that told us is we needed to manage the subjects expectations about the machine learning algorithm. So it improves overtime and it improves overtime, individualized to the subject. So we need to tell the subjects that they need us to help train the algorithm.
  • 31:28So if you think about it like I said, You're doing 10s of thousands of gestures with your hand every day and what you're asking the algorithm to do is to pick this out several times a day so from the field of everything that you do with your hands over the course of the day and of course, there's comments about always about battery life.
  • 31:46So to summarize the platform is accurate were able to predict smoking in advance of actual smoking. Our preliminary results are positive again. The application requires no user input other than they wear the watch and they download the app to their Phone. It seamlessly integrated into their lives. They need never to show up for an office based appointment. It delivers a personalized intervention in real time and it's inexpensive and scalable and time we've received a patent.
  • 32:17For the gesture based recognition system.
  • 32:21We've received some recognition so the opposite cooperative. Researchers asked me to present at the Yale Innovation Summit. This is done at the medical school every year. It's kind of an interesting event I did a kind of a Shark Tank 5 minute presentation to investors and I got my big Publishers Clearing House check for the most innovative solution and I'm going to use this money to help defray the cost of a spa day for my staff, we've also.
  • 32:51Being invited to NCI has investor initiatives that they select certain grantees to attend and we've had the opportunity to do that also.
  • 33:04So, in terms of next steps for this we're going to continue to analyze R4 billion data points. Uhm there's another SBIR grant pending to extend this quick platform to a reducer platform, the majority of smokers don't.
  • 33:20Actually want to quit now alot of them want to quit through reduction so we're going to tailor the platform for that outcome and we've also just completed a market study with a big Pharma partner and hopefully that will continue moving forward.
  • 33:38We are also developing the platform for drinking.
  • 33:41So the challenge here is we can detect a drinking gesture, but how can we tell if somebody is drinking alcohol or something non alcoholic?
  • 33:52So what we decided to do here is to pair are gesture recognition system with a transdermal alcohol sensor and in this case, the backtrack skin.
  • 34:02Now the challenge with the transdermal alcohol sensor is that they have delayed detection of drinking. So somebody has to drink alcohol. It has to be metabolised before it comes out through the skin and then the sensor has a positive reading.
  • 34:17So the aim of this grant was to develop an integrated system and to be able to merge those two data streams. So we had participants with alcohol use disorder participate. We videotape them drinking over a 2 hour period. We provided them with a bolus dose of their favorite alcohol beverage and they could drink to a Max level of .12 grams per deciliter and we collected blood alcohol levels every 15 minutes with ground truth.
  • 34:49So this just represents what we're collecting for the gesture recognition piece of it. So it records when they start drinking and then of course, it's recording drinking gestures and all the other available contextual variables and then you have an individual King Episode.
  • 35:05So what the goal was here are the trick was is at some point during the drinking episode, the trans dermal detection will goaf so we needed to sort of backtrack and figure out what this interval was so then in the future. When we are developing our prediction algorithm. We know how far back to go to grab those.
  • 35:27Prediction variables that feed into our algorithm.
  • 35:34So this is what a drinking gesture looks like.
  • 35:37So what you have here is somebody picking up a glass.
  • 35:41And then the wrist rotation to take a sip. This is the sip. Then they're putting down the glass and they're lowering the grass.
  • 35:49And.
  • 35:51We evaluated this like I said over the 2 hour period as people became intoxicated. So you can imagine that the gestures change as people get intoxicated OK, they become larger.
  • 36:03Actually.
  • 36:05And we used a cost sensitive random forests classification algorithm for this data, and we're able again with the with the outcome of intoxication were able to detect these gestures with 95% accuracy and .01% false positive rate.
  • 36:26This.
  • 36:38I just got a message on my screen that PowerPoint stopped responding.
  • 36:45OK, it's back.
  • 36:47All right uhm so here we have this data is the blood alcohol level graft over a period of 500 minutes and this is the transdermal alcohol sensor so again. They were drinking for the first 120 minutes and you could see by the end of the drinking. They're they're at a pretty decent blood alcohol level.
  • 37:15So what the goal was here was through some sophisticated.
  • 37:20Curve fitting we were able to match the two curves and to figure out how to do that individual prediction and on average, that delay between so you can see here. The delay between when the blood alcohol starts to go up and when the TS measure starts to go up that that is about 29 minutes on average. So now that we've worked out that delay. We can use this information to look back in the data stream to understand the construction textual information that was present.
  • 37:51At the start of a drinking episode when we start to train a prediction algorithm.
  • 37:57So, in terms of next steps were going to continue developing the prediction algorithm and to use as a treatment platform for those with alcohol use disorders. We're going to also do lanja tude inal assessment of natural drinking behaviors. I think will be interesting to do and we're also considering the research applications of this system so Walter Roberts raise your hand.
  • 38:21Well has has received K 23 funding from an I AAA Essentia Lee to develop this platform to be used in medication trials.
  • 38:32And we've completed a paper outlining that approach so essentially the idea is to move very precise laboratory. Measurements of a medication response during a drinking episode, an move those into the person's natural environment, so Walter is going to be doing is figuring out how to do prompted assessments with this system at different points in a blood alcohol curve. So when drinking is detected during a sending limb.
  • 39:04At binge levels and then during descending limb and you can see the advantages of doing this that. When we do laboratory assessments of the medication. It's one drinking episode, where assessing in a relatively faults environment and this allows that kind of testing to move out into a person's naturalistic environment and you're able to collect this data over multiple drinking episodes.
  • 39:28So we have other biosensor projects were developing so I've primarily talked about using sensors that are available in smartwatches and smartphones, which has the advantage of being easily easy to disseminate 'cause. Everybody has a smartphone and it's also cost effective in terms an easy to scale those kinds of interventions, but there's also a whole world of new bio sensors out there in the technology is currently changing so you have things like contact lenses that assess glucose levels.
  • 40:01You have eyeglasses that do eye tracking.
  • 40:06Do bite counts for eating all sorts of sensors are now putting in your mouth and bedding in your teeth, putting into retainers.
  • 40:13You have.
  • 40:16Sensors that do interstitial fluid so there's a little microneedle array that pierces. The skin and I know that there's some working on that for real. Time alcohol detection. There's also a whole range of clothing being developed that can assess things like sleep sleep patterns sweat during sleep.
  • 40:38So we have a couple of projects that are just starting that are going to be looking at ambulatory and passive monitoring of craving and stress States and also of executive functioning and we're going to be doing that within the Hipc Appeal, one in a pending you 54 that will continue this work.
  • 41:01So, like I said earlier. Um we've been funding. This work through SBIR grants and this started as a partnership between us here and U mass and just to give you an idea of what's happened with the company 'cause That's kind of taken on a life of its own so we had first granted 2015 second. Grant 2017 and we completed the validation study that I just presented in 2018 and patent technology so in this year, we've.
  • 41:31We're partnered with pharma like I said, I hope that goes forward. We have paid pilots just starting. An employer insurance programs were looking at the issue of FDA clearance. These kinds of platforms fall under a Gray area for the FDA sometimes so its mobile applications. You know where the FDA has discretion over whether they want to evaluate them or not so we're looking at a path towards FDA clearance like I said there's a.
  • 42:01There's a grant pending and this is an interesting mechanism again on the SBIR side and it provides matching funds to invest your income. We've closed the seed round from investors on the company and will be collecting Series A investment in next year.
  • 42:19So with that I will finish by acknowledging my stuff. I wanted to say. Hi Megan, Megan and Sabrina collected. The smoking data that I presented today. I wanted to acknowledge Paula Andrew and Walter, who helped with the alcohol data and Terra Lynn McKenzie, who helped with the the P 01.
  • 42:39And my collaborators at UMass and I'll stop there and take questions thanks.