Understand the complexities of using social media as a healthcare professional. Join us as Dr. Damian Keter, shares expert insights on how to ethically navigate the digital space while maintaining transparency and integrity. Learn how to assess credible sources, avoid misinformation, and build an online presence that benefits both patients and clinicians.
Key Topics:
Navigating biases in social media
Avoiding conflicts of interest in healthcare content
Balancing transparency with personal branding
Social media's impact on patient education and engagement
00:00:02 --> 00:00:02 All right.
00:00:03 --> 00:00:05 Well, welcome to the show, Damian Keter.
00:00:05 --> 00:00:06 We are so glad to have you
00:00:06 --> 00:00:09 with us today on our AOMP hands-on,
00:00:09 --> 00:00:10 hands-off podcast.
00:00:11 --> 00:00:12 Thank you for joining us,
00:00:12 --> 00:00:13 first and foremost.
00:00:13 --> 00:00:14 We appreciate it.
00:00:15 --> 00:00:16 I would like to go ahead and
00:00:16 --> 00:00:18 just welcome our audience
00:00:18 --> 00:00:19 to our show today.
00:00:20 --> 00:00:21 And today we're diving into
00:00:21 --> 00:00:22 a topic about how we engage
00:00:23 --> 00:00:24 with our patients, our colleagues,
00:00:24 --> 00:00:25 and the broader healthcare
00:00:25 --> 00:00:27 community through social media.
00:00:28 --> 00:00:30 So today our guest is Damian Keeter,
00:00:30 --> 00:00:31 who has not only leveraged
00:00:31 --> 00:00:33 social media platforms,
00:00:33 --> 00:00:35 for professional growth,
00:00:35 --> 00:00:36 but has also studied it in
00:00:36 --> 00:00:39 terms of better understanding its impact.
00:00:39 --> 00:00:41 And so Damien is a clinician researcher.
00:00:42 --> 00:00:43 He's a dad.
00:00:43 --> 00:00:44 He's an active outdoorsman.
00:00:44 --> 00:00:46 I think he likes hunting.
00:00:46 --> 00:00:48 And he's a fellow Ohioan,
00:00:49 --> 00:00:50 which I have a special
00:00:50 --> 00:00:51 place in my heart for.
00:00:51 --> 00:00:52 So that is awesome.
00:00:53 --> 00:00:56 And he's actively engaged in OMPT research,
00:00:56 --> 00:00:57 which is why we are so
00:00:58 --> 00:00:58 excited to have him on the
00:00:58 --> 00:00:59 show with us today.
00:00:59 --> 00:01:01 So maybe you have read or
00:01:01 --> 00:01:02 listened to some of his work.
00:01:02 --> 00:01:05 So specifically in our today's episode,
00:01:05 --> 00:01:06 we want to explore the
00:01:06 --> 00:01:08 upside and the challenges
00:01:08 --> 00:01:09 of social media in our
00:01:09 --> 00:01:10 field and how it can be a
00:01:11 --> 00:01:13 tool for connection,
00:01:13 --> 00:01:15 education and advocacy.
00:01:15 --> 00:01:16 But maybe there's a little
00:01:16 --> 00:01:18 bit of ethical boundaries
00:01:18 --> 00:01:19 that we also need to
00:01:19 --> 00:01:21 explore so we don't cross lines.
00:01:22 --> 00:01:23 From success stories to
00:01:23 --> 00:01:24 potential pitfalls and
00:01:24 --> 00:01:25 valuable lessons along the way,
00:01:25 --> 00:01:27 Damian has some great insights today.
00:01:27 --> 00:01:29 that he is going to share with us today.
00:01:29 --> 00:01:32 So without further ado, let's jump in.
00:01:32 --> 00:01:32 Welcome, Damian.
00:01:32 --> 00:01:33 We're glad you're here.
00:01:34 --> 00:01:35 Thank you, Megan.
00:01:35 --> 00:01:36 Thank you for having me.
00:01:36 --> 00:01:37 I'm very excited to be here.
00:01:38 --> 00:01:38 I do have to give a little
00:01:38 --> 00:01:39 disclosure as I am a
00:01:39 --> 00:01:42 federal employee that my
00:01:42 --> 00:01:43 opinions today are my own.
00:01:43 --> 00:01:44 They do not represent those
00:01:44 --> 00:01:45 of the United States
00:01:45 --> 00:01:46 government or the
00:01:46 --> 00:01:47 Department of Veteran Affairs.
00:01:47 --> 00:01:49 So now that we got that out of the way,
00:01:50 --> 00:01:51 I'm really looking forward to this talk.
00:01:52 --> 00:01:53 Well, you're the guy.
00:01:53 --> 00:01:55 So I wanted to have this
00:01:55 --> 00:01:58 conversation because when I
00:01:58 --> 00:02:00 was coming into this
00:02:00 --> 00:02:02 president role of AOMT,
00:02:03 --> 00:02:04 right before I won the Manila Award,
00:02:05 --> 00:02:07 we've really needed to be
00:02:07 --> 00:02:08 challenged about how do we
00:02:08 --> 00:02:09 engage in social media.
00:02:09 --> 00:02:11 And so I've been diving in
00:02:11 --> 00:02:12 and I saw that you
00:02:12 --> 00:02:13 published an article in
00:02:13 --> 00:02:14 twenty twenty three,
00:02:15 --> 00:02:16 which was credible or questionable.
00:02:17 --> 00:02:18 assessing quality of
00:02:18 --> 00:02:19 evidence on social media.
00:02:19 --> 00:02:21 And so you state this in the
00:02:21 --> 00:02:23 social media platforms that
00:02:23 --> 00:02:24 they can open the door for
00:02:24 --> 00:02:26 misinformation that
00:02:26 --> 00:02:27 challenges evidence-based
00:02:27 --> 00:02:29 practice and that can
00:02:29 --> 00:02:31 potentially influence patient care.
00:02:31 --> 00:02:33 So I would love to dive into
00:02:33 --> 00:02:34 this a little further.
00:02:34 --> 00:02:34 I mean,
00:02:34 --> 00:02:35 I think we'll probably hit
00:02:35 --> 00:02:36 different tangents and
00:02:36 --> 00:02:38 facets of the conversation.
00:02:38 --> 00:02:40 But I can't wait to kind of
00:02:40 --> 00:02:41 hear your experience with
00:02:41 --> 00:02:43 this and maybe where did
00:02:43 --> 00:02:45 you start to find a passion
00:02:45 --> 00:02:46 to study topic of social
00:02:46 --> 00:02:49 media as it influences practitioners,
00:02:49 --> 00:02:50 especially in healthcare?
00:02:51 --> 00:02:52 Yeah, absolutely.
00:02:52 --> 00:02:52 Thank you.
00:02:52 --> 00:02:54 And it is it's, you know,
00:02:54 --> 00:02:55 it's been a passion of mine
00:02:55 --> 00:02:57 since I think I was in an
00:02:57 --> 00:02:58 interesting generation
00:02:58 --> 00:03:00 because I technically am a millennial.
00:03:01 --> 00:03:03 I'm on the far end of millennials.
00:03:03 --> 00:03:05 But because of that,
00:03:05 --> 00:03:06 I got to kind of see both
00:03:06 --> 00:03:07 sides of the coin.
00:03:07 --> 00:03:08 And what I mean by that is, you know,
00:03:09 --> 00:03:11 I remember the time before
00:03:11 --> 00:03:13 that connection, before cell phones,
00:03:13 --> 00:03:13 you know,
00:03:13 --> 00:03:14 just connecting you to
00:03:14 --> 00:03:15 everything instantly.
00:03:15 --> 00:03:16 And then I saw the
00:03:16 --> 00:03:18 transition towards that.
00:03:18 --> 00:03:18 So, you know,
00:03:18 --> 00:03:19 I remember when Facebook
00:03:19 --> 00:03:20 first came out and I
00:03:20 --> 00:03:22 remember before it and then after it.
00:03:22 --> 00:03:23 And so, you know,
00:03:23 --> 00:03:24 I got to kind of see how
00:03:25 --> 00:03:27 things changed in real time rather than,
00:03:27 --> 00:03:27 you know,
00:03:28 --> 00:03:29 some and I was in the
00:03:29 --> 00:03:31 generations using those things.
00:03:31 --> 00:03:31 A lot of these individuals
00:03:32 --> 00:03:33 using these things and, you know,
00:03:33 --> 00:03:34 college and stuff like that
00:03:34 --> 00:03:35 when it when they came out.
00:03:35 --> 00:03:35 So it was a really
00:03:36 --> 00:03:37 interesting thing to see
00:03:37 --> 00:03:38 the transition of it.
00:03:38 --> 00:03:39 But with that,
00:03:39 --> 00:03:40 I got to really see some of
00:03:40 --> 00:03:42 these huge benefits that
00:03:42 --> 00:03:43 are just awesome.
00:03:44 --> 00:03:45 as well as we know some
00:03:45 --> 00:03:47 significant challenges in that domain.
00:03:47 --> 00:03:48 And so for me,
00:03:48 --> 00:03:49 I think that's really what
00:03:49 --> 00:03:51 sparked the interest in it.
00:03:52 --> 00:03:53 Moving beyond that,
00:03:53 --> 00:03:54 we've really got to discuss
00:03:54 --> 00:03:55 this a lot in our residency
00:03:55 --> 00:03:57 program here at the VA,
00:03:57 --> 00:03:59 talking about how learners, you know,
00:03:59 --> 00:04:00 they don't learn the same
00:04:00 --> 00:04:01 way they used to.
00:04:01 --> 00:04:04 And as we progress the way we teach things,
00:04:04 --> 00:04:05 we have to appreciate the
00:04:05 --> 00:04:07 different understanding and
00:04:07 --> 00:04:08 leverage of some of these
00:04:08 --> 00:04:10 social media platforms for
00:04:10 --> 00:04:11 learning and things like that.
00:04:12 --> 00:04:13 And I think as I've seen
00:04:13 --> 00:04:14 that in our classes of
00:04:15 --> 00:04:16 residents and even students
00:04:16 --> 00:04:17 coming through the clinic,
00:04:17 --> 00:04:19 it really has emphasized to me, yes,
00:04:19 --> 00:04:20 this is something that's interesting,
00:04:20 --> 00:04:21 but it's also very
00:04:21 --> 00:04:22 important for us to jump on
00:04:22 --> 00:04:24 board and make sure we
00:04:24 --> 00:04:25 utilize this stuff and
00:04:25 --> 00:04:26 understand this stuff for
00:04:27 --> 00:04:29 the future learners, for the future PTs.
00:04:29 --> 00:04:30 Absolutely.
00:04:30 --> 00:04:33 So analog youth, digital adulthood.
00:04:33 --> 00:04:35 So you're not a dinosaur.
00:04:35 --> 00:04:38 You have an appreciation for both sides,
00:04:38 --> 00:04:39 which is great.
00:04:39 --> 00:04:40 I resonate with that.
00:04:40 --> 00:04:42 So just so people know that
00:04:42 --> 00:04:45 I am not a millennial, close but not.
00:04:46 --> 00:04:47 And so I appreciate that.
00:04:48 --> 00:04:49 I think that that gives you
00:04:49 --> 00:04:50 a really unique perspective
00:04:51 --> 00:04:52 that you can appreciate
00:04:52 --> 00:04:53 where students are nowadays
00:04:53 --> 00:04:55 with all the information
00:04:55 --> 00:04:56 and the processing and digesting.
00:04:57 --> 00:04:57 but you will have a
00:04:57 --> 00:04:59 different kind of awareness
00:04:59 --> 00:05:00 to it because you didn't
00:05:00 --> 00:05:01 really grow up during that
00:05:01 --> 00:05:03 time where they were having
00:05:03 --> 00:05:04 all of those challenges how
00:05:05 --> 00:05:06 much to process right so
00:05:07 --> 00:05:08 now we're in this I would
00:05:08 --> 00:05:10 say this different age
00:05:10 --> 00:05:11 right so not only do you
00:05:11 --> 00:05:13 have so much to process but
00:05:13 --> 00:05:14 you have to check the
00:05:14 --> 00:05:16 stories you have to credit
00:05:16 --> 00:05:17 check we have to figure out
00:05:17 --> 00:05:19 what is truthful versus
00:05:19 --> 00:05:21 maybe self-promotion you
00:05:21 --> 00:05:23 know and I think the old words
00:05:24 --> 00:05:25 used to be gurus, right?
00:05:25 --> 00:05:26 Like somebody just had a
00:05:26 --> 00:05:29 facet or a style that they promoted,
00:05:29 --> 00:05:31 but it wasn't really always
00:05:31 --> 00:05:31 evidence-based.
00:05:31 --> 00:05:33 And I certainly can see that
00:05:33 --> 00:05:35 even occurring in a social media,
00:05:35 --> 00:05:36 but with a larger audience
00:05:36 --> 00:05:39 now between YouTube and
00:05:39 --> 00:05:40 some of the social media platforms,
00:05:40 --> 00:05:44 such as Facebook, Instagram, I mean,
00:05:44 --> 00:05:45 there's all different types.
00:05:45 --> 00:05:47 I mean, we can get into the, you know,
00:05:48 --> 00:05:49 the even younger days now,
00:05:49 --> 00:05:50 how some of those
00:05:50 --> 00:05:51 individuals are connecting, but
00:05:52 --> 00:05:53 I would love to hear some of
00:05:53 --> 00:05:55 your positive experiences
00:05:55 --> 00:05:57 that you've experienced, maybe,
00:05:57 --> 00:05:58 or heard through social
00:05:58 --> 00:06:01 media in our community, healthcare,
00:06:01 --> 00:06:03 physical therapy, or specific to OMPT.
00:06:03 --> 00:06:04 And then we'll talk about some pitfalls,
00:06:05 --> 00:06:05 but let's focus.
00:06:05 --> 00:06:07 What have been the positives
00:06:07 --> 00:06:08 that you have seen when
00:06:08 --> 00:06:09 using social media?
00:06:10 --> 00:06:11 Yeah, thank you, Megan.
00:06:11 --> 00:06:12 I think there's a lot there.
00:06:12 --> 00:06:14 I mean, you see it every day.
00:06:14 --> 00:06:15 You know, if you're active on social media,
00:06:15 --> 00:06:17 you see this collaboration
00:06:18 --> 00:06:19 and reach every single day.
00:06:20 --> 00:06:20 To me,
00:06:20 --> 00:06:23 that's really the biggest benefit of
00:06:23 --> 00:06:24 social media overall is
00:06:24 --> 00:06:25 this idea that you're able
00:06:25 --> 00:06:29 to get information to and from, you know,
00:06:30 --> 00:06:32 far reaches of the
00:06:32 --> 00:06:34 publications and all those things.
00:06:34 --> 00:06:36 There's so many journals
00:06:36 --> 00:06:37 that are publishing
00:06:37 --> 00:06:39 relevant information to us.
00:06:39 --> 00:06:41 There's no way as a researcher,
00:06:41 --> 00:06:41 you're able to keep up with
00:06:41 --> 00:06:42 all that stuff.
00:06:42 --> 00:06:44 But if you find some other
00:06:44 --> 00:06:45 researchers that follow
00:06:45 --> 00:06:46 similar interests to you,
00:06:46 --> 00:06:47 you have other people
00:06:47 --> 00:06:48 siphoning through some of
00:06:48 --> 00:06:50 that and feeding some of that to you.
00:06:50 --> 00:06:51 And so it's really
00:06:51 --> 00:06:52 beneficial to see some of
00:06:52 --> 00:06:53 this information.
00:06:53 --> 00:06:54 Furthermore, though, like I said,
00:06:54 --> 00:06:55 that collaboration,
00:06:56 --> 00:06:57 really being able to not
00:06:57 --> 00:06:58 only read an article,
00:06:58 --> 00:07:01 but ask questions about it or, you know,
00:07:01 --> 00:07:03 talk directly with the author about this.
00:07:03 --> 00:07:05 These aren't things you could do twenty,
00:07:05 --> 00:07:06 thirty years ago.
00:07:06 --> 00:07:08 I mean, even in the time of, yes,
00:07:08 --> 00:07:09 we had email, yes,
00:07:09 --> 00:07:10 there were corresponding
00:07:11 --> 00:07:12 authors listed on emails,
00:07:12 --> 00:07:13 a lot of times people
00:07:13 --> 00:07:16 didn't want to ask them simple questions.
00:07:16 --> 00:07:17 You didn't want to just ask them, hey,
00:07:17 --> 00:07:19 what are your thoughts on this?
00:07:19 --> 00:07:20 Those were things that
00:07:20 --> 00:07:21 you're comfortable asking
00:07:21 --> 00:07:22 in kind of an informal
00:07:22 --> 00:07:24 social media question,
00:07:24 --> 00:07:25 but it's not something you
00:07:25 --> 00:07:27 would write an email to a
00:07:27 --> 00:07:28 world renowned researcher on typically.
00:07:29 --> 00:07:30 And so I think it really
00:07:30 --> 00:07:31 opens the door for this
00:07:32 --> 00:07:33 connection with the authors,
00:07:33 --> 00:07:34 which is great.
00:07:34 --> 00:07:36 And my personal experience with this,
00:07:36 --> 00:07:37 you know,
00:07:37 --> 00:07:38 I have a good example of this
00:07:38 --> 00:07:39 that actually happened
00:07:39 --> 00:07:41 occurred with me earlier
00:07:41 --> 00:07:45 this year is I saw on Twitter,
00:07:45 --> 00:07:46 saw on there were now X formally,
00:07:47 --> 00:07:48 I should say,
00:07:48 --> 00:07:50 saw an X that Nathan Hooding
00:07:50 --> 00:07:53 posted a little thing that
00:07:53 --> 00:07:54 he was coming to the Ohio
00:07:54 --> 00:07:55 State Conference.
00:07:56 --> 00:07:56 Now,
00:07:56 --> 00:07:57 I had no intention of going to the
00:07:57 --> 00:07:58 Ohio State Conference at
00:07:58 --> 00:08:00 the time because I just had
00:08:00 --> 00:08:01 too much stuff going on.
00:08:01 --> 00:08:02 And it was right around my birthday.
00:08:02 --> 00:08:04 But regardless,
00:08:04 --> 00:08:06 I saw that he was coming to Ohio.
00:08:06 --> 00:08:07 And I said, this guy's a big deal.
00:08:07 --> 00:08:08 I love his work.
00:08:09 --> 00:08:11 You know, why is he coming to Ohio?
00:08:11 --> 00:08:13 Hey, no one goes to Ohio, right?
00:08:14 --> 00:08:16 But I said, he's coming to Ohio.
00:08:16 --> 00:08:18 So I reached out to Ken Learman,
00:08:18 --> 00:08:19 another Ohioan, a fellow Ohioan.
00:08:19 --> 00:08:21 I said, Ken, get in the car.
00:08:22 --> 00:08:23 We're going to Columbus.
00:08:23 --> 00:08:24 We're going to go to this conference.
00:08:25 --> 00:08:26 And we ended up going to
00:08:26 --> 00:08:28 this conference in Columbus, Ohio,
00:08:28 --> 00:08:30 which we actually ran into yours truly,
00:08:30 --> 00:08:33 Megan Battles, at that conference.
00:08:33 --> 00:08:36 And we went just for Nathan's talk.
00:08:36 --> 00:08:37 And it was really
00:08:37 --> 00:08:38 interesting because we went
00:08:38 --> 00:08:38 in the morning.
00:08:38 --> 00:08:39 We're sitting there eating breakfast.
00:08:40 --> 00:08:41 And I had seen his pictures, again,
00:08:41 --> 00:08:42 only because of social media.
00:08:43 --> 00:08:44 There's Nathan standing over
00:08:44 --> 00:08:46 in the corner just eating breakfast,
00:08:46 --> 00:08:48 kind of hanging out, looking around.
00:08:48 --> 00:08:48 And so I went over there.
00:08:48 --> 00:08:49 I said, are you Nathan?
00:08:49 --> 00:08:49 Yeah.
00:08:50 --> 00:08:51 He's like, yeah.
00:08:51 --> 00:08:52 And so we got to chatting
00:08:52 --> 00:08:53 and it was great because we
00:08:53 --> 00:08:55 ended up going to his talk.
00:08:55 --> 00:08:56 We ended up having lunch with them.
00:08:57 --> 00:08:58 And then this led to the idea of, hey,
00:08:58 --> 00:08:59 we should write a paper together.
00:09:00 --> 00:09:00 And then we did.
00:09:00 --> 00:09:02 We collaborated.
00:09:02 --> 00:09:03 We wrote one paper together,
00:09:03 --> 00:09:04 which has been published in
00:09:04 --> 00:09:06 JSPT Open on person
00:09:06 --> 00:09:06 centered manual therapy.
00:09:07 --> 00:09:08 And we actually have another
00:09:08 --> 00:09:09 one in peer review on social media.
00:09:10 --> 00:09:11 And so it's really this
00:09:11 --> 00:09:12 collaboration would have
00:09:12 --> 00:09:14 never happened if it wasn't
00:09:14 --> 00:09:15 for that social media.
00:09:15 --> 00:09:15 You know,
00:09:15 --> 00:09:16 me seeing that he happened to be
00:09:16 --> 00:09:17 presenting
00:09:18 --> 00:09:19 at the state conference.
00:09:19 --> 00:09:20 And the funny thing about
00:09:20 --> 00:09:20 that is when I asked him
00:09:21 --> 00:09:22 why he was at the Ohio State Conference,
00:09:23 --> 00:09:24 he just said, I don't know,
00:09:24 --> 00:09:25 it seemed like Ohio was a
00:09:25 --> 00:09:27 cool place to check out.
00:09:27 --> 00:09:28 You have no other reason to be.
00:09:29 --> 00:09:31 Ohio is a cool place to check out.
00:09:31 --> 00:09:31 That's right.
00:09:31 --> 00:09:31 It is.
00:09:35 --> 00:09:36 Oh, that's great.
00:09:36 --> 00:09:37 Well, and I and I appreciate that,
00:09:38 --> 00:09:38 you know,
00:09:38 --> 00:09:39 and I think what's nice now is
00:09:39 --> 00:09:40 like on social media,
00:09:41 --> 00:09:42 you can follow people.
00:09:42 --> 00:09:44 And again, it hits your feed, right?
00:09:44 --> 00:09:46 It hits the you have the
00:09:46 --> 00:09:47 opportunity to not have to
00:09:47 --> 00:09:50 work hard to keep up if you
00:09:50 --> 00:09:51 want to follow researchers
00:09:51 --> 00:09:52 or see what they share.
00:09:52 --> 00:09:52 Right.
00:09:53 --> 00:09:54 So, again, infographics.
00:09:55 --> 00:09:56 those didn't exist years ago,
00:09:57 --> 00:09:58 widely shared now that
00:09:58 --> 00:09:59 helps you to really
00:09:59 --> 00:10:02 translate what you're seeing in research,
00:10:02 --> 00:10:03 how it translates to practice.
00:10:03 --> 00:10:04 And so,
00:10:04 --> 00:10:07 so much of this is good information
00:10:07 --> 00:10:09 sharing when done well.
00:10:09 --> 00:10:10 And I think that that's what
00:10:10 --> 00:10:11 you're expressing is you
00:10:11 --> 00:10:12 get to touch base with
00:10:13 --> 00:10:15 researchers who are living that.
00:10:15 --> 00:10:16 And I would even say
00:10:16 --> 00:10:17 clinicians who are really
00:10:17 --> 00:10:18 presenting and doing the
00:10:18 --> 00:10:20 things that highlight the
00:10:20 --> 00:10:22 hard work and growth that
00:10:22 --> 00:10:24 they have built as a professional and
00:10:24 --> 00:10:25 What a unique way to
00:10:25 --> 00:10:26 collaborate and then
00:10:26 --> 00:10:28 connect even in person.
00:10:28 --> 00:10:30 So hybrid is one of my
00:10:30 --> 00:10:32 favorite conversations for another day.
00:10:32 --> 00:10:33 But in person and then
00:10:33 --> 00:10:35 virtual and then it comes
00:10:35 --> 00:10:36 together in a really great,
00:10:36 --> 00:10:37 great collaboration.
00:10:37 --> 00:10:38 So what a great positive
00:10:40 --> 00:10:42 success story for a good use of that.
00:10:42 --> 00:10:44 So now I want to change
00:10:44 --> 00:10:45 directions a little bit,
00:10:45 --> 00:10:46 and I certainly think you
00:10:46 --> 00:10:48 could probably speak to this as well.
00:10:48 --> 00:10:49 But what are some of the
00:10:49 --> 00:10:50 challenges or common
00:10:50 --> 00:10:51 pitfalls that you or others
00:10:51 --> 00:10:53 have faced while navigating
00:10:53 --> 00:10:55 social media in this space,
00:10:55 --> 00:10:58 maybe as a researcher, highly recognized,
00:10:58 --> 00:10:59 or maybe just as a
00:10:59 --> 00:11:01 clinician navigating it?
00:11:02 --> 00:11:03 Yeah,
00:11:03 --> 00:11:05 this is a challenging topic because I
00:11:05 --> 00:11:06 think that there's, you know,
00:11:06 --> 00:11:07 the common term that always
00:11:07 --> 00:11:08 comes up is misinformation,
00:11:09 --> 00:11:10 misinformation, disinformation.
00:11:10 --> 00:11:12 I think we all know that
00:11:12 --> 00:11:13 there's a lot of stuff out
00:11:13 --> 00:11:15 there that's just total and utter BS,
00:11:15 --> 00:11:15 right?
00:11:15 --> 00:11:16 We know that there's a lot
00:11:16 --> 00:11:17 of stuff out there that's
00:11:17 --> 00:11:18 just not accurate.
00:11:19 --> 00:11:20 The challenge isn't even
00:11:20 --> 00:11:21 from this information just
00:11:21 --> 00:11:22 being out there.
00:11:22 --> 00:11:23 Yes, that's a problem, right?
00:11:23 --> 00:11:25 But we know that that's
00:11:26 --> 00:11:27 something that can be addressed.
00:11:27 --> 00:11:29 The challenge comes from all
00:11:29 --> 00:11:30 the stuff that surrounds
00:11:30 --> 00:11:31 the misinformation to me.
00:11:31 --> 00:11:32 And one of these things is
00:11:33 --> 00:11:34 the way that it's often presented.
00:11:35 --> 00:11:35 Right.
00:11:35 --> 00:11:36 A lot of times this
00:11:36 --> 00:11:38 misinformation comes from
00:11:38 --> 00:11:39 sources that have a lot of
00:11:40 --> 00:11:41 time to designate to this.
00:11:41 --> 00:11:44 OK, I'm a full time clinician.
00:11:44 --> 00:11:46 All my research is on my own time.
00:11:47 --> 00:11:49 You know, everyone in this area of study,
00:11:50 --> 00:11:50 you know,
00:11:50 --> 00:11:52 that's highly involved in own PT.
00:11:53 --> 00:11:55 has a lot of things they're involved in,
00:11:55 --> 00:11:56 whether it's teaching, research, you know,
00:11:56 --> 00:11:57 they have all of these
00:11:57 --> 00:11:58 really important things
00:11:59 --> 00:12:00 where typically social
00:12:00 --> 00:12:01 media is kind of a side
00:12:02 --> 00:12:03 thing they do when they log
00:12:03 --> 00:12:03 in when they can,
00:12:03 --> 00:12:04 where when you see a lot of
00:12:05 --> 00:12:06 these individuals that are
00:12:06 --> 00:12:09 posting just utter BS,
00:12:09 --> 00:12:10 the real misinformation
00:12:10 --> 00:12:12 stuff that we all can look at and say,
00:12:12 --> 00:12:13 this is misinformation.
00:12:14 --> 00:12:14 they're individuals where
00:12:15 --> 00:12:16 this is their thing they
00:12:16 --> 00:12:17 don't have all of these
00:12:17 --> 00:12:18 other things they're doing
00:12:18 --> 00:12:20 to to occupy their time
00:12:20 --> 00:12:21 they have a lot of time to
00:12:21 --> 00:12:22 invest in this and it's a
00:12:22 --> 00:12:24 really unfortunate for
00:12:24 --> 00:12:25 people like us because we
00:12:25 --> 00:12:26 log in you know to some of
00:12:26 --> 00:12:27 these social media
00:12:27 --> 00:12:28 platforms a couple times a
00:12:28 --> 00:12:30 day maybe for short periods
00:12:30 --> 00:12:31 of time while we're doing
00:12:31 --> 00:12:32 something else whereas
00:12:32 --> 00:12:33 these individuals might be
00:12:33 --> 00:12:34 sitting in front of that
00:12:34 --> 00:12:35 computer screen all day and
00:12:35 --> 00:12:36 with that it makes it
00:12:36 --> 00:12:37 challenging because they're
00:12:37 --> 00:12:39 firing this stuff non-stop
00:12:40 --> 00:12:41 And they're able to continue
00:12:41 --> 00:12:43 to just try to support
00:12:43 --> 00:12:44 their own thoughts and
00:12:44 --> 00:12:45 beliefs and some of these biases.
00:12:46 --> 00:12:49 And it makes it really challenging.
00:12:49 --> 00:12:49 On top of that,
00:12:50 --> 00:12:50 there's the way that
00:12:50 --> 00:12:52 they're able to communicate, right?
00:12:53 --> 00:12:55 I work for the United States government.
00:12:56 --> 00:12:57 Most people that work in
00:12:58 --> 00:12:59 different OMPT settings
00:13:00 --> 00:13:01 work for high-level
00:13:01 --> 00:13:03 institutions or different
00:13:03 --> 00:13:04 academic institutions.
00:13:05 --> 00:13:07 They can't go on social media and just
00:13:08 --> 00:13:10 curse and, you know,
00:13:10 --> 00:13:12 discuss things in an inappropriate way.
00:13:12 --> 00:13:14 And so they have to be
00:13:14 --> 00:13:15 careful about how they're
00:13:15 --> 00:13:16 presenting this information.
00:13:17 --> 00:13:18 And that's a challenge when
00:13:18 --> 00:13:19 you have individuals who
00:13:19 --> 00:13:21 might not have those same, you know,
00:13:21 --> 00:13:22 handcuffs to them that
00:13:22 --> 00:13:24 might be able to just speak
00:13:24 --> 00:13:25 more frequently on things
00:13:25 --> 00:13:26 and get a little bit more
00:13:26 --> 00:13:27 aggressive with things.
00:13:27 --> 00:13:28 And so it's really
00:13:28 --> 00:13:29 challenging because you see
00:13:29 --> 00:13:30 some of these individuals
00:13:30 --> 00:13:31 that present this very
00:13:31 --> 00:13:33 typical narcissistic type
00:13:33 --> 00:13:34 of communication.
00:13:34 --> 00:13:35 And they've done a couple of
00:13:35 --> 00:13:36 studies on this to show
00:13:36 --> 00:13:37 that they present this.
00:13:37 --> 00:13:39 A lot of times this social
00:13:39 --> 00:13:40 media personalities present
00:13:40 --> 00:13:41 this narcissistic type of
00:13:42 --> 00:13:43 information and way of
00:13:43 --> 00:13:44 communicating on things
00:13:45 --> 00:13:46 that people in professional
00:13:46 --> 00:13:47 academic settings
00:13:48 --> 00:13:49 won't put up with.
00:13:49 --> 00:13:51 They'll just remove themselves from that.
00:13:51 --> 00:13:53 So now we're in a situation
00:13:53 --> 00:13:53 where we have these
00:13:53 --> 00:13:55 individuals who are posting
00:13:55 --> 00:13:56 this misinformation,
00:13:56 --> 00:13:57 who are presenting more of
00:13:57 --> 00:13:59 it because they have the tie,
00:13:59 --> 00:14:00 and then presenting it in
00:14:00 --> 00:14:01 more aggressive ways
00:14:01 --> 00:14:02 because they don't have
00:14:02 --> 00:14:03 these same kind of ethical
00:14:03 --> 00:14:05 constraints limiting what
00:14:05 --> 00:14:06 they're able to say.
00:14:06 --> 00:14:09 And so it floods things a bit.
00:14:09 --> 00:14:10 And so it really does make
00:14:10 --> 00:14:12 things challenging for us
00:14:12 --> 00:14:14 as consumers of that
00:14:14 --> 00:14:15 information to kind of siphon through.
00:14:16 --> 00:14:17 Absolutely.
00:14:17 --> 00:14:19 And it becomes polarizing, right?
00:14:19 --> 00:14:21 Do you engage in that?
00:14:21 --> 00:14:24 Do you, do you give feedback back to,
00:14:24 --> 00:14:25 you know,
00:14:25 --> 00:14:27 that individual and most likely
00:14:27 --> 00:14:28 not at times, right?
00:14:28 --> 00:14:29 Unfortunately,
00:14:29 --> 00:14:31 that's usually when people withdraw.
00:14:31 --> 00:14:33 And so then the individuals
00:14:33 --> 00:14:34 have even bigger platform
00:14:34 --> 00:14:36 because there's no pushback.
00:14:36 --> 00:14:38 And so we have seen this, right?
00:14:38 --> 00:14:39 And I would say, I mean,
00:14:40 --> 00:14:41 for those who are listening,
00:14:41 --> 00:14:43 manual therapy has swung the pendulum.
00:14:44 --> 00:14:45 over a short period of time,
00:14:45 --> 00:14:47 whether you use it or don't use it.
00:14:47 --> 00:14:49 And I think you can see this
00:14:49 --> 00:14:51 is a testimony to some of
00:14:52 --> 00:14:54 the social media platforms
00:14:54 --> 00:14:56 and big individuals who
00:14:57 --> 00:14:58 feel like that that's a
00:14:58 --> 00:15:00 platform that they want to speak towards.
00:15:00 --> 00:15:04 And so I hear some of the challenges,
00:15:04 --> 00:15:04 but definitely the
00:15:05 --> 00:15:06 misinformation or
00:15:06 --> 00:15:07 disinformation is something
00:15:08 --> 00:15:10 that I believe our next
00:15:11 --> 00:15:12 generation of practitioners
00:15:12 --> 00:15:14 have to navigate through
00:15:14 --> 00:15:16 because it makes it really
00:15:16 --> 00:15:19 hard to know what's truthful, right?
00:15:20 --> 00:15:22 I mean, we see this in our TVs every night,
00:15:22 --> 00:15:23 what's truthful, what's not.
00:15:23 --> 00:15:25 And it's so hard to kind of
00:15:25 --> 00:15:27 navigate that balance, especially
00:15:28 --> 00:15:29 a younger version of
00:15:29 --> 00:15:30 yourself who's just now
00:15:30 --> 00:15:32 coming into the PT profession,
00:15:32 --> 00:15:34 trying to digest as much as
00:15:34 --> 00:15:35 you can so you can give
00:15:35 --> 00:15:36 your best patient care.
00:15:36 --> 00:15:37 Navigating it is really hard.
00:15:38 --> 00:15:39 So I appreciate that.
00:15:39 --> 00:15:40 It's hard.
00:15:40 --> 00:15:42 It's so hard.
00:15:42 --> 00:15:44 So that takes me to our next
00:15:44 --> 00:15:45 spot where I wanted to ask
00:15:45 --> 00:15:47 you with all of the amount
00:15:47 --> 00:15:49 of information that's online,
00:15:49 --> 00:15:51 how do you suggest that
00:15:51 --> 00:15:53 OMPT practitioners,
00:15:53 --> 00:15:55 those who are listening in our audience,
00:15:55 --> 00:15:57 stay evidence-based and try
00:15:57 --> 00:15:58 to avoid some of this
00:15:58 --> 00:16:00 misinformation that we're talking about?
00:16:01 --> 00:16:02 Yeah, thank you for that,
00:16:02 --> 00:16:03 because I think that's important.
00:16:04 --> 00:16:04 You know,
00:16:04 --> 00:16:05 we're going to get this information.
00:16:05 --> 00:16:06 It's how do you how do you
00:16:06 --> 00:16:07 kind of siphon through this?
00:16:07 --> 00:16:09 And I think one of the
00:16:09 --> 00:16:10 things I really like to
00:16:10 --> 00:16:12 have some of our residents
00:16:12 --> 00:16:13 and some young learners
00:16:13 --> 00:16:14 reflect on for everything,
00:16:14 --> 00:16:15 but including the social
00:16:15 --> 00:16:18 media stuff is why do they feel that way?
00:16:18 --> 00:16:20 What is the other side of this?
00:16:20 --> 00:16:20 Right.
00:16:20 --> 00:16:22 We all see things through
00:16:22 --> 00:16:23 our own personal biases, which is OK.
00:16:23 --> 00:16:24 You can't get rid of those.
00:16:24 --> 00:16:25 Right.
00:16:25 --> 00:16:27 But it's important when you
00:16:27 --> 00:16:28 look at these things to say,
00:16:28 --> 00:16:30 why do you think that this
00:16:30 --> 00:16:32 person sees things this way?
00:16:32 --> 00:16:33 So you can see maybe some of
00:16:33 --> 00:16:35 the rationale they have
00:16:35 --> 00:16:36 behind some of these things.
00:16:36 --> 00:16:37 And it's really interesting
00:16:37 --> 00:16:39 because a lot of times when you do that,
00:16:39 --> 00:16:42 you're able to dive into
00:16:42 --> 00:16:44 some of their tactics and
00:16:44 --> 00:16:45 things along those lines
00:16:45 --> 00:16:46 just by assessing that kind
00:16:46 --> 00:16:47 of other side of the coin.
00:16:47 --> 00:16:48 Yes, this is how I see things.
00:16:49 --> 00:16:50 I see this as total BS.
00:16:50 --> 00:16:53 They're presenting this way, but why?
00:16:53 --> 00:16:55 What is the rationale behind that?
00:16:55 --> 00:16:56 And I think a lot of times
00:16:57 --> 00:16:58 you can get that by looking
00:16:58 --> 00:17:00 at the source itself.
00:17:00 --> 00:17:01 And that's why I wrote that
00:17:01 --> 00:17:04 JOSBT blog you were talking about,
00:17:04 --> 00:17:05 that credible or questionable,
00:17:05 --> 00:17:06 that's where that came from,
00:17:06 --> 00:17:07 is looking at this,
00:17:07 --> 00:17:09 assessing the source this is coming from.
00:17:10 --> 00:17:11 And one of the things I did
00:17:11 --> 00:17:12 for that is put together
00:17:12 --> 00:17:13 this little algorithm,
00:17:13 --> 00:17:15 which was kind of comical,
00:17:15 --> 00:17:16 but it's this idea of you
00:17:16 --> 00:17:19 get this information presented to you.
00:17:19 --> 00:17:20 You have to establish where
00:17:20 --> 00:17:20 this is coming from.
00:17:21 --> 00:17:23 Now, is this person a content expert,
00:17:23 --> 00:17:23 right?
00:17:23 --> 00:17:24 Because if the information,
00:17:24 --> 00:17:25 if this is just an article
00:17:25 --> 00:17:26 that's presented to you and
00:17:26 --> 00:17:27 somebody says interesting
00:17:28 --> 00:17:30 read and puts it on social media, right,
00:17:30 --> 00:17:31 they're opening it up for
00:17:31 --> 00:17:32 your interpretation.
00:17:32 --> 00:17:32 That's great.
00:17:33 --> 00:17:34 I would say at that point,
00:17:35 --> 00:17:36 they're still allowing you
00:17:36 --> 00:17:38 to assess that evidence at
00:17:39 --> 00:17:40 its level of evidence based
00:17:40 --> 00:17:41 on the study design.
00:17:41 --> 00:17:43 As soon as that person puts
00:17:44 --> 00:17:46 a twist on that or spin on that saying,
00:17:46 --> 00:17:47 this article is total BS
00:17:48 --> 00:17:49 because it doesn't agree with blah, blah,
00:17:49 --> 00:17:50 blah,
00:17:50 --> 00:17:53 or they did this the wrong way because A,
00:17:53 --> 00:17:53 B, C,
00:17:54 --> 00:17:55 D. As soon as they provide that
00:17:55 --> 00:17:55 twist on that,
00:17:56 --> 00:17:57 that regresses that a bit
00:17:58 --> 00:17:58 to expert opinion.
00:18:00 --> 00:18:01 And even then, we have to ask ourselves,
00:18:01 --> 00:18:03 is this person an expert?
00:18:03 --> 00:18:04 Because anyone who's
00:18:04 --> 00:18:06 published knows you have an
00:18:06 --> 00:18:07 editor and a couple of peer
00:18:07 --> 00:18:09 reviewers who are looking
00:18:09 --> 00:18:10 at your study before
00:18:10 --> 00:18:13 they're published who are actual experts.
00:18:13 --> 00:18:15 So you have people who are very,
00:18:15 --> 00:18:15 very qualified,
00:18:15 --> 00:18:16 who have already looked at
00:18:16 --> 00:18:18 these studies in most cases.
00:18:19 --> 00:18:20 to review them to see to
00:18:20 --> 00:18:22 make sure it is you know
00:18:22 --> 00:18:24 done in a appropriate way
00:18:24 --> 00:18:26 and so then you get to a
00:18:26 --> 00:18:28 point of an expert or maybe
00:18:28 --> 00:18:28 it is a real expert
00:18:29 --> 00:18:30 offering the opinion then
00:18:31 --> 00:18:31 you have to assess
00:18:32 --> 00:18:33 does this person have any
00:18:33 --> 00:18:34 conflict of interest, right?
00:18:35 --> 00:18:37 We, as authors, we have to disclose that.
00:18:37 --> 00:18:39 When I submit a paper, big, you know,
00:18:40 --> 00:18:40 big disclosure.
00:18:40 --> 00:18:42 Are there any conflicts of interest?
00:18:42 --> 00:18:43 Are you making any money by
00:18:43 --> 00:18:44 this opinion you're presenting?
00:18:45 --> 00:18:47 That's not present in social media.
00:18:47 --> 00:18:49 And so it makes things challenging, right?
00:18:50 --> 00:18:51 Because you have individuals
00:18:51 --> 00:18:53 that that is their livelihood.
00:18:53 --> 00:18:54 right you have to ask
00:18:54 --> 00:18:56 yourself is this person you
00:18:56 --> 00:18:58 know relevant and that's
00:18:58 --> 00:19:00 why people follow them on
00:19:00 --> 00:19:01 social media or is this
00:19:01 --> 00:19:03 person only relevant
00:19:03 --> 00:19:04 because of their social
00:19:04 --> 00:19:05 media and unfortunately we
00:19:05 --> 00:19:06 have a lot of these folks
00:19:06 --> 00:19:08 that are only relevant in
00:19:08 --> 00:19:10 the profession because of
00:19:10 --> 00:19:11 what their these
00:19:11 --> 00:19:12 personalities they've
00:19:12 --> 00:19:14 created on social media and
00:19:14 --> 00:19:16 so you take this and if you
00:19:16 --> 00:19:19 say this person maybe is selling courses
00:19:19 --> 00:19:21 is selling products and a
00:19:21 --> 00:19:22 lot of times it's obvious
00:19:22 --> 00:19:23 you'll see this this
00:19:23 --> 00:19:24 doesn't agree with you know
00:19:24 --> 00:19:26 ABCD OMPT is not
00:19:26 --> 00:19:27 evidence-based there's no
00:19:27 --> 00:19:28 reason you should you
00:19:28 --> 00:19:29 should receive training in
00:19:29 --> 00:19:31 it by the way here are some
00:19:31 --> 00:19:32 courses I'm doing you know
00:19:32 --> 00:19:33 follow me on Instagram
00:19:33 --> 00:19:35 follow me on Twitter right
00:19:35 --> 00:19:36 and so it's usually
00:19:36 --> 00:19:37 followed up with something
00:19:37 --> 00:19:39 that leads to direct financial gain
00:19:39 --> 00:19:41 OK, but they don't have to disclose that.
00:19:41 --> 00:19:42 But a lot of times you're
00:19:42 --> 00:19:44 able to work through and kind of see that,
00:19:44 --> 00:19:45 you know,
00:19:45 --> 00:19:46 see that conflicts of interest
00:19:46 --> 00:19:48 that they have on top of that,
00:19:48 --> 00:19:49 really diving into the references.
00:19:49 --> 00:19:50 And that's what I really
00:19:50 --> 00:19:52 like to promote individuals
00:19:52 --> 00:19:53 going on social media.
00:19:54 --> 00:19:55 Yes, if somebody shares this article,
00:19:56 --> 00:19:57 you can glance at their opinion,
00:19:58 --> 00:19:59 but read that article yourself.
00:20:00 --> 00:20:02 You know, it's like abstracts in studies.
00:20:02 --> 00:20:03 They've done they've done reviews on this.
00:20:03 --> 00:20:03 Right.
00:20:03 --> 00:20:04 And what have we found is
00:20:04 --> 00:20:05 that individuals are able
00:20:05 --> 00:20:07 to spin abstracts even to
00:20:07 --> 00:20:09 make them sound like what
00:20:09 --> 00:20:10 supports their beliefs.
00:20:11 --> 00:20:12 And so do you think people
00:20:12 --> 00:20:13 on social media can do
00:20:13 --> 00:20:15 similar things by spinning
00:20:15 --> 00:20:16 their idea of what the results show?
00:20:17 --> 00:20:17 Yeah.
00:20:18 --> 00:20:20 So read it, read it yourself, pull it up,
00:20:20 --> 00:20:22 look at it, you know, send it,
00:20:22 --> 00:20:24 send the email or a message
00:20:24 --> 00:20:24 to the author.
00:20:25 --> 00:20:25 If you have questions,
00:20:26 --> 00:20:26 don't just listen to
00:20:27 --> 00:20:28 somebody else's opinion.
00:20:29 --> 00:20:30 Well,
00:20:30 --> 00:20:32 and I love the algorithm in your paper.
00:20:32 --> 00:20:34 So I will say I'm a little
00:20:34 --> 00:20:35 bit of a geek self-proclaimed,
00:20:35 --> 00:20:36 so you can say it back to
00:20:36 --> 00:20:37 me and it's totally okay.
00:20:38 --> 00:20:40 But I really appreciate the
00:20:40 --> 00:20:41 way that this article kind
00:20:41 --> 00:20:42 of breaks that down.
00:20:42 --> 00:20:43 And I think, you know,
00:20:44 --> 00:20:45 I always think about flow charts.
00:20:45 --> 00:20:46 I like structure,
00:20:46 --> 00:20:48 so that's not lost on people who know me,
00:20:48 --> 00:20:51 but I like a way that helps
00:20:51 --> 00:20:53 somebody navigate through a
00:20:53 --> 00:20:56 lot of information and a process helps.
00:20:56 --> 00:20:57 And I think that algorithm
00:20:57 --> 00:20:59 is really helpful where, you know,
00:20:59 --> 00:21:00 I would say a flow chart, right?
00:21:00 --> 00:21:02 It's guiding you through
00:21:02 --> 00:21:04 asking you to ask yourself questions,
00:21:05 --> 00:21:07 a reflective opportunity to say,
00:21:08 --> 00:21:10 where is the information coming from?
00:21:11 --> 00:21:13 How do they gain and benefit from it?
00:21:14 --> 00:21:16 And, you know, when you're stepping back,
00:21:16 --> 00:21:18 what ways can you either
00:21:18 --> 00:21:20 prove or validate that information?
00:21:21 --> 00:21:21 And
00:21:21 --> 00:21:21 And again,
00:21:22 --> 00:21:23 I think you're spot on having
00:21:23 --> 00:21:25 opportunities in really positive ways,
00:21:25 --> 00:21:26 like having you on this
00:21:26 --> 00:21:29 show and having some of our podcasts.
00:21:29 --> 00:21:29 You know,
00:21:29 --> 00:21:31 we hope to be able to highlight
00:21:31 --> 00:21:33 that these individuals not only write it,
00:21:33 --> 00:21:33 they believe it,
00:21:33 --> 00:21:35 but then they show the
00:21:35 --> 00:21:37 research right about what
00:21:37 --> 00:21:38 they're doing and how
00:21:38 --> 00:21:39 they're furthering study.
00:21:39 --> 00:21:40 And I love the fact that we
00:21:40 --> 00:21:43 start with questions, not with a bias.
00:21:43 --> 00:21:45 We start with questions and it's often.
00:21:45 --> 00:21:47 we undo our bias, right?
00:21:48 --> 00:21:48 Like, oh,
00:21:48 --> 00:21:50 I didn't know that was going to happen.
00:21:50 --> 00:21:51 That's the best part about
00:21:51 --> 00:21:52 some of the research that's come out.
00:21:52 --> 00:21:53 So again,
00:21:53 --> 00:21:55 I appreciate kind of your
00:21:55 --> 00:21:57 thoughts to driving our new
00:21:57 --> 00:21:59 clinicians in a way to kind
00:21:59 --> 00:22:00 of handle some of the
00:22:00 --> 00:22:01 misinformation and
00:22:01 --> 00:22:03 definitely the vast amount
00:22:03 --> 00:22:04 of information that's out there.
00:22:04 --> 00:22:06 So that is awesome.
00:22:06 --> 00:22:08 So I'm going to change gears again.
00:22:08 --> 00:22:10 And instead of talking about clinicians,
00:22:11 --> 00:22:13 I want to talk about how can we start to
00:22:14 --> 00:22:16 connect our community and our patients.
00:22:17 --> 00:22:18 So we are talking about
00:22:18 --> 00:22:19 having students navigate this,
00:22:20 --> 00:22:21 but what about patients who
00:22:21 --> 00:22:23 are just starting to like, you know,
00:22:23 --> 00:22:25 they have no contacts.
00:22:25 --> 00:22:26 They don't have this
00:22:26 --> 00:22:28 resource that lets you dive
00:22:28 --> 00:22:29 in and navigate in the same way.
00:22:30 --> 00:22:31 So what learning lessons do
00:22:31 --> 00:22:32 you have there from using
00:22:32 --> 00:22:33 social media to connect
00:22:33 --> 00:22:35 with health professionals
00:22:35 --> 00:22:36 in the community outside of
00:22:36 --> 00:22:37 our own scope?
00:22:37 --> 00:22:39 And then what about patients?
00:22:41 --> 00:22:42 yeah I think I think
00:22:42 --> 00:22:43 assessment of health
00:22:43 --> 00:22:44 literacy is very important
00:22:44 --> 00:22:45 obviously you know you get
00:22:45 --> 00:22:46 these individuals whether
00:22:46 --> 00:22:47 it's you know different
00:22:47 --> 00:22:49 health professions or you
00:22:49 --> 00:22:50 know individuals at
00:22:50 --> 00:22:51 different um you know
00:22:51 --> 00:22:52 different levels of
00:22:52 --> 00:22:53 education as far as
00:22:53 --> 00:22:54 patients go and they're all
00:22:54 --> 00:22:56 at different levels and so
00:22:56 --> 00:22:56 you know I like
00:22:57 --> 00:22:58 I like sharing information
00:22:58 --> 00:22:59 with these individuals,
00:22:59 --> 00:23:01 but it makes it very
00:23:01 --> 00:23:02 challenging because we know
00:23:02 --> 00:23:03 that when it comes to
00:23:04 --> 00:23:05 concepts like pain neuroscience education,
00:23:05 --> 00:23:06 some of these things,
00:23:06 --> 00:23:08 it needs to be catered for
00:23:08 --> 00:23:09 the individual in front of you.
00:23:09 --> 00:23:10 And so that's one of the big
00:23:10 --> 00:23:12 challenges in social media
00:23:12 --> 00:23:13 is a lot of times it's
00:23:13 --> 00:23:15 catered for the average.
00:23:15 --> 00:23:17 And I think a good example
00:23:17 --> 00:23:18 of this is how this
00:23:18 --> 00:23:19 information can be
00:23:19 --> 00:23:20 presented in a way that's
00:23:21 --> 00:23:24 really not optimal for us.
00:23:24 --> 00:23:25 I'm assuming a lot of
00:23:25 --> 00:23:26 individuals have seen how
00:23:27 --> 00:23:29 Stuart McGill did a recent
00:23:29 --> 00:23:30 Huberman podcast.
00:23:30 --> 00:23:32 And I love Stuart's work,
00:23:32 --> 00:23:33 and I think he's done some
00:23:33 --> 00:23:34 great things for our profession.
00:23:34 --> 00:23:35 But a lot of his concepts
00:23:36 --> 00:23:37 are a bit outdated from
00:23:37 --> 00:23:38 what we understand about
00:23:38 --> 00:23:39 low back pain specifically.
00:23:40 --> 00:23:41 And I think we all saw that.
00:23:42 --> 00:23:43 And we, in this,
00:23:43 --> 00:23:46 this large geared at largely at patients,
00:23:46 --> 00:23:47 this, you know,
00:23:47 --> 00:23:50 large podcast type of thing he did.
00:23:50 --> 00:23:51 And it,
00:23:51 --> 00:23:53 it really highlighted some of the
00:23:53 --> 00:23:55 things you don't want to do,
00:23:55 --> 00:23:56 which you don't want to
00:23:56 --> 00:23:57 present things in a way
00:23:57 --> 00:23:59 that creates any type of nocebo based,
00:23:59 --> 00:24:00 you know, fear of movement and, you know,
00:24:00 --> 00:24:03 this idea of anatomical pathological,
00:24:03 --> 00:24:03 you know,
00:24:03 --> 00:24:05 terrible things causing these
00:24:05 --> 00:24:06 pain complaints and all of
00:24:06 --> 00:24:07 these things that are just
00:24:07 --> 00:24:08 outdated models and.
00:24:10 --> 00:24:11 On the counter side of things,
00:24:12 --> 00:24:13 Peter O'Sullivan did a
00:24:13 --> 00:24:14 recent podcast on the same exact topic,
00:24:15 --> 00:24:17 which was the polar opposite,
00:24:17 --> 00:24:18 which was really geared
00:24:18 --> 00:24:19 towards what we do understand about pain,
00:24:20 --> 00:24:21 what we do understand about
00:24:21 --> 00:24:21 how you treat it and how
00:24:21 --> 00:24:23 you manage it in these things.
00:24:23 --> 00:24:23 Again,
00:24:23 --> 00:24:25 geared a little bit towards patients
00:24:25 --> 00:24:27 as well as towards clinicians,
00:24:27 --> 00:24:29 but in a way that really highlights the
00:24:30 --> 00:24:31 the variability in it.
00:24:31 --> 00:24:32 And it's not that
00:24:32 --> 00:24:33 something's out of place or
00:24:33 --> 00:24:35 not that this muscle is weak or the,
00:24:35 --> 00:24:36 cause that's, you know,
00:24:36 --> 00:24:37 maybe for one individual patient,
00:24:37 --> 00:24:38 that might be a problem,
00:24:39 --> 00:24:40 but to paint that across every patient,
00:24:40 --> 00:24:42 it's a huge challenge.
00:24:42 --> 00:24:43 And so I think that that's
00:24:44 --> 00:24:46 from educating patient standpoint,
00:24:46 --> 00:24:46 I think that's the
00:24:46 --> 00:24:48 challenge is how do I
00:24:48 --> 00:24:51 educate these individuals individually?
00:24:51 --> 00:24:51 you know,
00:24:51 --> 00:24:53 in a way that's beneficial to
00:24:53 --> 00:24:55 them based on their specific needs.
00:24:55 --> 00:24:57 And so I, to be honest with you,
00:24:57 --> 00:24:58 I don't use social media as
00:24:58 --> 00:25:00 much for patient type of, you know,
00:25:00 --> 00:25:02 information or education
00:25:02 --> 00:25:03 for that reason is because
00:25:03 --> 00:25:05 I think it's it's from a
00:25:05 --> 00:25:07 person centered care standpoint.
00:25:07 --> 00:25:08 How do you make it individualized?
00:25:08 --> 00:25:08 Right.
00:25:09 --> 00:25:10 You know, and so I don't do that as much,
00:25:10 --> 00:25:11 but I think it's important.
00:25:12 --> 00:25:13 I just I haven't gotten
00:25:13 --> 00:25:14 involved in much into in
00:25:14 --> 00:25:16 that realm of things.
00:25:16 --> 00:25:18 Well, and it's super tricky.
00:25:18 --> 00:25:19 And I can tell you that even
00:25:19 --> 00:25:20 with your websites,
00:25:20 --> 00:25:21 you have to be so
00:25:21 --> 00:25:23 thoughtful about what words
00:25:23 --> 00:25:23 you're putting out there
00:25:23 --> 00:25:24 and how you're leading
00:25:24 --> 00:25:26 people and what you're linking them to.
00:25:26 --> 00:25:28 But even just keeping them up to date,
00:25:28 --> 00:25:29 it makes it really difficult.
00:25:29 --> 00:25:31 So I continue to challenge
00:25:31 --> 00:25:32 people to think about, you know,
00:25:33 --> 00:25:34 if an end in mind is
00:25:34 --> 00:25:35 patient-centered care,
00:25:35 --> 00:25:36 what resources are we
00:25:36 --> 00:25:37 providing to that end body,
00:25:38 --> 00:25:39 that end group,
00:25:39 --> 00:25:42 and really being strategic
00:25:42 --> 00:25:43 about the way we interact
00:25:43 --> 00:25:44 with them throughout these
00:25:44 --> 00:25:46 social media platforms.
00:25:46 --> 00:25:46 in thoughtful,
00:25:47 --> 00:25:48 intentional ways without
00:25:48 --> 00:25:51 trying to steer them or misguide them.
00:25:51 --> 00:25:52 It really should be what is
00:25:52 --> 00:25:54 in the best intention,
00:25:54 --> 00:25:55 which is to hopefully get
00:25:55 --> 00:25:57 them to the right care, right?
00:25:57 --> 00:25:58 And get them the best outcome.
00:25:59 --> 00:26:00 So, you know,
00:26:00 --> 00:26:02 I think people always have two sides,
00:26:02 --> 00:26:03 just like you said, Stuart McGill,
00:26:03 --> 00:26:04 and then you turn around
00:26:04 --> 00:26:05 and you think Peter O'Sullivan,
00:26:05 --> 00:26:07 but those are two different
00:26:07 --> 00:26:08 philosophical approaches.
00:26:09 --> 00:26:10 Maybe one's right.
00:26:10 --> 00:26:11 Maybe one's wrong.
00:26:11 --> 00:26:12 Maybe both are right.
00:26:12 --> 00:26:13 Maybe both are wrong.
00:26:14 --> 00:26:14 I don't know that we know,
00:26:14 --> 00:26:15 but we certainly can at
00:26:15 --> 00:26:17 least explore philosophies
00:26:18 --> 00:26:19 and they're not out there
00:26:19 --> 00:26:20 trying to get that self gain.
00:26:20 --> 00:26:21 They are seeking
00:26:21 --> 00:26:22 and researching and doing
00:26:22 --> 00:26:23 some of this work.
00:26:23 --> 00:26:24 And you certainly can
00:26:24 --> 00:26:25 connect with those
00:26:25 --> 00:26:26 individuals to talk with
00:26:26 --> 00:26:27 them about their studies
00:26:27 --> 00:26:28 and the work they're doing.
00:26:28 --> 00:26:28 Right.
00:26:29 --> 00:26:31 So I think the other pieces,
00:26:31 --> 00:26:32 and you kind of hit this a little bit,
00:26:32 --> 00:26:34 but you know,
00:26:34 --> 00:26:35 these individuals that you
00:26:35 --> 00:26:36 were just speaking about in
00:26:36 --> 00:26:37 the Hooverman lab is
00:26:37 --> 00:26:39 definitely a healthcare professional who,
00:26:39 --> 00:26:41 you know, he has a personal brand.
00:26:42 --> 00:26:43 I mean, they have a personal brand.
00:26:43 --> 00:26:45 And so how do you,
00:26:45 --> 00:26:48 as a healthcare provider, you know,
00:26:48 --> 00:26:49 use social media.
00:26:50 --> 00:26:52 build transparency and
00:26:52 --> 00:26:53 integrity while you're
00:26:53 --> 00:26:54 still trying to promote
00:26:54 --> 00:26:55 what you're doing.
00:26:55 --> 00:26:57 I think this is a tricky space.
00:26:57 --> 00:26:59 And what advice would you
00:26:59 --> 00:27:00 give to other professionals
00:27:00 --> 00:27:01 trying to navigate it?
00:27:05 --> 00:27:06 Yeah, I think I think first and foremost,
00:27:06 --> 00:27:09 know your worth and know your value.
00:27:09 --> 00:27:12 And I think that we a lot of
00:27:12 --> 00:27:13 times we take for granted
00:27:13 --> 00:27:16 how much patients and other
00:27:16 --> 00:27:17 people listen to us.
00:27:18 --> 00:27:20 And so especially what is
00:27:20 --> 00:27:22 what is the term doctor mean to people?
00:27:22 --> 00:27:22 Right.
00:27:23 --> 00:27:24 And my wife likes to remind
00:27:24 --> 00:27:25 me frequently after I have
00:27:25 --> 00:27:26 two doctorate degrees,
00:27:26 --> 00:27:26 like to remind me
00:27:26 --> 00:27:28 frequently that Dr. Seuss
00:27:28 --> 00:27:29 is also a doctor.
00:27:30 --> 00:27:32 you know but at the end of
00:27:32 --> 00:27:34 the day if you put if you
00:27:34 --> 00:27:35 have doctor in front of
00:27:35 --> 00:27:37 your name on a social media
00:27:37 --> 00:27:38 platform or something along
00:27:38 --> 00:27:40 those lines keep in mind
00:27:40 --> 00:27:41 patients aren't able to
00:27:41 --> 00:27:43 decide or other people may
00:27:43 --> 00:27:45 not be able to decide what
00:27:45 --> 00:27:46 your doctorate is in or
00:27:46 --> 00:27:49 what area you study so know
00:27:49 --> 00:27:50 your area of expertise
00:27:51 --> 00:27:52 speak to your area of
00:27:52 --> 00:27:54 expertise and defend it
00:27:54 --> 00:27:55 when you know the material
00:27:56 --> 00:27:59 but at the same time be
00:27:59 --> 00:28:00 careful stepping out of
00:28:00 --> 00:28:02 that okay because you
00:28:02 --> 00:28:03 reading you know I'm not
00:28:03 --> 00:28:04 going to read do a
00:28:04 --> 00:28:06 literature review on on the
00:28:06 --> 00:28:08 treatment of diabetes and
00:28:08 --> 00:28:10 go try to you know educate
00:28:10 --> 00:28:11 people on diabetes using
00:28:11 --> 00:28:13 doctor in front of my name tomorrow
00:28:14 --> 00:28:15 That's not my area of specialty.
00:28:16 --> 00:28:17 I haven't done research there.
00:28:17 --> 00:28:19 I haven't done anything there.
00:28:19 --> 00:28:21 And so that doctor title
00:28:21 --> 00:28:22 still carries you.
00:28:22 --> 00:28:23 And I think a lot of times
00:28:23 --> 00:28:24 people on social media,
00:28:24 --> 00:28:25 you'll see it where they'll
00:28:25 --> 00:28:27 use doctor and all of these initials.
00:28:27 --> 00:28:28 And keep in mind that
00:28:28 --> 00:28:30 patients or other providers
00:28:30 --> 00:28:31 might not know the
00:28:31 --> 00:28:32 relevance of those things.
00:28:33 --> 00:28:34 They might see the whole ABC
00:28:35 --> 00:28:36 soup there and think that
00:28:36 --> 00:28:38 this person is an expert in everything.
00:28:38 --> 00:28:39 This is a person that should
00:28:39 --> 00:28:41 be giving me advice on what
00:28:41 --> 00:28:42 vitamins to take.
00:28:43 --> 00:28:44 when that individual might
00:28:44 --> 00:28:45 have no training in that
00:28:45 --> 00:28:46 area of expertise and so I
00:28:46 --> 00:28:48 think knowing your value
00:28:48 --> 00:28:49 both the positives what you
00:28:49 --> 00:28:51 do have to offer and the
00:28:51 --> 00:28:52 negatives the areas that
00:28:52 --> 00:28:54 you probably shouldn't be
00:28:54 --> 00:28:56 speaking to right and I
00:28:56 --> 00:28:57 think that that's that's
00:28:57 --> 00:28:58 really an important thing
00:28:58 --> 00:28:59 to keep in mind the other
00:28:59 --> 00:29:00 thing is following ethical
00:29:01 --> 00:29:02 standards again we don't
00:29:02 --> 00:29:04 have conflicts of interest
00:29:04 --> 00:29:05 in social media I don't
00:29:05 --> 00:29:06 have to sit there and tell
00:29:06 --> 00:29:08 people oh by the way you
00:29:08 --> 00:29:09 know I'm making money off
00:29:09 --> 00:29:11 of this but simply put
00:29:12 --> 00:29:14 you should do what's right, right?
00:29:14 --> 00:29:15 You should do what's right
00:29:16 --> 00:29:17 because you know that it's
00:29:17 --> 00:29:18 not right to try to trick
00:29:18 --> 00:29:20 people and give them bad information,
00:29:20 --> 00:29:21 right?
00:29:21 --> 00:29:23 If we know that manual
00:29:23 --> 00:29:26 therapy is shown to lead to
00:29:26 --> 00:29:29 improved downstream outcomes,
00:29:30 --> 00:29:31 decreased healthcare costs,
00:29:31 --> 00:29:34 decreased opiate utilization downstream,
00:29:34 --> 00:29:35 all of these things,
00:29:36 --> 00:29:38 and you're sitting here saying, you know,
00:29:38 --> 00:29:39 you shouldn't use manual therapy.
00:29:40 --> 00:29:41 No one should be trained in it.
00:29:41 --> 00:29:41 You shouldn't educate...
00:29:42 --> 00:29:43 You providing that
00:29:44 --> 00:29:45 information are going to
00:29:45 --> 00:29:47 negatively impact patients
00:29:47 --> 00:29:48 down the road because of that,
00:29:49 --> 00:29:50 because of that trickle
00:29:50 --> 00:29:51 down effect of what you're
00:29:51 --> 00:29:52 telling some of these providers.
00:29:52 --> 00:29:54 So you have to do the right
00:29:54 --> 00:29:55 thing and think about is
00:29:55 --> 00:29:56 this information that I'm
00:29:56 --> 00:29:58 providing really the best we know,
00:29:58 --> 00:29:59 really what's best for
00:29:59 --> 00:30:01 these providers to hear and
00:30:01 --> 00:30:03 for their patients to, you know,
00:30:03 --> 00:30:06 downstream to be influenced by.
00:30:06 --> 00:30:08 Or are they what's best for
00:30:08 --> 00:30:09 me and my own financial
00:30:09 --> 00:30:11 loving and perhaps patting
00:30:11 --> 00:30:13 myself on my back for my own biases?
00:30:13 --> 00:30:14 And so, you know,
00:30:14 --> 00:30:16 I think really appreciating
00:30:16 --> 00:30:16 that is important.
00:30:17 --> 00:30:18 Providing your sources,
00:30:19 --> 00:30:21 making sure you give people
00:30:21 --> 00:30:23 the ability to look things up themselves.
00:30:23 --> 00:30:25 Again, give them that opportunity.
00:30:25 --> 00:30:26 Don't try to tell people how
00:30:26 --> 00:30:29 they should feel about things, okay?
00:30:29 --> 00:30:30 Give them the opportunity to say,
00:30:30 --> 00:30:32 this is an article, interesting review.
00:30:32 --> 00:30:33 Here's my thoughts on it.
00:30:33 --> 00:30:34 Let me know what you think.
00:30:34 --> 00:30:34 Right.
00:30:34 --> 00:30:36 Open that door for them to
00:30:36 --> 00:30:38 dive deeper into some of this stuff.
00:30:39 --> 00:30:40 Don't be afraid to admit
00:30:40 --> 00:30:41 when you're wrong.
00:30:41 --> 00:30:42 One of the things I love
00:30:42 --> 00:30:44 telling people is challenge your biases.
00:30:45 --> 00:30:46 We all have them.
00:30:46 --> 00:30:47 You cannot make them go away.
00:30:47 --> 00:30:48 Be aware of them,
00:30:48 --> 00:30:49 become friends with them
00:30:50 --> 00:30:51 and challenge them.
00:30:51 --> 00:30:52 Okay.
00:30:52 --> 00:30:53 If you find a paper that
00:30:53 --> 00:30:54 doesn't agree with what you think,
00:30:56 --> 00:30:57 look at it and say,
00:30:57 --> 00:30:58 what did they do here?
00:30:58 --> 00:31:00 And is it them that did
00:31:00 --> 00:31:01 something wrong in this
00:31:01 --> 00:31:02 study that I don't agree with?
00:31:03 --> 00:31:04 Or do I need to change the
00:31:04 --> 00:31:05 way I'm looking at this?
00:31:05 --> 00:31:06 Because again,
00:31:06 --> 00:31:08 there's different opinions on things.
00:31:08 --> 00:31:09 Just like you were saying, Megan,
00:31:09 --> 00:31:12 there's this idea of it's
00:31:12 --> 00:31:13 just different philosophies
00:31:13 --> 00:31:14 looking at things oftentimes.
00:31:15 --> 00:31:16 And so oftentimes it's not a
00:31:16 --> 00:31:16 right or wrong.
00:31:17 --> 00:31:18 It's just different sides of
00:31:18 --> 00:31:19 the coin looking at things
00:31:19 --> 00:31:20 in a different way.
00:31:21 --> 00:31:22 And then opening your mind
00:31:22 --> 00:31:24 to things outside your comfort zone.
00:31:24 --> 00:31:25 Being able to dive into some
00:31:25 --> 00:31:26 of this stuff and just
00:31:26 --> 00:31:28 learn a little bit every day.
00:31:29 --> 00:31:30 I always tell our residents this.
00:31:31 --> 00:31:31 Every day,
00:31:31 --> 00:31:32 we're either getting a little
00:31:32 --> 00:31:35 bit smarter or a little bit stupid.
00:31:36 --> 00:31:38 You have to learn some new stuff.
00:31:39 --> 00:31:40 You have to open your mind to say,
00:31:40 --> 00:31:41 I don't just want to keep
00:31:41 --> 00:31:43 reinforcing what I already know.
00:31:43 --> 00:31:45 I actually want to learn some new stuff.
00:31:45 --> 00:31:46 I want to dive into some new
00:31:46 --> 00:31:48 stuff and investigate some
00:31:48 --> 00:31:49 other people's thoughts on these things.
00:31:50 --> 00:31:50 Well,
00:31:50 --> 00:31:53 and I think what you allude to in
00:31:53 --> 00:31:55 that is that research is hard.
00:31:55 --> 00:31:57 I think research is so hard.
00:31:57 --> 00:32:00 And when you see different philosophies,
00:32:01 --> 00:32:03 people may not be bantering
00:32:03 --> 00:32:04 about somebody else's approach,
00:32:04 --> 00:32:06 but rather maybe they had a
00:32:06 --> 00:32:07 different subset of
00:32:07 --> 00:32:08 patients in that experience.
00:32:08 --> 00:32:10 Maybe they had more females,
00:32:10 --> 00:32:11 maybe they had more males,
00:32:11 --> 00:32:13 maybe had a different ethnicity,
00:32:13 --> 00:32:14 maybe they have a higher chronicity.
00:32:15 --> 00:32:15 I mean,
00:32:15 --> 00:32:17 there's so many variables that go
00:32:17 --> 00:32:18 into research
00:32:19 --> 00:32:20 that I feel like that's why
00:32:20 --> 00:32:21 it gives it more merit
00:32:21 --> 00:32:22 rather than it being a
00:32:22 --> 00:32:24 myopic singular approach.
00:32:25 --> 00:32:27 It allows you to think a
00:32:27 --> 00:32:28 little bit more open and
00:32:28 --> 00:32:29 broad and I love that.
00:32:30 --> 00:32:31 I've also seen in Twitter which you can,
00:32:32 --> 00:32:33 sorry, X,
00:32:33 --> 00:32:35 is that you see somebody will
00:32:35 --> 00:32:36 say that and then they'll
00:32:36 --> 00:32:37 do a thread with another
00:32:37 --> 00:32:38 link or resource and
00:32:38 --> 00:32:40 another link or resource.
00:32:40 --> 00:32:41 I love seeing that because
00:32:42 --> 00:32:45 it keeps you as a follower or reader.
00:32:45 --> 00:32:46 I'm hearing what they're
00:32:46 --> 00:32:47 saying and then they
00:32:47 --> 00:32:48 provide me another link or
00:32:49 --> 00:32:52 resource to go to because X is limited,
00:32:52 --> 00:32:53 Y characters.
00:32:53 --> 00:32:54 You can't put all your
00:32:54 --> 00:32:55 information in there.
00:32:55 --> 00:32:58 It certainly is a way of a good tactic,
00:32:58 --> 00:32:59 I would say,
00:32:59 --> 00:33:01 to try to make sure that
00:33:01 --> 00:33:04 you're being as transparent as possible.
00:33:05 --> 00:33:06 I found that to be really
00:33:06 --> 00:33:08 helpful as a user.
00:33:08 --> 00:33:10 um I I'm not as on uh
00:33:10 --> 00:33:12 predominantly on x um but
00:33:12 --> 00:33:13 you know certainly I like
00:33:13 --> 00:33:15 to follow a few people so I
00:33:15 --> 00:33:17 hear you there that's great
00:33:17 --> 00:33:18 well is there any other
00:33:18 --> 00:33:20 advice you have as we wrap
00:33:20 --> 00:33:21 up the show if not I'll go
00:33:21 --> 00:33:23 ahead and kick us into some closing
00:33:24 --> 00:33:25 No, I mean, again,
00:33:25 --> 00:33:27 I just thank you for having me on.
00:33:27 --> 00:33:29 We are actually doing a talk.
00:33:30 --> 00:33:31 You, me,
00:33:31 --> 00:33:33 and Chad Cook are doing a talk on
00:33:33 --> 00:33:35 this topic at AM.
00:33:35 --> 00:33:36 So anyone at the conference,
00:33:36 --> 00:33:36 please come see us.
00:33:36 --> 00:33:37 We did a really interesting
00:33:37 --> 00:33:39 narrative review on social
00:33:39 --> 00:33:41 media content related to
00:33:41 --> 00:33:42 musculoskeletal rehab.
00:33:42 --> 00:33:44 And so it's currently in peer review.
00:33:44 --> 00:33:45 We're really excited for the
00:33:45 --> 00:33:45 results of it.
00:33:46 --> 00:33:46 We're looking forward to
00:33:46 --> 00:33:47 discussing some of those
00:33:47 --> 00:33:48 with you guys at AM.
00:33:49 --> 00:33:51 Otherwise, thank you for having me.
00:33:51 --> 00:33:53 Oh, Damian, always a pleasure.
00:33:53 --> 00:33:54 And sincerely,
00:33:54 --> 00:33:56 we can't wait to have you at conference.
00:33:56 --> 00:33:57 I can't wait to see all of
00:33:57 --> 00:33:59 our listeners who are able to go to.
00:33:59 --> 00:34:00 And, you know,
00:34:00 --> 00:34:01 this pretty much wraps us up
00:34:01 --> 00:34:03 in our insightful
00:34:03 --> 00:34:04 conversation on social media.
00:34:04 --> 00:34:06 And I have to say thank you, Damian,
00:34:06 --> 00:34:08 for sharing your expertise with us,
00:34:08 --> 00:34:08 your wisdom.
00:34:09 --> 00:34:11 and how we navigate social media,
00:34:11 --> 00:34:13 especially as OMPT practitioners,
00:34:13 --> 00:34:14 thinking about the variety
00:34:14 --> 00:34:16 of audiences that we need to.
00:34:16 --> 00:34:17 And although we know it
00:34:17 --> 00:34:20 offers tremendous opportunity, engagement,
00:34:20 --> 00:34:21 growth, and communication,
00:34:22 --> 00:34:23 it's so important to
00:34:23 --> 00:34:26 approach it with care, transparency,
00:34:26 --> 00:34:27 and that solid ethical
00:34:27 --> 00:34:28 foundation like you were mentioning.
00:34:29 --> 00:34:31 So I hope this, for those listeners,
00:34:31 --> 00:34:32 I hope this episode has
00:34:32 --> 00:34:33 given you some food for thought,
00:34:33 --> 00:34:34 how to use social media
00:34:34 --> 00:34:36 effectively and responsibly
00:34:36 --> 00:34:38 in their professional journey,
00:34:38 --> 00:34:39 especially maybe as new,
00:34:39 --> 00:34:40 younger clinicians.
00:34:41 --> 00:34:42 Don't forget to follow AMT
00:34:42 --> 00:34:44 on our social media channels.
00:34:44 --> 00:34:44 Of course,
00:34:44 --> 00:34:46 I have to say that because we obviously,
00:34:46 --> 00:34:48 they're listening to a podcast.
00:34:48 --> 00:34:50 So we hope that you'll find
00:34:50 --> 00:34:52 that we're bringing fresh ideas,
00:34:52 --> 00:34:52 that we're going to
00:34:52 --> 00:34:53 challenge our assumptions
00:34:54 --> 00:34:55 and always try to move AONT
00:34:55 --> 00:34:56 forward in the right
00:34:56 --> 00:34:57 direction to do the right
00:34:57 --> 00:34:59 thing for our patients.
00:34:59 --> 00:35:01 So thank you, Damian, for your time.
00:35:01 --> 00:35:03 And as our audience,
00:35:03 --> 00:35:04 we'll see you later on the next Hands On,
00:35:05 --> 00:35:05 Hands Off.

