Have you ever felt overwhelmed by the constant waves of innovation in our field?

How do we, as statisticians, manage the hype surrounding new trends like AI and machine learning?

Today, I’m thrilled to have Kaspar back on the show! We explore these recurring waves of innovation that promise to revolutionize our industry—from data mining and big data to real-world evidence and, most recently, AI and machine learning. These trends often come with high expectations, pressuring us to validate our value amid the hype.

We discuss the pressures we face within organizations, the importance of rigorous and honest assessments, and strategies for building trust and reputation.

Join us as we navigate the complexities of managing hype and maintaining our vital role in drug development.

Key Points of the Episode:
  • Innovation Waves: Data mining, big data, real-world evidence, AI, machine learning.
  • Hype Management: Handling initial excitement, realistic expectations.
  • Pressure on Statisticians: Validation of value, skepticism from stakeholders.
  • Rigor and Honesty: Importance of rigorous assessments, clear assumptions.
  • Building Trust: Establishing credibility within organizations.
  • External Pressure: Influence of consultants and external vendors.
  • Practical Examples: Futility analysis, real-world data usage.
  • Proactive Approach: Staying ahead of trends, internal and external communication.
  • Reputation and Networking: Internal trust, external credibility.
  • Implementation Challenges: Balancing innovation with practical application.
  • Educational Initiatives: Internal seminars, panel discussions, expert invitations.
  • Simple Solutions: Focusing on fundamental, proven methods.

If you enjoyed this episode, share it with your friends and colleagues. Your support helps us reach more statisticians and data scientists, keeping us all informed and proactive in our work. Don’t forget to subscribe for more episodes featuring expert advice and industry insights.

Transform Your Career at The Effective Statistician Conference 2024!

  • Exceptional Speakers: Insights from leaders in statistics.
  • Networking: Connect with peers and experts.
  • Interactive Workshops: Hands-on learning experiences with Q&A.
  • Free Access: Selected presentations and networking.
  • All Access Pass: Comprehensive experience with recordings and workshops.
Register now! Register now!

Never miss an episode!

Join thousends of your peers and subscribe to get our latest updates by email!

Get the shownotes of our podcast episodes plus tips and tricks to increase your impact at work to boost your career!

We won’t send you spam. Unsubscribe at any time. Powered by Kit

Learn on demand

Click on the button to see our Teachble Inc. cources.

Load content

Kaspar Rufibach

Expert Biostatistician at Roche

Kaspar is an Expert Statistical Scientist in Roche’s Methods, Collaboration, and Outreach group and is located in Basel.

He does methodological research, provides consulting to Roche statisticians and broader project teams, gives biostatistics training for statisticians and non-statisticians in- and externally, mentors students, and interacts with external partners in industry, regulatory agencies, and the academic community in various working groups and collaborations.

He has co-founded and co-leads the European special interest group “Estimands in oncology” (sponsored by PSI and EFSPI, which also has the status as an ASA scientific working group, a subsection of the ASA biopharmaceutical section) that currently has 39 members representing 23 companies, 3 continents, and several Health Authorities. The group works on various topics around estimands in oncology.

Kaspar’s research interests are methods to optimize study designs, advanced survival analysis, probability of success, estimands and causal inference, estimation of treatment effects in subgroups, and general nonparametric statistics. Before joining Roche, Kaspar received training and worked as a statistician at the Universities of Bern, Stanford, and Zurich.

More on the oncology estimand WG: http://www.oncoestimand.org
More on Kaspar: http://www.kasparrufibach.ch

Transcript

Managing Hype in Statistics

[00:00:00] Alexander: Welcome to another episode of The Effective Statistician, and today I have again Kaspar on the show. Hi Kaspar, how are you doing? 

[00:00:10] Kaspar: Hi Alexander, I’m fine, thanks for having me today. 

[00:00:13] Alexander: Yeah, and today we want to talk about something that is [00:00:20] worrying statisticians again and again. I think I don’t know. Since I’ve been in the industry, there’s, there’s these waves of innovations that are coming.

[00:00:34] And with each of these waves, there’s always this kind of hype at the beginning [00:00:40] that kind of, it will resolve everything. Yeah. There was data mining, there was big data, there was real world evidence and all kinds of different things. Yeah. And nowadays it’s, AI and machine learning. And well, we can do everything with it.

[00:00:58] Yeah. At, at a time, [00:01:00] I heard kind of, oh yeah, now with, with with Bayesian approaches, of course we can look into every patient and we don’t need a multiplicity control anymore and things like that. So is that the same perceptions that you have? 

[00:01:15] Kaspar: Yeah, thanks for mentioning all these examples and I share your perception that [00:01:20] it seems to come in waves and then there are new innovations, which to a certain extent are innovations, but the potential impact they have is typically highly exaggerated at the beginning and then statisticians get under pressure and [00:01:40] have to prove that, you know, their value still or that maybe these innovations are not so innovative than decision makers initially think.

[00:01:51] And that may cause a lot of turmoil in organizations. That’s a little bit. My impression. 

[00:01:58] Alexander: How does that feel [00:02:00] in organizations? Yeah, so let’s say with the new AI wave that is currently happening where do you see kind of that pressure or that hype within the organization is getting from? So, so who jumps on this kind of hype, for example?

[00:02:18] Kaspar: I think it’s [00:02:20] twofold. One is that with the democratization of a lot of these tools, a lot of people can, I call this do stuff. I mean, we have seen this with predictive modeling, with all these open source softwares. I mean, R has a stake in this. People can just [00:02:40] do stuff and get some results very quickly.

[00:02:44] But It’s still, it’s not like just because you can do stuff, the science behind and the input you need to give and the knowledge you need to put in, the experience you need to put in just goes away. That still needs to be [00:03:00] there. And that’s what statisticians provide. And then when statistician, so this is one, one area kind of, a lot of people can, I call this do stuff.

[00:03:10] And we also have seen this with real world data. People were just kind of starting to generate control arms. And now we start to [00:03:20] understand how much effort that actually takes that you need to formulate the target trial and that you need them to, to find the right data and to emulate the target trial.

[00:03:28] And then you will surface all these potential biases. There are. It’s much, much more than just pull a cohort of patients out of a database and then compute overall survival for [00:03:40] these patients and that tension. A lot of people initially didn’t understand. And unfortunately, it’s then often the statisticians who show up and say, well be careful and wait a minute, and, and, and then are perceived as kind of the, the roadblocks [00:04:00] and kind of naysayers.

[00:04:01] Alexander: Yeah. 

[00:04:01] Kaspar: Yeah. The naysayers. And maybe to some extent we are, but I think as a statistician, we are trained to be rigorous. To be clear on our assumptions, what conclusions can we draw based on these assumptions and using this type of data? [00:04:20] And then we just try to put things in perspective and maybe that perspective is then less exciting than what people think when you just what I call do stuff.

[00:04:28] So that’s one aspect. So kind of this, everybody can run analysis, but that’s running analysis is not science yet. And I think the other direction [00:04:40] this comes from is from senior decision makers who then, I don’t know from where out of the press or through consultants get. These ideas in their head that let’s take the most prominent current example that with AI, you can gain a lot and maybe you can.

[00:04:59] [00:05:00] And I’m, I mean, all these examples, you mentioned data mining, big data, real world data, AI, Bayesian statistics, they all have a place in drug development. And it’s our task statisticians, drug developers to find and define that place. And in my experience, it turns out that [00:05:20] Ultimately, the place these innovations take in drug development is much, much smaller than what initially we may believe.

[00:05:27] And kind of this, this way from, or this, this road from the initial suspicion, what this can achieve to finally defining the place, that’s often a very [00:05:40] bumpy road and very challenging. I mean, I, I keep saying this real world data. You, you, when this came up, of course, that has a place in drug development.

[00:05:49] It always had, we always had epidemiology. But then there was the perception that with real world data, I think a lot of people thought we don’t need randomized trials anymore. We can [00:06:00] basically cut the cost of drug development by half because we just need to collect data on the experimental arm. And of course that perception could not be more wrong.

[00:06:10] And and now we have, I think defined more and more the place where real world data is useful in drug development, but it, it, it, it [00:06:20] didn’t replace many randomized trials. They still have to be run. And I think this, this phenomenon applies to a lot of. These innovations. 

[00:06:28] Alexander: Yeah, I think the the external pressure is definitely a big one.

[00:06:33] Yeah, because one of course, consultants, [00:06:40] yeah, they come with kind of strategy consulting and whatsoever, and they kind of speak about, oh, you can, you know, leverage so much of the value from these kind of new tools and they get paid. based on the [00:07:00] perceptions that they create. Yeah. If they create a big perception of value, then they also get paid a lot.

[00:07:09] Yeah. And but they never need to deliver on anything. Yeah, they just say, well, book us for kind of a strategy consulting around [00:07:20] AI, and that will help you to decrease cost in R& D by X percent or so many millions. And then of course, people think like, well, compared to the cost of this strategy workshop was the strategy consulting.

[00:07:36] That’s peanuts. So of course I will purchase it. [00:07:40] Yeah. The second thing is also, I think the External vendors, of course, easily can say, oh, purchase from us your AI, real world evidence whatsoever, and then that solves your thing. And there, of course, people can, you know, [00:08:00] don’t directly need to go over statistics.

[00:08:03] Yeah. People could directly outsource it. Yeah, whoever wants to do that can kind of, you know, they need to go have maybe their own budget or their own function. Yeah. If they’re high enough in the organization, they [00:08:20] can, you know, more or less go to any vendor and say, Hey, I want to purchase this kind of AI reward evidence whatsoever solution from you.

[00:08:28] Because Well, and then, you know, that sounds super promising and I don’t like these naysayers from the step side, or I don’t even ask them or I don’t even know [00:08:40] them and since they purchase those things, 

[00:08:43] Kaspar: I mean, we’re statisticians. Maybe we should look at the evidence. And one piece of evidence is for me.

[00:08:49] And I’m just guessing, but I think we can find real evidence for that. I think costs for clinical trial development are just increasing over time. So whoever [00:09:00] is kind of excited about all these innovations, then think back 10 or 15 years, how many of these innovations came your way with the promise to reduce costs of clinical trials?

[00:09:10] So why is at the same time the cost of clinical trials still always increasing? So and then follow up question to that is ask yourself How many [00:09:20] of these innovations have really left a big stamp on drug development? How how much are we doing drug development? differently compared to 10 or 15 years ago?

[00:09:32] Really different? Which aspects are we really doing differently? And I don’t think you find so many. And then the other [00:09:40] aspect is, but something that actually frustrates me with all these innovations, I have the impression we lose sight of the simple things that maybe we should do right first. And I always compare, I think today in [00:10:00] an average trial, the time between clinical cutoff date and snapshot date is rather longer than when I started 12 years ago.

[00:10:08] Yeah. So how does that go together? And if we now again believe that AI will help us cut these timelines. It will [00:10:20] not, in my opinion, it may add a little bit of efficiency gain here and there, but ultimately these are human activities and it’s, it’s very hard to automate them or to replace them by AI.

[00:10:35] In, in some sense, but yeah, maybe I’m again, just a naysayer here and not [00:10:40] able to think outside of the box and not kind of buying into these innovations enough. I don’t know. I mean, time will tell. 

[00:10:47] Alexander: I think what you said about let’s do the simple things first, yeah, is a great way also to showcase how statisticians can have an influence.

[00:10:58] And I love your [00:11:00] example of the utility analysis that you talked about the, at the last conference of the effective statistician. It’s a very, very simple tool. Yeah. Okay. There’s lots of misunderstandings about it, but once you get away from all of these misunderstandings. Yeah, [00:11:20] you can actually have a pretty significant impact on your overall portfolio with not a lot of work.

[00:11:29] Yeah. And you don’t need any AI for that. Other things, yeah. Bayesian statistics. Yeah. How many early [00:11:40] phase oncology studies have three, still three plus three designs? Yeah, when you think like, well, there is definitely a lot you can do in these spaces. Yeah. The problem with innovation is very often said, there’s a lot of research, but [00:12:00] there’s not enough data.

[00:12:01] implementation with it. So before you jump on something new, I think you can gain a lot more from things that have proven to be effective and implementing them consistently. [00:12:20] I think the return on investment is much more predictable and much higher. 

[00:12:27] Kaspar: No, I, I couldn’t agree more. And the example of utility analysis you bring up is really the case in point because, and I shared this in in this effective statistician conference, if you add [00:12:40] a template futility analysis to your trial.

[00:12:44] If the null hypothesis is true, if your drug has no effect, you can expect on average sample size. So an average sample size reduction by something between 15 and 30%, depending on your setup. So we are not just [00:13:00] talking about small things here. And this is a very well understood tool. Statisticians know how to implement it.

[00:13:06] It just suffers from some skepticism and, and wrong perceptions on the end of our stakeholders. And, and that’s why they’re not implemented. As much as they could yeah, and this is very simple. [00:13:20] It’s very transparent. It’s no black box. You don’t need an external vendor to implement it. You can just do it.

[00:13:26] And still we’re not doing it sufficiently. And then instead trying to use AI to cut writing of a clinical study report. I mean, I don’t know, a clinical study report to be written is like two [00:13:40] months. They’re off generation of the first draft is maybe one week by a medical writer and then it’s seven weeks discussion in the team.

[00:13:47] How do we phrase it? So, okay, I can maybe get you at the first draft so you can cut this first week into two days. But the seven weeks after where the team discusses the whole thing, [00:14:00] that will not be shortened by AI. And yeah, I sometimes struggle to really see the value. And then you mentioned this other piece, which is a pet peeve of mine, this commercialization piece.

[00:14:11] When you say, okay, sometimes it lacks implementation and that’s where the statisticians need to do their homework as well. I mean, we should not just complain that, [00:14:20] Futility analysis are not implemented more broadly and consistently in say all phase three trials or whatever. We also need to do our homework, work on our stakeholders, work on our messaging.

[00:14:31] I keep saying don’t show up with 30 slides with font size 11 and have all the caveats and assumptions. Go there with three slides, [00:14:40] have your punchlines, and you clarify the assumptions in the background. And you need to be confident that they’re met and then maybe you can, you can shift the needle there.

[00:14:49] Alexander: Yeah, I think the, when I talk about influencing others, yeah, don’t just rely on the facts. Speak about [00:15:00] what’s in it for them, drive emotions, yeah, that is so important. And last but not least, one really critical thing is people need to trust in you and your organization. Yeah. I, when all this data science hype came [00:15:20] up a couple of years ago I was seeing Steven Rupert talk, talking about this and he was saying, if you don’t buy the time, this hype comes, have a good reputation within your company.

[00:15:39] [00:15:40] If you’re not regarded as a trustworthy partner, people will not believe you that, you know, this external things are overhired and things like that. So you need to build your trust, your reputation. Before the hype comes, when the hype is coming. And you [00:16:00] know, the question is not whether there will be another hype.

[00:16:03] The question is when there will be another hype. Yeah. You need to build a trust in your organization, in your individual people before the hype comes. And that by the way, will also help you to [00:16:20] implement all these, let’s say lower hanging fruits. like a futility analysis much more consistently. So think about the branding of your rep of your biostats department of your data science department, whatever you kind of call it, where you’re [00:16:40] working at.

[00:16:41] So reputation is absolutely critical. 

[00:16:44] Kaspar: It’s a, it’s a very good point you bring up. And it reminds me that in, in June, there is the FSPI stats leaders meeting, which happens every year. This year it happens in Switzerland and the FSPI statistical methodology leaders have been invited to run a session there [00:17:00] and we discuss innovation, commercialization of innovations in organizations and before the event, we sent out a questionnaire.

[00:17:08] And one of the questions was if some fancy. Startup is approaching your chief medical officer with some new quantitative method. They promised to cut, I don’t know, costs [00:17:20] by half or whatever these promises typically are. Who is your CMO approaching? Is it kind of the data science head in your organization?

[00:17:27] Or is it, is he or she just believing that startup and, and pushes this into the organization? And I think that exactly, relates to the point you’re making. Is there trust in the internal data [00:17:40] science and statistical capabilities that they will provide an honest and rigorous assessment of the method and that they’re capable of doing that?

[00:17:49] Or is there no such trust? And then as a data science organization, you’re left out. And that’s not where we want to be as statisticians and data science organization. So yeah. And you have to [00:18:00] build this trust in say, In quiet times, when, when there is no such things coming your way and then that you have it once the things come your way and and yeah, and can make this rigorous and honest assessment for the organization.

[00:18:14] Alexander: Yeah, completely agree. So a couple of things, what are the things that [00:18:20] you think you can do to improve set reputation beforehand? So you don’t get into this challenge.

[00:18:28] Kaspar: I mean, act. Consistently as a rigorous voice, which is not just saying no to everything because it’s [00:18:40] maybe complicated or you need to look into new things. Always try to make an honest assessment, provide your expertise in such a way. And then build that reputation over time. But I think we should not underestimate it takes time and you need, I mean, that’s what you said.

[00:18:59] You need to [00:19:00] take the other side seriously. They’re also, they have, they are under pressure. They need to justify. Why they’re not doing things, why they are doing things, you can help them make, guide that decision. Yeah, these are all kind of lofty, lofty comments. But yeah, I think, I mean, how can you, [00:19:20] it boils down to the question, how can you build that reputation?

[00:19:23] And I think it’s by consistently being honest, By consistently being rigorous. Also, sometimes I feel, and that’s maybe sometimes a bit unfortunate, but you’re listened [00:19:40] to in your own organization, sometimes more when you come from externally, when people see you comment on a panel with FDA folks, for example try to get in there.

[00:19:51] So that’s kind of this outreach piece, this collaboration piece as well. If you work together across companies and then six, seven companies. [00:20:00] write a white paper, maybe that carries more weight than when you internally try to push for something or build that reputation. So those all goes together. And I think there is no standard recipe.

[00:20:13] And it, it, I think it takes time and, and effort to build. That [00:20:20] credibility and then even if you have that credibility, depending on who’s on the other side, how much they trust statisticians, data science, it’s still no guarantee you are the go to person for them, but I don’t think there’s much more you can do.

[00:20:34] And Steve, of course, is a is a template example. For that and and [00:20:40] for his influence but there are other such figures, but it’s not like they show up one day and then the next week they have this role. Yes. I think that builds over years and years through multiple activities, being involved in the scientific discourse, publications Attending [00:21:00] conferences, speaking at conferences, speaking in panel discussions, doing webinars recording effective statistician podcasts, you name it.

[00:21:08] All these activities help to build a certain reputation first in your field and then hopefully at some point. Beyond your field, but yeah, I, I, [00:21:20] I don’t think there is kind of a secret recipe or secret sauce here. It’s just hard work. 

[00:21:25] Alexander: Well, there’s a couple of things that you just need to know. Yeah. So it is a lot about building trust, networking, linear reputation, and definitely having an [00:21:40] external network and an external reputation that you can.

[00:21:43] Pull from will hugely increase your internal trust. Another thing is also that when you also spearhead these innovations internally. Yeah, so [00:22:00] never think about, oh, that’s not, that’s not cool or we’re skeptic about it. Yeah, leverage things internally, educate internally. You need to be on the top of these things.

[00:22:14] Yeah. So is that you have already [00:22:20] communicated first within the organization. So that’s the big advantage in being the first mover here in terms of communication, because if you already set the frame, if you already have the story here, then people will be much less likely and susceptible to. [00:22:40] external kind of stories and external hype.

[00:22:43] However, if you kind of just stay always to kind of we have all always done it this way, then of course you set yourself up for being made redundant, being kind of put into the that’s, you know, statisticians are just those that [00:23:00] do the tables for the clinical trials niche. So I think being, you know, innovative Always kind of on the lookout for how you can improve things, how you can, you know, leverage these waves rather than by being hit by them is very, very important.

[00:23:19] Kaspar: I agree to [00:23:20] that. So this kind of being proactive instead of reactive is very, very important. I think what one should not underestimate is that an organization needs to provide a at least to a certain extent, resources for that, or, or kind of incentivize and encourage people to, to stay abreast. And if it’s, if it’s just [00:23:40] with the statistical literature to see maybe what’s coming and then that you can make an assessment.

[00:23:47] One of my favorite examples is estimates for a long time there, there were lots of discussions happening and there was no requirement from health authorities yet. But if you started [00:24:00] early on to bring your organization into this direction and to, to start to generate at least understanding, if not excitement for that, this concept actually will help us doing drug development.

[00:24:15] Then you might have. eventually, when then at some point regulators [00:24:20] started to request it. And there are other such examples. And I think organizations also need to be robust enough because sometimes you invest time into topics that do not become important. I think, I mean, the more senior people you have, the more they have a nose for what’s maybe will be important [00:24:40] moving forward.

[00:24:40] But sometimes you can also be wrong. And then you’ve invested time into something that turned out not to be so relevant, but you never really know you have no certainty. And but if you want to stay ahead of the curve, you need to allow for at least a certain amount of time. [00:25:00] Some people or some people with some time to, to kind of try to identify what will be important moving forward so that you can be proactive when then something is coming from outside and you already can very quickly develop an opinion on it, or already have that opinion and have the expertise.[00:25:20] 

[00:25:20] Alexander: Yep. Completely agree. If all of your people are already kind of 100 percent booked with all the standard stuff then it’s pretty hard to be innovative and thinking forward. You need to invest time so that you can get time out later on. [00:25:40] And this time investment is really worth it. And when you do these things, I think one important thing is that you have some kind of forum, platform, communication channel within your organization, via which you can shape the organization, via which [00:26:00] you can show how innovative you are, how you can bring in external experts, how you can, you know, collaborate with others.

[00:26:10] That is really, really fundamental to have. Because this kind of channel is what, how you build your reputation. [00:26:20] Yeah. Of course, all the one to ones your stat statisticians, data scientists have is really important. And I would never underestimate these kinds of things. And you also need to have some kind of internal communication channels so that you can, you know, do all these kinds of things.

[00:26:39] Kaspar: Yes. [00:26:40] And I mean, I can mention Roche as an example, we have internal seminars, of course, where people can present and also try to generate an atmosphere that you don’t need to have like conference style presentations where everything is perfect. You can just come and maybe also say, well, I got until here and then I got stuck.

[00:26:57] And what do people think? And, and [00:27:00] we’re not always successful in this, but yeah, also kind of have these forum where you can just discuss and have a scientific discussion about things and, and, and seek input from your colleagues so that you can learn together. I think that is a very. Important piece and generate the confidence in people that they can also learn new things [00:27:20] and that they’re not overwhelmed by something coming from outside and understand, well, it’s maybe a little bit outside of what I know, but maybe it’s not so far outside.

[00:27:29] And maybe a few of us together can figure out what this thing is really doing. And then you can stay on top of the [00:27:40] developments rather efficiently. 

[00:27:41] Alexander: Yes, you definitely need these. Statistics, data science, internal things, and you also need the external facing seminars and trainings and things like that.

[00:27:54] Presentations where you as a biostatistics [00:28:00] organization present your expertise and your collaborations to others. Yeah. So when you see there’s a new hype. around X coming in, yeah? Invite someone that is an expert in X and have a panel discussion [00:28:20] internally or something like this, yeah? That will give you a lot of visibility and a lot of credibility and that will prevent that people, or at least minimize risks that people will do without you, yeah?

[00:28:37] That you then need to kind [00:28:40] of correct and Well, reset expectations, what always usually is lowering expectations and nobody wants to hear that. So be proactive in these kind of things. It is absolutely fundamental.

[00:28:58] Thanks so much Kaspar. [00:29:00] That was an awesome discussion about managing hypes. And we have mentioned a couple of these at the beginning. I don’t know which ones will be the next I just know that maybe already next year on two years, we’ll have the next big things that is coming around. So [00:29:20] really invest in your reputation internally, in your trust, in your network, in your visibility, all these things.

[00:29:28] will help you a lot. And as you said, also commercialize the things that are already existing. That will also gain you a lot of reputation. [00:29:40] Any last thought on this topic? 

[00:29:44] Kaspar: No. Thanks for, for having me and give me the opportunity to share my thoughts and yeah, and encouragement or just kind of wrapping up.

[00:29:52] I think two points for me are important. Stay abreast of what’s going on, build that reputation, as you mentioned, and And don’t [00:30:00] forget to do the simple things right. And sometimes we tend to miss the simple things just because we are too busy with trying to do the fancy things. But the simple things like utilities, they can carry a lot of value and also focus on those.

[00:30:16] Alexander: Thanks so much.

Join The Effective Statistician LinkedIn group

I want to help the community of statisticians, data scientists, programmers and other quantitative scientists to be more influential, innovative, and effective. I believe that as a community we can help our research, our regulatory and payer systems, and ultimately physicians and patients take better decisions based on better evidence.

I work to achieve a future in which everyone can access the right evidence in the right format at the right time to make sound decisions.

When my kids are sick, I want to have good evidence to discuss with the physician about the different therapy choices.

When my mother is sick, I want her to understand the evidence and being able to understand it.

When I get sick, I want to find evidence that I can trust and that helps me to have meaningful discussions with my healthcare professionals.

I want to live in a world, where the media reports correctly about medical evidence and in which society distinguishes between fake evidence and real evidence.

Let’s work together to achieve this.