×

Premium Content - Please Submit to Continue:

First Name
Last Name
Company
Thank you!
Error - something went wrong!

eCOA Design Like Nike: How a Great Interface Can Improve the Clinical Trial Experience

February 26, 2016

Nike is one of the most valuable brands in the world, attributed to a deep knowledge of its customers and the design of products which fit their needs: aesthetically, functionally and emotionally.

Over in the life sciences industry, many sponsors are trying hard to bring their equivalent of customers - the patients – to the center of the stage, and yet they still struggle to add value to the patient's clinical trial experience. As a result they continue to face the same challenges: Why do patients drop out? Why is it hard to recruit them? Why is compliance low? Why are some kinds of data so difficult to gather?

Electronic Clinical Outcome Assessments (eCOA) offer a dramatically improved experience for patients compared with other ways of gathering data. What lessons can be learned from the wider product design industry about making the user experience really meaningful?

Join Senior User Experience Designer for CRF Health, Paul Margerison, for this eye-opening exploration of the nuts and bolts of user experience design as it can be applied to clinical trials.

Full Transcript

[00:00]

MODERATOR

Good morning and good afternoon to everyone that’s joining us on the webinar today. We appreciate your time. This is of course CRF Health CRF Health. And today’s webinar is a very interesting topic. eCOA Design Like Nike—or if you’re here, that’s Nike—and what we’ll be talking about is, I think, the often overlooked value a great interface can improve the clinical trial experience.

And joining us today as your speaker is Paul Margerison, who is at CRF Health, one of our user experience designers, and he’s based in London. Paul’s been with us for quite a while, and he is the role of users champion. And he looks to remind us that the diaries do end up in the hands of ordinary people who strongly desire them to be friendly and easy to use. And Paul brings quite a bit of experience, having worked on the design of products across many sectors, from news to telecoms and even medicine. And before joining CRF Health, he was the Head of Digital User Experience for the Education and Arts organization, the British Council.

So without further ado I’ll hand it over to Paul, and Paul will take you through the webinar. If you do have questions, feel free to type them into the chat box. We do have some time at the end. And we will also be recording the webinar, so you will receive those slides with the recording sometime tomorrow. Thank you, Paul, take it away.

 

 PAUL MARGERISON

Thank you Naor. Good morning, good afternoon. Thanks for joining. Before I start I just want to give you an advance apology in case I start coughing. I do have a little cough, but I’ll try to stop that from distracting us from the content of the webinar, a webinar which is going to look at one of those ideas which is buzzing around in the clinical trials world, which never gets defined in a very satisfactory way. And that is the idea of patient centricity. I’m a user experience designer, as Naor said. And if anybody’s wondering what a user experience designer does in the context of clinical trials, I’m going to spend about 45 minutes talking about that and trying to argue that it’s a profession right at the heart of the idea of patient centricity.

User experience design is actually more associated for many people with high-profile consumer goods like Nike, as Naor mentioned, and also like the other iconic consumer products that you can see here on this slide. And there is no question that, as a discipline, it has blossomed in the commercial world and not in academia and not in medicine particularly. If you take Nike in the middle of this tableau here, manufacturer of sports shoes, they’ve managed to develop a product which is, or at least a product which we believe to be, a blend of all those elusive properties which I’ve listed here—comfort, lifestyle improvement, physical wellness, and so on—in the shape of a formerly humble running shoe. And the point that I’m trying to make here is that, when designing the products at Nike, the designers didn’t only think about how the shoe functions, they also thought about many other layers of its connection to the consumer and wrapped it all up into their design thinking. And my question is, why don’t we apply that level of multi-layered consumer-centric thinking to designing products that are in a medical setting.

For example eCOA, electronic clinical outcome assessments. They’re one of the products made by the company that’s hosting the webinar, CRF Health. And humble though they are, I still believe there is room to apply a multi-layered approach to designing questionnaire interfaces. I chose those keywords to describe Nike quite carefully, because they could all be applied also to a well-designed eCOA. The qualities of comfort and lifestyle improvement and physical wellness, science, advanced manufacturing, and even aesthetic pleasure. And with the exception of science, these words are not often mentioned in the context of electronic diaries, I think you’ll agree. Listing those implies that there are layers in the design thinking that include usability certainly, but go way beyond it and take us into the territory of designing for patient centricity. And I’m going to describe some of those layers today as I see them.

[04:47]

So I started by saying that user experience design was at the heart of patient centricity. If I want to understand the different ways in which an eCOA solution might connect with users, then it would be helpful to have a formal definition of user experience design. And here’s one on the screen now. It’s a definition from the International Standards Organization, the ISO, who say that, “User experience design involves all the user’s emotions, beliefs, preferences, perceptions, physical and psychological responses, behaviours and accomplishments that occur before, during, and after the use,” which is a very very wide field indeed. But that definition doesn’t actually mention anything to do with being user-friendly, because user friendliness comes generally as a by-product of getting all of the other things right. And next I’ll show you how user experience design was described by another institution, a certain gentleman called Peter Morville, who is one of the rock stars of the user interface design world. Twelve years ago it was, he came up with this diagram that illustrates many of the aspects that come together to make a great user experience in a designed product. It’s a honeycomb arrangement of all the properties, salient qualities. I’ll read them out. The properties of being useful, desirable, accessible, credible, findable, and of course useable. And if a design has all of those then it becomes a valuable product, valuable for the gathering of data, and valuable to the patient who is also a living person with things to do. So this diagram is just a list really. Doesn’t matter in which order we read the things. But it’s very useful and I’ve decided to use it as the basis of coming up with a description of how user experience design can improve the patient experience, and indeed the medical practitioner’s experience as well. I’ll look at each on of these formal properties and apply it to the discipline of designing electronic diaries for clinical trials, starting with Morville’s word, useful.

Useful. Here we go. It’s a strange word to apply to eCOA actually. Clinical trial is what it is. It’s not exactly useful to a patient, although at the same time it could probably be the most useful thing in the world to them. I’ve illustrated the idea of useful here with a picture of an old mobile phone, which we don’t actually use in any clinical trials—we never did—not this model anyway. The first commercial phone was launched in 1983 as a novelty, and it was a novelty at first. But it didn’t take long at all before it moved beyond novelty into something which is useful. That’s to say, new forms of behavior evolved to make use of new capabilities. And then after that, new forms of behavior become normal forms of behavior, fitting in with the average person’s life patterns. And that’s actually quite a good definition of useful, a word which can be taken to me fulfills a need in a way that suits our personal circumstances. So skipping forward to today, normal patterns of behavior have evolved so much that it’s actually hard to think of anything you can't do using a mobile device of some kind. And my point here is that much of the world's population now expects the convenience and the immediacy of doing things on a mobile device. Whatever they need to do they expect to be able to do it on a mobile device.

And so back to the question of gathering patient data. Well of all the studies that use clinical outcomes assessments, COA, less than 30% of them gather data on electronic devices still. Well at least, that figure comes from a study related to 2014 Phase 2-4 studies. Less than 30%. So the scientific world is still wrestling with questions about how accurate and how reliable are the data that’s gathered through devices like smartphones and tablets. But at the same time we have a population that’s coming to expect to be able to do mundane things on a mobile device. And those expectations are some things that we have to keep a firm eye on.

Because with any commercial product, designers always begin by looking at the people who are going to be using the thing that’s being designed. They get to know them intimately and they get to know what those people expect. And those people are our patients obviously. For that matter they’re also our clinical site staff. So it’s important to study what those patients' daily routine looks like, what are their sleeping patterns, and how does the condition they have affect their ability to participate. What are their feelings about other people knowing that they're involved in a trial, what other electronic devices do they have. All of this kind of thing allows us to design accurately for different types of population, and to design something which turns out for them to be useful. It might sound like an obvious point, maybe it is, but still we often hear stories about clinical trials in which the study design seems to inconvenience the patient in their daily lives in an unnecessary way. That’s usefulness.

[10:40]

Back to Peter Morville’s honeycomb diagram there at the bottom right of the screen. He uses the word, desirable. Desirable. And that really is a very strange word to apply to anything to do with clinical trials. I’ve put a quotation on the top of this slide. It’s from another very famous name in the world of user experience design, a man called Don Norman, who incidentally, interesting fact, is purported to be the world’s first ever person who had a job title with user experience design in it, sort of the first user experience designer in the world. And that was in Apple, back in the early ‘90s. And he uses the word, pleasant. Pleasant, to describe desirable objects. Pleasant things work better, meaning that if you have an object which appeals to the senses, appeals to the emotions, just by its formal appearance, you know, the way it looks and the way it feels and the way it interacts with you, then you’re likely to believe that it actually works better, which is both a radical idea and also kind of common sense as well. So I’ve illustrated that with this brand of very fashionable headphones that you might recognize. They’re really cool looking and they carry a lot of cachet. Very nice to wear. But I’m told by people in the know that the sound quality is nothing special. Yet the entire package is very pleasant, so it’s possible to perceive of it working better. Same sound quality, superior user experience. Same ePRO instrument, superior user experience. Compare these two interfaces here. If your ePRO setting has an artfully selected color palette, it’s more pleasant to look at. If it uses fonts that work well for the screen size and have a contemporary look, it’s more pleasant to read. If it’s laying out the elements of the questionnaire in a balanced spatial composition, it's more pleasant to interact with. And none of those things affect the content at all. They don’t affect the content, but they affect the way that we experience the content. And they could even make the task of filling in a questionnaire, in an unexpected way, quite satisfying. Incidentally the image on the right there is a shot of CRF Health’s new app interface. Pleasantness then, can also be applied in the copywriting. Read what it says on these two screens, if you can. Obviously clinical questions need to be crystal clear in the way they’re expressed, that’s a given. But you also have guidance between the questions, how to complete the diary. And that can have a more human tone, a more encouraging tone, that recognizes that the patient has accomplished something by filling in the diary. so if I can substitute the world desirable for the world pleasant in the honeycomb diagram, then it becomes a very effective quality in the eCOA.

The next key word there is accessible. On this slide you can see a picture of the Refreshabraille, manufactured by the American Printing House for the Blind. And this is the best selling mobile Braille display on the market today, and you can see how it works I think, from the image. You have a mobile device attached to the Refreshabraille, which reads the screen and displays the words on the screen line by line as Braille characters. You can see there are little plastic things that go up and down. Braille displays are a form of assistive equipment that helps people with disabilities to use screen-based devices, even touch-screen devices.

[14:58]

And Braille displays are not the only ones. There are many many types of assistive equipment out there too. Such as, here we’ve got some examples. Screen readers, screen magnifiers of all sorts of different types. Accessible keyboards. Something called a sniff and puff device, which helps you to control a cursor with only your mouth. And many other things, which are not publications for disabled people, not apps for disabled people, but they are devices that help disabled people to read and to navigate any content that anybody else can read and navigate. And they do tend to work very well, but they work better if the content designer has put some thought into accommodating the assistive equipment. And accommodating assistive equipment is not necessarily done in ways that other users would ever even notice. Accommodations will tend to be little tweaks that go into the code or perhaps special relationship between the elements on the screen, the spatial relationship between the two things. Or perhaps they’re in structuring sentences in certain ways. Subtle changes that don’t fundamentally change the design but do make a big difference when you’re trying to navigate using assistive equipment. And those small acts of accommodation by designers have accumulated into a huge raft of best practice knowledge which has been documented by the World Wide Web Consortium in a set of accessibility guidelines.

 

 

And here on the screen now you can see some fun facts relating to those guidelines. They’re very broad bright guidelines, but appended to them are effectively 66 narrower and very specific, context-specific checkpoints. There’s a grading system, so it’s possible to know how accessible your production is. And they’re so well known that people will often refer to them as statements of the law, when in fact they’re no such thing, but the law may well refer to those in countries where there is disability discrimination legislation. And they’ve become so well known in fact as guidelines that most commercial web offerings do now respect them. Most.

And here’s an example. Taken at random, an example of a website, it’s a large company website as it appears on a mobile phone. Absolutely nothing special about it but it does present the information in ways which are flexible for people with different needs as required by those World Wide Web Consortium guidelines. So you can see here, users are able to increase the font size. They can also decrease it, and strangely there are also people who like that too.

Here’s another way that designers accommodate the needs of disabled users. The ability to choose one’s own color scheme, that’s to say the background color or the color of fonts. There are so many different color combinations that can be more comfortable to people with different eye conditions, some of them looking really garish to people who don’t suffer from that condition. So here for example, you’ve got three very well known websites—eBay, the National Health Service of the UK, the retailer Marks and Spencer—displayed as you’ve never seen them before. And I must stress, these are not colors that were designed by the site designers, but the site designers built in flexibility, designed the flexibility so that it allows you to create your own color scheme. So flexible font sizes, flexible color schemes, both mandated by the World Wide Web Consortium’s best practices. They seem quite obviously but, even though simple things, are routinely ignored in most clinical trial questionnaires.

This screen shows what could well be a typical eCOA screen from many studies. As it happens, it’s not a real one, it’s a mock-up. I didn’t want to criticize any designer or any sponsor or instrument owner. So this is a fake but it’s a credible fake. And it’s quite neat. The color scheme is subtle. There’s a  number picker, which is easy to use and the design of it is quite elegant, as is the navigation. And even the font size is okay for most people to read. But it has design flaws that make comprehension difficult to people with disabilities, so it’s not very accessible. Here is the same thing, differently presented. So the font size in the version on the right is better.

[20:07]

Just as important as font size there’s the technique of what we call chunking. So if you take a long dense paragraph of text can be impenetrable to people with certain conditions like concentration issues or reading difficulty. Breaking a paragraph up helps a lot, so we call that chunking. Also, underlining to emphasize text, which you can see happening on the left, is a normal thing to do no paper but it causes problems on a screen for visually impaired people. It makes it harder to read, and it’s sometimes also mistaken for a hyperlink. These are the most obvious issues.

Here is a slightly improved, even  more improved way of designing the same instrument. It’s more radical but very useful to people with partial sight. It would allow them to read the display of text and at at the same time hear an audio version of the text, and thus use the native properties of the electronic medium to very good effect, no longer being just an electronic version of a paper thing.

But for some people, increasing font size really means increasing to a very great extent, like so. This version here has let go of the requirement to fit all of the content onto one screen without scrolling. It does retain a property of fitting responsively, responsively meaning the text wraps from line to line so that it always fits horizontally, nobody’s going to have to scroll from left to right in that horrible way that you probably have to do. And as in the earlier example it should be possible for patients to specify the precise color combinations that they find most comfortable to read. All of this work is not a personalization gimmick. It could well be the difference between somebody being able to read and somebody being unable to read a questionnaire. And while these accessibility features do look like a radical departure—and I would be the first to acknowledge that there are strong forces pulling us back, making it difficult to go all the way with accessibility—I want to stress, this is standard practice in other commercial industries. Standard practice, and without these accommodations, websites, and apps, could be labeled as not accessible, and that’s a very serious allegation for them.

I’m going back to the next keyword shape in Peter Morville’s honeycomb diagram, which is credible. Credible. And if you noticed this word already in the diagram, and not seen it used before in the context of user experience design, it might look puzzling. So what does credible mean in this context? The illustration there explains pretty well straightaway. When you first saw the image here come up on your screen a few seconds ago, you probably glanced first of all at the logo, and depending on where you live you might have recognized this as belonging to a very famous bank, the Royal Bank of Scotland. But a few seconds after that you realize that it’s really an image of a phishing email, an email that’s pretending to come from the bank, and putting pressure on the recipient to give away their security details. And you probably get these on a daily basis. Phishing emails are deeply sinister, but once you know that it’s a fraudulent email, it starts to look quite comical. There are really amateur attempts at graphic design in the email. There are several spelling mistakes. There are loads of errors of punctuation and grammar. And the most comical thing of all is, there are just some plain bizarre turns of phrase when you come to read it. If you look at it closely, the initial thrust generated by the bank’s logo disappears, and it’s revealed to be deeply unprofessional looking, totally lacking in credibility, and a good illustration of the effect that we need to be extremely careful about. Because losing credibility is so easy and it doesn’t have to be through errors as blatant as these.

[24:44]

Let me give you some examples. Here is one screen from a very typical questionnaire given to a patient on a tablet device, asking three questions about the patient’s level of tiredness over the previous 24 hours. And they’re simple questions, and I think very few patients would not get the idea here of what to do. Is it user friendly? Well, insofar as patients can probably understand what to do, then yes it’s user friendly. Is it the work of a highly professional study? Well I notice that there are inconsistent things going on on the same screen. And that’s not a good sign. For example, if you look at the fonts themselves, they’re in different sizes, some of them are in different sizes. There don’t seem to be good reasons why they’re in different sizes. There are even different typefaces being used. And if you look at the first line, which says “the past 24 hours,” and then the second line, which says “please think about your feelings,” they’re different font faces. And even in the way that emphasis is given to important words, like “right now” and on the second question it says “past 24 hours,” each of the three questions emphasizes important words in different ways. So it’s riddled with inconsistencies which is a sure sign of something lacking professionalism, just as the phishing email did.

So I’ll magically correct those inconsistencies like so, and now you can see that the page is much more consistent. It’s more consistent but it’s still not obvious to me how this page is structured. You can see that there are three numerical rating scales, and it’s possible to understand this set of questions, but it’s quite difficult to understand at a single glance. And if we introduce some spacing, or padding, as we refer to it in the design trade, it becomes easier to read. It looks much better. They eye can go up and down the page, you can take in at a glance what’s going on.

But with that extra padding, extra spacing, it also becomes clear that things aren’t really lined up with each other. If you look at that first row of numbers, some of the numbers are not really aligned with the radio button that they’re supposed to refer to. And also the radio buttons themselves don’t make a satisfying vertical sight line going up and down the page as they would have if they’d been more carefully laid out. Even the questions are indented to different degrees. So they’re not aligned, things aren’t aligned very well, so we lose that neat sense of composition. But also we lose the ability to scan the page easily. So let’s see what it looks like if things are properly aligned. There. The alignment is much better. It looks much more professional. It has the credible gloss of something that belongs to an eminent scientific study.

However, it’s lacking in what we would refer to as visual hierarchy in its typography. Visual hierarchy is when typography is presented in different sizes and different weightings to help a user to scan a page and to create visual landmarks, like this. There, now you can see. Somebody seeing this page for the first time immediately reads the title. The visual hierarchy means that at the top of the hierarchy you have  a title. You very quickly see the numbering of the three questions, A, B, and C. And even the markers at the beginning and the end of teach of those numerical rating scales stands out a lot better, with the markers underneath the 0s and the 10s. So this is now a fully functioning, scannable, understandable, professional looking thing, there are no alignment errors which may have actually caused biased in the answers by favouring some of the answers over others. And there's a visual hierarchy, although I think the visual hierarchy can be taken further to make this page have a softer impact. In the notes on the left I describe this page as looking bureaucratic, and it needn't look bureaucratic with the addition of some simple graphical highlights, like so.

[29:32]

This is my final version. Same content as all the previous ones. But more easily digested, and actually more credible looking, more professional looking. Just to stress the transformation, here are the before and after shots side by side. And if you think back to that fraudulent phishing email, it had the logo of a bank to establish its credentials, but the equivalent of a bank in our world, the clinical trials world, is the relationship that develops between a patient and a physician or between a patient and the hospital. Those are the things that established credibility for the patient. And now the doctors have entrusted the patient with an electronic questionnaire, and so the design of it ought to treat the patient in a similar way to the way in which the doctor would treat the patient. And the doctor would explain things in a consistent way. And the doctor would explain things in a way that was easy to understand. And the doctor would be approachable and calming. That’s what we need to strive for in a user interface. It’s too easy to lose goodwill, to lose credibility, through bad visual design, even though visual design has nothing at all to do with the scientific content of the questionnaire.

Moving on to our next keyword in the honeycomb diagram, it’s findable. Findable. I’m going to use my artistic license here a bit and substitute the word findable for another closely related one, which is navigable, and I’ve just changed that on the honeycomb diagram there. Navigability refers to the quality of a structure that allows people to move around inside it, to easily get to the place where they want to go, or to easily find the thing that they want to find. An eDiary and eCOA ought to be a linear journey with no diversions at all. It should represent the ultimate in navigability, a straight line from A to B, just like the journey at an airport to your gate. I’ll start from the assumption that eDiaries normally get this right already, and they are good at devising the logical flow from question to question, also from instrument to instrument. They shouldn’t allow patients to get lost along the way, and actually they rarely do. But something that lots of eCOA settings don’t do, is to think about a different aspect of navigability, which is the speed with which a journey is made. Is it going to be slow or fast, is it going to be hard work or effortless. Obviously, an airport with travelators is easier to navigate with suitcases in your hand than an airport that doesn’t have travelators.

Here now is a sequence of two screens to be answered in the order of left to right, as you can see here. It’s a very simple journey. The screens have something in common with nearly every other example I’ve shown so far, that they carry navigation buttons at the bottom there, so if you want to move forward or move backward, the way that you do it is always always the same. That’s very important for consistency. This happens to be the house file of CRF Health’s eDiaries. There’s no natural law to say that it has to be that way. But there is a kind of natural law to say that navigation has to be consistently placed. Beyond that, you navigate within a page by starting at the top and moving to the bottom. So far, so obvious. And I’m pretty sure that everybody on the call today would be able to answer all of these questions quite easily without getting confused about what they’re supposed to do. You might also agree that these screens have got decent professional gloss to them, like you would expect from an esteemed clinical trial. But these two screens display no less than five different formats of questions, and each individual question has to be navigated. And each individual question requires you to engage your brain so that you can understand how to do it. You can get to it quite easily, but it’s not effortless.

Now, I’d like you to look at the same sequence presented in a different way. These are actually the same questions. But the designer has found a way to format them so that they all appear to operate in the same way. And what’s more, if you pass from page to page, there’s no visual jump. The top button is always going to be in exactly the same place as the previous top button, the same vertical position. And so are the questions. So once your brain has figured out how to navigate the first page, there’s no more figuring out to do for the rest of the sequence. Now okay, this sequence has got five pages in it. The previous one only had two. And some people might argue that it’s better to condense questions onto fewer pages so that the questionnaire feels shorter. And that’s an understandable view, nobody wants patients to get tired or bored. But my argument is that, despite there being five screens here, the combined mental work required to read them, understand them, answer the questions, and move on is less than the required mental work for the shorter sequence that we just saw. So for me, what you see here is more navigable.

[35:35]

Here’s a photo I took in a museum in London. It shows the controls of a tram that the tram driver needed to operate and drive the tram. This one is the entirety of the control panel that that driver needed to get to grips with. It’s here to illustrate the last of the keywords, one that a lot of people believe is the essence of user experience. I would contend that one. The quality of usability, also know as user friendliness. It is the one quality that we cannot and may not let slide at all in the clinical trial setting. Because if a patient has been recruited, we have to see to it that the patient can use the equipment. Usability is a very broad field, we could talk about that for a long time. But one thing that sums up much of what could be described as user friendliness is that precious quality of making a thing effortless.

And there’s another well-known star of the user experience world, his name is Steve Krug. And Steve Krug wrote a treatise on effortlessness, which I’ve illustrated here. it's called Don't Make Me Think, a classic work published back in 2000. If anybody fancies reading it, I do encourage you to do so because it is very informative and it’s highly entertaining and it’s also really short. It contains this important message, that an interface design can cause users to expend cognitive effort unnecessarily.

We’ve just been looking at one example. I'll show you again the linear sequences. The one on the top forced the patient to figure out how to answer five differently formatted questions, which are not difficult to understand, but all of them different and needing to be worked out, instead of there would be a journey at the bottom which is just one thing to figure out. The journey at the top forces patients to spend mental effort understanding question, even though that effort can be minute and hardly measurable. This is an example of a diary's formal presentation forcing the patient to briefly stop in his/her tracks and just make sure that he's doing the thing right and understanding what to do. A very brief break in the flow.

And there are many many other ways that this can happen. For example, look at this interface here. One of the questions is slightly out of line with the other questions. The patient may well stop in his tracks and say, does this mean this question is somehow of a different order to the others, look at it for a second, no I don’t think it is, I’ll move on.

Here’s another example. Two questions have the same kind of interactivity, and yet they look slightly different from each other. So the patient may stop and say, does this mean that the questions are in a different sort of category, different order of things, differently understood, no I don't think so, let’s carry on. A brief break in the flow.

Here’s another one. it happens when vocabulary that is used is a little bit more complex, has more syllables than a perfectly good alternative would have had. The word “modifications” for example. The patient might stop, think, have I understood this correctly, yes I have, I’ll carry on. These and many many more examples show patients having their brain engaged momentarily to make a tiny decision. And that's a bad thing. It’s a bad thing because they may come to the wrong decision, they may tire before reaching the end of the questionnaire, and more likely, they get the sense of not being the master of the equipment. The equipment is the master of them, they become the servant of the equipment. And in all cases it's better that the patient’s brain power is spent answering the clinical questions and not figuring out how to use the interface.

[40:05]

So there, I’ve skimmed the surface of the different aspects of user experience design, all putting the patient at the centre of the design thinking. My thanks to Peter Morville for coming up with the honeycomb diagram. It was designed mainly for websites and applications and software, but it's wholly applicable for clinical trials. I want to leave you with a challenge, something to think about. I design user interfaces, not PRO questionnaires, and I realize now that I’m speaking to a lot of experts in PRO questionnaires. And I am a huge admirer of a good PRO questionnaire because it does what I think to be an amazing thing. It takes a phenomenon that is completely subjective, like how a patient feels, and turns it into something which is measurable, something which you can apply statistical analysis to. It’s very clever. Something subjective becoming something measurable. So I spent the last 40 minutes or so describing many things that are completely subjective, the different aspects of user experience design, which I believe equate to the different aspects of patient centricity. And the challenge then: If user experience design does serve as a good definition for patient centricity, then please can one of us come up with a PRO-like instrument to measure it.

 

 Thank you, ladies and gentlemen. If anybody would like to follow up on today’s material, then I’d be very happy to talk to you or exchange correspondence. Please vote right now, you can vote through your online interface. I hope you’ve enjoyed it. I will look forward to being in touch with many of you in the coming days and weeks. Meanwhile, I shall take questions.

 

 [Q&A section starts at 42:20]

 

 

Previous
eCOA Impact on Data Quality
eCOA Impact on Data Quality

Next
Usability Testing of electronic Patient Reported Outcomes (ePRO)
Usability Testing of electronic Patient Reported Outcomes (ePRO)

<p>Director of Health Outcomes, Paul O'Donohoe, joins Mapi Group's Dr. Catherine Acquadro as co-presenter h...