If you want a cheap laugh, go to the Mount Sinai medical system website and hope they ask you to complete an opinion survey. It’s stupid and ridiculous, deserving lots of snark. But try not to think about what it means or the underlying reality, or you might get kinda depressed. Like I did, because I went to the website because I needed to get something done! I needed a phone number. Sounds simple, right? Until you understand I had already talked with someone at Mount Sinai, and that person gave me the wrong number. But I really needed the number -- I needed to make an appointment for a medical test that is crucial to my health. After a great deal of searching, I finally found what appeared to be the right number. Except it wasn't, as I found when I called.
This was part of my epic struggle to schedule an appointment -- something that I do with a couple clicks for my favorite restaurants, my cat at the animal hospital, or ... yes, my primary care provider. But at that powerhouse medical institution, Mount Sinai? Only the best people who really, really, really want an appointment are graciously granted one. See this for the story.
In this post, I'll confine myself to glancing at the carefully constructed Mount Sinai website and the extraordinary steps they are taking to assure that it is the best it can be. It's clear they're in a race for the top with the health insurance companies on this subject, see this.
Major companies that build websites have a problem, a problem they share with lots of companies that build software. The executives in charge are required to say that they care about quality, and do everything in their power to track and improve it, along with important metrics involving customer satisfaction. They take concrete steps to measure quality, using the best firms out there to help them.
There's just a little problem: they can't get it done.
The Mount Sinai website
I recently encountered a typical example of hopeless executive incompetence while trying to get a simple phone number to schedule a visit to Mount Sinai Hospital in NYC – scheduling that any institution whose software had successfully made the wrenching transition to the 2,000’s would have made long ago. I tell the story of the scheduling adventure here.
It was a long slog to get my MRI appointment made, including a number of calls and emails. You might think that when a window popped up near the end of my ultimately unsuccessful trek through the Mount Sinai website to extract a simple phone number that I would ignore it. After all, the website is a carefully-crafted, attractive-looking piece of useless fluff, impressive perhaps to the important people who are shown images of it in a Powerpoint presentation during some meeting, but in fact annoying, error-filled and generally useless to real people. Silly me: here I am thinking that the hoi-polloi, the real people who have health issues, are the relevant people here – when in reality, it’s the executives, jockeying for ever-growing power, prestige and money among themselves.
If you’ve read any of my other posts on software quality, you may suspect that I’m a glutton for punishment. Your suspicions are correct. So I agreed to take the survey. When I left the site, I expected the survey to pop up, but it didn’t. After all, the request told me, in no uncertain terms, “it will pop up when you leave the site.” OK, I thought, your loss, not mine. But darn! The survey I recently got from my health insurance company was so juicy!! I would have loved to see who wins the race for most dysfunctional survey between a major provider and a major payer!
It turns out, I just needed to wait. Before long, the survey arrived in an email:
I was a bit surprised to get the request in this way, but OK, they’ve obviously got all my information, so fine. As usual, I hover over the link to make sure it’s legit. The URL was portal.gsight.net with some codes after. I quickly discovered this was a domain owned by the company that sent me the email, Greystone.net. Hmmm, who are they?
Wow, a whole company devoted to healthcare marketing and the internet! They must be really good! I wonder if they know about the web?
It appears they know about the Web. And look at this:
It’s a whole process to make the website great! Smart folks, those people at Mount Sinai, turning to professional specialists to figure out how well their website is serving their customers! Though I can only assume that Greystone has only recently been engaged, since the Mount Sinai website is, after all, a pretty-looking pile of stinking crap…
So let’s dig into this expert opinion survey. Click. Here’s where I land:
OMG!!! My jaw has hit the floor so heavily, I’ll probably be scarred for life. I wonder if I can sue Greystone to cover the costs of plastic surgery for my deformed, floor-mangled jaw??
Why is my jaw hurting? Because the link these consummate professionals sent me was to a completely generic landing page! There is this thing known as “deep linking,” in which the custom URL you click brings you right to the place in question. It’s widely used. The landing page knows who you are and why you’re there. I guess the folks at Greystone hired a bunch of interns for this project, ones who hadn’t gotten to that chapter in the “Internet Linking for Dummies” book. And no one with the slightest bit of experience, like the average internet user, had tried it out.
After I gave the right answer, I was thrown into a completely generic survey about the website, utterly uninformed about who I was or any smidgen of knowledge about my site visit – putting the lie to the user tracking they supposedly do. Had they done elementary user tracking, they would have known who I was and which pages of the site I had visited. But no, they decided to ask completely generic questions.
Is this hard to do? Nope. For example, my bank, USAA, notices when I go to the “wire transfer” section of their website and then call them. A recorded voice says something like “I see you’ve recently visited the money transfer section of the USAA website; would you like to wire money today, David?” If I answer “yes,” they transfer me to the relevant department. Not hard! Maybe the Greystone.net interns will eventually get to that chapter.
The survey itself was endless, irrelevant awfulness.
Here’s an example of why the survey was awful:
If they had tracked me and made the survey specific, they would have known that I hadn’t filled out a form. Instead, they present me with a question about form-filling, and then require an answer. Most of the questions were like this. By the way, this question was about ... the twentieth question -- all of them response is required. See the progress bar that says 25%? At some point, it jumped to that, and then, question after question, it didn't changed.
Then at the end, I was invited to give some input. Which I did in calm language, mentioning that it might be nice if the phone numbers on the site were, you know, correct numbers. Of course, since they hadn't deep-linked, they had no way of contacting me to get further information.
Just to be sure I wasn't completely nuts, I went onto the Mount Sinai website again. I got lucky -- an invitation to complete an opinion survey popped up again. I carefully chose "take it after leaving the site," and this time it worked, though in a remarkably clunky way, indicating that whoever built the code had flunked Javascript 1.01. So I get the survey, and was blown away by seeing the very same "how did you get here" question I got from the email link. Any competent web programmer could know how I got there, by looking at exactly how the original URL was invoked. Clearly, performing this elementary task was beyond the collective genius of Greystone and Mount Sinai.
Then as I went farther into the brain-dead survey, I discovered that it just didn't work. Look at this:
Look at the percent completion bar just below the black line under the Mt Sinai logo -- it's still at zero, even though I'm many questions into it, as you can see by the scroll bar on the right. Programming and QA 1.01. Fail. And of course, it was a survey designed so that no normal person would march all the way through to completion.
Discussion
There’s a concept in math and computing, and also in real life, called “recursion,” or sometimes self-reference. It’s a simple concept; it’s been around for literally thousands of years, as we know from fun statements ancient Greeks made involving lying Cretans. In this case, it applies to the question of the quality checkers: who checks the quality of the quality checkers?
The answer is evidently “no one.” The most basic principles in surveys, common sense but also proven by experience, are “keep it short” and “Make every question matter.” We know these are the relevant principles because everyone who hasn’t failed the “survey 1.01” course knows that the most important metric to measure is drop-out rate. Of the people you invite, how many accept? Of those who accept, how far in the survey do they get before dropping out? What’s the completion rate? Any tracking along these lines would have shown minuscule completion rates. I’d love to have a recording of the executive meeting at Mount Sinai in which the survey results were presented, to see whether the issue was even raised.
But beyond that, let me ask: when was the last time you got a survey from Google? Or Amazon? Never, right? Another thing: have you read even a little bit about opinion polling, about how it's long-since been proven that people give one answer when asked, but then act differently? What people who are moderately educated web professionals know is that surveys are useless! That's why folks who know a little about websites watch what you do! If there's a lot of information on the site, they make it search-based, with lots of suggestions. They look for drop-outs.
Yes, I've made fun of how badly the survey was constructed and executed. It was the electronic equivalent of a paper survey from 50 years ago. Which makes sense, because the Mount Sinai website is the electronic equivalent of a glossy brochure from 50 years ago. That's the killer observation. Mount Sinai could make a huge advance by leaping forward to the state of the art of roughly 20 years ago. The very fact that they're using obsolete technology like surveys -- and on top of it doing it incompetently -- shows that they are clueless. It's the equivalent of using a steam-powered car instead of an oil-powered one, and being unable to run the steam-powered car competently. The right response here isn't build a better survey -- it's use modern customer feedback techniques.
Conclusion
Well, it’s a wash. The hospital system opinion survey was pretty different from the health insurance one, but they each exemplified unique ways of being bad. I wonder how many dimensions of badness there are? The institutions I’ve had the pleasure of experiencing are clearly on the leaderboard of those most likely to get to the maximum. Neither of them has a clue about decades-old methods that are vastly superior for getting customer feedback than surveys, however well-constructed those surveys might be.
Postscript
Learning about the excellent survey work conducted by Greystone.net on behalf of Mount Sinai had an added dimension of amusement for me because I grew up near an institution named Greystone. Or more formally, Greystone Park Psychiatric Hospital.
It was, as my mother the R.N. called it when I was growing up, a ‘Looney bin.” If someone said something she thought was dumb, she would say “Did you just escape from Greystone?”