Ben Locwin, PhD, MBA, MS09.16.19
The problems and predicaments of autonomous vehicles and electric scooters in society is a proxy for drug manufacturing and clinical supply chain risks.
The only reason more people haven’t been seriously injured or killed by electric scooters and self-driving cars is simply a probability game. If we had more of these technologies impinging upon pedestrians and motorists, we would increase the interaction effect and see a direct and linear increase in both injuries and fatalities.
Unfortunately, we don’t have such a direct line-of-sight to adoption of, and commission of errors with, technology platforms used to manage supply chains. The behavioral similarities are many between these two seemingly-disparate topics. Here’s why.
Your AI-controlled, autonomous vehicle is great…until it’s not
I think that we’re not fully ready to trust these technologies, at least on the side of the autonomous vehicles. It is for this reason, and the as-yet unsolved challenges of making the technology of autonomous driving fully-functional, that these cars want you, the driver, to keep your hands on the steering wheel. If you think a bit more deeply on that point, it’s really ironic: The most fallible and error-prone part of every driving scenario is the human participating in the event. Big Autononmous Tech—a derivative of Big Pharma, or Big Data—is asking the human to stay engaged in the largely-machine-controlled process to offset machine system error(?). Just remember, that when we get this technology nailed, we’ll want humans as isolated and far away from the operation as practically possible.
Where the F**k did all of these electric scooters come from?
If you’ve been to any major metropolitan area in the past year, you’ve likely seen or had a run-in with electric scooters. They seem to have sprouted from the ether and their popularity has outstripped local legislation and the application of common sense.
Thad Moore, writing for The Post and Courier in Charleston, SC, had this to say about the unleashing of the scourge of scooters:
“The scooters arrived with little warning—no hype, no preview, no city approval.* When day broke on the weekend, they were just there.
Several dozen of them, actually scattered across the Charleston peninsula and West Ashley. They were parked on sidewalks on the Westside, and they were parked around Avondale. And before long, they were zipping around, whipping through the streets at 15 mph.”
Injuries, and yes unfortunately, deaths, have occurred. Now Charleston, Nashville, and other cities have banned them.
Managing clinical supply chains
My friend, colleague, and industry deep-thinker, Scott Endicott, had the following to say when I asked him about the linkage between the behavioral principles of electric scooters and managing clinical supply chains: “Just applying technology, any technology, to your clinical supply chain is equivalent to just handing people an app to drive an electric scooter, who’ve never driven one, anywhere they want to go.”
This is salient extrospection, because if we externalize our view of risks as to how people behave at a macro level with electric scooters, there’s no reason to believe they’re operating differently in the adoption of technologies and platforms in the workplace.
Further, the probability (likelihood) that things will go wrong is fairly high, and the consequence (severity) is potentially very damaging. With electric scooters, there is already evidence that very bad things can and will happen when you put tools in the hands of people who don’t see the bigger picture. The behavioral and mental heuristics at play are identified here as being implicated in similarly bad decision making with drug manufacturing and clinical trial technology platforms. The stark difference is that the commission of errors when using electric scooters leads to sometimes immediate and hazardous (injurious) consequences. When people commit errors in a similar vein with their clinical or commercial technologies and platforms, or any organizational platforms, really, they can often course-correct or obfuscate the issue so that it doesn’t have direct linkage back to their commission of error, nor to untoward outcomes—injurious, maybe, only to their career trajectory.
All of these fallacies of decision-making and execution of activities fall under a broad umbrella I like to call: The Donald Rumsfeld Conjecture. Former Secretary of Defense, Donald Rumsfeld, had the following, extraordinarily-worded and almost logically impenetrable statement regarding how we tend to think about our own thinking—something here I would categorize as metacognition:
“…as we know, there are known knowns; there are things we know we know. We also know there are known unknowns; that is to say we know there are some things we do not know. But there are also unknown unknowns—the ones we don’t know we don’t know.”
For the purposes of our arguments here, the synopsis is that, with clinical technologies, the danger is in the failures you don’t see coming, or don’t recognize as apparent or even possible.
*Author’s note: This should also make you consider that extended marketing campaigns and hype aren’t always the best way to ‘do’ marketing. I’ve advised many organizations in a variety of industries on ‘more-mininmalist marketing measures’ (M4), and sometimes the very presence of novelty and unexpected hype and word of mouth (WOM) beats (very) expensive ad campaigns (by a lot).
Ben Locwin, PhD, MBA, MS, is an entrepreneur and healthcare executive, a member of several advisory boards, and works across the boundaries of tech and health (HealthTech) to push the envelope, in an evidence-based way, for the improved future of humanity. Scott Endicott has worked in biopharma and pharma for over two decades, and most recently on the clinical-facing side in support of more logical, rational, and relevant frameworks and approaches to clinical supply chain. He’s a trusted sounding board, a deep-thinker, and more importantly, a good friend.
The only reason more people haven’t been seriously injured or killed by electric scooters and self-driving cars is simply a probability game. If we had more of these technologies impinging upon pedestrians and motorists, we would increase the interaction effect and see a direct and linear increase in both injuries and fatalities.
Unfortunately, we don’t have such a direct line-of-sight to adoption of, and commission of errors with, technology platforms used to manage supply chains. The behavioral similarities are many between these two seemingly-disparate topics. Here’s why.
Your AI-controlled, autonomous vehicle is great…until it’s not
I think that we’re not fully ready to trust these technologies, at least on the side of the autonomous vehicles. It is for this reason, and the as-yet unsolved challenges of making the technology of autonomous driving fully-functional, that these cars want you, the driver, to keep your hands on the steering wheel. If you think a bit more deeply on that point, it’s really ironic: The most fallible and error-prone part of every driving scenario is the human participating in the event. Big Autononmous Tech—a derivative of Big Pharma, or Big Data—is asking the human to stay engaged in the largely-machine-controlled process to offset machine system error(?). Just remember, that when we get this technology nailed, we’ll want humans as isolated and far away from the operation as practically possible.
Where the F**k did all of these electric scooters come from?
If you’ve been to any major metropolitan area in the past year, you’ve likely seen or had a run-in with electric scooters. They seem to have sprouted from the ether and their popularity has outstripped local legislation and the application of common sense.
Thad Moore, writing for The Post and Courier in Charleston, SC, had this to say about the unleashing of the scourge of scooters:
“The scooters arrived with little warning—no hype, no preview, no city approval.* When day broke on the weekend, they were just there.
Several dozen of them, actually scattered across the Charleston peninsula and West Ashley. They were parked on sidewalks on the Westside, and they were parked around Avondale. And before long, they were zipping around, whipping through the streets at 15 mph.”
Injuries, and yes unfortunately, deaths, have occurred. Now Charleston, Nashville, and other cities have banned them.
Managing clinical supply chains
My friend, colleague, and industry deep-thinker, Scott Endicott, had the following to say when I asked him about the linkage between the behavioral principles of electric scooters and managing clinical supply chains: “Just applying technology, any technology, to your clinical supply chain is equivalent to just handing people an app to drive an electric scooter, who’ve never driven one, anywhere they want to go.”
This is salient extrospection, because if we externalize our view of risks as to how people behave at a macro level with electric scooters, there’s no reason to believe they’re operating differently in the adoption of technologies and platforms in the workplace.
Further, the probability (likelihood) that things will go wrong is fairly high, and the consequence (severity) is potentially very damaging. With electric scooters, there is already evidence that very bad things can and will happen when you put tools in the hands of people who don’t see the bigger picture. The behavioral and mental heuristics at play are identified here as being implicated in similarly bad decision making with drug manufacturing and clinical trial technology platforms. The stark difference is that the commission of errors when using electric scooters leads to sometimes immediate and hazardous (injurious) consequences. When people commit errors in a similar vein with their clinical or commercial technologies and platforms, or any organizational platforms, really, they can often course-correct or obfuscate the issue so that it doesn’t have direct linkage back to their commission of error, nor to untoward outcomes—injurious, maybe, only to their career trajectory.
All of these fallacies of decision-making and execution of activities fall under a broad umbrella I like to call: The Donald Rumsfeld Conjecture. Former Secretary of Defense, Donald Rumsfeld, had the following, extraordinarily-worded and almost logically impenetrable statement regarding how we tend to think about our own thinking—something here I would categorize as metacognition:
“…as we know, there are known knowns; there are things we know we know. We also know there are known unknowns; that is to say we know there are some things we do not know. But there are also unknown unknowns—the ones we don’t know we don’t know.”
For the purposes of our arguments here, the synopsis is that, with clinical technologies, the danger is in the failures you don’t see coming, or don’t recognize as apparent or even possible.
*Author’s note: This should also make you consider that extended marketing campaigns and hype aren’t always the best way to ‘do’ marketing. I’ve advised many organizations in a variety of industries on ‘more-mininmalist marketing measures’ (M4), and sometimes the very presence of novelty and unexpected hype and word of mouth (WOM) beats (very) expensive ad campaigns (by a lot).
Ben Locwin, PhD, MBA, MS, is an entrepreneur and healthcare executive, a member of several advisory boards, and works across the boundaries of tech and health (HealthTech) to push the envelope, in an evidence-based way, for the improved future of humanity. Scott Endicott has worked in biopharma and pharma for over two decades, and most recently on the clinical-facing side in support of more logical, rational, and relevant frameworks and approaches to clinical supply chain. He’s a trusted sounding board, a deep-thinker, and more importantly, a good friend.