The Dressler Blog

I have opinions. Lots of opinions.

Back

Quantum Computing. For real, this time. Moore’s law postulates that the number of transistors on a microchip should double every two years. For a long, long time (in technology years) that has proven true. But transistors have gotten really, really, really, really small. The problems with things that are really (to the fourth power) small is that they start behaving in odd ways. Up until now, good old Newtonian physics could explain the structure of microchips. But the size reductions have pushed us inexorably to a point where quantum physics applies. Quantum objects behave in ways that are not-readily-relatable to our everyday experience. The same object can be in multiple places at once. One quantum object can mysteriously affect another, even when they are at a distance. So one would think that transistors at the quantum level would be all kinds of busted. But computer scientists have theorized for years that a true quantum computer should be able to do things that our current computers are incapable of doing. Transistors in current computers can be set to either 1 or 0. The limitation of those two values has been a hard ceiling on the performance of computers since they were first built. But a quantum computer should have multiple possible settings, 0, 1 or some combination of the two. The challenge of quantum computing is that the minute you measure the state of that transistor, you fix it as either 0 or 1. So using a quantum computer requires a tolerance for ambiguity and probabilities. This was all just an abstraction until two companies, D-Wave and IBM introduced real, live quantum computers. In practice, these quantum computers are really (to the nth power) primitive. The IBM version, which I’ve spent the week playing with through a web interface, features just five transistors. To put that in perspective, the most sophisticated commercially available standard chip features 7.2 billion transistors. Application to Marketing: Imagine if you could go back in time and hang out with Woz and Steve, or Bill Gates and Paul Allan. Imagine if you could have known at that time how big computing would be, how it would reshape our world. I really believe that quantum computing is that big. What is the biggest limitation of computing today? Is it a shortage of data? No, we’re drowning in data. The biggest limitation is that the more data we produce that more difficult it becomes to discern patterns in the data. Quantum computers should be capable of processing vast amounts of data much more quickly than standard computers. That offers the opportunity to recognize patterns or the absence of pattern in close to real time. Marketing is a discipline dedicated to the creation and manipulation of patterns in the marketplace. So this is especially useful for us. Next Steps: Quantum computing (like quantum physics) is difficult to understand. Take the time to learn more. (Below is a 2014 article on what quantum computers would be good for.) Read More Racist Algorithms Math is not magic. And yet, the word “algorithm” gets thrown around in technology circles as a catchall incantation to dispel every doubt and question. Algorithms are like soufflees, you need to know what you’re doing to put one together and, even then, it’s only as good as the ingredients you put in. If you pick your data badly, algorithms can be used to “prove” all kinds of spurious nonsense. This isn’t necessarily malicious. We all have natural biases that will affect the data sets we are exposed to. So it is entirely possible for a racist person or a person in a racist society to produce an algorithm that seems to objectively confirm a racist thesis. It’s all in how you pick and weigh the data. Application to Marketing: In the adtech world, algorithms are treated like secret sauce. You can be sure that there is an algorithm, but you may never know what goes into it and why. That is wrong. If a company is offering to measure or automate some aspect of your marketing “objectively” you have every right to know exactly what that entails. Math can be internally consistent and still be a false predictor of reality. Do not accept “algorithm” as an explanation. Next Steps: Next time someone says “algorithm” in a meeting, ask them to define the word. (Hint: A process or set of rules to be followed in calculations or other problem-solving operations.) Read More The Internet of Things is hard The Internet of Things is too good an idea not to succeed. Yet, for all its apparent inevitability, very little in the way of IoT has actually been accomplished. In an article in Recode (link below), the author suggests that the issue may be that IoT doesn’t really scale that well. While it certainly makes sense to tie processes and objects into a digital infrastructure, particularly for any type of manufacturing process, IoT is ultimately a way of thinking, not just a product. The open standards and shared architecture of the Internet of Things makes loads of sense, but there are already technologies in many industries that offer some of the same benefits, albeit using closed, limited, dated technology. Particularly in manufacturing, people are not conditioned to jettison the technology they know, because they think of their equipment life cycle in terms of decades, not years. Application to Marketing: Clients have normal human motivations. We can encourage them to use new technology and prepare for a future that seems so clear to us. But they are much more likely to stick to a place of comfort. They have the job to make sure things “just work” which creates a bias towards things that have just worked in the past. They will only be willing to innovate when the pain points created by the old technology outweigh the comforts of familiarity. With IoT, the pain from not adapting will come in three to five years. That’s an eternity. Next Steps: Don’t propose an IoT solution, unless you’ve been asked to propose an IoT solution. Trust me. I learned this the hard way. Read More Hacking Healthcare “Move fast and break things.” It’s a scary mindset when applied to the healthcare industry. As the tech world inevitably turns its attention to the big margins and curious anachronisms of healthcare, a cultural conflict is taking center stage. Dr. James Madara, the CEO of the American Medical Association has accused the tech world of pushing apps that "impede care, confuse patients and waste our time.” But many technologists just hear the standard objections of the soon-to-be-disintermediated. Having dabbled on the periphery of this massive market, I have often been shocked by the dated, unlovable and inefficient technologies that hospitals and health insurance providers refuse to reconsider in the light of better options. But how innovative can you afford to be when your first duty is the health of your patients? If your app fails to detect a fatal arrhythmia, that isn’t a bug. That’s someone’s dad. Application to Marketing: Technology offers distinct advantages to pharmaceutical companies and health insurance providers. Customer care and service can be improved, problems can be detected early, and customer loyalty guaranteed. But the costs of failed technology are massive. Vic Gundotra of AliveCor, suggests technology companies embrace the restrictions and inbred conservatism of regulators and government agencies. But experience tells me that regulators will happily kill innovation on any flimsy pretext. Instead, I would recommend that marketers and technologists focus on getting easy wins, improving care in low risk areas and improving user experience across the board. When technology has been proven not to kill people (and perhaps even save a few lives), there will be more of an appetite for innovation. While it might seem like technology that actually saves lives is a superior entry point, but in practice, these technologies draw far more scrutiny. Next Steps: UX in healthcare is godawful. Really. This is the low hanging fruit that can be addressed right away. Read More

Sign up to receive weekly Uneven Distribution emails about technology, design, marketing, and user experience.