The Dressler Blog

I have opinions. Lots of opinions.

Back

Mossberg’s Last Walt Mossberg has always written on tech. He wrote his first column on the wall of a cave at Lascaux using crushed berries and ochre – an insightful think-piece on the possible enterprise applications of fire. This week he wrote his final column for Recode, a technology website he co-founded with Kara Swisher in 2013. This column is partly dedicated to the last 40 years of technology and partly to the immediate future. Mossberg, like many technology luminaries, believes we are on the verge of overcoming the greatest barrier to technology adoption – friction. Friction is a new term in technology, but it refers to an old problem: technology is just too damn hard to use. Mossberg believes that voice input, beacons and various skunk-works projects at Facebook and Apple will usher in an era of “ambient computing.” Ambient computing will be a frictionless way to interact with technology. It will sense our needs at the point of intention, or perhaps pre-intention. We will live in smart environments that meet our needs with total, invisible efficiency. Computing will not be so much “at our fingertips” as “on the tips of our tongues.” Mossberg senses the good and the bad in these potential developments. The computer has always been an intrusion, in Mossberg’s memorable phrasing “a hulking object that demands space and skill”. But ambient computing will also force us to confront our society’s ad hoc approach to privacy and surveillance. Why does this matter? I wish I had Mossberg’s faith that new technology would offer users a frictionless experience. Anyone who has dealt with Alexa or Siri knows that voice input can have its own maddening frictions. I agree with Mossberg that the technology is in development to create frictionless computing, but I don’t think we have given enough thought to user experience to make these technologies truly frictionless. The iPhone was not a technology breakthrough. The vast majority of the tech involved had previous commercial applications. The iPhone was a user experience breakthrough. Praising Steve Jobs as “a visionary” is a way to avoid coming to terms with his legacy. His great ability was to place himself in the shoes of the average user and demand that the technology satisfy their needs. This wasn’t “vision” so much as ruthless practicality. If the oligopoly of tech giants continues to prioritize technological innovation over user experience, then the era of ambient computing will be one of omnipresent inconvenience and misunderstanding. We may even find ourselves fondly remembering today’s hulking objects. In a nutshell: Technology doesn’t eliminate friction. Design does. Read More Misusing Psychology MIT Technology Review is a wonderful publication. I appreciate their thoughtful analysis and attention to the cutting edge of technology. However, when it comes to machine learning they misuse scientific terms from Psychology and Neurology with appalling frequency. Take for example a recent article: “Curiosity may be Vital for a Truly Smart AI.” Almost every word in that title has been misused. The article starts well, explaining how “reinforcement learning” allows machine learning technology to perform at a higher level. Reinforcement learning rewards machine learning systems when they get a correct answer or perform a task quickly. Then the article points out that rewards don’t tend to exist in the real world. However, a group of researchers at the University of California, Berkeley have begun to address this issue by adding curiosity to their machine learning technology. And here’s where we go off the rails. “Curiosity” is added by establishing a reward for exploring all aspects of a system regardless of its relationship to outcome. Does that sound even vaguely like curiosity? I think not. The reason for this tortured definition is that the team used video games like Mario Bros. and VizDoom as a proxy for the real world. The machine learning system was given the goal of “solving levels” in these games. This is something that traditional machine learning systems are bad at. Once the new system was encouraged to explore as well as solve, it solved the levels faster. One might suggest that VizDoom is not all that good a proxy for real life. After all, real life rarely involves levels to “solve.” But the larger issue is that a respectable, academically affiliated technology publication is breathlessly promoting this interesting tweak to machine learning technology as “curiosity.” Why does this matter? I suppose it doesn’t matter. Who cares if we call machine learning “artificial intelligence” or call a tweak to its programming “curiosity”? What does it matter if we misuse these terms? Words change their meanings all the time. The reason I disapprove of this particular kind of misappropriation is The Law of the Instrument. In the words of psychologist Abraham Maslow: “I suppose it is tempting, if the only tool you have is a hammer, to treat everything as if it were a nail.” Meaning that the over reliance on common tools may lead to cognitive bias. Technologists rely on their own mechanistic thought processes to solve problems. Computers force them to think rationally and sequentially. Over time, this creates a cognitive bias; a blindness to the activities of the human brain that are not rational or sequential. Simply put, computer scientists aren’t very imaginative or empathetic. (Sorry.) So when they are asked to generate an “artificial intelligence,” they build one that is rational and sequential rather than one that is intuitive or imaginative because their daily interactions with computers has taught them to disregard and devalue those aspects of human intelligence. When they are asked to come up with a working definition of “curiosity” they reduce a vast and complex psychological concept to the needs of the particular task they have set for themselves. The misappropriation matters because, by a quirk of our post-industrial economy, computer scientists are hugely influential in our culture. So their cognitive biases spread. And our culture will be impoverished if we came to believe that human curiosity is reducible to exploring all the passageways in VizDoom. In a nutshell: It’s not intelligence. It’s not smart. It’s not curiosity. Read More Learning from Museums The museums of my childhood were low tech affairs. I remember entering the Royal Ontario Museum on rainy, late-winter mornings, dust swirling in the marble vestibule as the Haida crest poles towered overhead. There were enough fossils and swords to satisfy an eight year old’s imagination without a touchscreen in sight. When Dressler started working with museums a couple years ago, I wanted our work to add to the wonder of the museum experience, not distract from it or, worse yet, explain it away. The place of technology in museums is a hotly debated topic. On the one hand, technology can turn the monologue of a traditional museum experience into a dialogue between the institution and the visitor. RFID and NFC sensors can help the museum to understand how visitors interact with a space. Personalization can allow visitors to follow a storyline through the entire museum, reducing the cold facts of history to the experience of a single participant. Virtual reality can allow people who would never have the chance to visit a museum because of distance or disability the opportunity to virtually experience the exhibitions. On the other hand, technology can be applied thoughtlessly from a sense of institutional competition. Virtual reality can create the false impression that a museum visit is unnecessary, depriving people of the sense of awe that comes from standing in the presence of history. Augmented reality goggles could become a barrier to engaging with other museum visitors. The shared experience of looking and learning becomes a solipsistic exercise in “self improvement.” Museums adopt technology slowly and the glacial development of new exhibitions is out of time with the “fail fast” mindset of technology. To work with a museum, you have to be willing to downshift to a speed that they will still find “intolerably hasty.” Curators distrust technology for good reason. Anachronism is their job. Why does this matter? Museums are slower to adopt technology than businesses. But they tend to do it more thoughtfully and with a greater sense of purpose. Companies like Local Projects have built educational experiences at museums that would be the envy of any Fortune 500 company. At the Cooper Hewitt Design Museum in New York visitors are handed an electronic pen that allows them to interact with multiple exhibits and save information from the exhibits for later consumption. The thoughtful application of geolocation, personalization, mixed reality and digitally-enabled dialogue means that museums have something to teach technology companies about how to make technology work for people, rather than just how to make it work. The Holocaust Museum in DC issues ID cards to visitors so they can follow the story of a single individual through the museum. Exhibits change to reflect the experience of that individual, reducing a subject of unimaginable tragedy and complexity to the relatable story of a single individual. Museums are laboratories of user experience and a smart technology or marketing executive should spend more time studying what they do well and badly than they do studying the latest marketing stunt. In a nutshell: Museums are the proverbial tortoise in the application of technology. But the tortoise wins. Read More

Sign up to receive weekly Uneven Distribution emails about technology, design, marketing, and user experience.