Rather, computer science programs produce people with blind spots; people who excel at writing code, but who aren’t equipped to assess how that code might intersect with human behavior, privacy, safety, vulnerability, equality, and many other factors. As the entrepreneur Anil Dash recently wrote, “Tech is often built with surprising ignorance about its users.” Too few people follow the advice of legal scholar Salome Viljoen to honor all expertise when writing code and building technology.
This blind spot plays a major role in the tech controversies we see every day. It’s one of the reasons we end up with recommendation algorithms that can predict our television preferences to a tee–but also spread anti-vaccine rhetoric at gigabit speeds. And encounter targeted pregnancy advertisements that spoil the surprise, or far worse, continue to appear after a stillbirth. The computer scientists that built Facebook’s newsfeed algorithm, or Google’s search ranking algorithm, didn’t fully anticipate how or why bad actors could hijack their technology–an oversight that a more complete understanding of sociology might have helped mitigate.