Algorithmic technologies play an increasing role in decision-making and governance of human activity. In this academic offering, political geographer Louise Amoore challenges the notion that algorithmic biases are a fixable glitch in the system. Drawing on philosophical frameworks, Amoore explores how the self-generating value judgments which develop from ongoing algorithm-human interaction form a locus-point for ethicopolitics.
A geographically-located understanding of the cloud does not solve the problem of oversight.
In 1927, physicist Charles Thomas Rees Wilson lectured on his innovative “cloud chamber experiments,” through which he developed an in-depth knowledge of condensation physics. Wilson’s chamber made the trajectories of ionized particles, otherwise imperceptible to the human eye, visible. Cloud computing contains, within its very name, similar implications: the notion that people can render the inscrutable complexity of digital data analyzable.
By the 2000s, people, generally, saw cloud computing as a three-part entity: infrastructure, software and the applications layer wherein lie data analytics and algorithms. This understanding gives the cloud a clear spatial boundary: the physical data centers wherein machines run the software and applications reside. The limitations of this geographically-located understanding are profound, however, as the United Kingdom parliament’s 2013 examination of possible user privacy breaches by the NSA evidenced. Does the user’s physical location when they send a message matter regarding privacy protections...
Comment on this summary or Start Discussion