DIS Magazine used my sculpture "Inside Cyber Command" to illustrate Benjamin Bratton's discussion of machine vision. The piece was a plexiglass construction housing computer parts and blue fluorescent lighting, mirroring the architecture of NSA headquarters. Behind it, a colorized black and red print added to the patriotic color palette—an aesthetic choice that felt both reverent and critical of surveillance infrastructure.
The pairing made sense. Bratton's text challenges how we think about computational seeing, arguing that machines don't see with human-like aesthetics—they see differently, and more importantly, they see us as objects without emotional connection or affective response. My sculpture literalized that relationship: transparent infrastructure, visible components, the cold glow of surveillance systems made architectural.
Bratton introduces what he calls the "reverse uncanny valley." Instead of being disturbed by machines that appear almost human, we're unsettled when recognizing how non-human we appear through their eyes. This represents "the clearing away of a closely guarded illusion" about human centrality in visual systems.
The scale shift is staggering. More images are now created by and for machines than by humans. Machine vision has become "arguably the ascendant ocular user subject." Bratton argues that the human visual subject should be "situated adjacent to machinic user subjects, instead of above them." This isn't about machines replacing human vision—it's about recognizing that computational seeing operates on fundamentally different terms.
What machines produce are what Bratton calls encounters with "our own estranged reflections." The machines "do not and cannot have a sense of 'aesthetics'" yet can still "recognize us and know us regardless." They parse us into data points, facial landmarks, behavioral patterns. The discomfort comes from seeing ourselves as objects of classification rather than subjects of perception.
The piece also critiques post-internet art's relationship to surveillance systems. Bratton suggests that much of this work risks fetishizing computational infrastructure rather than challenging its epistemological foundations. He advocates for serious engagement with the political geography of these systems—not just their surface aesthetics but their operational logic.
This was written in 2015, before many of today's computer vision systems became ubiquitous. Bratton's analysis feels more relevant now than when it was published. Every facial recognition system, every recommendation algorithm, every automated content moderation tool operates on the principle that machines can know us without understanding us. They recognize patterns we create without comprehending their meaning.
The question isn't whether machines can see like humans. It's whether we understand what it means to be seen by machines, and what kinds of knowledge that seeing produces.