Historically, photographs and images have been viewed as authentic accounts of the past. However, after experimenting with AI recolorization and image generation, I now believe that it is dangerous to interpret photos as such. New tools like DeOldify and Distant Viewing Explorer exemplify how technology is capable of manipulating how we remember history. Although these tools offer some benefits, I think that they pose ethical issues surrounding the rewriting of memory and the application of modern biases to moments in history.
I experimented with AI image manipulation using the DeOldify tool. This program gives color to black and white images using machine learning and pattern recognition. I chose a black and white image of Skinner Chapel to put into the program.


The tool colors the chapel structure fairly accurately. However, everyone in the chapel looks similar; each man wears a shade of red and has dark hair. The DeOldify model operates on and learns from datasets of colorized images. This detail indicates that the individuals in this photo may have been given a palette that reflects the biases of the dataset that DeOldify builds its model on.
“Adding color does not show things as they were but recreates what is already a recreation – a photograph – in our own image, now with computer science’s seal of approval.”
Drimmer, “How AI is hijacking art history” (UMass Amherst, 2021).
Professor Sonja Drimmer perfectly describes the ethical questions that AI colorization models create. Each image is a recreation, despite its presentation as reality.
Next, I explored Distant Viewing Explorer. This tool analyzes visual samples using computational techniques. I decided to plug my original chapel image into its object detection tool.

The object detection tool is an irregular and incomplete analysis of the image. This is why I believe that AI tools like Distant Viewing Explorer cannot replace human interpretation of original images. If you were relying solely on AI, many of the people and features of this image would be lost.
“If you’re looking for an exact sequence of bits, you won’t find it; all you will ever get is an approximation.”
Chiang, “ChatGPT Is a Blurry JPEG of the Web” (The New Yorker, 2023).
Ted Chiang believes AI tools do not provide the full picture. AI tools like Distant Viewing Explorer estimate strictly based upon the information learned from training. The images from DeOldify paired with tools from Distant Viewing Explorer explain why blindly using AI runs the risk of skewing the truth and creating false approximations of history.