MARCH 20, WEDNESDAY

PHOTO QUALITY

In digital photography, at least in the first years, the quality of a photograph depended upon a number of things. It depended on the resolution of the camera, how close the pixels were and how many were in a photograph. The denser the pixels, which contain gradations of color through a mix of three colors or gradations of black and white, the sharper the photographs. At first the sensitivity to light of each pixel on the sensor was very important and at first digital photographs required very good light. And at the beginning, and still, sharp movement of the camera could cause blurring, so lens stabilization was very important. And of course the quality of the lens increased sharpness and made a great difference. So to get a good photograph you need a sharp camera lens with high resolution, good light without too much contrast and image stabilization. And of course the closer you got to the subject the sharper the photo would be. It was pretty simple, if your camera had those qualities and you had an interesting subject and a good background and good composition and took enough shots so that you could choose the best one, you ended up with a good photograph.

This was before artificial intelligence processing through a computer within the camera became important. Artificial intelligence progressed slowly but became very noticeable to me in cell phone cameras, and since I used an Apple iPhone I noticed this in the yearly improvements in the iPhone without understanding it. How could a tiny sensor and a tiny (and inexpensive) lens be able to produce such sharp and vivid photos even in low light while at the same time having wide angle and zoom lenses.

But when I look at my iPhone photos or videos on the huge virtual screens of the Vision Pro the problems I had in the early days of digital photography reveal themselves again. The more you enlarge a photograph the more you enlarge any faults. What looks great on the 4 inch iPhone can look blurry or fuzzy with the same photo on a six foot high screen. The pixels are pulled apart and the sharpness dissipates.

This is not the fault of the Vision Pro. During the demonstration of the Vision Pro I said I wanted to look at my spatial videos, not the Apple produced spatial videos, because my videos made on the tiny iPhone, good as they looked on the iPhone, weren’t going to be as good on the Vision Pro screen as the the Apple produced videos made with very expensive cameras that can shoot extremely sharp photos. The demonstrator wouldn’t let me do that.

I was right, my wedding photos of India weren’t as sharp in the slower video speed used in making iPhone spatial movies as the 4K or 8K videos made by much more powerful cameras Apple used. But I learned two things.

One was that the spatial videos made by the iPhone which looked blurry on other devices were very sharp with vivid color when seen on the Vision Pro. The Vision Pro made my videos look as good as they could look. I didn’t need a $5000 camera, the iPhone 15 produces very sharp videos.

And the second was that the Vision Pro could handle any photograph or video I was likely to make in the next several years. It wasn’t going to be instantly outmoded. As the Apple Pro camera improved my videos would look better and better on the Vision Pro. I hadn’t made a mistake in buying the Vision Pro.

Leave a comment