A Free Education is Worth Exactly What You Paid For It

There’s a certain couple on Youtube that talks a lot about photography and tends to harp on the value of education also pushes their “free” photography education courses online. You may even begin to believe some of their rhetoric or have heard some of their misinformation floating around.

Mind you, much of the technical information they love to claim they’re proficient in and knowledgeable about is shit. They’ll contradict themselves within the span of 5 minutes. One example is their belief that “professional” lenses on a smaller sensor, like APS-C, will perform worse in sharpness because said lens isn’t tuned for the sensor size, yet will then claim that some lenses on 35mm are so sharp, you get the ability to crop the image in closer than other lenses. Realize that these two things are a contradiction and a sharper lens will never give a sensor of fixed resolution higher resolution, perceived or otherwise. How do I know this? Simple physics and geometry. Let me explain:

Imagine pixels being a grid on a plane. Now try to draw a diagonal line. If you’re old enough to have owned an LCD monitor from 15 years ago, or a consumer level CRT monitor, you’ll be familiar with this. Any curved or diagonal line will look “stepped” the more you magnify that image as these polygonal pixels, using binary coordinate data, attempt to recreate curves and diagonals. This is called a bitmap.

Actually, here’s a highly magnified sensor right here. We’ll call this one “24MP 35mm, aka ‘full frame'”:

A magnified, 24MP, 35mm “full frame” sensor

Now, let’s see a 24MP APS-C sensor:

A similar view of a 24MP APS-C sensor

Because these sensors have the same total megapixel count, it results in a different pixel size, or pitch. The “full frame” sensor’s larger surface area results in the pixels themselves being larger and a diagonal line that is less defined because of the larger “steps,” aka increased aliasing. This property gives the sensor other benefits that we’re ignoring for this discussion as they have no effect on the lesson at hand.

Instead, let’s increase the pixel density on the “full frame” sensor to the point where each individual pixel’s size is equal to those on the APS-C sized sensor. The result will obviously increase the total number of pixels we can fit onto the sensor. Smaller pixels = higher density = higher total number. Since an APS-C sensor has 50% less surface area of a “full frame” sensor, these pixels are also 50% smaller, matching the pixel size:

A 35mm “full frame” sensor with the same pixel size as the APS-C sensor

As you can see, the diagonal line’s “steps” are just as detailed since the per pixel size is the same… only the total number of pixels increases due to total surface area being increased, but detail is now equivalent between the two sensors at an unmagnified view.

On a “per pixel” basis, where both sensors have the same total pixel count, zooming and cropping a “full frame” image to APS-C view will obviously increase visible aliasing. However, the confusion lies in the misconception that an APS-C sensor takes a “full frame” image and literally crops it, magnifying the result and thus increasing visible aliasing, by 50%. This is not true.

I repeat: this is not true. This is simply NOT how sensor crop factor works at any level.

This is the argument this couple keeps repeating and it’s not correct. The diagonal line projected by the lens will be captured as closely as the pixels will allow for the given image sensor and its pixel size. In fact, if pixel density is equal between 2 dissimilarly sized sensors, the larger sensor’s image will look less sharp when cropped by 50% to match the view of the smaller sensor. Why? Because the aliasing is not a product of the actual scene projected through the lens and onto the sensor… it’s a product of the sensor’s ability to capture the scene.

I am obviously ignoring factors that can influence image fidelity, like pixel sensitivity, bias, interpolation, amplification and noise. They all play a role but none as much as actual resolution in their logical fallacy.

All other factors being equal, a smaller pixel size will capture finer details in an image (that’s simple physics, folks) but a larger overall sensor will capture a larger view of the scene. This larger view will make objects look smaller as compared to the result from a smaller sensor, giving the sense of sharpness when viewed unmagnified. If what this couple says were true, then a sharper lens would look better on a sensor of any size, given a fixed pixel pitch, and make pixel density irrelevant to a certain extent. Most of all, it negates their argument of a high quality lens producing worse results on a smaller sensor with higher pixel density/smaller pixels.

Let’s attack their logical fallacy from a different perspective. Take one “professional grade” lens and put it in front of 2 different sensors: 1 that’s ‘full frame’ with 2µ sized pixels and a total of 24MP. The other sensor is APS-C sized with 1.3µ pixels and 24MP total pixels. The latter image will look more zoomed in compared to the “full frame” image, but it will be sharper and more detailed. It has to because of its finer pixels’ ability to capture finer details from the lens’ projected image. However, if you were to compare the same APS-C sensor with a “full frame” sensor with 1.3µ pixels, and thus a ~36MP dense sensor, you’d get images with the same level of sharpness and detail, at the same magnification and distance, with the larger sensor having a wider view of the scene. Only if the pixels in the smaller sensor are larger than those in the larger sensor will the smaller sensor’s image be less sharp and detailed.

Backing this up is one of their most popular quotes, that “APS-C covers a smaller part of a ‘full frame’ lens’ image circle, leading to a lower quality image.” The truth is that the center is the sharpest part of any lens’ image circle and therefore always lead to higher fidelity; it simply depends on whether the sensor used has the fidelity to capture it.

Bottom line: When it comes to comparing image sensors, surface area has nothing to do with image fidelity.

ƒ-Stops and Apertures

Now, when it comes to total light capture versus intensity per surface area measure as estimated by ƒ, they’re also wrong. They claim that the crop factor of a sensor to achieve a similar field of view must also be applied to the ƒ number because “a smaller sensor gathers less light than a larger sensor.” Sure, of course a larger sensor area is exposed to more light, but that is irrelevant when it applies to the intensity of light that guides exposure.

A larger sensor collects no more light per square unit than a smaller sensor.

ƒ, or “f-stop,” is simply a measure of the aperture in a given lens and is based on the lens’ field of view, in mm, divided by the diameter of the iris as visible when looking down the front element. In its most basic, it’s simply a ratio of the focal length in mm to the aperture’s diameter as an imprecise measure for controlling the amount of light in an exposure. The actual stop numbers mean nothing without the lens’ focal length.

Mathematically, these ƒ-stops function as the denominator and the focal length as the enumerator in the ratio calculation and has no direct relation to the diameter of the aperture outside of this formula. What this means is that the diameter of the opening at ƒ/2.8 on a 50mm lens is completely unrelated and unequal to that of an 85mm lens. However, theoretically ƒ/2.8 on either lens should let in a similar amount of light between both lenses, creating a similar exposure value.

Now, the problem is that these 2 Youtubers argue that the crop factor used to calculate equivalency between an APS-C versus a “full frame” sensor must also be applied to the ƒ-stop ratio. This is patently false.

Again: This is patently false.

Sure, you can apply it to estimate an equivalent amount of focal quality, or “bokeh,” but it has nothing to do with your exposure or exposure value. Like I explained above, the ƒ-stop is designed to estimate the amount of light for an exposure and is tied directly to the focal length of a given lens. As such, it does not relate to the size of the sensor in any way. at. all.

In summary

The problem with perpetuating these myths, especially by a couple who have spent years cultivating a reputation of being knowledgeable, is that they sow confusion in an attempt to sell their books and get you to upgrade your hardware often and often needlessly. Both of these myths were designed to perpetuate the idea that cameras equipped with smaller sensors, especially APS-C, were inferior to the larger, 35mm “full frame” sensors that began to show up in professional level cameras a decade ago. Manufacturers, and their paid disciples, preyed upon consumer ignorance of basic mathematics and principles of light using these logical fallacies, scaring consumers into trading up their cameras for more than they need.

These falsehoods have been perpetuated by hundreds, if not thousands, of people all over. Some did it in the pursuit of profit while most others are simply regurgitating fallacies they’ve learned over time. The worst of them all, as judged by these myths and their associated explanations, are people like these 2 Youtubers who are either unable to grasp the basic fundamentals of mathematics or physics themselves or are knowingly trying to mislead. They like to remind you that their information is shared freely compared to other Youtubers who either charge for similar information or are paid discreetly to perpetuate it; that their word can be proven both valuable and trustworthy when compared to the underhanded dealings of others. It’s a reputation built on “whataboutism,” and clearly demonstrates that, even when free, you always get what you pay for.

If you actually made it this far and are unsure of what, if anything, you should take away from this post, it’s this:

First: A camera sensor’s pixel density is what affects image quality while sensor size alone plays zero role in this. It captures whatever image is projected by the lens, whether that projected image is larger, smaller or ideally sized for the sensor used.

Second: Reputation is everything because free information is only worth what you paid for it. Reputation is the only reason why these myths persist and is proof that someone can have a positive reputation with zero credibility. Seek credibility instead of popularity to find worthwhile sources to learn from.

Remember, nothing is ever free. You can read more about it in my latest e-book, available on the Amazon storefront. Just kidding.

Now, let’s

Leave a Reply

Please log in using one of these methods to post your comment:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

Create a website or blog at WordPress.com

Up ↑

%d bloggers like this: