12-31-2015, 02:32 PM
As there are some doubts in this thread about AF reliability and as well PDAF (phase detection AF) vs CDAF (contrast detection AF) I thought best to open a new thread, run some tests and find out what’s on in AF business.
First, the concerns are reasonable - PDAF is just spray and pray, sometimes a happy hit and soemtimes a crappy sh.. . But: That’s only valid for fast lenses or long focal lengths at short distances. Most lenses at f/2.8 or slower and under 100 mm might just have enough depth of field to make the shot look good.
There was also a debate in this thread, that Sigma lenses focus worse than genuine Nikon lenses (can’t speak for Canon, others might run a test, too).
The setup was a D750 on a sturdy tripod pointing towards a target I print out from Reikan’s website. For the lenses between 35 and 105 mm the distance was 3 m, the 300 mm lens was too close at 4 m (but I only wanted to check focus reliability, not AF adjustments) and the 20 and 24 mm lenses were used with distance of 1.7 m, just to make it easy to spot the target. All lenses were AF calibrated before. Disclaimer: It’s not test about general quality Nikon vs Sigma - those are only my lenses. To make such a test with a wider conclusion, I’d need a lot more lenses and since I’m aware or convinced that AF with PD will never be as precise as CDAF in LiveView, I leave that statistically relevant test to others.
The setup made automatically 10 shots, and put a graph of the different sharpness results. At the end I got a number for those 10 shots. The next ten shots would give another number, just saying… It’s not about probability, it’s more how often were the shots very close to each other and if not, how much, was the difference. I don’t know the exact formula of this process. At the end I got 14 PDFs with 18 pages each for each lens and that number is only sometwhat abstract, a shot of a moment. Also, this test is only for static objects. Focus reliability with AF-C (continuous focusing until shutter release) is impossible to check or at least very difficult to compare.
In short, here are my results:
Sigma
20/1.4 98.3 %
24/1.4 99.7 %
35/1.4* 96.8 %
50/1.4 98.8 %
(another ten shots came up with 99.3 %
)
24-105/4 @ 90mm 97 %
24-105/4 @ 105mm 99.6 %
* I suspect the 35/1.4 needs a service, there’s a kind of akward sound in the lens. If it has to be repaired, I’ll repeat the test.
Nikon
14-24/2.8G @ 24 mm 99.6 %
85/1.4G 96.3 %
300/4 G 98.6 %
70-200/4 @ 105 mm 99.3 %
105/2.8 Micro 99.5 %
Here are three charts of those rows:
70-200/4
85/1.4G
24/1.4 Art
Now, I don't see a real big advantage from Nikon over Sigma. The 35mm appears to be far off and more a candidate for guessing, but as I wrote, I need to have it checked.
I also tried a row with Fuji XE-2 files - but this JPGs are not supported by FoCal
First, the concerns are reasonable - PDAF is just spray and pray, sometimes a happy hit and soemtimes a crappy sh.. . But: That’s only valid for fast lenses or long focal lengths at short distances. Most lenses at f/2.8 or slower and under 100 mm might just have enough depth of field to make the shot look good.
There was also a debate in this thread, that Sigma lenses focus worse than genuine Nikon lenses (can’t speak for Canon, others might run a test, too).
The setup was a D750 on a sturdy tripod pointing towards a target I print out from Reikan’s website. For the lenses between 35 and 105 mm the distance was 3 m, the 300 mm lens was too close at 4 m (but I only wanted to check focus reliability, not AF adjustments) and the 20 and 24 mm lenses were used with distance of 1.7 m, just to make it easy to spot the target. All lenses were AF calibrated before. Disclaimer: It’s not test about general quality Nikon vs Sigma - those are only my lenses. To make such a test with a wider conclusion, I’d need a lot more lenses and since I’m aware or convinced that AF with PD will never be as precise as CDAF in LiveView, I leave that statistically relevant test to others.
The setup made automatically 10 shots, and put a graph of the different sharpness results. At the end I got a number for those 10 shots. The next ten shots would give another number, just saying… It’s not about probability, it’s more how often were the shots very close to each other and if not, how much, was the difference. I don’t know the exact formula of this process. At the end I got 14 PDFs with 18 pages each for each lens and that number is only sometwhat abstract, a shot of a moment. Also, this test is only for static objects. Focus reliability with AF-C (continuous focusing until shutter release) is impossible to check or at least very difficult to compare.
In short, here are my results:
Sigma
20/1.4 98.3 %
24/1.4 99.7 %
35/1.4* 96.8 %
50/1.4 98.8 %
(another ten shots came up with 99.3 %
)
24-105/4 @ 90mm 97 %
24-105/4 @ 105mm 99.6 %
* I suspect the 35/1.4 needs a service, there’s a kind of akward sound in the lens. If it has to be repaired, I’ll repeat the test.
Nikon
14-24/2.8G @ 24 mm 99.6 %
85/1.4G 96.3 %
300/4 G 98.6 %
70-200/4 @ 105 mm 99.3 %
105/2.8 Micro 99.5 %
Here are three charts of those rows:
70-200/4
85/1.4G
24/1.4 Art
Now, I don't see a real big advantage from Nikon over Sigma. The 35mm appears to be far off and more a candidate for guessing, but as I wrote, I need to have it checked.
I also tried a row with Fuji XE-2 files - but this JPGs are not supported by FoCal