Sony's 'Real-time tracking' is a big leap forward for autofocus

One of the biggest frustrations when taking pictures is discovering that your photos are out of focus. Over the past few years, camera autofocus systems from every manufacturer have become much more sophisticated, but they've also become more complex. If you want to utilize them to their full potential, you're often required to change settings for different scenarios.

The autofocus system introduced in Sony’s a6400 as well as in the a9 via a firmware update aims to change that, making autofocus simple for everyone from casual users to pro photographers. And while all manufacturers are aiming to make autofocus more intelligent and easier to use, our first impressions are that in practice, Sony’s new ‘real-time tracking’ AF system really does take away the complexity and removes much of the headache of autofocus so that you can focus on the action, the moment, and your composition. Spoiler: if you'd just like to jump to our real-world demonstration video below that shows just how versatile this system can be, click here.

When I initiated focus on this skater, he was far away and tiny in the frame, so the a9 used general subject tracking to lock on to him at first. It then tracked him fully through his run, switching automatically to Face Detect as he approached. This seamless tracking, combined with a 20fps burst, allowed me to focus on my composition and get the lighting just right, without having to constrain myself by keeping an AF point over his face. For fast-paced erratic motion, good subject tracking can make or break your shot.

So what is ‘Real-time tracking’? Simply now called ‘Tracking’, it’s Sony’s new subject tracking mode. Subject tracking allows you to indicate to your camera what your subject is, which you then trust it to track. Simply place your AF point over the subject, half-press the shutter to focus, and the camera will keep track of it no matter where it moves to in the frame - by automatically shifting the AF points as necessary. The best implementation we'd seen until recently was Nikon's 3D Tracking on its DSLRs. Sony's new system takes some giant leaps forward, replacing the 'Lock-on AF' mode that was often unreliable, sometimes jumping to unrelated subjects far away or tracking an entire human body and missing focus on the face and eyes. The new system is rock-solid, meaning you can just trust it to track and focus your subject while you concentrate on composing your photos.

You can trust it to track and focus your subject while you concentrate on composing your photos

What makes the new system better? Real-time tracking now uses additional information to track your subject - so much information, in fact, that it feels as if the autofocus system really understands who or what your subject is, making it arguably the 'stickiest' system we've seen to date.

Subject tracking isn't just for action. I used it even in this shot. Good subject tracking, like Sony's 'Real-time tracking', keeps track of your subject for you, freeing you up to try many different poses and framings quickly. Most of these 20 shots were captured in under 19 seconds, without ever letting off the AF-ON button. The camera never lost our model, not even when her face went behind highly-reflective glass. The seamless transitioning between Eye AF and general subject tracking helps the AF system act in such a robust manner. Not having to think about focus allows one to work faster, get more poses and compositions, so you can get to the shot you're happy with faster. Click here or on any thumbnail above to launch a gallery to scroll through all 20 images.

Pattern recognition is now used to identify your subject, while color, brightness, and distance information are now used more intelligently for tracking so that, for example, the camera won’t jump from a near subject to a very far one. What's most clever though is the use of machine-learning trained face and eye detection to help the camera truly understand a human subject.

What do we mean when we say ‘machine-learning'? More and more camera - and smartphone - manufacturers are using machine learning to improve everything from image quality to autofocus. Here, Sony has essentially trained a model to detect human subjects, faces, and eyes by feeding it hundreds, thousands, perhaps millions of images of humans. These images of faces and eyes of different people, kids, adults, even animals, in different positions have been previously tagged (presumably with human input) to identify the eyes and faces – this allows Sony's AF system to 'learn' and build up a model for detecting human and animal eyes in a very robust manner.

Machine learning... allows Sony's AF system to detect human and animal eyes in a very robust manner

This model is then used in real-time by the camera's AF system to detect eyes and understand your subject in the camera’s new ‘real-time tracking’ mode. While companies like Olympus and Panasonic are using similar machine-learning approaches to detect bodies, trains, motorcyclists and more, Sony's system is the most versatile in our initial testing.

Real-time tracking's ability to seamlessly transition from Eye AF to general subject tracking means that even when there was an eye to track up until this perfect candid moment, your subject will still remain in focus when the eye disappears - so you don't miss short-lived moments such as this one. Note: this image is illustrative and was not shot using Sony's 'Tracking' mode.

What does all of this mean for the photographer? Most importantly, it means you have an autofocus system that works reliably in almost any situation. Reframe your composition to place your AF point over your subject, half-press the shutter, and real-time tracking will collect pattern, color, brightness, distance, face and eye information about your subject so comprehensively it can use all that to keep track of your subject in real-time. This means you can focus on the composition and the moment. There is no longer a need to focus (pun intended) on keeping your AF point over your subject, which for years has constrained composition and made it difficult to maintain focus on erratic subjects.

There is no need to focus on keeping your AF point over your subject, which for years has constrained composition and made it difficult to focus on erratic subjects

The best part of this system is that it just works, seamlessly transitioning between Eye AF and Face Detect and ‘general’ subject tracking. If you’re tracking a human, the camera will always prioritize the eye. If it can’t find the eye, it’ll prioritize its face. Even if your subject turns away so that you can't see their face, or is momentarily occluded, real-time tracking will continue to track your subject, instantly switching back to the face or eye when they're once again visible. This means your subject is almost always already focused, ready for you to snap the exact moment you wish to capture.

The tracking mode lets you specify a subject and it'll prioritize their eye, switching to face detection if it loses the eye and treating them as a generic subject to track if they, for instance, turn their head away from the camera. Click on the images and follow the entire sequence to see how the camera focuses on my subject no matter where she walks to in the frame.

One of the best things about this behavior is how it handles scenes with multiple people, a common occurrence at weddings, events, or even in your household. Although Eye AF was incredibly sticky and tracked the eyes of the subject you initiated AF upon, sometimes it would wander to another subject, particularly if it looked away from the camera long enough (as toddlers often do). Real-time tracking will simply transition from Eye AF to general subject tracking if the subject looks away, meaning as soon as they look back, the camera's ready to focus on the eye and take the shot with minimal lag or fuss. The camera won't jump to another person simply because your subject looked away; instead, it'll stick to it as long as you tell it to, by keeping the shutter button half-depressed.

Performance-wise it's the stickiest tracking we've ever seen...

And performance-wise it's the stickiest tracking we've ever seen, doggedly tracking your subject even if it looks different to the camera as it moves or you change your position and composition. Have a look at our real world testing with an erratic toddler, with multiple people in the scene, below. This is HDMI output from an a6400 with 24mm F1.4 GM lens, and you can see focus is actually achieved and maintained throughout most of the video by the filled-in green circle at bottom left of frame.

Real-time tracking isn't only useful for human subjects. Rather, it simply prioritizes whatever subject you place under the autofocus point, be it people or pets, food, a distant mountain, or a nearby flower. It's that versatile.

In a nutshell, this means that you rarely have to worry about changing autofocus modes on your camera, no matter what sort of photography you’re doing. What’s really exciting is that we’ll surely see this system implemented, and evolved, in future cameras. And while nearly all manufacturers are working toward this sort of simple subject tracking, and incorporating some elements of machine learning, our initial testing suggests Sony’s new system means you don’t have to think about how it works; you can just trust it to stick to your subject better than any system we’ve tested to date.

Addendum: do I need a dedicated Eye AF button anymore?

There’s actually not much need to assign a custom button to Eye AF anymore, since real-time tracking already uses Eye AF on your intended subject. In fact, using real-time tracking is more reliable, since if your subject looks away, it won’t jump to another face in the scene as Eye AF tends to do. If you’ve ever tried to photograph a kids' birthday party or a wedding, you know how frustrating it can be when Eye AF jumps off to someone other than your intended subject just because he or she looked away for long enough. Real-time tracking ensures the camera stays locked on your subject for as long as your shutter button remains half-depressed, so your subject is already in focus when he or she looks back at the camera or makes that perfect expression. This allows you to nail that decisive, candid moment.

Comments

Some folks willsay "your canon cameras still work" blah blah all they want, but..if even half of this capability was avail on canon bodies, we would be dancing in the streets. (well, and better sensors...sheesh)

And of course, canons latest version of this tech on the RP, just happens to be on a body with a garbage sensor.

Because, its canon, of course. AND no new bodies this year, how nice. Just bury the 7d series already, the 80d is already dead, rebels are out of here, the 6d is finished

Ill always love my canons, but im adding either one of these or a nikon to my kit.Not waiting on these slow lazy cheap bums any more.

I was on the fence with the A7III vs A9 purchase - this firmware update made it easier for me to go for the A9. I'm impressed at the lack of contrast focus point on the A9 is being rectified. Well done Sony for making such a big upgrade on a 2-year old camera!

I propose a series of features for the new compact A6xxx mirrorless cameras:- Multi-articulated rear screen like the FUJIFILM X-T100.- Headphone through blue-tooth.- Close the shutter when the lens (lock-button) is removed.- Improved FHD 1080p quality.- Compact and no retractable lenses like the power-zoom Lumix 45-175.

Will this work on adapted lenses as well? I have some EF-S lenses I'd like to keep even if I bought an A6400 - both because they're good lenses and because it would bring the cost of changing brands down.

the 3rd-gen ff sonys can utilize all of the focus modes with adapted glass, depending on the adapter that's used... mc-11 with compatible sigma lenses is the best, because it looks like native e-mount glass to the camera body.

it's reasonable to assume that the a6400 will inherit that level of functionality, but you may want to rent first, with a couple of adapters, and test it with your specific lenses.

Not that this is a fantastic use of AI in a camera, and not that this would be a deal breaker, but I was curious how far have they gone with AI. Great family, and love the work you do to bring the latest info to us, to make a better decision.

Can you imagine that a few years ago when I started using a DSLR and heard that Nikon had the best professional tracking, and this is what I expected, only to be disappointed when I actually tried to use one. I guess I was thinking way ahead of the times.

@Rishi Sanyal . You say real time tracking uses other information besides face and eye information to track without loosing subject. My experience with A9 with latest update and face priority for AF-C is that if face is partly and temporarily obscured by say the peak of persons hat the focus will immediately jump to another nearby face that is not obscured. This is not the same as shown in your video. What were all your camera settings? I would like to learn from you. But thanks for all your hard work. I have learnt a lot from your articles

the a9 has a stacked sensor, which allows for super-fast readout of the sensor data... sony has claimed that the a9 can make up to 60 af/ae measurements a second, and you'll be using the electronic shutter for just about everything that doesn't require flash.

people have speculated that the fastest a7 series cameras can take up to 20 af/ae measurements a second, but I don't recall ever seeing documentation for that... just know that stacked sensors are far superior for af.

I got the idea of new tracking feature and it sounds promising. Not sure though if/when it will make me switch from A99II to E-Mount at some point. But from my experience shooting sports events I'd like to know if the new system was tested also in situations when the camera is moved to follow the subject across a longer path - a pretty common shooting situation in trail running, mountainbike trail riding, but also surfing, for instance. Does the new tracking help to keep the focus on the subject during the sweep? If not tested yet, I could imagine that more photographers would like to have an assessment about this. Thanks.

I've shot several pro surfing sessions using the a9, it already rocks for that, due in part to the af improvements from the last 4 firmware updates that it already got.

for that type of shooting, some aspects of these two new updates may not be fully realized, because when you are shooting loose framing of full-body targets, eye-af isn't usable all of the time... the eyes are simply too small in the frame.

people don't realize how much sony already improved the a9 af, with those 4 firmware updates... these latest updates will probably be just icing on the cake, for applications like shooting surfing... however, for weddings, portrait shooting, etc, it could have a much bigger impact.

I just bought a Sony A7iii since I shoot a lot of sports stuff. I figured Sony would be adding functionality via firmware like some of the other companies. However, I really don't appreciate games like this to differentiate particular models, some cheaper and some not. If you want to put it on your top model, fine, but don't stick it on a low level model just to goose sales. A new series of cameras like the 7iii should include firmware features like this. If this were a recently introduced Samsung phone, you'd get the improved functionality. I may return it and get something else non-Sony.

I don't think your assessment of goosing sales is accurate here. The a6400 is a new model, and it's the first to have the new focusing mode.

Many companies would only put a high-performance AF technology in their more expensive cameras, to boost THOSE sales. I can't imagine a lot of the camera companies would sell a $900 camera with focus performance they'd put in a $3000+ camera.

It's in the a6400 because it's the first camera to come out since they made this AF technology, not because it's some plan to artificially generate interest.

Only Real-time Eye AF is coming to the a7 III and a7R III. Meaning the camera will automatically switch between Lock-on and Face Detect and Eye AF. But the 'general' subject tracking mode will be 'Lock-on AF', not the new Real-time tracking mode, so it won't use the new pattern detection and updated color/brightness/distance information algorithms to track your subject as robustly as what you see here.

I think features such as this are good, however... I've been into photography for over 50 years. I've shot weddings, events, wildlife, indoor & outdoor concerts, wildlife, portraits, street and you know what? I can't remember ever missing an important shot. Sure there were misses but seriously... What am I doing wrong?

You know with all the technical advances in photography equipment, notwithstanding current features, website such as these and most others are still saturated with mediocre work, some so poor as to be painful to look at. Those features are good but they do not make a photo and probably never will. For the 98% they sell dreams and exude a captivating sense of photographic excellence but no mater the gear the results remain the same.

I'm no pro. Probably never will be. But I'm getting more shots in focus and having more fun with my A7iii than I ever did with Nikon. And I haven't even begun to use the A7iii to it's full potential. A person takes the photo, but the camera makes the image. And a better camera helps in that respect.

So go back to using a manual camera! Just leave the newer technology to those of us who appreciate the accuracy, precision, freedom, and effectiveness it provides us. No one is forcing you to buy newer cameras.

I first took up photography in 1989. So I've been shooting for 30 years. And I've gotten plenty of "important shots", but there are also plenty that I've missed. And there are plenty of shots I would have never even attempted with older, more primitive camera technology. Or you do workarounds like focus-lock-recompose, etc. These days, newer AF technology offers much more freedom, efficiency, reliability, and seamlessness. I have no desire to go back to older cameras, other than for the sake of nostalgia.

What are you doing wrong? Nothing, really. "Wrong" is a strong word. But I would surely classify your "I'm so awesome I don't need autofocus at all!" mentality as annoying to those who actually like technological progress. But that's okay. Some other old people such as yourself don't even like cell phones, because "Back in my day, rotary phones got the job done! You young'uns with your SnapTube make me sick!" By the way, I'm a pro. (A younger one...one who is more of a Lightroom pro than a Darkroom pro.) And I actually like technological progress, thanks. Only the first part of your comment's first sentence is truly important, namely, "I think features such as this are good..." Maybe leave it at that and stop the "I'm so awesome" antics that are so stale on sites like these.

If you're going to use images to make a point, then don't use images that aren't related. Even though you admit it, you're still lying when using the image of the couple smacking each other in the face to illustrate a point. And, folks wonder why the media is losing credibility...

"Real-time tracking's ability to seamlessly transition from Eye AF to general subject tracking means that even when there was an eye to track up until this perfect candid moment, your subject will still remain in focus when the eye disappears - so you don't miss short-lived moments such as this one. Note: this image is illustrative and was not shot using Sony's 'Tracking' mode."

I appreciate your pavlovian reaction Sony good is very hard to overcome , but had you read DustyBins's comment and link you would have seen .That they had images from the same months old samples with very different conclusion under each . Yay Sony !!!!

James Stirling wrote: That they had images from the same months old samples with very different conclusion under each . Yay Sony !!!!

It's not the same picture. It's the same couple, but it's a different image.

Even if it was the same image it wouldn't matter because you're using it for illustrative purposes (in other words the picture only has to go with the words).

The picture of the the couple sniffing each others hands could be used to illustrate an article on weddings, bride's dresses, cultural traditions, love, event catering, family planning, pensions, mortgages - you name it.

It's the principle stock libraries operate under. One image can be used to sell a wide range of products/services/ideas. Yay stock libraries!

Get a grip OP. We haven't had enough time with the a6400 to shoot the multitude of weddings I've shot in the past. I was really lucky in nailing that particular shot by responding fast enough moving the AF point and nailing focus, while the couple held the post for a couple seconds.

Today, with Sony's 'Real-time tracking' AF, I could have easily snapped the shot without thinking, at the exact moment it happened.

That's why the image is illustrative. If you really want, I could remove it and replace it with a block of text explaining the concept that no one would read, but how would that help our audience?

Your response actually made me laugh out loud. With almost 30 years’ experience in the PR industry, I understand how real journalism works more than most people. I’m not questioning the technology, but the integrity of the journalist who includes an ‘illustrative’ image that wasn’t even taken with the same camera. He devalues the fact-based information in the article by using something non-factual to make a point... that’s what advertisers do.

Nice way to treat your readers.... “Get a grip”!! So, your writing isn’t compelling enough for people to read? Are we not smart enough to understand a concept without images? Do you not understand that you diminish the advice you’re giving by proving that similar results can be achieved without the tracking technology? And, given your point about using flash, then you wouldn’t have to ask the couple to hold the pose... just have enough DOF and use flash to freeze the moment.

There are many photographers who still don't understand the point of subject tracking. Illustrations help. Lots of words, well, sometimes just confuse people. Video demo would've been even better.

Not sure why you're bringing flash into this. My point is that the subjects will be pre-focused so you can nail the decisive moment, whereas a different focus technique might cost you the shot (because it'd be slower). Flash freezing the moment has nothing to do with this. I could freeze the moment with a fast shutter speed - how is that related to this article?

Stop down for enough DOF? Sure, but if you're cool with that, then there's no need for you to even have any interest in this technology or article.

I suppose you haven't seen a blind test by Tony Northrup. One can search for "Tony Northrup color science" on youtube and see what manufacture was preferred by the 1500 photographers at blind tests. You will be surprised ;)

"Today, Sony has announced that they have overtaken Canon and held the number one overall position in the U.S. full-frame interchangeable lens camera market for the duration of the first half of 2018, in both dollars and units."

From Imaging Resource: "...With considerably less processing power and a slower sensor read speed, the A6400 struggled to offer the same experience as the A9 in practice, despite the expectations set by the public Sony marketing presentation. Don't get us wrong, it's still quite good for an entry-level camera (if you consider $1000 entry-level), but it's obvious there are certain hardware requirements necessary for truly excellent implementation that are lacking in the A6400. "

Sony colors, from A7/3 series are not problem anymore. People coming from Canon or Nikon have to learn how to develop Sony raw in order to get what they want. It's not realistic approach to expect same colors from different manufacturer straight out of camera.

Yesterday I put down an order for 2 A7III's as I am finally making the jump from Canon. Do I understand it correctly that the A7III does not and will not have this feature? If that is the case, does it mean the A9 is suddenly a better choice for AF capabilities?

contrast detection points: the A7 III has 425 contrast detection points, and the A9 only has 25 which I guess gives the A7III an edge in low light. But with sony leaving the A7III users behind on this crucial update, I'm really confused what camera to go for. AF accuracy is the number one reason I switch from Canon so I want the most capable Sony camera...

There are 2 things that make the A9 better at AF than the A7III - the stacked sensor and the BIONZ X processor. You need both to get the A9's AF performance - the sensor provides more data/sec and the better processor is needed to process it. If you want the absolute best AF you have to go with the A9.

@DeathArrow To get the A-level performance I think you need both. As I understand it the A6400 has the same BIONZ X processor as the A9 but doesn't have a stacked sensor. So it seems you need the BIONZ X processor to get any RT tracking and the stack sensor to get the best RT tracking.

The rumored A7000 is being called a "mini-A9" so I'm guessing it will have an APS-C stacked sensor and the BIONZ X processor.

The minimum is you need the same A9 BIONZ X processor to do the calculation.The A9 Stack sensor feed the processor with more data to get the frame rate and AF accuracy to the A9 level.The A7iii and A6400 will have no match to raw AF power the A9 has with the up coming firmware V5 & V6. And properly there are more potential to the A9 because of the combination of tech inside when the lower level max out on performance.Do you really think any electronic company will simply let their low end kill the top end? The A7iii can match the A9 is in a relative terms with in what most people use. A9 will out perform all the Sony lower end for where it designed for.

As a sports shooter, I'm very interested in seeing what Canon does with the 1DXMKIII. Sony has always been innovative. A platform supported is what professionals need to put food on the table, kids through college and a cabin in the woods. From a fashion and beauty perspective, I love the 645z and it's sealed for dust and moisture and the AWB is spot on. Right now, today... I wish there was a 1DXMKIII that had all the features of the MKII and 645z combined with cross-points across the full OVF. If like the SONY we could get 20 fps on a REAL 50mp MF sensor, I would buy that camera and all the glass to go along with it. It must be built like a tank so when a linebacker or kitesurfer hits you, even if you die.. Your camera will survive and your big whites won't snap off. #KissMyKite

Can that central area differentiate between a whole object, a face, and an eye? And as others have mentioned, keeping a subject in the center all the time is a big negative. For example, that's a big issue if you are shooting people, unless you always want the person's face smack dab in the middle of the image all the time.

Other companies do the same, except that they make you wait 5 years (like Canon and Nikon do) between new models to get any new features! Sony, on the other hand, moves very quickly with new product iterations, so buyers can access these features much sooner if they so choose.

"That said, once you try and take control and specify your subject, you may be disappointed in the camera's ability to stick to it reliably: both Eye AF and the Lock-on AF area modes can to jump to other subjects. Initial subject recognition in Lock-on AF also isn't as quick as the best DSLRs, or even the a6500's own 'Wide' mode where the camera automatically chooses a subject."

There was a link in that text that didn't copy to a video showing the issues.

** Finally **. This is what every Soccer Mom wants. A Simple but (Reliable) AF that is capable of capturing a child-on-the-move. It is SAD that its take engineer SO LONG to make this into a reality. Most consumer JUST WANT a reliable AF camera that can be use to take picture of their children the move. Sadly, most walk-away disappointed by their dslr and mirrorless alike.

Engineers were busy building AI learning on ARM architecture and wait for the silicon chip to be small enough to fit within the thermal headroom and to give you meaningful battery life. In the mean time, engineers still have to wait for manufacturers and marketers to push mirrorless camera to the mass. Not engineers or programmers fault. Lolz.

Simple to use manifestly does not mean simple to develop and engineer. I find the condescending attitude to the engineers who have developed what is, by any standards, a massive breakthrough in camera intelligence difficult to understand.

It's a truism of artificial intelligence that what people find easy to do, computers do not and vice-versa. What a person can do naturally in following and individual and understanding the context is an extremely difficult thing to design into a camera. It's akin to some of the things required for a self-driving car, but squeezed into a tiny camera body and with only one sensor. In time, we can expect much more to be done, and this is just a step, but a very large one.

The discrepancy in experiences with the A6400 is quite puzzling. Everyone seems to agree that this tech works superbly on the A9 but Matt Granger struggled to get it to focus past a twig and Imaging Resource's assessment of the A6400's tracking reads like a dpreview review of Pentax.

? IR is still trying to determine its hit-rate, and admitted that what was troubling was the speed with which the AF point followed the subject during bursts. We experienced the same, when you're shooting fast bursts, but many of our shots are still in focus.

As I mentioned earlier, we'll publish full bursts soon.

I will say that the a9 tracked even better than the video you see here, if that were possible - e.g. when a subject passed in front / near your subject, it rarely switched subjects, while the a6400 sometimes did.

This is due to the a9's faster update rate - 60 Hz (times/second). This article is more about the potential of this new AF algorithm. Its performance will vary from body to body, as expected of any brand. Nikon 3D Tracking e.g. is very different on a D850 than it is on a D5. It's complex and depends on how fast the camera can read out the sensor to update its AF calculations. Currently the a9 is the fastest.

AF-A cannot be used for Real Time Tracking because the AF system needs to be continually evaluating the scene in order to be tracking in real time. AF-S obviously can't be used because the camera stops tracking when it locks onto the subject by definition.

AF-A cannot be used because this is simply a system where the camera switches between AF-C and AF-S depending on what the subject movement is. So at any point, when the camera determines the subject is still in AF-A, it acts as AF-S.

A bird in tree branches won't be a situation where you'd usually be using the real-time tracking modes - the earlier series of these cameras already did a fine job of focusing through heavy clutter to a small subject partially obscured - the A6000, A6300, and A6500 all can do that quite easily by using AF-S focus mode, and the smallest AF Flex Spot point. I suppose once the animal-eye-AF mode debuts, you may be able to have it detect the bird and follow it even as it hops along behind branches, assuming of course the bird is facing you so the eye/eyes are visible...though I suspect a good number of birders would still stick to the flex spot and AF-S method which works with high accuracy already.

Impressive. Glad to see that Sony is not resting on its mirrorless laurels and is continuing to innovate. Ultimately other manufacturers will answer, so it's good news for all. Keep up with the good job. Sony !

That would be programming, NOT artificial intelligence!! Artificial intelligence would be telling the camera to focus on the bird WITHOUT having programmed that how to focus on the bird or what a bird is! 🤦‍♂️🙄 it’s learning like a child learns. You don’t program children, they learn

Never been a big fan of wide-angle astrophotography in general. Close-ups of celestial objects are more interesting. Also, to shoot really good wide-angle astro-landscape, you need very dark skies lacking in water vapour since it diffuses light. Desert-shooters are in the best position to shoot astro-landscape shots.

The A6000, A6300, and A6500 all outperformed Sony's FF mirrorless cameras at the time they were launched. It's only the A9 that's leaps and bounds ahead. Even the A7x III series is only slightly ahead. The A6400 fixed that quick enough, though. ;)

You are clueless and not worth any time responding to. Sony has a fantastic 70-200, a 100-400 which is great for daytime sports, and. 400f2.8 which is expensive but also fantastic. Go take your hate elsewhere.. wait until EF and F Are abandoned by Canon and Nikon. It WILL happen. It’s only a matter of time. It may be 10 years from now, but it’s just because they are that far behind on the next gen technology...

During handheld videoshoot using EVF when touchscreen is not available, the joystick can allow you to de-select what you have selected (by pressing the joystick) and select something else (by relocating focus point). The use case is small but still present.

Dedecos - great question. I no longer ever use the joystick. Even for static subjects, I just place my center place over it, half-press, recompose.

Far faster than using the joystick for AF systems as reliable as this one.

Now, there are occasions where it's useful. E.g. when you can't initiate AF on the subject until it enters the frame (a motorcyclist appearing from behind a hill and making a jump, for example) - then I may wish to place the AF point at least close to where I expect him/her to enter the frame for the composition I have in mind.

So I wouldn't say you never need it, but it's less and less necessary once you start truly using and relying on subject tracking.

and other cameras C-AF is not doing real time tracking?it's just the speed of the AF processing aided by certain Algo.it's never real time as such REAL TIME because there is processing lag even if it's in nano seconds.

It's really real-time AF mode switching, from single to continuous AF to continuous face AF to continuous eye AF, etc. The camera automatically switches seemlessly between all these focus modes "in real time", without the need for human intervention. Just look how quickly the system switches from mode to mode, without skipping a beat.

As Horseshack says in a earlier post further down, Sony should have named Real Time Focus something else because it's confusing as cameras always did 'real time focus/tracking. As T3 explains it's basically a continuous transitional AF system that keeps up with the subject. I did think just like you because the name is off a bit.

You can talk about lag if you want. And that it can’t be real time if there’s even nanoseconds of lag. But the fact is, based on the video here, the lag is the smallest here compared to any other system.

And Did you know it takes 10 nanoseconds for the light from your subject 10 feet away to reach your camera? Ther is lag there too.

Fact is, today, this tracking is as real time as practically necessary...

The Fujifilm X-T30 and Sony a6400 are two of the newest, most exciting mid-range mirrorless cameras on the market. Chris and Jordan break down the differences between these models to see which comes out on top.

We've spent a little more time shooting with Sony's new a6400, and as we work towards the completion of a full review, we've updated our initial gallery of sample images with additional shooting in and around Seattle.

Latest in-depth reviews

The Canon G5 X Mark II earns a Silver Award with its very good image quality, flexibility and the overall engaging experience of using the camera. However, if you need the very best in autofocus and video, other options may suit you better. Find out all the details in our full G5 X II review.

360 photos and video can be very useful for certain applications (as well as having fun). The Vuze+ is an affordable 360 camera that supports both 2D and 3D (stereo vision) capture, and might be the best option for someone wanting to experiment with the 360 format.

The Mikme Pocket is a portable wireless mic with particular appeal to smartphone users looking to up their game and improve the quality of recorded audio without the cost or complexity or traditional equipment.

The 90D is essentially the DSLR version of the EOS M6 Mark II mirrorless camera that was introduced alongside it. Like the M6 II, it features a 32MP sensor, Dual Pixel AF, fast burst shooting and 4K/30p video capture. It will be available mid-September.

Latest buying guides

If you want a compact camera that produces great quality photos without the hassle of changing lenses, there are plenty of choices available for every budget. Read on to find out which portable enthusiast compacts are our favorites.

Whether you're hitting the beach in the Northern Hemisphere or the ski slopes in the Southern, a rugged compact camera makes a great companion. In this buying guide we've taken a look at nine current models and chosen our favorites.

What's the best camera for under $500? These entry level cameras should be easy to use, offer good image quality and easily connect with a smartphone for sharing. In this buying guide we've rounded up all the current interchangeable lens cameras costing less than $500 and recommended the best.

If you're looking for a high-quality camera, you don't need to spend a ton of cash, nor do you need to buy the latest and greatest new product on the market. In our latest buying guide we've selected some cameras that while they're a bit older, still offer a lot of bang for the buck.

Whether you're new to the Micro Four Thirds system or a seasoned veteran, there are plenty of lenses available for you. We've used pretty much all of them, and in this guide we're giving your our recommendations for the best MFT lenses for various situations.

Blackmagic has announced an update to Blackmagic RAW that adds support, via plugins, to Adobe Premiere Pro and Avid Media Composer. Blackmagic also announced a pair of Video Assist 12G monitor-recorders with brighter HDR displays, USB-C recording and more.

Sony has announced the impending arrival of its next-generation video camera system, the FX9. The full-frame E-mount system is set to be released later this year with a 16-35mm E-mount lens to follow in spring 2020.

The Canon G5 X Mark II earns a Silver Award with its very good image quality, flexibility and the overall engaging experience of using the camera. However, if you need the very best in autofocus and video, other options may suit you better. Find out all the details in our full G5 X II review.

The Fujifilm X-A7 is the newest addition to the company's X-series lineup. Despite its relatively low price of $700 (with lens), Fujifilm didn't skimp on features. Click through to find out what you need to know about the X-A7.

The entry-level Fujifilm X-A7 improves upon many of its predecessor's weak points, including a zippier processor, an upgraded user experience and 4K/30p video capture. It goes on sale October 24th for $700 with a 15-45mm F3.5-5.6 kit lens.

Robert Frank's unconventional approach to photography and filmmaking defied generational constraints and inspired some of the most influential artists of the 20th century. He passed away today at age 94.

All three devices offer a standard 12MP camera plus, for the first time on an iPhone, an ultra-wide 13mm camera module. The 11 Pro and 11 Pro Max also retain the telephoto camera of previous generations.

Phase One's new XT camera system incorporates the company's IQ4 series of digital backs with up to 151MP of resolution and marries them to a line of Rodenstock lenses using the new XT camera body. The result is an impressively small package for one of the largest image sensors currently on the market - take a closer look here.

Phase One has announced its new XT camera system, which includes an IQ4 digital back, body (made up of a shutter release button and two dials) and a trio of Rodenstock lenses. The company is marketing the XT as a 'travel-friendly' product for landscape photographers.