Thanks to stay-home orders, work-from-home initiatives, and social distancing, we’re FaceTiming more than ever before. As of March 2020, nearly 50 percent of adults in the United States were using FaceTime as their video conferencing app of choice to talk to colleagues, family, and friends.
FaceTiming, like any other method of communication, is something you can be good or bad at. One key way to improve your FaceTime calls is by maintaining better eye contact. And luckily, there’s some nifty, next-gen Apple tech that helps you do just that.
Why is eye contact so important?
Good eye contact can improve relationships, build trust, help you bond with others, and generally make sure you and the other participants in a FaceTime call get the most out of the experience.
Maintaining eye contact with the people you’re FaceTiming with shows that you are actively listening and paying attention to them, and good eye contact can even help with retention of what is said during a video call.
A University of Wolverhampton study showed that the better the eye contact, the more people remembered the content of the call. Just 30 percent of time spent making firm eye contact increased what participants remembered had been said.
Am I not making good eye contact on FaceTime calls now?
Well, not exactly. While attempting to maintain eye contact with whoever you’re FaceTiming with, you’ll naturally want to look at your iPhone’s screen to see that person’s face. This means that from the other person’s view, it will seem like you’re looking slightly below eye level and therefore avoiding eye contact.
Apple’s new “Eye Contact” setting in iOS 14 aims to solve that issue.
What is Apple’s Eye Contact functionality in FaceTime?
FaceTime’s “Eye Contact” feature is advanced augmented reality software which Apple beta-tested in iOS 13 as the “FaceTime Attention Correction” functionality but officially released as “Eye Contact” with the launch of iOS 14.
Apple stated in the release notes for iOS 14 that this new functionality “helps make video calling more natural by helping you establish eye contact even when you’re looking at the screen instead of the camera.”
How does Apple’s Eye Contact FaceTime tool work?
The Eye Contact tool uses Apple’s sophisticated ARKit software framework. ARKit combines device motion tracking, camera scene capture, and advanced scene processing to simplify building an AR experience, such as your eyes appearing to look in a certain direction. Basically, Eye Contact makes your eyes appear to be looking directly into the front-facing camera of your iOS device using real-time augmented reality software and tech. This works with multiple faces, if you happen to be on a call with more than one participant.
Do I have the Eye Contact feature on my iPhone?
Only the most recent models of iPhone can support this clever Apple technology. You will find it on iPhone 11, iPhone 11 Pro, iPhone 11 Pro Max, iPhone XS, iPhone XS Max, and iPhone XR. The phone also needs to be running iOS 14.
How can I enable the Eye Contact feature on my iPhone?
Apple’s FaceTime Eye Contact functionality is actually enabled by default, so if you have one of the devices listed above and you’re running iOS 14, chances are you’re already using it.
If you want to check that Eye Contact is enabled, turn Eye Contact on, or disable Eye Contact if you’re not a fan of your real eyes being replaced by AR versions, it’s simple. Go to your iPhone’s Settings app, scroll down to see FaceTime, tap it and look for the Eye Contact setting. If it’s green it’s toggled on.
To turn it on or off, simply tap on the button.