What FaceTime Attention Correction does, is it basically fakes eye contact, even though you're looking at the screen. Apple is able to do this by leveraging the power of ARKit and computational image manipulation.
According to Rundle and Sigmon, FaceTime Attention Correction is currently working on the iPhone XS and XS Max (and possibly the iPhone XR) running iOS 13 beta 3, while the iPhone X doesn't seem to support it. We can't say for sure how much processing power the effect requires, but considering that the iPhone X also doesn't have real-time previews for post-processing effects like HDR in the Camera app, FaceTime Attention Correction may be reserved for the newer models only.
Dave Schukin explains that eye contact correction uses ARKit to "grab a depth map of your face" and then employs computational imaging to readjust your eyes in real time. It sounds complex, but you can look at it as just a Snapchat filter (only a much more meaningful implementation of the technology).