Teleoperation brings the advantage of remote control and manipulation to distant locations or harsh or constrained environments. The system allows operators to send commands from a remote console, traditionally called a master device, to a robot, traditionally called a slave device, and offers synchronization of movements. This allows the remote user to operate as if on-site, making teleoperational systems an ideal and often only solution to a wide range of applications such as underwater exploration, space robotics, mobile robots, and telesurgery. The main technical challenge in realizing remote telesurgery (and similarly, all remote teleoperation) is the latency from the communication distance between the master and slave. This delay causes overshoot and oscillations in the commanded positions, and are observable and statistically significant in as little as 50msec of round trip communication delay. Predictive displays are virtual reality renderings, generally designed for space operations, that show a prediction of the events to follow in a short amount of time. It can be used to overcome the negative effects of delay by giving the operator immediate feedback from a predicted environment. Furthermore, it does not suffer stability issues that arise with delayed haptic feedback. Early predictive displays included manipulation of the Engineering Test Satellite 7 from ground control where the round trip delay can be up to 7sec and Augmented Reality (AR) rendering where the prediction is overlaid on raw image data. These strategies can be applied to telesurgery, but require overcoming the unique challenges in calculating and tracking the 3D environment for a full environment prediction, which includes non-rigid material such as tissue. Furthermore, prior work in the surgical robotics community highlights the need for active tracking rather than only relying on kinematic calibrations to localize the slave due to the millimeter scale of a surgical operation and the often utilized cable driven actuation.