Posted April 25, 2021 (edited) For the technical details of camera_track, see the spoiler at the end of this post. Library: hlm.lua Sample (shown below): Vision Test.lua This library provides SAPP script writers with the ability to accurately determine a player's camera position and aiming directions (forward, left, up), using only SAPP memory functions and lua, up to and excluding camera interpolation, third-person camera hacks, and Halo script/console commands that control the camera. Practical applications include more accurate visibility and intersection testing from the camera and a boost command that works properly from vehicles (this is left as an exercise to the reader). Many of the functions provided by the library (hlm.lua) are reverse-engineered from the client, with the intent of preserving accuracy to the client. (There is a bit of a hiccup in the video at 0:38, my apologies for this). A sample script from my repository is attached (Vision Test.lua). If you intend to use this library, please read the instructions on setting it up. The sample script should get you up to speed on usage if you plan on using the library as pitched, although there are a number of related functions you may desire to take a look at, such as get_object_markers and get_node_transform. It should be noted that in the video above, the camera position as calculated is not clipped against the terrain nor certain collideables. This behavior is faithful to how Halo calculates the position of the camera for firing purposes. Technical details on the camera_track tag: Spoiler First, given a unit, its aiming pitch ϕ is determined, by taking the arcsine of the scalar value of the k̂ component of the unit's aiming vector (offset 0x23C from the unit's object memory, scalar of relevance is at offset 0x244). The pitch is then transformed into a normalized interpolation parameter t in the range of [0, 1] by t = (ϕ + π/2)/π. When the unit is looking all the way down, t will be near 0, When the unit is looking along the horizontal plane, t will be 0.5. When the unit is looking all the way up, t will be near 1. This interpolation parameter is used to select exactly 4 equidistant and distinct control points from the camera_track tag. If there are n+1 control points labeled from y_0 to y_n, then the spacing between each control point and the interpolation interval are given by h = 1/n, I = floor(t*h), respectively. Halo will select the control points with a preference of having two on both sides of the interpolation parameter in the input domain. Thus the control points selected are (x_i, y_i), (x_{i+1}, y_{i+1}), (x_{i+2}, y_{i+2}), (x_{i+3}, y_{i+3}), where i is given by i = {I, if I == 0; {min(I - 1, n - 3), if I > 0, and x_j = j * h, for j in [0 .. n]. Halo then evaluates the interpolating polynomial across these control points at the parameter t. Specifically, Halo uses the Newton Polynomial form using forward differences, by bringing out distributive factors and evaluating the most nested expression first. To see an implementation that respects evaluation order, see the generalized version plyinterp on line 242 of hlm.lua. Once the track-relative position of the camera is determined, it is rotated through k̂ by the unit's aiming yaw, as determined by the forward aiming vector at offset 0x23C from the unit's object memory (see lines 1257 through 1269 for the specifics of this operation). The camera track origin is given by the unit's first-person camera position (the same camera position that camera_set_first_person uses), which itself is quite a lengthy computation to describe (see get_unit_fp_camera_position on line 1040). This, at least in part, explains why the warthog model has a camera driver marker defined that is not referenced by the driver seat definition; if the camera marker name was set to camera driver, then the camera track origin would be that marker, resulting in a third-person camera that is offset slightly from the warthog center. Edited April 25, 2021 by PiRate can't see the attachments, i am noob, also note on firing camera (SBB) Storm, Sunstriker7, Takka and 4 others like this Share this post Link to post Share on other sites
Posted April 25, 2021 This is really cool and useful! I don't understand much of the math behind this, (I was happy when I figured out a way to find a point on the edge of a sphere given its origin). But my first thoughts were that this information could be used to make a wall-hack detection probability calculator of some type. Possibly by using statistics for how often and how long someone has their crosshairs on a player before having actual line of sight, combined with whether they have seen them at all since their last spawn, how many times they have had line of sight up until killing a specific player, and other information. I don't know how exactly to create a wall hack detection script, but from this set of information it seems like it would be possible. If there already is something like what I'm talking about I haven't been able to find a copy of it. In the end it would be an estimation, yet a useful estimation to take into account along with other variables when making a decision. The proposed script may be very difficult to create considering changes in fov, but I really think it's possible in some form. Share this post Link to post Share on other sites