avatars(by sorcerykid)
Avatars was built from the ground up to provide a complete OOP mob development framework for building more realistic NPCs, monsters, friendly animals, etc. in Minetest.
- The GenericAvatar class for example implements the underlying Action API and dispatcher (for scripted-sequences of commands to be executed asynchronously), Travel API and dispatcher (for terrain-based pathfinding to random, directional, and designated waypoints), and Target API (for sensory analysis of environmental cues and propogation of events based on discrete awareness levels).
- The BasicAvatar class implements all of the decision making systems (via the sensory pipeline) and behavioral responses (retreat, follow, attack, search, patrol, and wander), most of which were inspired by Thief: The Dark Project. All of this functionality ties in with my new pathfinder, motion mechanics, and physics APIs to add a truly lifelike quality to mobs.

Testing Awareness & Alertness States [Avatars Mod]
https://vimeo.com/324076654
A quick demonstration of awareness and alertness states. Much of the design philosophy was influenced by the Thief AI's sensory system. Of course, the sensory input parameters can be tuned for different styles of gameplay.

Testing Waypoints & Pathfinder [Avatars Mod]
https://vimeo.com/324076654
Another demonstration of my upcoming Avatars Mod, now with interactive waypoint visualization and seamless pathfinder integration. The trolpoint and fleepoint design concepts were heavily inspired by Thief and the DromEd engine. For an interesting historical perspective on that project and the technical hurdles, check out this interview with the original creative director: https://youtu.be/qzD9ldLoc3c

Demonstration of Extended Motion Mechanics in Minetest S3
https://vimeo.com/316098852
I also overhauled the API for LuaEntitySAOs so that mobs can simulate true "player like" motion mechanics without the need for excessive and repetitious trigonometry calculations. In addition, the globalstep callback has been extended to provide continuous feedback about collisions and movement state. This is something I've long wanted to do!
Overview
Thanks to the the new API functions for LuaEntitySAOs, calls to set_yaw(), get_velocity(), set_velocity(), etc. are no longer necessary at every server step.
The new set_speed( ) and add_speed( ) methods lock the velocity in the xz plane permitting continuous movement in the direction the entity is facing (an offset yaw can be specified to ensure the model is correctly oriented). The add_speed( ) function makes it possible to introduce lateral movement in addition to forward or backward movement, giving the illusion of strafing or side-stepping. Successive calls to set_yaw(), will automatically recalculate the entity's trajectory.
There is also a smooth rotation function that works in conjunction with speed, so that monsters and NPCs can walk in circles, figure eights, or just about any imaginable shape with only a few lines of code. There are also a variety of other shorthand methods like add_yaw(), set_velocity_horz(), set_acceleration_vert() that help to eliminate a lot of redundant and extraneous Lua code.
And best of all, collision detection is possible in Lua without resorting to "hacks". The status of collisions is continuously reported via four new on_step() callback parameters: vel_old, vel_new, hit_y and hit_xy.

I reached another development milestone in my Avatars mod! Highly sophisticated sensory processing is added, with both visual and audio input queues so that AI can respond to a wide variety of environmental stimuli -- yes even sounds!
No more "immediately attack upon sighting enemy". Now AI are fully-fledge state machines capable of acknowledging intruders, ramping up awareness states over a set delay period or immediately depending on multiple alertness factors (including distance, illumination, and movement applied against a logarithmic curve of sensitivity within a given survey area), and then slowly cooling down over a set decay period. The screenshot below shows all of the different parameters that can be set.
When it comes to code optimization, I tend to be rather obsessed with details anyway lol. And the Avatars mod is no exception. I designed every component to be as scalable and robust as possible, and to operate within desired budgets. This includes rigorous unit testing and profiling to identify and mitigate potential bottlenecks, even prior to integration testing.
Here are the raw benchmarking results for the latest alpha build of the Avatars mod in singleplayer on my laptop and in multiplayer on my dedicated server.
Windows 10 (Laptop)

CentOS 6 (Dedicated Server)

All tests were conducted over the course of 60 seconds. The important column is the total, since that reflects the cumulative execution time for each corresponding test series. Note that these test series are cascaded:
Code: Select all
on_step (pre)
on_step {
on_travel
on_float
on_motion {
on_motion_stand {
is_paranoid
decision making (untested)
pathfinding (disabled)
movement handling (untested)
}
on_motion_walk {
is_paranoid
decision making (untested)
pathfinding (disabled)
movement handling (untested)
}
debugging visualization (untested)
}
}
I didn't bother testing the decision making or movement handling systems since those consistently contributed less than 10% percent to the overall cost of the on_motion functions which is negligible. In contrast, the on_paranoid function, which analyzes and filters all sensory inputs, consumed 2/3 of the total execution time. While that might be evidence of a potential bottleneck (albeit small), you'll notice in the test series on my laptop, the maximum recorded time was 7.211ms whereas the average recorded time was only 0.024 ms. This is due to an anomaly that occurs immediately after the entity emerges (haven't yet isolated the cause). So if that one divergent value were eliminated, it would give a far more accurate and favourable picture of overall performance.
Anyway, as a final stress test, I decided to throw 15 avatars into a singleplayer world on my Laptop and let them run around like crazy for nearly 2 minutes.
Despite 10,000 processing cycles, the on_step function still clocked in at an astonishing 0.2 seconds. Not to shabby :)


There's still much more testing to do, in addition to further refactoring and optimizing. But for the most part, the API is nearing completion after 7 long weeks of development (I think I'm approaching about 2500 lines of code). With any luck, my goal is to have at least an alpha version of Avatars and the CPP patches ready by mid-March :)