I have seen quite a lot of the research meetings and Monty Overviews, but am still unable to find any references to any functional components of Monty that resembles similar to Hippocampal-Entorhinal Complex. the HC-EC complex plays vital roles in short term memory storage, Global Position System (Grid Cell Mechanism in Entorhinal Complex) and also has the ability to store what importance of specific temporal sequence over others and other functions. I believe this plays key functionality in vision (object detection). Speech Interpretation, touch sensation interpretation, and even while loading Object Reference Frames and global reference frames alike. Is this being implemented in Monty? Or did the team decide to incorporate this functionality somehow within the Learning Modules itself?
I’m curious to know the teams plans on this too. Though I do recall @vclay mentioning that they were actively working towards replicating hippocampal function into the TBP framework.
Being one of their goals seems to be that of improving model efficiency, I would imagine they’d be looking at some sort of dedicated structure, a kind of Tolman-Eichenbaum machine, as opposed to some sort of ‘self-attention’ type approach. (given the computational costs associated there.)