Your Robot Expertise is Requested!

Thank you so much @RDaneelOlivaw, that’s a helpful list of things to consider. I’ve passed it on to the team! <3

Yes thanks very much @RDaneelOlivaw , that’s super helpful!

@rolly also thank you for sharing the robots you have, the 3D scanner is quite similar to the “Palm Pilot” we are considering as one of our robot designs, so it could be interesting to see if we can eventually adapt the implementation to also work with your hardware. Also thank you for highlighting the Maker Plate, that’s useful to be aware of.

Re. the requirements you mention (listed below), are. you able to expand a bit on what you mean by these, and how they are not currently present in Monty?

1. Having sensory/motor parts run apart from the main Monty
2. A distributed form of CMP
3. A way of sharing Monty Learnings amongst the Centre, and many hubs
  1. Monty parts currently only run on a single host. Your robot project must needs this to no longer be the case.
  2. Distributed CMP: Forgive me if I’m mistaken, but the current Monty implementation has a State class (expressing CMP) that is only passed between Monty pieces on the same host. A Python class needs help to be passed between hosts ergo DCMP. This has been mentioned on this forum before.
  3. When Monty exists at multiple sites (perhaps a terrifying thought to you), when one Monty has learned something (initially just object recognitions), it would be very useful to share it with the others. Unlike humans, a Monty network only needs to learn something once.

One thing I’m excited to have happen in the Monty code base (also a necessary condition for your robot project), is for its sensor & motor classes to interact with real devices, not just simulators. Once the code has broken free, I can start to introduce Monty to my devices.

2 Likes

Thanks @rolly for clarifying the terminology, yes I absolutely agree those things are not present and would be something we need to add, particularly 1+2, as well as the real-sensor interface. 3 will be really nice for swarm settings and multi-modality, but is less of a strong requirement.

To give you a sense of what we’re thinking, an early robot will likely implement the sensor modules and motor systems locally, deferring the LM processing to a wirelessly connected server. This will require processing real sensor data into the CMP on the robot’s system, and then sending this CMP message to a different machine.

4 Likes

I have one of the following robot kits from SunFounder. It’s a pretty simple build process that should only take a couple of hours (can be done in advance of the hack-a-thon). In addition to the cool rocker-bogey chassis, it features an Arduino Uno R3 with on-board wireless connectivity, a ESP32 camera module mounted on a servo (tilt up/down), one ultrasound range finder, two tunable infrared distance switches, and a solar panel.

It comes with software that allows tele-operation (including FPV from the camera module) over a wifi connection. It also comes with some simple autonomy programs (e.g. object avoidance and following). They’ve published the source code along with some tutorials, so it should be fairly straight-forward to interface with Monty using the tele-operation commands.

A potential project could be to enable the system to do autonomous navigation (exploration, mapping, etc) while charge is high, and then seek a sunny spot to recharge when the battery starts to get low.

4 Likes

somewhat late to the discussion, I see one platform has been overlooked for convenient streaming of real-time sensor data: https://www.grisp.org/
it’s a bare metal erlang/elixir platform that could send the sensor data through whatever channel one would choose, and @Rich_Morin suggested several options for the Erlang/Elixir ecosystem.

Why would one use this board? PMods - it has readymade drivers for them that won’t need any fragile compilation in C/C++, plus one gets quick turn-around times through hot code reloading. I have one board with a hygro and an 9-axis pmod. Here’s a vid of a hot code reload of a simple “overload” meter: mp4.

lots of “senses” are available

I probably won’t be able to participate in a hackathon but I could try out smaller things on my board in advance, e.g. streaming the sensor data to a piece of Python code on a laptop.

As for receiving streamed data in a fixed loop - I had good experience with the ZeroMQ protocol, as an actual receiving of messages happens on the i/o thread, and at the time of an API “receive” one can fetch all the available messages without waiting (until there’s none) and aggregate/ignore/select the latest one with them.

3 Likes

Quick update for this thread.

First of all - many thanks for all the great responses and things to consider. This community is so fantastic already! :heart_exclamation:

The internal hackathon date is now set for the week of the 18th of May. The TBP team will be flying into the San Francisco area to build for a week with much caffeine and much less sleep.

There are three teams

  • The Drone Team.
  • The Ultrasound Team
  • The Palm Pilot Team

Out of this internal hackathon should come the following things (if all goes well)

  • Videos for each of the three teams projects (which we’ll publish on youtube)
  • Tutorials on the process of building robots and integrating them with Monty.
  • At some point later, a date for a public hackathon!

bobs-burger-tina-belcher

8 Likes

FYI - Embodied AI Hackathon - Hackster.io

2 Likes

Can’t wait to see everything that comes out of this!

3 Likes

For the hackathon, will y’all be testing Monty on non x86_64/amd64 devices such as Raspberry Pis or Jetsons?

No, MacOS will be the brain for the hackathon projects - there is work in progress that will decouple the parts that will allow Monty to run on other platforms.

Follow these GitHub issues to get a notification:

2 Likes

Thanks for the info @brainwaves! Good to know.

1 Like