Your Robot Expertise is Requested!

Dear TBP Community,

We are excited to announce our plans for a robot hackathon sometime this year (probably). :tada: :robot:

To ensure its success, we intend to conduct an internal hackathon first, aiming to identify and address any software quirks, hardware challenges and generating quickstart tutorials before extending the invitation to the broader community.

We would greatly appreciate your insights, especially if you have experience in robotics. Please consider sharing:

  • Lessons Learned: What do you wish you’d known before starting your robotics projects?
  • Common Pitfalls: Are there any challenges we should be mindful of?
  • Recommended Starter Kits: Which kits are beginner-friendly and require minimal preliminary learning? We particularly need sensors that can estimate their location in space (like a depth camera or lidar sensor) and that can be moved around.
  • Essential Tools: What tools have been indispensable in your robot-building endeavors?
  • Useful Software: Do you have software you would recommend using? As a first step we plan to stream the sensor data to a laptop, have it be processed by Monty there, and then send motor commands back.?
  • Any Other Wisdom? There is lots we don’t know, so anything you can think of would be gratefully appreciated.

Giving Feedback on our Thoughts Thus Far

@nleadholm has compiled a comprehensive document on various robot kits and approaches. We have enabled comments on this document, and we warmly invite you to share your thoughts there or on this topic.

Our Assumptions for the Internal Hackathon

  • Monty Software Deployment: We will run the Monty software on laptops, as it currently doesn’t operate on Raspberry Pi. We aim to resolve this before the public event.
  • Data Transmission: For that reason, it is assumed that sensor data reception and movement commands will be managed over WiFi, Bluetooth, or other wireless methods. But maybe some approaches could use direct wires between the laptop and robot.
  • Robot Capabilities: Our focus will be on movement and sensing, such as using a touch sensor or camera to learn or identify objects. Initially, we will keep tasks simple and not attempt complex actions like moving objects or changing the state of the world.

As our software evolves, we will refine the scope of the public event. In the meantime, your advice and suggestions will be invaluable in helping us create a successful and engaging hackathon for everyone.

Thank you in advance!

8 Likes

I’m a big fan of Seeed Studios’ Grove offerings for robotics and related prototyping. Basically, they allow assorted computers (e.g., Arduino, RasPi) to connect with and use a variety of sensors and effectors.

Because Grove uses polarized, multi-pin connectors, getting the wiring worng [sic] is a non-problem and of course, no soldering is required. The pieces are all pretty inexpensive, making it a good starting point for a prototyping setup.

On the down side, the Grove connector only supports four leads, with two of these being used for power and ground. So, I’ve noticed that some of the capabilities of the sensors and effectors aren’t fully supported.

4 Likes

Thanks @Rich_Morin we’ll take a look!

Some 8 years back I purchased a BrickPi which is essentially a case, a battery, and a Raspberry Pi HAT, onto which you can build a Lego robot, and plug in 4 Lego motors, and 4 Lego sensors. They are still around (but somewhat reduced).
Here’s my SLAMbot (it has a lidar on the roof)

4 Likes

The Moon is getting crowded with rovers. Perhaps someone with some pull with various space agencies could work out a time-share deal. I’m a big FORTH proponent. 50 years ago, the world came within a hair’s width of adopting AI instead of sifting eye candy as the predominant popular computing paradigm. It had a well-thought out and simple method for tethering remote systems to lab computers with only a handful of bytes needing to be embedded in the remote. Monty on the Moon - think about it.

Oh, and there’s some FORTH types in surprising places like the ESA because it was invented at the NRAO for controlling radio telescopes.

2 Likes

I played with FORTH long ago on my Commodore 64 but felt it didn’t stack up (or down, or …)

1 Like

Thats great! Do you have a video of it?

FWIW, FORTH-flavored Erlang (ffe) is a version of FORTH that runs on the BEAM. I don’t know anything much about it, but any reasonable implementation would be able to benefit from Erlang’s message-passing and process supervision models, etc.

1 Like

I’ve mentioned Nerves and Nx before on this forum, but I’d like to discuss them again, in the context of the proposed robot hackathon. So, here is a quick intro to both, followed by some discussion (e.g., points that might argue against their use in a Monty-controlled robot). YMMV, etc.

The Nx (Numerical Elixir) library supports math-friendly data structures such as multi-dimensional arrays (aka tensors), code generation for GPUs, etc. These could all be useful for Monty’s computational tasks.

Nerves is a production-quality version of Linux that supports distributed sets of embedded systems. It is written in Elixir and supports:

  • many sensor and effector interfaces
  • many commercial and hobbyist CPUs
  • remote control and code distribution
  • read-only and read/write file systems
  • the usual BEAM-based reliability
  • a supervision-tree based init system
  • dual system file trees, with failover
  • …

Discussion

The mainstream Monty code is written in Python, so it can’t run directly under the BEAM. However, the Python interpreter could be executed as an OS process, using a BEAM “port”. This would require a bit of plumbing (e.g., mapping CMP messages in/out of an intermediate format such as JSON), but it seems quite doable (“a simple matter of software” :-).

Even better, there’s an Elixir library named Piton which addresses this issue directly. Piton sets up a pool of Python interpreters (as ports), letting Python code sidestep the dreaded global interpreter lock (GIL).

1 Like

Thanks all for highlighting these, this is really helpful! Grove and BrickPi are exactly the kinds of things that we are on the lookout for.

Also love the suggestion of Monty on the Moon, maybe one day!

Re. Multi-Purpose Body Kits
Does anyone know of any kits that can be bought specifically for building the body and actuators of a robot, without being too constrained to a single design? We would like at least one of the projects to be something where many different robots could be built (a robot that drives around a room or a claw/hand type robot, etc.). LEGO is one approach to this, but is quite expensive and makes it harder to branch out to the interesting sensors available in a platform like Grove. Grove appears like a great option on the computer/sensor platform, but I couldn’t see anything about components for the robot’s body/actuators.

I should add that as this if for short hackathons, we’d like to avoid having to go too much into things like 3D printing, which while amazing, can add significant delays to a multi-day project. Rather the hope is that a single kit could be sufficient for building several, very different robots, without having to research online and buy many individual parts.

Re. Aerial Drones
Also interested what experience or tips anyone has on these, including options like Crazyflie?

1 Like

Any sort of “body modeling” effort (using however many limbs) will spend a lot of energy on issues like balance, locomotion, etc. If this is the goal, that’s fine, but something like a robotic tractor base could avoid most of these issues and provide a good foundation for mounting a computer and assorted add-ons. That said, this approach would not be able to deal with stairs, etc. (So, YMMV :-).

1 Like

Monty on the Moon

So, intersting little fun fact, I was brainstorming some coordinate system stuff the other day. Turns out, your truncated octahedron approach would be more computationally efficient for systems navigating in 3D enviroments (ie. space). It made me wonder, would there be an advantage to having seperate Monty system implementations; one designed for 3D navigation, and another meant to navigate on a ground plane?

1 Like

I have a partly-baked idea for a Monty-based robotics project. Comments welcome…

The Maslow 4 is a large-format CNC router. Basically, it’s a sled that is pulled across a 2D surface by the actions of four sets of precision belts, gears, and motors. Once the sled is in the desired X/Y position, the router can be used to carve away material.

In a vanilla Maslow 4, an Arduino controls the sled’s behavior, based on an uploaded (G-code) control file. However, this isn’t the only possibility.

For example, a pair of Raspberry Pi Camera Modules could watch the bit and the nearby surface. These could be connected to a RasPi 5 and (via the Arduino) used to drive the router and sled, based on the appearance of the surface, before and after contact.

This may not be the sort of robotics project the TBP had imagined, but it might be one that could show off Monty’s capabilities. And, given that I have a Maslow 4.1 that is supposed to arrive fairly shortly, I might even give it a try. In any event, I’ve made a related post to the Maslow forum…

1 Like

This actually kind of reminds me of the approach used by Hadrian Manufacturing (they use full automated for things like milling and what not). I think the idea is really cool. Even if its not what most people think of when they hear the word “robotics,” it’s still probably one of the more immediatly useful Monty applications. I say go for it.

I see Raspbbery Pi has an inexpensive HAT (& Python library) that can support up to 4 recent vintage Lego motors or sensors. They also have a power supply that can power the RPi & Lego devices (it would be nice to have a battery). Lego also has a Maker Plate that mechanically connects the RPi & HAT to Lego parts. There’s a recipe to 3D print this plate on Thingiverse

1 Like

Any recommendations for 3D printing shops? I have one just down the road (https://pro3dcomposites.com/), but I suspect they are aiming a bit higher than garage DIY types. So, I’d like to know about alternatives.

Ideally, there would be a convenient, economical, high quality shop that would be willing to maintain a catalog of open source 3D designs. And a pony…

Maybe sensorimotor is only 2/3 of the way. Tool making goes all the way back to Homo habilis. It’s a major positive feedback loop. Paths often lead to the Raspberry Pi.

I humbly offer the following:

My menagerie* of mostly Raspberry Pis looks forward to being able to having “play to the tune of Monty” added to their limited repertoire.
Things I think necessary for this to occur:

  • Sensory/motor parts that deal with the real world
  • Having sensory/motor parts run apart from the main Monty
  • A distributed form of CMP
  • A way of sharing Monty Learnings amongst the Centre, and many hubs

I was disappointed to not see some of these things in the Monty Project Overview sheet (I realize you have your hands full with more-to-the-purpose stuff), but see them as necessary parts of your in-house robot Hackathon.

*From right-to-left: robot arm (6 DOF), 3D scanner (Raspberry Pi, stereo lasers, camera, & turntable), autonomous vehicle (Raspberry Pi. Lego parts & motors, lidar) 3D printer (picture Monty dreaming up a design it passes to a 3D printer to realize). Not shown: RGBD camera, Raspberry Pi cluster, Jetson Nano

2 Likes

Although I don’t have nearly as many robotic devices as @rolly, I’m in violent agreement with this point. So, here’s a modest proposal…

Centralize and extend CMP I/O

Basically, make sure that all CMP I/O in the current (Python) Monty implementation goes through centralized, distributable APIs. This should allow selectable channels (with a default of :local). If desired, this could also provide traffic monitoring, filtering, etc.

For :remote channels (e.g., cross-node, cross-platform), use clean, open, and well understood messaging approaches such as Elixir’s send and pattern-matching facilities.

Implement and test a proof of concept version that can operate within normal Python constraints. Work with other language communities to provide and track multi-node test suites.

Resources

The most relevant options that ChatGPT suggests are:

There are a few ways to interface Python with the BEAM (Erlang Virtual Machine), depending on the level of integration and performance needs:

1. Pyrlang

2. Term (erlang-term)

  • Description: A Python library for encoding and decoding Erlang External Term Format.
  • Use Case: If you need to serialize/deserialize Erlang terms for communication.
  • GitHub: https://github.com/hdima/python-erlang-term

3. erlport

It also mentions GRiSP, Luerl, NIFs & Ports, gRPC, ZeroMQ, and Message Bus. It suggests this approach:

For your POC multi-node message fabric, you’ll want a solution that allows seamless communication between Python (on Mac) and BEAM-based nodes (Elixir, Phoenix, Nerves, Linux). Here are some possible approaches:

1. Erlang Distribution with Pyrlang

  • Pros: Allows Python to act as a full Erlang node, speaking the native BEAM distribution protocol.
  • Cons: Requires Pyrlang setup, which may have performance limitations compared to native BEAM nodes.
  • Best Use Case: If deep integration with Erlang/Elixir’s native messaging is needed.

2. Erlport for Lightweight Communication

  • Pros: Simple way to call Python from Elixir (or vice versa) using Erlang ports.
  • Cons: Communication is via standard I/O, so it’s slower than direct message passing.
  • Best Use Case: If the Python nodes mainly need to process occasional requests from BEAM nodes.

3. Distributed Messaging with MQTT or ZeroMQ

  • Pros: Protocol-agnostic, widely supported across Python, Elixir, and Linux.
  • Cons: Lacks the tight integration of BEAM’s native messaging.
  • Best Use Case: If a generic, scalable message bus is preferred.

4. gRPC for Cross-Language Communication

  • Pros: Well-supported in Python and Elixir, fast binary serialization (Protocol Buffers).
  • Cons: Requires defining a schema, not as seamless as BEAM-native messaging.
  • Best Use Case: If structured API-style communication is preferred.

5. Message Bus (RabbitMQ, NATS, Kafka, etc.)

  • Pros: Scales well, robust messaging, already widely used in robotics applications.
  • Cons: More infrastructure overhead than direct BEAM messaging.
  • Best Use Case: If event-driven or pub/sub architecture is needed.

Recommendation for the Hackathon

For quick multi-language interoperability, I’d suggest using MQTT or ZeroMQ as a lightweight message fabric. If you need BEAM-native messaging, Pyrlang is a good choice for Python nodes. …

And a pony…

1 Like
  • Lessons Learned: Robotics is really challenging specially if you want to do something that works always. Its invaluable to have a simulated environment to test everything first. That way you work on the software part before running directly into the physical robot which comes with its own problems, nothing to do with the software itself. If you don’t do that, its extremely difficult to have good results.

  • Common Pitfalls: I would say define very well the objective of the hackathon. Limiting the scope and the task to accomplish is key, because in robotics things can get very complicated very fast and you spend more time fixing issues relate to hardware than really developing something useful. A small focused objective is the way to go. Also using hobby robots that are not just plug and will drain most of your time, so recommending some easy to use robots its recommended.
    Also trying to reinvent the wheel is very common in robotics, and you don’t need to. There are loads of ROS2 packages or similar that give you complex functionality out of the box.

  • Recommended Starter Kits: All depends on the project, but I would recommend a simple robot that has already all the basic hardware and systems ready. Asking people to assemble a robot, make it work for then start developing an application, normally doesn’t turn out very well. IN the company I work for TheConstruct, we have developed an integrated solution for that. Is simple mobile robot that’s all you really need to get started fast in robotics, with ROS2 integrated, laser, camera, odometry, gripper and remote connection. I’ll leave the info here: INFO DOC DRIVE PDF.

Again, the kit I highly advise that whatever we use, is as much as plug and play as possible and has a simulated version.
If what we need is more a mobile manipulator for doing the learning of touching the cup and uses RGBD depths ensors, I would encourage to do itin simulation first, because manipulators that are ready to move it where you want them by pose of the end effector, with force feedback or touch detection for affordable prices there really isn’t any. If the people want to do some tinkering an hardware integration then yeah, whatever robot arm is placed in the list provided already is fine. ElephantRobotic Arms are Ok for that., there is work done although like all the robot arms the integration of the griper or in our case a contact sensor will have to be developed.

  • Essential Tools: The easy one is ROS/ROS2. If you are into robotics, knowing ROS is basic to be able to do perception, navigation, frame transformations, and manipulation… Loads of skills already working there. It also gives an easy way to access sensor data once you know the basics of ROS. Using Robot APIs is also important because not all robots have ROS2 systems available to the users. It’s the simplest way to access all the systems without prior knowledge, but you have to know a bit about API access. RVIZ inside ROS is one of the most powerful tools for robotics. It helps in perception a lot.

  • Useful Software: Again, ROS systems are meant for that. Once they are in the same network, your sensor can publish the data into a topic, the laptop sees it a processes it and then sends the commands pack through another topic. Also, remote robot connections allow even to connect to robots that are note even in the same country and work that way. That’s how we do it in TheConstruct, students do their exams and final projects using robots that are on the other side of the globe and are too expensive for a regular Joe to have them.

  • Any Other Wisdom? I think the best advice is to give a simulated environment for the participants to work fast on their systems and Monty integration and then because the simulation will use ROS or an API, integrating it into the Real robot will be much easier. Robotics is a discipline that hardware can affect a lot in the success of the project. Having reliable hardware, with a transparent API/ROS systems that users can treat as the simulation is night and day. If you don’t do that, people will spend 90% of the time fixing issues not related to what you want which is that people develop cool apps with Monty and push its boundaries.

Hope this helps. Got many other observations but I would say These are the most important points :wink:

8 Likes