I’ve floated this use case, informally, in a couple of forum topics:
- Design of Final-Year Undergraduate Project with the Inclusion of the TBP! - #20 by Rich_Morin
- Learning Categories of Objects - #5 by Rich_Morin
However, I haven’t gone into much detail. So, here is a quick stab at an informal sales pitch and problem description. (Note: I’m a complete newb on this topic, so please cut me some slack. :-)
Sales Pitch
Lots of folks would like to have “nice” gardens, but (a) they may not know much about plants and (b) their opinions, situations, and tastes are likely to vary markedly. Meanwhile, there are assorted lawn-mowing robots, water controllers, and other automated assistants on the market. More to the point, I’m certain that a variety of AI-enabled (whatever that means) robots will soon be available on the market.
In short, I think more or less capable gardening robots will be showing up Real Soon Now and that Monty might be able to play a valuable role. To be clear, I don’t think a near-term version of Monty would be able to handle this use case on its own, but I think it could if supported by some LLM magic, etc. More to the point, I doubt whether LLM-based AI could perform all of the desired tasks particularly well.
Desired Capabilities
In order to properly encourage (e.g., fertilize, water) or control (e.g., trim, weed) a set of plants, a gardening robot will need to identify each plant, to varying degrees of specificity:
- common names (e.g., tree, fruit tree, apple tree, Fuji apple tree)
- botanical names (e.g., species, subspecies, variety, cultivar)
In some cases, the needed information on the plant and associated actions will be quite minimal. For example: broadleaf weed or thistle (discourage), ground cover, lawn, or tree (trim, water occasionally). In others, there may be very specific actions, such as reporting and/or removing pests.
Using its own collected information and (say) some MCP-based resources, Monty should be able to look up lots of descriptive information, best practices, caveats, etc. Note that the needed information will depend a lot on the task(s) the robot is expected to perform: weeding a lawn or a raised bed is very different than trimming a hedge.
Monty should also be able to “learn” the shape of the garden, how to navigate and traverse it, etc. It should also be able to learn the capabilities and limitations of its physical platform. There are all sorts of platforms that could wander (e.g., crawl, roll, slither) around in a garden. So, the high-level goals might turn into very different low-level actions.
I certainly don’t want to have to explicitly show a gardening robot much, let alone install barriers, physical tags, etc. That is, the robot should handle most issues on its own, only consulting me about judgement calls, unexpected discoveries, etc.
Expected Challenges
There will be plenty of expected (and unexpected!) challenges:
-
pests and plants may look quite different, depending on the:
- camera type(s) (e.g., IR, monochrome, color, UV)
- current weather (e.g., bright and sunny, foggy, overcast)
- frequency distribution (e.g., intensity at each color)
- frequency spectrum (e.g., infra-red, visible, ultraviolet)
- illumination type (e.g., diffuse, direct, indirect)
- time of day (e.g., dawn, morning, noon, afternoon, dusk)
- …
-
Plants may respond differently to seasons, weather patterns, etc.
-
Different instances of “the same type of plant” may have variations, even if they have the same species, subspecies, variety, cultivar, etc.
-
Plants may look similar but have completely different genetics (convergent evolution FTW!).
-
Categories may have very important exceptions:
- “nasty thistles” include artichokes
- “nasty, thorny plants” include blackberries and roses
- …
-
Users will have very different preferences. For example, our garden is optimized to provide food for bees, birds, bunnies, and humans, so we grow acorn squash, artichokes, asparagus, blackberries, grapes, tomatoes, sunflowers, wildflowers, etc. None of our neighbors have any of these. Then again, our azaleas, dogwoods, and rhododendrons are pretty typical for the area.
-
…
However, figuring out this sort of thing “on the fly” seems to be exactly the sort of thing that Monty should excel at.
Scale invariance, Structure, etc.
AIUI, botanists (and especially botanical taxonomists) categorize plants using large numbers of “taxons” – observable characteristics that include things like number of petals, seed characteristics, vein structure, etc. The actual size of a full-grown plant, not so much (YMMV :-).
Meanwhile, the Burpee catalog pays very little attention to this sort of thing, concentrating on things like appearance, habit, hardiness, etc. I suspect that a gardening robot would need to live in both worlds, but mostly pay attention to the latter.