Leaphy Robot
| Leaphy robot with subsumption architecture | |
|---|---|
| Participants | |
| Skills | software hardware electronics biology |
| Status | Active |
| Niche | robotics |
| Purpose | Fun |
| Tool | No |
| Location | Space, home |
| Cost | Very small given what is already present and what is available in the space (everything!!) |
| Tool category | Electronics |
Leaphy robot with subsumption architecture
Leaphy back side view.jpg {{#if:No | [[Tool Owner::{{{ProjectParticipants}}} | }} {{#if:No | [[Tool Cost::Very small given what is already present and what is available in the space (everything!!) | }}
Electronic insect-like creature
Physiological, Behavioural and environmental overview
peterr 2022- 2025
Document history
- Currently this is a living document and this wiki contains the master version
- September 2025 subsumption architecture design & transfer to space wiki
- August 2025 Documentation for scheduler
- March 2023 Draft version 0.2 - terminology cleanup and scope
- Februari 2023 Draft version 0.1
- December 2022 Partial documentation/brainstorming
- November 2022 First document sketches & ideas
Project
The insect like creature (based on the original leaphyrobot) has a set of functionalities that together allow it to effectively find, and feed on targets that provides electricity as a reward. Learning strategies to improve success rate can be implemented. Sexual signaling, event prediction, mating behaviour, genetic exchange and evolution are in principle possible but are currently still (far) out of scope. The system at it's highest level can be seen as a feedback loop: sensory input => behavioural selection => motor output => new sensory input. The original design (2022) was meant to live in wind tunnel, and find odour sources to feed, but currently the emphasis has shifted from odour detection to more general sensory processing using subsumption architecture (see below).
Sensory inputs
- Mechanoreceptor - sensing of obstacles
- Micro switch based tactile hairs.
- Development status: Not yet implemented
- Mechanoreceptor - sensing of obstacles
- Ultrasound - Distance sensing of obstacles
- Ultrasonic pulses detect obstacles and trigger reflex avoidance behaviour
- Development status: Operational
- Ultrasound - Distance sensing of obstacles
- Better directional responses would be possible with more than one US sensor.
- This would need a dedicated embedded processor.
- Development status: Not yet implemented
- Sensing light intensity
- Two LDR photoreceptors measure front-left light level and front-right light level
- Total light intensity (sum) is available as an analog voltage
- The difference between the inputs is also available, and signals what side to turn to, to approach or avoid the light.
- Development status: Operational
- Sensing light intensity
- Sensing sound (audio)
- A subunit with its own processor, containing an Electret microphone to measure ambiant sound level, frequency and sound patterns.
- Can be used in classical (Pawlow) and operand conditioning experiments. Will detect both absolute sound levels (for startle or avoidance responses) and can contain feature detectors (e.g. chirp intervals for leaf hopper like communication and courting behaviour of two leaphy's).
- In addition, an Audeme Movi speech recognition shield (see https://www.audeme.com/movi.html) could be used to process voice commands
- Development status: Not yet implemented, but Movi shield "proof of concept" done
- Sensing sound (audio)
- PIR movement sensor
- In addition to light intensity, an infra red motion sensors can be used to detect movements (humans, or the cat for that matter) see for instance https://www.adafruit.com/product/4871
- Development status: Not yet implemented but PIR (similar to the adafruit thingy) available.
- PIR movement sensor
- Sensing wind direction
- Semiconductor based tactile hairs? Something like https://www.instructables.com/Low-Cost-Low-Speed-Wind-Sensor-1/ with shielding? or sensitive temperature measurements?
- Note
- Wind direction sensing in the wind tunnel is currently replaced by sensing light gradients parallel to the wind direction because the wind sensor is not available yet.
- Development status: To be designed, high priority for the windtunnel version with odour processing as a main goal, low priority for the a free walking space/nurdspace version described here.
- Semiconductor based tactile hairs? Something like https://www.instructables.com/Low-Cost-Low-Speed-Wind-Sensor-1/ with shielding? or sensitive temperature measurements?
- Sensing wind direction
- Volatiles (odours)
- Semiconductor based perception of volatiles like alcohol (fruit-fly like behaviours), Carbon dioxide (mosquito like behaviour), or VOC's (most plant eating insects). The sensor produces a voltage level proportional to the intensity of the stimulus. Using a threshold can generate a presence/absence signal.
- The interval between odour pulses is a proxy for the distance to the source. Note that an odour "gradient" does not exist!
- Volatiles (odours)
- Current implementation uses a SGP30 board. The time constant of the the sensor is in the order of 1 second, but "whiffs" of odour could be detected faster by using a feature detector that looks for sudden steps in the signal. This will also act as an sensory adaptation mechanism: in the absence of change the output go's to zero.
- Development status: Sensor: worked, currently broken
- Sensing ground color
- This sensor measures ground colour and the output can be used to test for a particular condition (e.g. being at a target position). It can also be used to simulate trail pheromones implemented as colors next to the trail followed. This sense can also take the role of short distance " present at target" stimuli.
- Development status: Operational
- Sensing ground color
- Sensing ground reflection level
- Two sensors 'line followers' that both provide a binary present/absent signal of ground reflection.
- Can be used to simulate trail following behaviour on a network of lines (combined with (led) ground colours.
- Development status: testing phase
- Sensing ground reflection level
- Future extensions: A combination of voltage levels and pulse frequency coding in the source could be used to simulate a spectrum of "chemical" inputs.
- Feature detectors could extract particular frequencies and/or patterns to recognise for instance "toxic" substances.
- Proprioception (internal "feeling")
- Checking for instance for body movement when motors move, or for force feedback when extending electrical honey sucking mouthparts
- Development status:To be developed, as yet low priority.
- Proprioception (internal "feeling")
- Sensing internal physiological states
- To place electric reward information in perspective, some internal status information is needed, that is supplied by reading internal variables like energy levels (battery voltage , load current) or physiological state (internal variables like "ready to feed".
- More advances implementations could look up previous experiences, and rate of change in physiological status as well as brain hormone levels and excitation states (see below under internal states and hormones").
- Development status: To be designed.
- Sensing internal physiological states
- Sensing hormone levels
- Certain activities or combinations of activities can stimulate the production of hormones. This could be implemented as slow changing values in memory. Hormone levels might modulate the effect of sensory input or behavioural output, and act as a slow form of memory.
- Sensing hormone levels
- Sensing rewards and deterrents
- Electronic organisms live on electricity, and are rewarded by sensing the presence of a voltage and current. This sensory organ is just sensing the analog voltage measured by the sensory organ. It simulates the perception of food such as nectar. Most logical position is at the tip of an appendage.
- Development status: implemented and removed (needs mutiplexing). Mechanical movement (servo) to make electrical contact: testing phase.
- Sensing rewards and deterrents
Brain
- Subsumption architecture
- The Electronic organisms has a software brain that receives sensory input both from the environment and internal status. The brain reads all inputs about every 50 msec, and produces command for actuators that produce behaviour. These behaviours can be as simple as a colour changes of a led (signaling sensory status) or movements of body and appendages. The aim is to make the lower levels as stateless as possible i.e. the brain uses its real time vision of the word as a model instead of some abstract internal representation. The processing principle roughly follow the subsumption architecture (Brooks 1986).
- Subsumption architecture
- Wikipedia writes about this technique:
- "[Behaviours] are organized into a hierarchy of layers. Each layer implements a particular level of behavioral competence, and higher levels are able to subsume lower levels (= integrate/combine lower levels to a more comprehensive whole) in order to create viable behavior. For example, a robot's lowest layer could be "avoid an object". The second layer would be "wander around", which runs beneath the third layer "explore the world".[...]The subsumption architecture creates a system in which the higher layers utilize the lower-level competencies. The layers, which all receive sensor-information, work in parallel and generate outputs. These outputs can be commands to actuators, or signals that suppress or inhibit other layers."
- Wikipedia writes about this technique:
- This description does not make fully clear that the reverse is also true, i.e. active lower levels will always subsume higher levels. The behavioural levels all compete for control of the robot. Each behavioural layer evaluates its sensory input and determines if it wants control the outputs (to change its input according to its (implied) goal, e.g. mechanoreceptors (microswitches) will only request control if the switch is closed and this request is is not explicitly inhibited by a higher level behaviours. That inhibition could come from a behavioural level "approach and bump into a feeding spot". As a rule the lowest layer that is requesting controls will run the robot for the next 50 msec.
- See Anderson (2007) for a clear explanations and actual implementations.
More soon, this page is work in progress
References
- Anderson, D.P. (2007) https://www.dprg.org/articles/2007-03a/
- Bettosini et al., 2022 Torocó: A Subsumption Architecture Implementation, in: 2022 8th International Conference on Automation, Robotics and Applications (ICARA) IEEE, Prague, Czech Republic, pp. 27–32. . https://doi.org/10.1109/ICARA55094.2022.9738521
- Brooks, R.A. (1986) A Robust Layer Control System for a Mobile Robot, IEEE Journal of Robotics and Automation RA-2, 14-23
- Rickenbacher (1975) Lernen und Motivation als relevanzgesteuerte Datenverarbeitung: Ein Computer-Simulationsmodell elementarer kognitiv-affektiver Prozesse. Birkhäuser Verag Basel.