Leaphy Robot

From NURDspace
Revision as of 13:36, 19 September 2025 by Peterr (talk | contribs) (→‎Brain)
Jump to navigation Jump to search
Leaphy robot with subsumption architecture
Leaphy back side view.jpg
Participants
Skills software hardware electronics biology
Status Active
Niche robotics
Purpose Fun
Tool No
Location Space, home
Cost Very small given what is already present and what is available in the space (everything!!)
Tool category Electronics

Leaphy robot with subsumption architecture

Leaphy back side view.jpg {{#if:No | [[Tool Owner::{{{ProjectParticipants}}} | }} {{#if:No | [[Tool Cost::Very small given what is already present and what is available in the space (everything!!) | }}

Electronic insect-like creature

Physiological, Behavioural and environmental overview
peterr 2022- 2025 (with very long pauses)

Document history

  • Currently this is a living document and this wiki contains the latest version
  • September 2025 subsumption architecture design & transfer to nurdspace wiki
  • August 2025 Documentation for scheduler
  • March 2023 Draft version 0.2 - terminology cleanup and scope
  • November 2022 First document sketches & ideas






Project

The insect like creature based on the original leaphyrobot has a set of functionalities that together allow it to effectively find, and feed on targets that provides electricity as a reward. Learning strategies to improve success rate can be implemented. Sexual signaling, event prediction, mating behaviour, genetic exchange and evolution are in principle possible but are currently still (far) out of scope. The system at it's highest level can be seen as a feedback loop: sensory input => behavioural selection => motor output => new sensory input. The original design (2022) was meant to live in wind tunnel, and find odour sources to feed, but currently the emphasis has shifted from odour detection to more general sensory processing using subsumption architecture, see below.

Sensory inputs

Mechanoreceptor - sensing of obstacles
Micro switch based tactile hairs.
Development status: Not yet implemented
Ultrasound - Distance sensing of obstacles
Ultrasonic pulses detect obstacles and trigger reflex avoidance behaviour
Development status: Operational
Better directional responses would be possible with more than one US sensor.
This would need a dedicated embedded processor.
Development status: Not yet implemented
Sensing light intensity
Two LDR photoreceptors measure front-left light level and front-right light level
Total light intensity (sum) is available as an analog voltage
The difference between the inputs is also available, and signals what side to turn to, to approach or avoid the light.
Development status: Operational
Sensing sound (audio)
A subunit with its own processor, containing an Electret microphone to measure ambiant sound level, frequency and sound patterns.
Can be used in classical (Pawlow) and operand conditioning experiments. Will detect both absolute sound levels (for startle or avoidance responses) and can contain feature detectors (e.g. chirp intervals for leaf hopper like communication and courting behaviour of two leaphy's).
In addition, an Audeme Movi speech recognition shield (see https://www.audeme.com/movi.html) could be used to process voice commands
Development status: Not yet implemented, but Movi shield "proof of concept" done
PIR movement sensor
In addition to light intensity, an infra red motion sensors can be used to detect movements (humans, or the cat for that matter) see for instance https://www.adafruit.com/product/4871
Development status: Not yet implemented but PIR (similar to the adafruit thingy) available.
Sensing wind direction
Semiconductor based tactile hairs? Something like https://www.instructables.com/Low-Cost-Low-Speed-Wind-Sensor-1/ with shielding? or sensitive temperature measurements?
Note
Wind direction sensing in the wind tunnel is currently replaced by sensing light gradients parallel to the wind direction because the wind sensor is not available yet.
Development status: To be designed, high priority for the windtunnel version with odour processing as a main goal, low priority for the a free walking space/nurdspace version described here.
Volatiles (odours)
Semiconductor based perception of volatiles like alcohol (fruit-fly like behaviours), Carbon dioxide (mosquito like behaviour), or VOC's (most plant eating insects). The sensor produces a voltage level proportional to the intensity of the stimulus. Using a threshold can generate a presence/absence signal.
The interval between odour pulses is a proxy for the distance to the source. Note that an odour "gradient" does not exist!
Current implementation uses a SGP30 board. The time constant of the the sensor is in the order of 1 second, but "whiffs" of odour could be detected faster by using a feature detector that looks for sudden steps in the signal. This will also act as an sensory adaptation mechanism: in the absence of change the output go's to zero.
Development status: Sensor: worked, currently broken
Sensing ground color
This sensor measures ground colour and the output can be used to test for a particular condition (e.g. being at a target position). It can also be used to simulate trail pheromones implemented as colors next to the trail followed. This sense can also take the role of short distance " present at target" stimuli.
Development status: Operational
Sensing ground reflection level
Two sensors 'line followers' that both provide a binary present/absent signal of ground reflection.
Can be used to simulate trail following behaviour on a network of lines (combined with (led) ground colours.
Development status: testing phase
Future extensions: A combination of voltage levels and pulse frequency coding in the source could be used to simulate a spectrum of "chemical" inputs.
Feature detectors could extract particular frequencies and/or patterns to recognise for instance "toxic" substances.
Proprioception (internal "feeling")
Checking for instance for body movement when motors move, or for force feedback when extending electrical honey sucking mouthparts
Development status:To be developed, as yet low priority.
Sensing internal physiological states
To place electric reward information in perspective, some internal status information is needed, that is supplied by reading internal variables like energy levels (battery voltage , load current) or physiological state (internal variables like "ready to feed".
More advances implementations could look up previous experiences, and rate of change in physiological status as well as brain hormone levels and excitation states (see below under internal states and hormones").
Development status: To be designed.
Sensing hormone levels
Certain activities or combinations of activities can stimulate the production of hormones. This could be implemented as slow changing values in memory. Hormone levels might modulate the effect of sensory input or behavioural output, and act as a slow form of memory.
Sensing rewards and deterrents
Electronic organisms live on electricity, and are rewarded by sensing the presence of a voltage and current. This sensory organ is just sensing the analog voltage measured by the sensory organ. It simulates the perception of food such as nectar. Most logical position is at the tip of an appendage.
Development status: implemented and removed (needs mutiplexing). Mechanical movement (servo) to make electrical contact: testing phase.

Brain

Subsumption architecture
The Electronic organisms has a software brain that receives sensory input both from the environment and internal status. The brain reads all inputs about every 50 msec, and produces command for actuators that produce behaviour. These behaviours can be as simple as a colour changes of a led (signaling sensory status) or movements of body and body and appendages. The aim is to make the lower levels as stateless as possible i.e. the brain uses its real time vision of the word as a model instead of some abstract internal representation. The processing principle roughly follow the subsumption architecture (Brooks 1986).
Wikipedia writes about this technique:
"[Behaviours] are organized into a hierarchy of layers. Each layer implements a particular level of behavioral competence, and higher levels are able to subsume lower levels (= integrate/combine lower levels to a more comprehensive whole) in order to create viable behavior. For example, a robot's lowest layer could be "avoid an object". The second layer would be "wander around", which runs beneath the third layer "explore the world".[...]The subsumption architecture creates a system in which the higher layers utilize the lower-level competencies. The layers, which all receive sensor-information, work in parallel and generate outputs. These outputs can be commands to actuators, or signals that suppress or inhibit other layers."
The behavioural levels all compete for control of the robot. Each behavioural layer evaluates its sensory input and determines if it wants to control the outputs (to change its input according to its (implied) goal, e.g. mechanoreceptors (microswitches) will only request control if the switch is closed. After all the behavioural levels are evaluated, and their request flags set or reset an arbitrate routine selects the highest active layer that will run the robot behaviorual output for the next 50 msec.
See Anderson (2007) for a clear explanations and actual implementation examples.
Brain subsystems
More complicated or slow tasks (like sound processing or US echo detection) can be executed in separate processor boards that report their status to the main controller. Conceptually this is not different from subroutines for tasks.
Development status: To be developed
Nervous system
The currently (sort of) running implementation of the brain is using a real time loop in a very simple cooperative multitasking system () and inmplemented on an Arduino Uno
Development status: operational, but under active development.
Associative memory
Associative memory in it current, simplest form will be implemented as memory locations for particular sensory input combinations. inputs that coincide increase a counter for that combination (learning those combinations). This allows both Pawlow and operand conditioning.
Development status: not yet implemented, proof of concept done ~50 years ago :).
Excitation state and hormones
Hormones are internal variables with levels that act as slow changing memory for previous states.
A default breakdown speed can be set that slowly decrease hormone levels. Particular sensory input could increase the levels again. Reaching high levels of certain hormones can be a goal in itself, and subject to learning behaviour.
Development status: To be developed
Satisfaction
Signals to end feeding behaviour can -in addition to electrical nectar running out- also come from internal signals (satiation, internal hormonal signals, (too) long time periods)
Future prediction
This is currently way out of scope, but neural-net like implementations that store event sequences, predict likely outcomes of current inputs and act accordingly are fully feasible.
See Rickenbacher 1975 for an example that is probably a bit more traceable than -for instance- modern neural network code like Tensorflow.
Development status: To be developed

Effectors and Movement patterns

main motors
The main moves of the organism are implemented by two wheels driven by two DC motors with build in gearboxes) Motor speed is controlled by pulse width modualation (PMW)) present in the processor. Steering is done by differential motorspeed, a third backside "wheel" is a simple short passive wooden stand. This works well on reasonable flat surfaces.
Motor development status: Operational
Movable mouthparts
Appendages like extendable mouthparts controlled by servo's can be used for feeding on electrical food. Currently one servo controlling a proboscis (mouthpart for feeding) is present.
Development status: Operational in priciple, untested in practice
Movement patterns
Note that these descriptions are high level, and in many cases will be emergent instead of hardcoded !
Backup movement (obstacle avoidance, escape and evade)
Backup movement is a reflexive behaviour (behaviour that can only be interupted by another reflexive behaviour) that subsumes other movements. It consists of a short reflexive backward movement of fixed duration followed by a short rotation.
Development status: operational (based on Ultrasonics)
Random walk
In the absence of obstacles (and given enough motivation to walk) the organism will perform a random walk consisting of running forward at slow speed with a random turning angle.
Development status: testing phase
Counter turn
The counter turn is a reflexive behaviour that subsumes other movement behaviour. It consists of a 180 degree turn around the organisms vertical axis. Counter turning, causing zig-zag movement is effective during odour plume tracking behaviour, were it can bring the organism back into the plume.
Development status: operational for Ulrasonic sensory input, untested with odour.
Kinesis
For kineses (undirected, sensory modulated responses turning rate and speed in the random walk will be modulated by the intensity or frequency of a particular the sensory input. Far from the source (low/infrequent sensory input) speed is high and almost straight, with only small changes in turning angle. Close to the source (high and/or frequent sensory input) speed is slow and turns sharp, leading to a local search.
Development status: testing phase
Taxis
In Taxis behaviour, movement directions are not random but defined by sensory input. Examples are going toward a light source by steering left when the right light sensor get less light than the left one, and steering right when the left light sensor gets less light than the right one.
Development status: Operational

Behavioural levels

Stay tuned, more comming soon

Implementation details

Stay tuned, more comming soon

References

Anderson, D.P. (2007) https://www.dprg.org/articles/2007-03a/
Bettosini et al., 2022 Torocó: A Subsumption Architecture Implementation, in: 2022 8th International Conference on Automation, Robotics and Applications (ICARA) IEEE, Prague, Czech Republic, pp. 27–32. . https://doi.org/10.1109/ICARA55094.2022.9738521
Brooks, R.A. (1986) A Robust Layer Control System for a Mobile Robot, IEEE Journal of Robotics and Automation RA-2, 14-23
Rickenbacher (1975) Lernen und Motivation als relevanzgesteuerte Datenverarbeitung: Ein Computer-Simulationsmodell elementarer kognitiv-affektiver Prozesse. Birkhäuser Verag Basel.