Leaphy Robot: Difference between revisions

From NURDspace
Jump to navigation Jump to search
 
(116 intermediate revisions by the same user not shown)
Line 12: Line 12:
}}
}}
== Electronic insect-like creature ==
== Electronic insect-like creature ==
'''Physiological, Behavioural & environmental overview''' <br>
'''Physiological, Behavioural and environmental overview''' <br>
peterr 2022- 2025<br>
peterr 2022- 2025 (with ''very'' long pauses)


====Document history====
====Document history====
* September 2025 subsumption architecture design
* Currently this is a living document and this wiki contains the latest version
* September 2025 subsumption architecture design & transfer to nurdspace wiki
* August    2025 Documentation for scheduler
* August    2025 Documentation for scheduler
* March    2023 Draft version 0.2  - terminology cleanup and scope
* March    2023 Draft version 0.2  - terminology cleanup and scope
* Februari  2023 Draft version 0.1 
* December  2022 Partial documentation/brainstorming 
* November  2022 First document sketches & ideas
* November  2022 First document sketches & ideas
<br>
<br>
<br>
<br>
<br>


== Project==
== Project==
The insect like creature (based on the leaphy robot) has a set of functionalities that together allow it to effectively find, and feed on targets that provides electricity as a reward.
The insect like creature based on the original [https://leaphy.store/product/leaphy-original-los/ leaphyrobot] has a set of functionalities that together allow it to effectively find, and feed on targets that provides electricity as a reward.
Learning strategies to improve succes rate can be implemented.
Learning strategies to improve success rate can be implemented.
Sexual signaling, event prediction, mating behaviour, genetic exchange and evolution are in principle possible but are currently still (far) out of scope.
Sexual signaling, event prediction, mating behaviour, genetic exchange and evolution are in principle possible but are currently still (far) out of scope. The system at it's highest level can be seen as a feedback loop: sensory input => behavioural selection => motor output => new sensory input. The original design (2022) was meant to live in wind tunnel, and find odour sources to feed, but currently the emphasis has shifted from odour detection to more general sensory processing using subsumption architecture, [https://nurdspace.nl/Leaphy_Robot#Brain see below].


== Sensory inputs ==
== Sensory inputs ==
Line 35: Line 39:


:: '''Ultrasound - Distance sensing of obstacles'''
:: '''Ultrasound - Distance sensing of obstacles'''
::: Ultrasonic pulses detect obstacles and can trigger reflex avoidance behaviour
::: Ultrasonic pulses detect obstacles and trigger reflex avoidance behaviour
::: ''Development status: Operational''
::: ''Development status: Operational''


::: Better directional responses would be possible with more than one US sensor.  
:::: Better directional responses would be possible with more than one US sensor.  
::: This would need a dedicated embedded processor.  
:::: This would need a dedicated embedded processor.  
::: ''Status: Not yet implemented''
:::: ''Development status: Not yet implemented''


:: '''Sensing light intensity'''
:: '''Sensing light intensity'''
::: Two LDR photoreceptors measure front-left light level and front-right light level
::: Two LDR photoreceptors measure front-left light level and front-right light level
::: Total ambient light intensity is available as an analog voltage
::: Total light intensity (sum) is available as an analog voltage
::: The difference between the inputs is also available, and signals what side to turn to, to approach or avoid the light.
::: The difference between the inputs is also available, and signals what side to turn to, to approach or avoid the light.
::: ''Development status: Operational''
::: ''Development status: working well in priciple, but due to mounting problems actually not functioning currently.''


:: '''Sensing sound (audio)'''
:: '''Sensing sound (audio)'''
::: A subunit with its own processor containing an Electret microphone to measure ambiant sound level, frequency and sound patterns.  
::: A subunit with its own processor, containing an [https://en.wikipedia.org/wiki/Electret_microphone Electret microphone] to measure ambiant sound level, frequency and sound patterns.  
::: Can also be used in classical and operant leaning experiments Detect both absolute sound level (for startle or avoidance resposes) or feature  
::: Can be used in classical (Pawlow) and operand conditioning experiments. Will detect both absolute sound levels (for startle or avoidance responses) and can contain feature detectors (e.g. chirp intervals for leaf hopper like  communication and courting  behaviour of two leaphy's).   
::: detectors (e.g. chirp intervals for leafhopper like  communcication and courting  behaviour of two leaphy's.   
::: In addition, an Audeme Movi speech recognition shield (see https://www.audeme.com/movi.html) could be used to process voice commands
::: In addition, an Audeme Movi speech recognition shield (see https://www.audeme.com/movi.html) could be used to process voice commands
::: ''Development status: Not yet implemented, but Movi shield "proof of concept" done''
::: ''Development status: Not yet implemented, but Movi shield "proof of concept" done''


:: '''PIR movement sensor'''
:: '''PIR movement sensor'''
::: In addition to light intensity, an infra red motion sensors can be used to detect movements (humans, or the cat for that matter) see for instance https://www.adafruit.com/product/4871
::: In addition to light intensity, an infra red motion sensors can be used to detect movements (humans, or the cat for that matter)  
::: '''Development status: Not yet implemented''
::: ''Development status: Not yet implemented but PIR (similar to this [https://www.adafruit.com/product/189 adafruit thingy]) is available''.


:: ''' Sensing wind direction '''
:: ''' Sensing wind direction '''
::: Semiconductor based tactile hairs?  Something like https://www.instructables.com/Low-Cost-Low-Speed-Wind-Sensor-1/ with shielding? or sensitive temperature measurements?
::: Semiconductor based tactile hairs?  Something like [https://www.instructables.com/Low-Cost-Low-Speed-Wind-Sensor-1/ this],with shielding? or sensitive temperature measurements?
::: Note: Wind direction sensing in the windtunnel is currently replaced by sensing light gradients parallel to the wind direction because the wind sensor is not available yet.  
:::; Note: Wind direction sensing in the wind tunnel is currently replaced by sensing light gradients parallel to the wind direction because the wind sensor is not available yet.  
::: ''Development status: To be designed, high priority for the windtunnel version with odour processing as a main goal, low for the a free walking space/nurdspace version described here.''
:::: ''Development status: To be designed, high priority for the windtunnel version with odour source finding as a main goal, low priority for the a free walking space/nurdspace version described here.''
 
:: ''' Volatiles (odours) '''
::: Semiconductor based perception of volatiles like alcohol (fruit-fly like behaviours), Carbon dioxide (mosquito like behaviour), or VOC's (most plant eating insects). The sensor produces a voltage level proportional to the intensity of the stimulus. Using a threshold can generate a presence/absence signal.
::: The interval between odour pulses is a proxy for the distance to the source. Note that an odour "gradient" does not exist!
 
::: Current implementation uses a SGP30 board. The time constant of the the sensor is in the order of 1 second, but "whiffs" of odour could be detected faster by using a feature detector that looks for sudden steps in the signal. This will also act as an sensory adaptation mechanism: in the absence of change the output go's to zero.
::: ''Development status: Sensor: worked, currently broken''
 
:: '''Sensing ground color'''
::: This sensor measures ground colour and the output can be used to test for a particular condition (e.g. being at a target position). It can also be used to simulate trail pheromones implemented as colors next to the  trail followed. This sense can also take the role of short distance " present at target" stimuli.
::: '' Development status: Operational ''
 
:: '''Sensing ground reflection level'''
::: Two sensors 'line followers' that both provide a binary present/absent signal of ground reflection.
::: Can be used to simulate trail following behaviour on a network of lines  (combined with (led) ground colours.
::: ''Development status: testing phase''
 
:::: Future extensions: A combination of voltage levels and pulse frequency coding in the source could be used to simulate a spectrum of "chemical" inputs. 
:::: Feature detectors could extract particular frequencies and/or patterns to recognise for instance "''toxic''" substances.
 
:: '''Proprioception (internal "feeling")'''
::: Checking for instance for body movement when motors move, or for force feedback when extending electrical honey sucking mouthparts
::: Development status:To be developed, as yet low priority.
:: '''Sensing internal physiological states '''
:::To place electric reward information in perspective, some internal status information is needed, that is supplied by reading internal variables like energy levels (battery voltage , load current) or physiological state (internal variables like "ready to feed".
::: More advances implementations could look up previous experiences, and rate of change in physiological status as well as brain hormone levels and excitation states (see below under internal states and hormones").
:::Development status: To be designed.
 
:: '''Sensing hormone levels'''
::: Certain activities or combinations of activities can stimulate the production of hormones. This could be implemented as slow changing values in memory. Hormone levels might modulate the effect of sensory input or behavioural output, and act as a slow form of memory.
 
:: '''Sensing rewards and deterrents'''
::: Electronic organisms live on electricity, and are rewarded by sensing the presence of a voltage and current. This sensory organ is just sensing the analog voltage measured by the sensory organ. It simulates the perception of food such as nectar. Most logical position is at the tip of an appendage.
::: ''Development status: implemented and removed (needs mutiplexing). Mechanical movement (servo) to make electrical contact: testing phase.''
 
== Brain ==
::'''Subsumption architecture'''
:::The Electronic organisms has a software brain that receives sensory input both from the environment and internal status. The brain reads all inputs about every 50 msec, and produces command for actuators that produce behaviour. These behaviours can be as simple as a colour changes of a led (signaling sensory status) or movements of body and body and appendages. The aim is to make the lower levels as stateless as possible i.e. the brain uses its real time vision of the word as a model instead of some abstract internal representation. The processing principle roughly follow the subsumption architecture (Brooks 1986).
 
:::Wikipedia writes about this technique:
::::"''[Behaviours] are organized into a hierarchy of layers. Each layer implements a particular level of behavioral competence, and higher levels are able to subsume lower levels (= integrate/combine lower levels to a more comprehensive whole) in order to create viable behavior. For example, a robot's lowest layer could be "avoid an object". The second layer would be "wander around", which runs beneath the third layer "explore the world".[...]The subsumption architecture creates a system in which the higher layers utilize the lower-level competencies. The layers, which all receive sensor-information, work in parallel and generate outputs. These outputs can be commands to actuators, or signals that suppress or inhibit other layers."''
 
:::My current implementation uses the opposite order. The high priority levels conduct the simplest behavioural tasks, and when these high priority but simple behavioural levels are happy more complex task can get control. This approach is still experimental, but in any case easy enough to reverse again.
 
:::The behavioural levels ''all'' compete for control of the robot. Each behavioural layer evaluates its sensory input and determines if it wants to control the outputs (to change its input according to its (implied) goal, e.g. mechanoreceptors (microswitches) will only request control if the switch is closed. After all the behavioural levels are evaluated, and their request flags set or reset an arbitrate routine  selects the highest active layer that will run the robot behaviorual output for the next 50 msec.
:::See Anderson (2007) for a clear explanations and actual implementation examples.
 
:: '''Brain subsystems'''
:::More complicated or slow tasks (like sound processing or US echo detection) can be executed in separate processor boards that report their status  to the main  controller. Conceptually this is not different from subroutines for tasks.
:::''Development status: To be developed''
 
:: '''Nervous system'''
::: The currently (sort of) running implementation of the brain is using a real time loop in a very simple cooperative multitasking system ([https://yiweimao.github.io/blog/async_microcontroller/ YiweiMao scheduler]) and inmplemented on an Arduino Uno
::: ''Development status: operational, but under active development.''
 
::'''Associative memory'''
::: Associative memory in it current, simplest form will be implemented as memory locations for particular sensory input combinations.  inputs that coincide increase a counter for that combination (learning those combinations). This allows both Pawlow  and operand conditioning.
:::''Development status: not yet implemented, proof of concept done ~50 years ago :).''
 
:: '''Excitation state and hormones'''
::: Hormones are internal variables with levels that act as slow changing memory for previous (combinations of) states.
::: A default breakdown speed can be set that slowly decrease hormone levels. Particular sensory input could increase the levels again. Reaching high levels of certain hormones can be a goal in itself, and subject to learning behaviour.
::: ''Development status: To be developed''
 
:: '''Satisfaction'''
:::Signals to end feeding behaviour can -in addition to electrical food running out- also come from internal signals (satiation, internal hormonal signals, (too) long time periods with little change of events)
 
:: '''Future prediction'''
:::This is currently way out of scope, but neural-net like implementations that store event sequences, predict likely outcomes of current inputs and act accordingly are fully feasible.
::: See Rickenbacher 1975 for an example that is probably  a bit more traceable than -for instance- modern neural network code like Tensorflow.
:::''Development status: To be developed''
 
== Effectors ==
 
:: '''main motors'''
:::The main moves of the organism are implemented by two wheels driven by two [https://leaphy.store/product/tt-motor-10cm-jumper-female/ DC motors] with build in gearboxes) and a [https://leaphy.store/product/motor-shield/ motor shield]. Motor speed is controlled by pulse width modualation (PMW)) present in the processor. Steering is done by differential motorspeed, a third backside "wheel" is a simple short passive wooden stand. This works well on reasonable flat surfaces, but needs improvement is the next robot (Terzo).
:::''Motor development status: Operational at a basic level, PID control and extra power at startup to get smooth startup at low speed is needed.''
 
:: '''Movable mouthparts'''
:::Appendages like extendable mouthparts controlled by [https://isking-modellbahn.de/modelcraft-es-030-Servo-top-line-steuerelektronik servo]'s can be used for feeding on electrical food. Currently one servo controlling a proboscis (mouthpart for feeding) is present.
:::''Development status: Operational in priciple, untested in practice, needs 3D printing of parts''
 
:: '''[https://leaphy.store/product/rgb-led-single/ Three colour led]'''
::: can be used for signaling status of whatever state we want.
:::''Development status: Operational
 
:: ''' Piezo beeper '''
::: Present on the [https://leaphy.store/product/motor-shield/ motor shield] can be used to create a variety of sounds,for instance to signal happines, distress or alarm.
:::''Development status: Operational, but specific sounds need to be designed
 
== Behavioural levels ==
 
 
 
:: Behaviour in subsumption architecture is organised in levels that compete for control, the highest priority behaviour (lowest rank, top of list) will control the robot for 50 msec.
 
::'''Overview of current behavioral levels and  their most important inputs for controlling requests'''.
:::Layers are only a naming convention, conceptually grouping behaviour with somewhat similar aims. Note that the ultimate goals of the system (here level SLEEP, see below) is determined by the order of the levels. Changing the order evidently will change the behavioural outcomes.  Currently the order is hard coded.
 
:::High priority behavioural levels take care of the most basic and critical task. When their goals are reached (their requests for control stop), higher behavioural levels, (representing more advanced lower priority tasks will be able to run. Note however that higher levels can explicitly inhibit or even modify lower level behaviours under some circumstances for some time.
 
 
::See the actual calls for details.
 
:::'''Layer "distress"'''
:::If unable to meet the absolute basic requirement for battery voltage or movement CALL for help. In STUCK, try reflexes to get loose, pause regular. If nothing works eventually HIBERNATE
 
::::Goal:  deal with the really difficult situations
::::0  HIBERNATE // hope for better times...
::::1  STUCK // try to solve non-movement when motors run...
::::2  CALL  // call for help
 
:::'''layer "stay free"'''
:::The primary sensory input activating this level are Bumper sensors and Ultrasonics, but also hormones for high emotional levels reached by an overload of sensory input or a high number of failing tries to reach some goal. FREEZE will stop movement for an serious amount of time to reduce overload FIGHT will move quickly forward and get very close to an object possibly even disregarding US and bumpers and hitting the object. ESCAPE will try to free the bumpers EVADE will try to keep a safe distance from objects. FLIGHT will try to run straight for a relative long time (using a countdown timer) but will subsumed by lower levels.
 
::::Goal: avoid stressors.
::::3  FREEZE  // simulated dead
::::4  FIGHT  // try to intimidate
::::5  ESCAPE  // escape from touch
::::6  EVADE  // keep distance
::::7  FLIGHT  // run away 
 
:::'''layer "external control"'''
:::This level is only requesting control if external commands are present in sensordata. The behaviour is self-reinforcing so once activated it keeps controlling the creature as long as new commands arrive within reasonable time. Note that Escape and Evade and Flight have priority, manual control has limits, the creature is autonomous!
 
::::Goals: Stay free and prevent distress, but otherwise otherwise obey external commands.
::::8 MANUAL // How boring...
 
:::'''layer "scan and move"'''
:::This level is the heart of the brain. Reaching the first level of this layer (INTEGRATE) is rewarding. :::Everything is (locally) under control and a target goal can been set. This will also increase positive emotional level. INTEGRATE will inspect internal status, like battery level, and emotional states like exhaustion, excitatory state and modify the settings for ORIENT and MOVE accordingly. Options for ORIENTs setting are for instance random turn angles and low speed  for a drunkard's walk, phototaxis (pos. or neg.), odour pulse count or interval for kinesis behaviour, and PIR input (i. e. human or robot) presence for more interactive behaviour. After this setting, INTEGRATE will inhibit the setting of the the INTEGRATE request flag for some time. Input from rotary encoders and/or motor commands) will re-trigger this inhibition of INTEGRATE so rotate and Orient can work. ROTATE will rotate around the body center until some sensory dead zone for the current target (just picked by INTEGRATE is reached. Move will drive forward and (together with ORIENT) allow the creature to move in the direction of the goal implied by ORIENT. ACCEPT checks sensory input for an acceptable stop position that might have food or a high quality resource. ACCEPT will increase positive emotional levels. FEED will become active when food is indeed available available, for instance, via bottom contacts and feeding will increase positive emotional levels even more.
 
::::Goal: find out where and when and how to move and set a goal
::::9 INTEGRATE // combine inputs and decide what to do at this moment.
::::10 ORIENT // orient towards the current goal
::::11 MOVE // move forward towards the current goal
::::12 ACCEPT // accept the current position and stop
::::13 FEED // electrical food is the best!
 
:::'''layer "appendage use"'''
:::Not all accepted position will directly allow access to resources like a voltage source, or an object to collect. EXTEND will try to reach the resource by (surprise) extending an appendage. PROBE will conduct a local search for the resource, and USE will get or enjoy the resource.
 
::::Goal: reach a high quality resource by extending an appendage
::::14  EXTEND // reach for a resource
::::15 PROBE // probe to make contact
::::16 USE // this is sooo good!
 
:::'''layer "interaction"'''
:::EXPLORE will search for fun, however that is defined at that moment, it might mean going back to some other location to drop a collected object. For more social tasks, it might be finding something or somebody to make contact with. When EXPLORE is successful and switches its request flag off, INTERACT will control the interactive behaviour performed. Both these levels may modify the inputs for the SCAN layer to keep focus. Interact will however also increase exhaustion that eventually will prevent allow SEEKREST to become active.
 
::::Goal: if all previous goals were reached this is the final goal setting layer
::::17 EXPLORE // find the fun
::::18 INTERACT // and love it!
 
:::'''layer "rest"'''
:::All fun comes to and end, when exhaustion strikes many higher priority levels that monitor exhaustion levels will become inactive and SEEKREST will modify the the scan layer and the the goal to finding a suitable location to rest. If a suitable location is found SLEEP will get active and stay active until sensory input changes again. At that point it is rinse and repeat.
 
::::Goal: reduce exhaustion, find a location to rest and sleep there.
::::19 SEEKREST // after hard work...
::::20 SLEEP // ... it is time to sleep!
 
:::'''layer "default"'''
:::If non of the other levels asks for control WAIT (that always requests control) switches off all motors, reduces power use as much as possible, and waits for input. This is ''never'' expected to happen!
 
::::Goal: wait for input
::::21 WAIT // default, always requesting but never getting it...
 
 
:: ''Stay tuned, more comming soon''
 
== Implementation details ==
: ''Stay tuned, more comming soon''
 
== References ==
 
:: '''Anderson, D.P. (2007)''' https://www.dprg.org/articles/2007-03a/
 
::'''Bettosini et al., 2022''' Torocó: A Subsumption Architecture Implementation, in: 2022 8th International Conference on Automation, Robotics and Applications (ICARA) IEEE, Prague, Czech Republic, pp. 27–32. . https://doi.org/10.1109/ICARA55094.2022.9738521
 
::'''Brooks, R.A. (1986)''' A Robust Layer Control System for a Mobile Robot, IEEE Journal of Robotics and Automation RA-2, 14-23


== More soon, this page is work in progress ==
::'''Rickenbacher (1975)''' Lernen und Motivation als relevanzgesteuerte Datenverarbeitung: Ein Computer-Simulationsmodell elementarer kognitiv-affektiver Prozesse. Birkhäuser Verag Basel.

Latest revision as of 00:05, 2 October 2025

Leaphy robot with subsumption architecture
Leaphy back side view.jpg
Participants
Skills software hardware electronics biology
Status Active
Niche robotics
Purpose Fun
Tool No
Location Space, home
Cost Very small given what is already present and what is available in the space (everything!!)
Tool category Electronics

Leaphy robot with subsumption architecture

Leaphy back side view.jpg {{#if:No | [[Tool Owner::{{{ProjectParticipants}}} | }} {{#if:No | [[Tool Cost::Very small given what is already present and what is available in the space (everything!!) | }}

Electronic insect-like creature

Physiological, Behavioural and environmental overview
peterr 2022- 2025 (with very long pauses)

Document history

  • Currently this is a living document and this wiki contains the latest version
  • September 2025 subsumption architecture design & transfer to nurdspace wiki
  • August 2025 Documentation for scheduler
  • March 2023 Draft version 0.2 - terminology cleanup and scope
  • November 2022 First document sketches & ideas






Project

The insect like creature based on the original leaphyrobot has a set of functionalities that together allow it to effectively find, and feed on targets that provides electricity as a reward. Learning strategies to improve success rate can be implemented. Sexual signaling, event prediction, mating behaviour, genetic exchange and evolution are in principle possible but are currently still (far) out of scope. The system at it's highest level can be seen as a feedback loop: sensory input => behavioural selection => motor output => new sensory input. The original design (2022) was meant to live in wind tunnel, and find odour sources to feed, but currently the emphasis has shifted from odour detection to more general sensory processing using subsumption architecture, see below.

Sensory inputs

Mechanoreceptor - sensing of obstacles
Micro switch based tactile hairs.
Development status: Not yet implemented
Ultrasound - Distance sensing of obstacles
Ultrasonic pulses detect obstacles and trigger reflex avoidance behaviour
Development status: Operational
Better directional responses would be possible with more than one US sensor.
This would need a dedicated embedded processor.
Development status: Not yet implemented
Sensing light intensity
Two LDR photoreceptors measure front-left light level and front-right light level
Total light intensity (sum) is available as an analog voltage
The difference between the inputs is also available, and signals what side to turn to, to approach or avoid the light.
Development status: working well in priciple, but due to mounting problems actually not functioning currently.
Sensing sound (audio)
A subunit with its own processor, containing an Electret microphone to measure ambiant sound level, frequency and sound patterns.
Can be used in classical (Pawlow) and operand conditioning experiments. Will detect both absolute sound levels (for startle or avoidance responses) and can contain feature detectors (e.g. chirp intervals for leaf hopper like communication and courting behaviour of two leaphy's).
In addition, an Audeme Movi speech recognition shield (see https://www.audeme.com/movi.html) could be used to process voice commands
Development status: Not yet implemented, but Movi shield "proof of concept" done
PIR movement sensor
In addition to light intensity, an infra red motion sensors can be used to detect movements (humans, or the cat for that matter)
Development status: Not yet implemented but PIR (similar to this adafruit thingy) is available.
Sensing wind direction
Semiconductor based tactile hairs? Something like this,with shielding? or sensitive temperature measurements?
Note
Wind direction sensing in the wind tunnel is currently replaced by sensing light gradients parallel to the wind direction because the wind sensor is not available yet.
Development status: To be designed, high priority for the windtunnel version with odour source finding as a main goal, low priority for the a free walking space/nurdspace version described here.
Volatiles (odours)
Semiconductor based perception of volatiles like alcohol (fruit-fly like behaviours), Carbon dioxide (mosquito like behaviour), or VOC's (most plant eating insects). The sensor produces a voltage level proportional to the intensity of the stimulus. Using a threshold can generate a presence/absence signal.
The interval between odour pulses is a proxy for the distance to the source. Note that an odour "gradient" does not exist!
Current implementation uses a SGP30 board. The time constant of the the sensor is in the order of 1 second, but "whiffs" of odour could be detected faster by using a feature detector that looks for sudden steps in the signal. This will also act as an sensory adaptation mechanism: in the absence of change the output go's to zero.
Development status: Sensor: worked, currently broken
Sensing ground color
This sensor measures ground colour and the output can be used to test for a particular condition (e.g. being at a target position). It can also be used to simulate trail pheromones implemented as colors next to the trail followed. This sense can also take the role of short distance " present at target" stimuli.
Development status: Operational
Sensing ground reflection level
Two sensors 'line followers' that both provide a binary present/absent signal of ground reflection.
Can be used to simulate trail following behaviour on a network of lines (combined with (led) ground colours.
Development status: testing phase
Future extensions: A combination of voltage levels and pulse frequency coding in the source could be used to simulate a spectrum of "chemical" inputs.
Feature detectors could extract particular frequencies and/or patterns to recognise for instance "toxic" substances.
Proprioception (internal "feeling")
Checking for instance for body movement when motors move, or for force feedback when extending electrical honey sucking mouthparts
Development status:To be developed, as yet low priority.
Sensing internal physiological states
To place electric reward information in perspective, some internal status information is needed, that is supplied by reading internal variables like energy levels (battery voltage , load current) or physiological state (internal variables like "ready to feed".
More advances implementations could look up previous experiences, and rate of change in physiological status as well as brain hormone levels and excitation states (see below under internal states and hormones").
Development status: To be designed.
Sensing hormone levels
Certain activities or combinations of activities can stimulate the production of hormones. This could be implemented as slow changing values in memory. Hormone levels might modulate the effect of sensory input or behavioural output, and act as a slow form of memory.
Sensing rewards and deterrents
Electronic organisms live on electricity, and are rewarded by sensing the presence of a voltage and current. This sensory organ is just sensing the analog voltage measured by the sensory organ. It simulates the perception of food such as nectar. Most logical position is at the tip of an appendage.
Development status: implemented and removed (needs mutiplexing). Mechanical movement (servo) to make electrical contact: testing phase.

Brain

Subsumption architecture
The Electronic organisms has a software brain that receives sensory input both from the environment and internal status. The brain reads all inputs about every 50 msec, and produces command for actuators that produce behaviour. These behaviours can be as simple as a colour changes of a led (signaling sensory status) or movements of body and body and appendages. The aim is to make the lower levels as stateless as possible i.e. the brain uses its real time vision of the word as a model instead of some abstract internal representation. The processing principle roughly follow the subsumption architecture (Brooks 1986).
Wikipedia writes about this technique:
"[Behaviours] are organized into a hierarchy of layers. Each layer implements a particular level of behavioral competence, and higher levels are able to subsume lower levels (= integrate/combine lower levels to a more comprehensive whole) in order to create viable behavior. For example, a robot's lowest layer could be "avoid an object". The second layer would be "wander around", which runs beneath the third layer "explore the world".[...]The subsumption architecture creates a system in which the higher layers utilize the lower-level competencies. The layers, which all receive sensor-information, work in parallel and generate outputs. These outputs can be commands to actuators, or signals that suppress or inhibit other layers."
My current implementation uses the opposite order. The high priority levels conduct the simplest behavioural tasks, and when these high priority but simple behavioural levels are happy more complex task can get control. This approach is still experimental, but in any case easy enough to reverse again.
The behavioural levels all compete for control of the robot. Each behavioural layer evaluates its sensory input and determines if it wants to control the outputs (to change its input according to its (implied) goal, e.g. mechanoreceptors (microswitches) will only request control if the switch is closed. After all the behavioural levels are evaluated, and their request flags set or reset an arbitrate routine selects the highest active layer that will run the robot behaviorual output for the next 50 msec.
See Anderson (2007) for a clear explanations and actual implementation examples.
Brain subsystems
More complicated or slow tasks (like sound processing or US echo detection) can be executed in separate processor boards that report their status to the main controller. Conceptually this is not different from subroutines for tasks.
Development status: To be developed
Nervous system
The currently (sort of) running implementation of the brain is using a real time loop in a very simple cooperative multitasking system (YiweiMao scheduler) and inmplemented on an Arduino Uno
Development status: operational, but under active development.
Associative memory
Associative memory in it current, simplest form will be implemented as memory locations for particular sensory input combinations. inputs that coincide increase a counter for that combination (learning those combinations). This allows both Pawlow and operand conditioning.
Development status: not yet implemented, proof of concept done ~50 years ago :).
Excitation state and hormones
Hormones are internal variables with levels that act as slow changing memory for previous (combinations of) states.
A default breakdown speed can be set that slowly decrease hormone levels. Particular sensory input could increase the levels again. Reaching high levels of certain hormones can be a goal in itself, and subject to learning behaviour.
Development status: To be developed
Satisfaction
Signals to end feeding behaviour can -in addition to electrical food running out- also come from internal signals (satiation, internal hormonal signals, (too) long time periods with little change of events)
Future prediction
This is currently way out of scope, but neural-net like implementations that store event sequences, predict likely outcomes of current inputs and act accordingly are fully feasible.
See Rickenbacher 1975 for an example that is probably a bit more traceable than -for instance- modern neural network code like Tensorflow.
Development status: To be developed

Effectors

main motors
The main moves of the organism are implemented by two wheels driven by two DC motors with build in gearboxes) and a motor shield. Motor speed is controlled by pulse width modualation (PMW)) present in the processor. Steering is done by differential motorspeed, a third backside "wheel" is a simple short passive wooden stand. This works well on reasonable flat surfaces, but needs improvement is the next robot (Terzo).
Motor development status: Operational at a basic level, PID control and extra power at startup to get smooth startup at low speed is needed.
Movable mouthparts
Appendages like extendable mouthparts controlled by servo's can be used for feeding on electrical food. Currently one servo controlling a proboscis (mouthpart for feeding) is present.
Development status: Operational in priciple, untested in practice, needs 3D printing of parts
Three colour led
can be used for signaling status of whatever state we want.
Development status: Operational
Piezo beeper
Present on the motor shield can be used to create a variety of sounds,for instance to signal happines, distress or alarm.
Development status: Operational, but specific sounds need to be designed

Behavioural levels

Behaviour in subsumption architecture is organised in levels that compete for control, the highest priority behaviour (lowest rank, top of list) will control the robot for 50 msec.
Overview of current behavioral levels and their most important inputs for controlling requests.
Layers are only a naming convention, conceptually grouping behaviour with somewhat similar aims. Note that the ultimate goals of the system (here level SLEEP, see below) is determined by the order of the levels. Changing the order evidently will change the behavioural outcomes. Currently the order is hard coded.
High priority behavioural levels take care of the most basic and critical task. When their goals are reached (their requests for control stop), higher behavioural levels, (representing more advanced lower priority tasks will be able to run. Note however that higher levels can explicitly inhibit or even modify lower level behaviours under some circumstances for some time.


See the actual calls for details.
Layer "distress"
If unable to meet the absolute basic requirement for battery voltage or movement CALL for help. In STUCK, try reflexes to get loose, pause regular. If nothing works eventually HIBERNATE
Goal: deal with the really difficult situations
0 HIBERNATE // hope for better times...
1 STUCK // try to solve non-movement when motors run...
2 CALL // call for help
layer "stay free"
The primary sensory input activating this level are Bumper sensors and Ultrasonics, but also hormones for high emotional levels reached by an overload of sensory input or a high number of failing tries to reach some goal. FREEZE will stop movement for an serious amount of time to reduce overload FIGHT will move quickly forward and get very close to an object possibly even disregarding US and bumpers and hitting the object. ESCAPE will try to free the bumpers EVADE will try to keep a safe distance from objects. FLIGHT will try to run straight for a relative long time (using a countdown timer) but will subsumed by lower levels.
Goal: avoid stressors.
3 FREEZE // simulated dead
4 FIGHT // try to intimidate
5 ESCAPE // escape from touch
6 EVADE // keep distance
7 FLIGHT // run away
layer "external control"
This level is only requesting control if external commands are present in sensordata. The behaviour is self-reinforcing so once activated it keeps controlling the creature as long as new commands arrive within reasonable time. Note that Escape and Evade and Flight have priority, manual control has limits, the creature is autonomous!
Goals: Stay free and prevent distress, but otherwise otherwise obey external commands.
8 MANUAL // How boring...
layer "scan and move"
This level is the heart of the brain. Reaching the first level of this layer (INTEGRATE) is rewarding. :::Everything is (locally) under control and a target goal can been set. This will also increase positive emotional level. INTEGRATE will inspect internal status, like battery level, and emotional states like exhaustion, excitatory state and modify the settings for ORIENT and MOVE accordingly. Options for ORIENTs setting are for instance random turn angles and low speed for a drunkard's walk, phototaxis (pos. or neg.), odour pulse count or interval for kinesis behaviour, and PIR input (i. e. human or robot) presence for more interactive behaviour. After this setting, INTEGRATE will inhibit the setting of the the INTEGRATE request flag for some time. Input from rotary encoders and/or motor commands) will re-trigger this inhibition of INTEGRATE so rotate and Orient can work. ROTATE will rotate around the body center until some sensory dead zone for the current target (just picked by INTEGRATE is reached. Move will drive forward and (together with ORIENT) allow the creature to move in the direction of the goal implied by ORIENT. ACCEPT checks sensory input for an acceptable stop position that might have food or a high quality resource. ACCEPT will increase positive emotional levels. FEED will become active when food is indeed available available, for instance, via bottom contacts and feeding will increase positive emotional levels even more.
Goal: find out where and when and how to move and set a goal
9 INTEGRATE // combine inputs and decide what to do at this moment.
10 ORIENT // orient towards the current goal
11 MOVE // move forward towards the current goal
12 ACCEPT // accept the current position and stop
13 FEED // electrical food is the best!
layer "appendage use"
Not all accepted position will directly allow access to resources like a voltage source, or an object to collect. EXTEND will try to reach the resource by (surprise) extending an appendage. PROBE will conduct a local search for the resource, and USE will get or enjoy the resource.
Goal: reach a high quality resource by extending an appendage
14 EXTEND // reach for a resource
15 PROBE // probe to make contact
16 USE // this is sooo good!
layer "interaction"
EXPLORE will search for fun, however that is defined at that moment, it might mean going back to some other location to drop a collected object. For more social tasks, it might be finding something or somebody to make contact with. When EXPLORE is successful and switches its request flag off, INTERACT will control the interactive behaviour performed. Both these levels may modify the inputs for the SCAN layer to keep focus. Interact will however also increase exhaustion that eventually will prevent allow SEEKREST to become active.
Goal: if all previous goals were reached this is the final goal setting layer
17 EXPLORE // find the fun
18 INTERACT // and love it!
layer "rest"
All fun comes to and end, when exhaustion strikes many higher priority levels that monitor exhaustion levels will become inactive and SEEKREST will modify the the scan layer and the the goal to finding a suitable location to rest. If a suitable location is found SLEEP will get active and stay active until sensory input changes again. At that point it is rinse and repeat.
Goal: reduce exhaustion, find a location to rest and sleep there.
19 SEEKREST // after hard work...
20 SLEEP // ... it is time to sleep!
layer "default"
If non of the other levels asks for control WAIT (that always requests control) switches off all motors, reduces power use as much as possible, and waits for input. This is never expected to happen!
Goal: wait for input
21 WAIT // default, always requesting but never getting it...


Stay tuned, more comming soon

Implementation details

Stay tuned, more comming soon

References

Anderson, D.P. (2007) https://www.dprg.org/articles/2007-03a/
Bettosini et al., 2022 Torocó: A Subsumption Architecture Implementation, in: 2022 8th International Conference on Automation, Robotics and Applications (ICARA) IEEE, Prague, Czech Republic, pp. 27–32. . https://doi.org/10.1109/ICARA55094.2022.9738521
Brooks, R.A. (1986) A Robust Layer Control System for a Mobile Robot, IEEE Journal of Robotics and Automation RA-2, 14-23
Rickenbacher (1975) Lernen und Motivation als relevanzgesteuerte Datenverarbeitung: Ein Computer-Simulationsmodell elementarer kognitiv-affektiver Prozesse. Birkhäuser Verag Basel.