Leaphy Robot: Difference between revisions

From NURDspace
Jump to navigation Jump to search
mNo edit summary
Line 60: Line 60:


:: ''' Sensing wind direction '''
:: ''' Sensing wind direction '''
::: Semiconductor based tactile hairs?  something like https://www.instructables.com/Low-Cost-Low-Speed-Wind-Sensor-1/ with shielding? or sensitive temperature measurments?
::: Semiconductor based tactile hairs?  Something like https://www.instructables.com/Low-Cost-Low-Speed-Wind-Sensor-1/ with shielding? or sensitive temperature measurements?
::: Note: Wind direction sensing is currently replaced by sensing light gradients parallel to the wind direction because the wind sensor is not available yet.
::: Note: Wind direction sensing in the windtunnel is currently replaced by sensing light gradients parallel to the wind direction because the wind sensor is not available yet.  
::: ''Development status: To be designed, high priority for the windtunnel version with odour processing as a main goal, low for the a free walking space/nurdspace version.''
::: ''Development status: To be designed, high priority for the windtunnel version with odour processing as a main goal, low for the a free walking space/nurdspace version described here.''


== More soon, this page is work in progress ==
== More soon, this page is work in progress ==

Revision as of 23:13, 11 September 2025

Leaphy robot with subsumption architecture
Leaphy back side view.jpg
Participants
Skills software hardware electronics biology
Status Active
Niche robotics
Purpose Fun
Tool No
Location Space, home
Cost Very small given what is already present and what is available in the space (everything!!)
Tool category Electronics

Leaphy robot with subsumption architecture

Leaphy back side view.jpg {{#if:No | [[Tool Owner::{{{ProjectParticipants}}} | }} {{#if:No | [[Tool Cost::Very small given what is already present and what is available in the space (everything!!) | }}

Electronic insect-like creature

Physiological, Behavioural & environmental overview
peterr 2022- 2025

Document history

  • September 2025 subsumption architecture design
  • August 2025 Documentation for scheduler
  • March 2023 Draft version 0.2 - terminology cleanup and scope
  • Februari 2023 Draft version 0.1
  • December 2022 Partial documentation/brainstorming
  • November 2022 First document sketches & ideas

Project

The insect like creature (based on the leaphy robot) has a set of functionalities that together allow it to effectively find, and feed on targets that provides electricity as a reward. Learning strategies to improve succes rate can be implemented. Sexual signaling, event prediction, mating behaviour, genetic exchange and evolution are in principle possible but are currently still (far) out of scope.

Sensory inputs

Mechanoreceptor - sensing of obstacles
Micro switch based tactile hairs.
Development status: Not yet implemented
Ultrasound - Distance sensing of obstacles
Ultrasonic pulses detect obstacles and can trigger reflex avoidance behaviour
Development status: Operational
Better directional responses would be possible with more than one US sensor.
This would need a dedicated embedded processor.
Status: Not yet implemented
Sensing light intensity
Two LDR photoreceptors measure front-left light level and front-right light level
Total ambient light intensity is available as an analog voltage
The difference between the inputs is also available, and signals what side to turn to, to approach or avoid the light.
Development status: Operational
Sensing sound (audio)
A subunit with its own processor containing an Electret microphone to measure ambiant sound level, frequency and sound patterns.
Can also be used in classical and operant leaning experiments Detect both absolute sound level (for startle or avoidance resposes) or feature
detectors (e.g. chirp intervals for leafhopper like communcication and courting behaviour of two leaphy's.
In addition, an Audeme Movi speech recognition shield (see https://www.audeme.com/movi.html) could be used to process voice commands
Development status: Not yet implemented, but Movi shield "proof of concept" done
PIR movement sensor
In addition to light intensity, an infra red motion sensors can be used to detect movements (humans, or the cat for that matter) see for instance https://www.adafruit.com/product/4871
'Development status: Not yet implemented
Sensing wind direction
Semiconductor based tactile hairs? Something like https://www.instructables.com/Low-Cost-Low-Speed-Wind-Sensor-1/ with shielding? or sensitive temperature measurements?
Note: Wind direction sensing in the windtunnel is currently replaced by sensing light gradients parallel to the wind direction because the wind sensor is not available yet.
Development status: To be designed, high priority for the windtunnel version with odour processing as a main goal, low for the a free walking space/nurdspace version described here.

More soon, this page is work in progress