I work as a visual artist. My project is based upon the goal that a houseplant control motor actuators. More info about the project can be found here: [url removed, login to view]
Two differential signals (four electrodes) are collected from the plant's leafs. The signals goes to a raspberry pi by a ads1115 16-bit adc. The goal is that the plant, as far as it is possible, control the motor actuators (robot legs).
I want a machine learning/ann solution to interpret the signals and choose between available actuator movements. The python script at [url removed, login to view] defines possible movements based on the physical construction.
Regarding the machine learning code I see that different possible solutions are suggested by various papers dealing with a similar problem when prosthetics are trying to classify emg signals to motor movements.
I believe unsupervised classification is desired since there is no way to classify a plant's electrophysiology to this project's goals (to move robot legs in some autonomous way). I see that autoencoder configurations have been used to reduce/abstract in an unsupervised way. Convolutional network in conjunction with k-means and autoencoder ([url removed, login to view]), and extreme learning machine based autoencoder ([url removed, login to view]) are two configurations I have seen. Another solution I have noticed are network based wavelet transform classification ([url removed, login to view]).
It should be noticed that I do not have any particular knowledge in machine learning, signal processing and so forth. The goal is mainly a conceptually sound treatment of the signals from the houseplant by the motor actuators.