Hey Everyone.
Love too see this board being active! I'm looking to gain clarity on all the repositories for the muse headset. In terms of developing your own independent scripts.
I only really ended up getting one project to work with. Using the muse-js library to play the chrome dino game.
https://urish.medium.com/your-brain-the ... 75aad0fa8e
My headset is collecting dust now
I'm thinking of using the muse headset to potentially develop games in game engines such as godot or using it simply within a python script.
if blink.detected: #Based on filtering of the raw eeg data. Predetermining what a blink looks like
print("hello world")
....
I know there is the python-osc that Mr.james developed (Thank you very much
Can it be applied to this?
thank you!
Using live data from mind-monitor to run a program in python
Re: Using live data from mind-monitor to run a program in python
Hi there,
I'm developing an art piece and was wondering your advice in using the different brain waves in mind monitor.
i think beta and alpha will be the most effective but id like to pick one.
the task is that the audience memember moves forward through these lands and focuses at a circle in the center.
the visuals will be stimulating so i didnt know if to use beta or alpha ( when the audience is not being very good at getting in a alpha state)
your thoughts and advice please
I'm developing an art piece and was wondering your advice in using the different brain waves in mind monitor.
i think beta and alpha will be the most effective but id like to pick one.
the task is that the audience memember moves forward through these lands and focuses at a circle in the center.
the visuals will be stimulating so i didnt know if to use beta or alpha ( when the audience is not being very good at getting in a alpha state)
your thoughts and advice please
- Peter Gamma
- Posts: 180
- Joined: Sat Jun 29, 2019 11:02 am
- Location: Switzerland
- Contact:
Re: Using live data from mind-monitor to run a program in python
Since Interaxon does not offer the Muse SDK anymore, the community of developers developing for the Muse headband has be come small, as far as we can see. When connected to another platform over a live-streaming, for instance Home Assistant:
viewtopic.php?t=1739
this could change. Starting form an InfuxDB with all the Muse data in, we maybe would become Muse developers, too.