Testing Brain-Computer Interfaces

Поделиться
HTML-код
  • Опубликовано: 13 дек 2024

Комментарии • 331

  • @jamesbruton
    @jamesbruton  4 года назад +33

    You can support me on Patreon or through RUclips channel membership - Patrons and Members get access to all the videos up to a week early! www.patreon.com/XRobots

    • @GeeveGeorge
      @GeeveGeorge 4 года назад

      Would be cool if you could build an User Interface like Emotiv , where they record spikes for few seconds and asks the users to make specific movements. They then use pattern recognition to classify the spikes accordingly in real time.

    • @JClemente1980
      @JClemente1980 3 года назад

      Emotive epoc also has a sdk for everyone to use, and it has from the start a small fpv demo for you to control the movement with your own mind. They have been around for at least 7-8 years. The difference from your set and approach, is that they do calibrate the movement from your own thought. which means you can still use your members and have an extension. For example, to control a 3rd arm, or even a tail :P . I do not like their wet electrodes, I had already been testing with dry electrodes, a design very similar to the ones you're using. I've thought on using that on my own PhD, wanted to supply my own eeg to a few growing neurons on a petri dish with electrodes, just to see what happened... What I was actually testing was the coating for electrodes to be implanted, but could be a more realistic situation if they were being stimulated with my own waves...
      P.S. No, I did not wanted to download my conscience to a set of neurons!!!!

  • @Graham_Wideman
    @Graham_Wideman 4 года назад +198

    James: There is a huge literature in EEG signal detection and interpretation that you could be drawing on. You are confronting at least two problems here. One is using just a few electrodes on the scalp to attempt to localize signals from a specific part of the brain. This is known as the "inverse" problem, and it's non-trivial, to say the least. Search for papers on that topic. The second problem is trying to pick up signals that convincingly correlate to some pattern of thought. Those are quite small signals relative to the coarse waves seen at an electrode that represents the sum of activity of thousands or millions of neurons near that electrode. To detect such a signal, experimenters use a paradigm like: present the experimental subject with a succession of trials, repeating the same stimulus over and over again (intermingled with control different-stimulus trials), and then average the signals from like trials together, aligned to the time of the stimulus onset. Then subtract the average control signal from that of the stimulus trials to produce a result signal. Hoping to discern a thought with a single trial, no control trial, and no sync to stimulus is pretty optimistic.
    As a side note: 18:02 -- "the electrode is pointing in to the other side of my brain". No, that's not a thing. Those electrodes are simply making contact with the skin on your head, and the only way it receives a signal is plain old conduction of the signal from all the firing neurons, through the tissue of the brain, the outer membrane that surrounds the brain, the cerebrospinal fluid (CSF), the skull, and the skin, with all the inhomogeneities that involves.

    • @MollyWi
      @MollyWi 4 года назад +20

      Yeah I agree, the problem is associated with sorting through all the large noise to find something localized. Even then the electronics would need to be very accurate differential amplifiers that can detect signals below the noise floors. Electromagnetic interference, from say a television in another room, is far more likely to appear on that graph than an inner brain signal. Get any some of muscle EEG is a good starting point though.

    • @jamesbruton
      @jamesbruton  4 года назад +46

      Super, interesting, thanks!

    • @minibigs5259
      @minibigs5259 4 года назад +1

      Excited to see James develop some self trials, baseline, cat, cat, cat, dog etc along with the deep learning for filtration!!

    • @noaht5654
      @noaht5654 4 года назад +7

      You always have to worry about what you are actually measuring. EEG picks up magnetic fields on the skin. Anything that can cause those fields can produce a signal, like your 50Hz line noise. Around 7:20, you may be producing a muscle signal ever so slightly on your scalp. Try clenching your jaw next time you wear the headset. You'll clearly see the muscle signal.
      19:00 If you are thinking about controlling a robot limb, it may still be beneficial to have sensors over the somatic and (pre)motor cortices still. Think of it like your brain is "simulating" it's own limb movements.
      These are great resources to learn more, www.mikexcohen.com/#books. Some people like Cohen, some people don't. I think he at least tries to fully explain competing schools of thought on EEG signal, so he may be a good reference.

    • @donaldviszneki8251
      @donaldviszneki8251 4 года назад +3

      The weirdest thing he suggested was that the electrodes were "pointing" somewhere inside his brain. That's not how it works.

  • @zabridges
    @zabridges 4 года назад +118

    Man, there is a DIY kit for everything! Incredible!

    • @thesacredsword7230
      @thesacredsword7230 4 года назад +8

      DIY replacement kidney next step in diy kits

    • @donaldviszneki8251
      @donaldviszneki8251 4 года назад +3

      The 8 channel base kit is $500. That's affordable, but not affordable enough IMO.

    • @GusCraft460
      @GusCraft460 3 года назад +1

      @@donaldviszneki8251 compared to what it would normally cost, that is insanely cheap. Of course most people don’t just have $500 lying around to spend on something like this, but it’s still not completely out of most people’s price range for something like a birthday gift.

    • @OMAR-fq4qi
      @OMAR-fq4qi 3 года назад +1

      Link to buy all electronic parts please

    • @RIPtechnoblade1
      @RIPtechnoblade1 2 года назад

      DIY kit for getting 1.5k if what I nedd

  • @alexisandersen1392
    @alexisandersen1392 4 года назад +56

    Because this is open source, I imagine that the data on display is rather immediate and raw from the sensors themselves (which is good and bad)... the problem with EEG sensors is that blood pressure will have an effect, you've can see that when you tense your muscles and it lights up your whole brain, it's not brain activity though, it's blood that is going to your head rather than to your constricted extremities. Bio sensing of the core blood pressure typically positioned on the lower neck or chest can be used as a point of reference to interpret the sensor data filtering out blood pressure. While Cerebral blood flow is decoupled from body blood pressure, you still have the skin and muscled over the skull with bodily blood pressures interfering with the probes.
    Typical commercial products will already have means to filter out these discontinuities to some degree so it's easy to work with out of the box, but it also means the data has less fidelity to the sensors measurements. The sensor data needs treatment and filtering to be useful, but treating the data taints it; with an open source set up, how much and what kind of treatment is up to you, but if you want to use the sensor data, it needs to be heavily treated such that you can faithfully reproducible interaction. There's effectively too many variables and they're all interfering with your intent, you can't really look to one sensor's data and say much of anything.

    • @izhamazman209
      @izhamazman209 4 года назад

      Sorry i didnt fully understand your comment since im only just starting to get into these kinda things and not really a person with biology knowledge. But i gotta ask, you said that the data is not fully indicating brain activity since you said too many variables at play. Is there any possible way to gather only brain signal data aside from being invasive such as neuralink? Will a full body crippled person with only a functioning brain be able to provide only brain signal since muscle is out of the equation? How much is the difference in data between being awake and sleeping? Does the brain is less active during sleeping? Will the data be similar if compared between an awake full body crippled person with a person that is in deep sleep?
      Hope you can answer even a few of those question. Thanks in advance

    • @alexisandersen1392
      @alexisandersen1392 4 года назад +3

      There are far too many factors at play when it comes to human brains in general, much less accounting for abnormal brain conditions. That said someone being crippled doesn't necessarily indicate a problem with their brain, such is the case with someone with spinal chord damage. If the brain is typical, and undamaged, it should behave similarly to other typical, undamaged brains. As for sleep vs Awake, it really depends on what stage of sleep the brain is in. Sleep is a whole other can or worms.
      Regardless, since neural signals are so finely granular, and because there are so many connections with their own signal, it's infeasible to obtain perfect data from the brain. There will always be interference, if not from peripheral systems of the brain, then from the concert of neurons all "talking at once" into your extremely limited number of sensors.
      It's like trying to reconstruct a stadium full of sound over space and time with a single microphone. You will experience a loss of signal fidelity, it's just the nature of the problem. However, we can use the limited information that we can gather, and intelligent systems to infer signal data, and this is what is typically used even with neural implants, the signals from the sensors must be fed into a system that can find known patterns that are associated with brain intended signals.
      One approach is to use an artificial neural network, that can be fed the sensor data, and trained to infer the implied signals, but the topology of such an artificial neural network capable of infer brain implied signals from sensor signals is it's own field of study as well.

    • @donaldviszneki8251
      @donaldviszneki8251 4 года назад +1

      Hey why do you say cerebral blood flow is decoupled from body blood pressure?

    • @alexisandersen1392
      @alexisandersen1392 4 года назад +3

      @@donaldviszneki8251 cerebral blood pressure is regulated separately from the rest of the body as too much blood pressure could lead to bruising in the brain matter.

    • @monad_tcp
      @monad_tcp 3 года назад

      @@alexisandersen1392 brains are so complex. Can you imagine the advances we will be able to make when we start having 100000 of probes in a human brain, so we can actually see what's happening between small groups of neurons.

  • @mathieusan
    @mathieusan 4 года назад +20

    You're one head hit away from discovering the flux capacitor there

  • @matthewcollier3482
    @matthewcollier3482 4 года назад +35

    "Mooom can we get Neuralinks please?"
    "No we have Neuralinks at home!"
    Neuralinks at home:

    • @mrmartinwatson1
      @mrmartinwatson1 4 года назад +2

      probably be the only person to successfully log of off sao

  • @TechMarine
    @TechMarine 4 года назад +7

    If I may suggest, you should use the "tree" support structure, you would save so much plastic for that kind of construction

  • @sunboy4224
    @sunboy4224 4 года назад +1

    I actually have worked in labs that did BCI's, and just graduated with a PhD in Biomedical engineering, with a focus on neuroengineering. I'm sorry to say, but this project is dead in the water. It's been about 5 years since I've looked at an EEG signal, so my memory is admittedly a bit fuzzy on the details, but my lab was attempting to decode leg kinematics from EEG. We were using decently complex filtering (Unscented Kalman filters, H-Infinity filters, all kinds of frequency-domain filters, as well as PCA reconstruction [for offline analysis]), and in the year that I was there we weren't able to decode a signal that "looked good" (we were getting some metrics that it was kind of working, but DEFINITELY nothing that looked correct to the eye). We believe this is because leg kinematics are actually encoded in the spine rather than the brain, but even if that wasn't the case (as in hand-movements), the EEG signal is just WAY too noisy to do anything with.
    It's been a while since I've looked at EEG, but I'm pretty confident in saying that most or all of the signal you were seeing on your headset was EMG artifact. If you WERE to see some kind of movement signal, it probably wouldn't look like what you think it would. It's not going to be that enormous amplitude spike, 5x larger than the noise floor. It would probably be some small increase in the spectral power that correlates loosely to the limb moving. You just don't have enough electrodes in the area to get a useful signal, and you're probably not going to without opening up your skull and sticking something in.
    As for putting electrodes on your frontal cortex, well...you might actually have a little bit better luck there. Again, you're not going to be able to think "cup" and have the computer understand. But, MAYBE you can put the probes on your visual cortex and get the computer to recognize when you look at a very large red piece of paper that you mean cup, and a very large blue piece of paper means something else (and even then, you have to be very careful that the computer isn't just seeing the difference between "look left" and "look right", i.e. EMG from your eyes). A great example of a functioning EEG device that you might base your design on is the "P300 speller". The P300 wave is a repeatable brain signal that occurs when you observe a specific stimulus among a background of similar stimuli (imagine a room full of people who are sporadically clapping, and you focus on a specific person and count every time they clap). Using a specially built system, a computer is able to determine which letter a person is focusing on in a displayed keyboard, and then type the chosen letter, allowing someone to type without using their hands. THAT would be a cool project to get going...either spell the object you want ("cup", "remote", etc), or just display pictures of each object and have a "P300 speller"-like system pick out what you want.

  • @World_Theory
    @World_Theory 4 года назад +11

    I didn't know there were brain interface kits like this! That's pretty darn cool.
    I assume that you'll eventually use some machine learning with those readings, to interpret what's going on. But I'm not 100% confident in this assumption.
    See this has sparked an idea though. I've been diving into the subject of VR lately, so that's the track my mind tends to be on now. And I've seen some really interesting tracking technology, and ways to drive VR avatars. But I've never seen someone use a brain-computer interface (other than in fiction) to help control an avatar.
    With a brainwave reading kit, though, I think it would allow you to control the body language at the very least, of non-human body parts in an avatar. How much you're concentrating, for example, could be used to modify the idle animation of a tail, wings, or ears, taking inspiration from real life animals when they're concentrating on something.
    I think that combining brainwave tracking with full body tracking, and facial expression tracking, would give a neural network (AI) interpreter a lot of cues to guess at the state of your mind. Perhaps just enough cues to actually fully animate a non-human avatar, despite the lack of physical body parts in some cases, to actually track with traditional means. (With "traditional" being relative. Considering that VR and body tracking are still relatively new fields, even counting the movie industry's use for CGI motion capture.)
    Another thing that I think could be useful, is actually using VR as a prototyping tool for controlling and tracking methods for robots and other things. So that instead of having to spend money on physical materials to build a prototype model, just to work out the bugs in your control interface, you could use a 3D avatar instead. Even if the avatar is hideous and super basic, the movement of the bones should still be useful. Providing that the skeleton measurements are accurate.

    • @vgaggia
      @vgaggia 4 года назад

      I wonder if you had to set that up for movement, and you apply some sort of anesthetic to the person, they'd be able to move like they're actually in the vr, obviously it'd take some machine learning voodoo and insanely accurate sensors, but would still be interesting
      edit:
      could be useful to give to people who have medical issues that can't move anything but they're brains are still functioning perfectly

    • @feda9562
      @feda9562 4 года назад

      ​@@vgaggia that's exactly what BCIs where first developed for in the '70s

  • @ApanLoon
    @ApanLoon 4 года назад +31

    “Do you know what this means? It means that this damn thing doesn’t work at all!”

    • @StevenIngram
      @StevenIngram 4 года назад +4

      I thought of the same quote. LOL

    • @jonmayer
      @jonmayer 4 года назад

      Listen, Doc.

  • @benGman69
    @benGman69 4 года назад +11

    *James tenses his arms and suddenly drops to the floor with spit around his mouth

  • @clancywiggum3198
    @clancywiggum3198 3 года назад

    For what it's worth, and I admit I'm a bit late here, the primary motor cortex is effectively just a bank of output pins for the brain, it just contains the brain end of the neurons that run down the spine and, through a couple of subsequent more or less direct connections, connect to parts of muscles, so if you are directly reading it you can only sense actual physical movement. They're also perfectly mirrored - all neurons for one side of the body go to the other side of the brain. BCIs get more interesting if you target the motor planning areas because you can plan how to move part of you without actually having to move it so there's scope for pure mind control of computers or hardware through that technique, although of course the resolution is low. I would presume that's where you've accidentally targeted because that's a larger area and probably the suggestions of mixed right and left hemisphere involvement in any given limb would be from mixed processing in the motor planning areas as the true primary motor cortex does little if any actual processing.

  • @masterninjaworrior
    @masterninjaworrior 4 года назад +2

    Can’t wait to see what you do with this, I am very interested in BCI!

  • @richardstock1
    @richardstock1 4 года назад +1

    Great video looking forward to the next one. Subjects on the brain really intrigue me.

  • @1kreature
    @1kreature 4 года назад

    You said it your self... These are just single conductor electrodes that contact the skin. Thus, I don't know why you would think they "point" in any direction and thus have directional pickup capability.
    As for the moving of the arms: Try not waving the arms closer to the headset. Less noise. Sideways out from the body or simply making flat hand/fist with arms resting on lap works better.

  • @wbretherton
    @wbretherton 4 года назад +10

    'We're getting closer to Elon Musk's pig' is my favourite quote of the day

  • @Fury9er
    @Fury9er 4 года назад +2

    This was very interesting, I would like to try this out one day so hopefully the open source stuff will become more common and a little cheaper in the future.

  • @mossm717
    @mossm717 4 года назад +2

    Glad to see your doung this. I always wanted to try one of these eeg setups, but they’re so expensive

  • @farazsayed5730
    @farazsayed5730 4 года назад +1

    What would be super awesome is if there was a headset with loads and loads of pins connected to something like an fpga, and some software to configure the fpga to pick and choose regions of interest. You could even get away with using sprung test probes that probably won't hurt your head if you had them sufficiently densely packed

  • @georgemathieson6097
    @georgemathieson6097 4 года назад +44

    Favourite quote: "We're getting close to Elon Musk's pig"

  • @Rooey129
    @Rooey129 4 года назад +13

    That helmet is a giant antenna, especially for low frequency, you should shield and ground it out.

    • @jamesbruton
      @jamesbruton  4 года назад +6

      I might put my whole head in a metal box

    • @Rooey129
      @Rooey129 4 года назад +1

      @@jamesbruton thats a good idea, if it works, you could even just run the sensors off a shielded cat6, making sure to ground the shield and aluminum foil around the helmet.

    • @quattrocity9620
      @quattrocity9620 4 года назад

      I've seen bigger...

    • @noahluppe
      @noahluppe 4 года назад

      @@jamesbruton The man in the iron mask 2: electroencephalographic bogaloo

  • @ollie-d
    @ollie-d 3 года назад

    I have been doing EEG-based BCI research for just under 6 years now and there are a lot of errors in this video, but it’s a fairly good start.
    It wouldn’t be productive to nitpick everything in here, but know that you’re unlikely to get a good motor imagery-based system with more than two classes using the OpenBCI. You need to collect more controlled data. If you want to be making inferences on imagined movement, make sure you’re not moving your limbs but rather imagining movements (rotating a door knob works well). You should see a decrease in the mu rhythms over the sensory motor cortex during imagined movement. You’ll typically see a decrease in both C3 and C4 mu power but the mu power over the electrode contralateral to the imagined hand should be lower.
    If you wanted to use the activity from the actual muscle movements then you’ll have a much easier time using EMG electrodes on the arms since the signals are orders of magnitude more powerful and classification is trivial

  • @ted5610
    @ted5610 3 года назад

    just being able to see the different outputs from the electrodes is wild :o

  • @deepakjoshi6242
    @deepakjoshi6242 4 года назад +5

    This guy always bring interesting stuff in a fun to watch way..

  • @jesseshakarji9241
    @jesseshakarji9241 4 года назад +1

    It may be interesting to pair this with an artificial neural network(ANN) that you could train on your brain sensor data. This way instead of looking for specific jumps of activity an certain channels, you could use an ANN to classify the data for you so it would know if you're say moving a leg vs an arm and other muscle groups. Seems like the sensors are pretty sensitive so who knows how well this would work.

  • @LucasMcDonald
    @LucasMcDonald 4 года назад

    James, I used to do “bio feedback” as a training method for ADD. It used basically the same interfaces as your using here. The mantra they taught me to “brain train” I found years later is just mindful meditation. I’m sure looking into this will help you with your results.

  • @Genubath1
    @Genubath1 3 года назад

    Something that might affect the signals is something called irradiation. even if you are using one main muscle for a movement, there are tons of other muscles that support it and muscles that support those, and so on. When you lift your legs, you are also flexing your core and moving your arms to balance. When you make a fist, you flex your muscles all the way up your arm and into your core.

  • @tiagotiagot
    @tiagotiagot 3 года назад

    Would be interesting to see a visualization that colors each region based on the frequencies picked by the respective sensor, like mapping the lowest frequency to red, the highest to blue, and interpolating in between, and blending the various hues in weighted by the respective intensities; producing white where all frequencies are being detected at maximum strength together, and black when none is at all.

  • @wesmatchett615
    @wesmatchett615 4 года назад +2

    This is amazing similar to Dr. Emmett Brown’s thought helmet from Back To The Future

  • @tiagotiagot
    @tiagotiagot 3 года назад +2

    Maybe you could train some sort of neural net with videos of your face and the EEG readings, to identify and subtract the patterns that are produced by blinking, eye movements etc, when those patterns are present, while leaving the rest of the signal intact?

  • @BLBlackDragon
    @BLBlackDragon 4 года назад

    Even a rudimentary system like OpenBCI can have a number of applications. Motor control, bio-feedback training, etc. This could be fun to play with.

  • @zeekjones1
    @zeekjones1 4 года назад +1

    I'd point towards the upper forehead, as there is more there to work with.
    As you said, to record a limb, it's better off probing from said limb, however thought is only in the head, therefore targeting logic and reasoning can approximate a new digital 'limb'.
    Tandem AI and brain training; eventually you could move a cursor, then a game controller, maybe even some custom macros on the PC.

  • @RupertBruce
    @RupertBruce 4 года назад

    A 12 node band across the top, two either side at back for visual, two on temples for easy signaling/button pressing. Top band is 3 side-by-side on each side. I wouldn't expect much from lower limbs but I'd love to build a neural net with sensors as input and Nvidia's gesture model from a camera pointed at you so that it can build a body language model.

  • @HKallioGoblin
    @HKallioGoblin 3 года назад

    Secret services created a very advanced form of technological mindcontrol in 2008 that can understand all those thoughts. All readings can be animated to screen, so we know what those readings mean.

  • @reggiep75
    @reggiep75 4 года назад

    Reminds my of the numerous EEG tests I had for epilepsy but this kit is just more enjoyable and fun.

  • @ViniciusMiguel1988
    @ViniciusMiguel1988 4 года назад +4

    Oh James come on do it properly! Drill your head and stick the electrodes into the brain! 😁

  • @pawzubr
    @pawzubr 4 года назад +1

    Read about cocktail party problem - you have to use some kind of decoupling method, ex. Independent Component Analysis

  • @toniwalter2911
    @toniwalter2911 3 года назад

    at 15:28 what does the diagram mean by trunk? was it made for elephants or am i not getting it right?

  • @BenKDesigns
    @BenKDesigns 3 года назад

    James, you're like John Oliver, if John Oliver were an incredibly cool nerd. And, of course, I mean that in the best way possible. Love your channel.

  • @marc-antoinebelanger2841
    @marc-antoinebelanger2841 3 года назад

    @James Bruton: My father-in-law just been diagnosed with ALS. I would like to build a dataset of its speech and EEG signals. Before bying the kit I would like to know if it is possible to sacrifice a EEG input for a microphone. Or, is it possible to expand the board to add a microphone?

  • @benGman69
    @benGman69 4 года назад +3

    Therapist: Scary mannequin heads with makeup on can't hurt you.
    Scary mannequin with makeup on: 10:03

  • @oliverer3
    @oliverer3 4 года назад +7

    The fact to you specified that it wasn't your actual brain made me laugh.
    Also side note, can we consider this a thinking hat?

    • @pvic6959
      @pvic6959 4 года назад

      lol i didnt need to be told. i would assume his real brain is 10x that size lolol

  • @andy-in-indy
    @andy-in-indy 4 года назад

    You don't need to reposition the probes if you use the interaction of several probes to "triangulate" the point of activity. That will be mathematically complex, since the signal intensity is not a point source and it is surrounded by other signal sources.
    An interesting possibility is to train a neural net to read your brain pattern instead of trying to calculate all the equations simultaneously. I expect the training would involve something like a camera to detect joint position and angles of your movement being compared to the neural net output to predict what the joints position and rotations should be. Eventually, you should get a close correlation between the actual and predicted. That information could then be fed into robot arm or something like the performance robots you built.
    Once there is a correlation between actual movement and the output, you would need to begin training the neural net to detect visualization instead of motor cortex activity. I expect the training would be to have yourself visualize a movement routine, and train the neural net to output something that matches that routine. The routine would have to be changed up frequently to prevent the neural net from learning to just output the routine instead of trying to match your visualization.
    Anyway, If I had more time and money, that would be how I would approach it. That may be a bit too long and boring a process for RUclips videos.

    • @sunboy4224
      @sunboy4224 4 года назад

      The problem with this is that EEG doesn't really have hand kinematics information in it (or if it doesn't, it's INCREDIBLY hard to get to). Chances are, if a schema like this works, it will be because the neural network is recognizing EMG artifacts as features.

  • @excitedbox5705
    @excitedbox5705 4 года назад

    You could try a tight fitting bathcap or a swimming cap and mount probes all over that. You really just need wires connected to a metal tack. if you coat the tack in conductive gel you will get better readings.

  • @MuhammadDaudkhanTV100
    @MuhammadDaudkhanTV100 4 года назад

    Great full ideas and good work

  • @kicktangerines8528
    @kicktangerines8528 4 года назад +1

    I was just doing a paper on this. SO AWESOME!

  • @barrettdent405
    @barrettdent405 4 года назад

    Reminded of Doc Brown’s rig in back to the future.

  • @bgg4865
    @bgg4865 4 года назад

    My first thought was that experimenting on yourself is not a good idea, you have to think about what you want to do, and you can't help but think about what you're seeing on the charts as you do the movement. I'd say get someone to give you commands, and don't look at the plot as you obey them. Get the assistant to write out the moves in a different order too, so you don't know what's coming. Record as you go and try and interpret later.

  • @graealex
    @graealex 4 года назад +5

    "That's my brain"
    WTF, put it back!

  • @umbigbry
    @umbigbry 4 года назад +1

    from what i can read from the diagram is you like pepsi 7:52

  • @seeigecannon
    @seeigecannon 4 года назад +1

    I bought an Emotiv a while back for a project that I never got around to. Unfortunately, I would not recommend them. The dongle is paired to the headset in such a way that if you lose the receiving dongle the headset is effectively bricked. The lowest version also hides all of the outputs and does not have any kind of API for doing something with the data. There is a python program somebody made to decrypt the stream, but they sent a mass email out about how the python library is taking money from their pockets and they must raise the price of all headsets because hobbiests want to actually do stuff with the headsets without paying $500.
    Note: I last interacted with the headset around 7 or so years ago, so the review may be very out of date.
    Also, with the full helmet, could you try different sensations like pain, hot, cold, and maybe something like sweet/sour/spicy to see if anything interesting falls out?

    • @jamesbruton
      @jamesbruton  4 года назад

      Thanks for that, I'm intending to stick with this one for now.

  • @LokiLeDev
    @LokiLeDev 4 года назад

    Cool stuff! It is really hard to extract meaningful information from non invasive probes like that. I've seen a conference where they said that actually we can read about 1bit/s from the brain! And they tried machine learning to decode the signals but it was actually easier to train the brain to produce more clear signals!

  • @tedhuntington7692
    @tedhuntington7692 4 года назад

    fairly soon we may see a kind of menograph, a device that records thought-audio, as was thought about by Hugo Gernback and mentioned in his Ralph 124C41+ in 1911 CE.

  • @negative258
    @negative258 4 года назад +1

    Maybe you can look into MyoWare EMG sensor to activate the claws. EEG activated tasks will be pretty interesting tho

  • @wolf1066
    @wolf1066 2 года назад

    This was very interesting since I was wondering about Open BCI and how well it would work to control a robot arm. It looks like a lot of fiddling about would be required before I could get it to read brainwaves to the degree of precision I would need.

  • @nasim3269
    @nasim3269 4 года назад

    I saw those electrodes spiked for no reason were of a lesser amplitude compared to the electrodes which made sense. I think if you do further filtering you'll get a much better resolution for the motor command signals.

  • @EngineeringSpareTime
    @EngineeringSpareTime 4 года назад +1

    Very interesting! Did you think about EMI on the cables of the sensors? There are very sensitiv..bundle them might look better, it‘s might not be better in terms of data quality though (analog output?)

  • @ryansummer1589
    @ryansummer1589 4 года назад +1

    This is pretty awesome!

  • @jonathanballoch
    @jonathanballoch 2 года назад

    awesome stuff!! did you ever build the new adjustable headset?

  • @badWithComputer
    @badWithComputer 4 года назад +4

    I'd love to be able to show 1989 me this video

  • @TheArashhak
    @TheArashhak 3 года назад

    Can you share the CAD files (STL?) for the custom headset, please? I'm interested in targeting just the hand movement for a custom assistive device.

  • @mhnoni
    @mhnoni 3 года назад

    Not trying to be religious here, but someone made that brain @ 15:04, what an engineering, I really have no idea how some people think human/universe is just a matter of random.
    I wonder if the signal that goes from our brain to the target like fingers are coded or analog, I mean each finger has a single nerve connected to our brain like a copper wire or multiple fingers get the signal from a single nerve? if there is a nerve that control more than one system, then the human brain is more complex than what I thought.

  • @AmaroqStarwind
    @AmaroqStarwind 3 года назад

    Brain-Computer Interfaces would probably be really effective when used in conjunction with conventional user inputs. Imagine a neural interface helmet for an LMP1 race car.

  • @Shinika01
    @Shinika01 4 года назад

    Sooooo EPIC!!!!! Please go further with this toy!!!! Show us what could be done!

  • @Saraseeksthompson0211
    @Saraseeksthompson0211 3 года назад

    You are an absolute genius

  • @at0mic282
    @at0mic282 3 года назад

    Can you test responses to sudden shocks or surprises? so maybe a protection hardware would be possible to shield the user when they are suddenly put in a precarious situation

  • @bloodypommelstudios7144
    @bloodypommelstudios7144 3 года назад

    The other thing about the pig walking is it's a pretty predictable movement, if you know where in the walk cycle it is you should be able to predict fairly accurately what each limb is doing.

  • @Quasar_QSO
    @Quasar_QSO 3 года назад

    I sure hope this mind reading tech gets so much better soon. I want an omnidirectional wheelchair that I can control with my thoughts.

  • @johncaccioppo1142
    @johncaccioppo1142 3 года назад

    Have you figured out how to convert this into a synth controller yet? I think the Spitfire audio orchestra would overlay perfectly.

  • @JMRCREATIONS
    @JMRCREATIONS 4 года назад +8

    Near to 1 milliom👏👏🎶🎶🎇

  • @thorley1983
    @thorley1983 4 года назад

    could you calibrate the input with a hand movement followed by a jaw movement? and use a signal strength thing so that the position of the electrodes isnt as critical?

  • @CyberSyntek
    @CyberSyntek 4 года назад +1

    As much as I love the fact you have this up and running and believe if anyone can pump out great diy results on this project it is you James, but... my god man you are jumping from project to project on a weekly basis! XD
    Perhaps working on one thing at a time and really making the solid gains on it for a little longer. I understand it is fun to experiment and play with new concepts and designs but... you are James Burton! You have the ability to really push these projects to a next level.
    Either way keep doing you, love what you are doing. I get it that you need to keep the fresh content coming to bring in more supporters. We need a James clone at some point to allow the project focus time. XD

  • @antonwinter630
    @antonwinter630 4 года назад

    what a time to be alive

  • @stefanguiton
    @stefanguiton 4 года назад +1

    This would be amazing on your Exo suit!

  • @isbestlizard
    @isbestlizard 3 года назад

    I wonder if this could recreate those famous experiments where people make detectable intentions to do things slightly before they consciously attribute the decision point that'd be cool to see!

  • @catalinalb1722
    @catalinalb1722 3 года назад

    Hello, where can I order the kit with sensors and boards?

  • @cosmicrider5898
    @cosmicrider5898 4 года назад +6

    When you realize hes been building these robots to upload into and become an Android with gray hair.

  • @commanderbrickbreaker45
    @commanderbrickbreaker45 4 года назад

    Theoretically if someone were to use this and maybe change a few things up (or build an original version of something like this), do you reckon it'd be possible to Map the brain as it's Dreaming? If anything it'd be interesting to see how the brain changes comparing to Awake versus Asleep. (Though the Dream Mapping may itself just be what I'm curious about, A dream.)

  • @lasersbee
    @lasersbee 4 года назад

    11:33.. If the signals from the sensors are so low why are the wires from the sensors not shielded ?

    • @jamesbruton
      @jamesbruton  4 года назад

      I'm not really sure, might be worth putting my whole head inside a metal box

  • @inamdarswapneel11
    @inamdarswapneel11 4 года назад

    James : Can you please share 3d printing files for custom headset that you have designed ?

  • @РоманБессонов-ш7с
    @РоманБессонов-ш7с 4 года назад

    Can you tell us what skills and knowledge are needed to work with the brain-computer interface?

  • @motbus3
    @motbus3 4 года назад

    hey James.
    I think the signal is just too complex and have very low amplitudes to be visually curated.
    probably you'll have to record some time length of the movements and build some models to understand.
    I know your stuff is robotics but I guess you may find some deep learning pretrained models to validate the idea.
    I think some Transformer models could be built in order to predict the labels of the movement but it would require too much date to get it working from scratch.
    if you think this idea is cool just give a shout out on next video and I would be really happy :)

  • @4.0.4
    @4.0.4 4 года назад

    I wonder if you could train a neural network to take those signals and convert them into an estimated intention.

  • @outsider7654
    @outsider7654 3 года назад

    Combine this product with itrack 5 and an eye tracker camera; use xpadder to assign signals to keyboard and eye tracker and itrack 5 to mouse. Then you can play without moving or using your hands. And dont even think what can happend if you add a VR set.

  • @thebagelbomb
    @thebagelbomb 4 года назад +2

    Imagine an Iron Man helmet opening and closing when you think it.

  • @vincentpaniccia109
    @vincentpaniccia109 4 года назад +4

    2030: Here’s the upload your consciousness DIY kit.

    • @wesleymays1931
      @wesleymays1931 3 года назад

      I sure f**king hope so, getting tired of this stupid human body

  • @chrism9976
    @chrism9976 2 года назад

    I'd like to see a BCI connected to a keyboard. I'd like to know if one could communicate with others in the event of a paralyzing stroke.

  • @ahmedkamel821
    @ahmedkamel821 4 года назад

    The video is amazing as usual, I am just surprised by how bad the 3d printing quality is from such a professional maker!

    • @jamesbruton
      @jamesbruton  4 года назад +3

      It could have been better, but each half took 12 hours as it was, so it wasn't on the highest quality settings.

  • @electronetiq
    @electronetiq 3 года назад

    Can I translation waves to text by this brain sensor? Please answer thank you.

  • @kestergascoyne6924
    @kestergascoyne6924 4 года назад

    Mindblowing stuff!

  • @ebybbob
    @ebybbob 3 года назад

    Hey man, looking down like that in a superhero fight is dangerous... even if it let's you charge up your war-face....
    stay safe out there

  • @flloyd86
    @flloyd86 4 года назад

    I'm sure you've already thought of this, so. Are you aiming to apply Machine Learning? I see you've spoken about it while working on the jetson. It would be great to translate the input from cranial sensors into, dare I say, ROS messages? That would be cool and I might be able to find a repository to help. :)

  • @AJB2K3
    @AJB2K3 4 года назад

    I wonder how those spring jumper pins would work as they have tiny tips.

  • @OMAR-fq4qi
    @OMAR-fq4qi 3 года назад

    Link to buy all electronic parts please

  • @rustycobalt5072
    @rustycobalt5072 4 года назад

    Build a faraday cage around ya to remove all noise, have the Bluetooth receiver inside as well

  • @CNGboyevil
    @CNGboyevil 4 года назад

    I'm certainly no expert, but a couple of thoughts. Maybe some brass mesh to shield from external signals. Maybe chain up a bunch of ground, and place them in the comfort pegs

  • @bbogdanmircea
    @bbogdanmircea 4 года назад

    I really am curious about the number of hours those TAZ 3D Printers have under their belts! A dedicated video would be nice!

  • @emmanueldejesusvelasquezma8662
    @emmanueldejesusvelasquezma8662 2 дня назад

    where can i get the eeg sensor

  • @rcninjastudio
    @rcninjastudio 4 года назад

    Just one step closer to James becoming a cyborg😂, seriously though I find things like this fascinating, we've been to the moon, sent probes into deep space but we still don't fully know how the brain works

  • @DamianReloaded
    @DamianReloaded 4 года назад +15

    "We are getting closer to Elon Musk's pig" XD

    • @martin_mue
      @martin_mue 4 года назад +2

      Even though everyone knows that you never want go get close to Space Karens pig.