Testing Brain-Computer Interfaces

Поделиться
HTML-код
  • Опубликовано: 15 ноя 2020
  • Go test your skills at - go.tech/btcx
    I recently made Wolverine Claws which are triggered by a deep learning model that uses machine vision to look at my facial expression. Now it's time to test some brain-computer interfaces from openBCI to see how easy it would be to control robotics and prosthetics. I've managed to pinpoint specific places in my motor cortex, but I think we'll need a slightly different electrode mount on a headset to use free thought to control anything.
    Links from this video:
    • FULL REVEAL! Elon Musk...
    www.sciencealert.com/we-may-h...
    • Amputee Makes History ...
    You can support me on Patreon or buy my Merchandise:
    ***************************
    Patreon: / xrobots
    Merchandise: teespring.com/stores/james-br...
    ***************************
    Affiliate links - I will get some money of you use them to sign up or buy something:
    ***************************
    Matterhackers 3D printing supplies: www.matterhackers.com?aff=7500
    Music for your RUclips videos: share.epidemicsound.com/xrobots
    ***************************
    Other socials:
    ***************************
    Instagram: / xrobotsuk
    Facebook: / xrobotsuk
    Twitter: / xrobotsuk
    ***************************
    CAD and Code for my projects: github.com/XRobots
    Huge thanks to my Patrons, without whom my standard of living would drastically decline. Like, inside out-Farm Foods bag decline. Plus a very special shoutout to Lulzbot, Inc who keep me in LulzBot 3D printers and support me via Patreon.
    HARDWARE/SOFTWARE
    Below you can also find a lot of the typical tools, equipment and supplies used in my projects:
    Filament from: www.3dfuel.com/
    Lulzbot 3D Printers: bit.ly/2Sj6nil
    Lincoln Electric Welder: bit.ly/2Rqhqos
    CNC Router: bit.ly/2QdsNjt
    Ryobi Tools: bit.ly/2RhArcD
    Axminster Micro Lathe: bit.ly/2Sj6eeN
    3D Printer Filament: bit.ly/2PdcdUu
    Soldering Iron: bit.ly/2DrNWDR
    Vectric CNC Software: bit.ly/2zxpZqv
    Why not join my community, who are mostly made up of actual geniuses. There’s a Facebook group and everything: / 287089964833488
    XROBOTS
    Former toy designer, current RUclips maker and general robotics, electrical and mechanical engineer, I’m a fan of doing it yourself and innovation by trial and error. My channel is where I share some of my useful and not-so-useful inventions, designs and maker advice. Iron Man is my go-to cosplay, and 3D printing can solve most issues - broken bolts, missing parts, world hunger, you name it.
    XRobots is the community around my content where you can get in touch, share tips and advice, and more build FAQs, schematics and designs are also available.
  • НаукаНаука

Комментарии • 329

  • @jamesbruton
    @jamesbruton  3 года назад +33

    You can support me on Patreon or through RUclips channel membership - Patrons and Members get access to all the videos up to a week early! www.patreon.com/XRobots

    • @GeeveGeorge
      @GeeveGeorge 3 года назад

      Would be cool if you could build an User Interface like Emotiv , where they record spikes for few seconds and asks the users to make specific movements. They then use pattern recognition to classify the spikes accordingly in real time.

    • @JClemente1980
      @JClemente1980 3 года назад

      Emotive epoc also has a sdk for everyone to use, and it has from the start a small fpv demo for you to control the movement with your own mind. They have been around for at least 7-8 years. The difference from your set and approach, is that they do calibrate the movement from your own thought. which means you can still use your members and have an extension. For example, to control a 3rd arm, or even a tail :P . I do not like their wet electrodes, I had already been testing with dry electrodes, a design very similar to the ones you're using. I've thought on using that on my own PhD, wanted to supply my own eeg to a few growing neurons on a petri dish with electrodes, just to see what happened... What I was actually testing was the coating for electrodes to be implanted, but could be a more realistic situation if they were being stimulated with my own waves...
      P.S. No, I did not wanted to download my conscience to a set of neurons!!!!

  • @Graham_Wideman
    @Graham_Wideman 3 года назад +197

    James: There is a huge literature in EEG signal detection and interpretation that you could be drawing on. You are confronting at least two problems here. One is using just a few electrodes on the scalp to attempt to localize signals from a specific part of the brain. This is known as the "inverse" problem, and it's non-trivial, to say the least. Search for papers on that topic. The second problem is trying to pick up signals that convincingly correlate to some pattern of thought. Those are quite small signals relative to the coarse waves seen at an electrode that represents the sum of activity of thousands or millions of neurons near that electrode. To detect such a signal, experimenters use a paradigm like: present the experimental subject with a succession of trials, repeating the same stimulus over and over again (intermingled with control different-stimulus trials), and then average the signals from like trials together, aligned to the time of the stimulus onset. Then subtract the average control signal from that of the stimulus trials to produce a result signal. Hoping to discern a thought with a single trial, no control trial, and no sync to stimulus is pretty optimistic.
    As a side note: 18:02 -- "the electrode is pointing in to the other side of my brain". No, that's not a thing. Those electrodes are simply making contact with the skin on your head, and the only way it receives a signal is plain old conduction of the signal from all the firing neurons, through the tissue of the brain, the outer membrane that surrounds the brain, the cerebrospinal fluid (CSF), the skull, and the skin, with all the inhomogeneities that involves.

    • @stuartwilson4960
      @stuartwilson4960 3 года назад +20

      Yeah I agree, the problem is associated with sorting through all the large noise to find something localized. Even then the electronics would need to be very accurate differential amplifiers that can detect signals below the noise floors. Electromagnetic interference, from say a television in another room, is far more likely to appear on that graph than an inner brain signal. Get any some of muscle EEG is a good starting point though.

    • @jamesbruton
      @jamesbruton  3 года назад +46

      Super, interesting, thanks!

    • @minibigs5259
      @minibigs5259 3 года назад +1

      Excited to see James develop some self trials, baseline, cat, cat, cat, dog etc along with the deep learning for filtration!!

    • @noaht5654
      @noaht5654 3 года назад +7

      You always have to worry about what you are actually measuring. EEG picks up magnetic fields on the skin. Anything that can cause those fields can produce a signal, like your 50Hz line noise. Around 7:20, you may be producing a muscle signal ever so slightly on your scalp. Try clenching your jaw next time you wear the headset. You'll clearly see the muscle signal.
      19:00 If you are thinking about controlling a robot limb, it may still be beneficial to have sensors over the somatic and (pre)motor cortices still. Think of it like your brain is "simulating" it's own limb movements.
      These are great resources to learn more, www.mikexcohen.com/#books. Some people like Cohen, some people don't. I think he at least tries to fully explain competing schools of thought on EEG signal, so he may be a good reference.

    • @donaldviszneki8251
      @donaldviszneki8251 3 года назад +3

      The weirdest thing he suggested was that the electrodes were "pointing" somewhere inside his brain. That's not how it works.

  • @zabridges
    @zabridges 3 года назад +117

    Man, there is a DIY kit for everything! Incredible!

    • @thesacredsword7230
      @thesacredsword7230 3 года назад +8

      DIY replacement kidney next step in diy kits

    • @donaldviszneki8251
      @donaldviszneki8251 3 года назад +3

      The 8 channel base kit is $500. That's affordable, but not affordable enough IMO.

    • @GusCraft460
      @GusCraft460 3 года назад +1

      @@donaldviszneki8251 compared to what it would normally cost, that is insanely cheap. Of course most people don’t just have $500 lying around to spend on something like this, but it’s still not completely out of most people’s price range for something like a birthday gift.

    • @OMAR-fq4qi
      @OMAR-fq4qi 3 года назад +1

      Link to buy all electronic parts please

    • @r.i.ptechnoblade9407
      @r.i.ptechnoblade9407 2 года назад

      DIY kit for getting 1.5k if what I nedd

  • @mathieusan
    @mathieusan 3 года назад +20

    You're one head hit away from discovering the flux capacitor there

  • @alexisandersen1392
    @alexisandersen1392 3 года назад +56

    Because this is open source, I imagine that the data on display is rather immediate and raw from the sensors themselves (which is good and bad)... the problem with EEG sensors is that blood pressure will have an effect, you've can see that when you tense your muscles and it lights up your whole brain, it's not brain activity though, it's blood that is going to your head rather than to your constricted extremities. Bio sensing of the core blood pressure typically positioned on the lower neck or chest can be used as a point of reference to interpret the sensor data filtering out blood pressure. While Cerebral blood flow is decoupled from body blood pressure, you still have the skin and muscled over the skull with bodily blood pressures interfering with the probes.
    Typical commercial products will already have means to filter out these discontinuities to some degree so it's easy to work with out of the box, but it also means the data has less fidelity to the sensors measurements. The sensor data needs treatment and filtering to be useful, but treating the data taints it; with an open source set up, how much and what kind of treatment is up to you, but if you want to use the sensor data, it needs to be heavily treated such that you can faithfully reproducible interaction. There's effectively too many variables and they're all interfering with your intent, you can't really look to one sensor's data and say much of anything.

    • @izhamazman209
      @izhamazman209 3 года назад

      Sorry i didnt fully understand your comment since im only just starting to get into these kinda things and not really a person with biology knowledge. But i gotta ask, you said that the data is not fully indicating brain activity since you said too many variables at play. Is there any possible way to gather only brain signal data aside from being invasive such as neuralink? Will a full body crippled person with only a functioning brain be able to provide only brain signal since muscle is out of the equation? How much is the difference in data between being awake and sleeping? Does the brain is less active during sleeping? Will the data be similar if compared between an awake full body crippled person with a person that is in deep sleep?
      Hope you can answer even a few of those question. Thanks in advance

    • @alexisandersen1392
      @alexisandersen1392 3 года назад +3

      There are far too many factors at play when it comes to human brains in general, much less accounting for abnormal brain conditions. That said someone being crippled doesn't necessarily indicate a problem with their brain, such is the case with someone with spinal chord damage. If the brain is typical, and undamaged, it should behave similarly to other typical, undamaged brains. As for sleep vs Awake, it really depends on what stage of sleep the brain is in. Sleep is a whole other can or worms.
      Regardless, since neural signals are so finely granular, and because there are so many connections with their own signal, it's infeasible to obtain perfect data from the brain. There will always be interference, if not from peripheral systems of the brain, then from the concert of neurons all "talking at once" into your extremely limited number of sensors.
      It's like trying to reconstruct a stadium full of sound over space and time with a single microphone. You will experience a loss of signal fidelity, it's just the nature of the problem. However, we can use the limited information that we can gather, and intelligent systems to infer signal data, and this is what is typically used even with neural implants, the signals from the sensors must be fed into a system that can find known patterns that are associated with brain intended signals.
      One approach is to use an artificial neural network, that can be fed the sensor data, and trained to infer the implied signals, but the topology of such an artificial neural network capable of infer brain implied signals from sensor signals is it's own field of study as well.

    • @donaldviszneki8251
      @donaldviszneki8251 3 года назад +1

      Hey why do you say cerebral blood flow is decoupled from body blood pressure?

    • @alexisandersen1392
      @alexisandersen1392 3 года назад +3

      @@donaldviszneki8251 cerebral blood pressure is regulated separately from the rest of the body as too much blood pressure could lead to bruising in the brain matter.

    • @monad_tcp
      @monad_tcp 3 года назад

      @@alexisandersen1392 brains are so complex. Can you imagine the advances we will be able to make when we start having 100000 of probes in a human brain, so we can actually see what's happening between small groups of neurons.

  • @matthewcollier3482
    @matthewcollier3482 3 года назад +35

    "Mooom can we get Neuralinks please?"
    "No we have Neuralinks at home!"
    Neuralinks at home:

    • @mrmartinwatson1
      @mrmartinwatson1 3 года назад +2

      probably be the only person to successfully log of off sao

  • @richardstock1
    @richardstock1 3 года назад +1

    Great video looking forward to the next one. Subjects on the brain really intrigue me.

  • @masterninjaworrior
    @masterninjaworrior 3 года назад +2

    Can’t wait to see what you do with this, I am very interested in BCI!

  • @mossm717
    @mossm717 3 года назад +2

    Glad to see your doung this. I always wanted to try one of these eeg setups, but they’re so expensive

  • @Fury9er
    @Fury9er 3 года назад +2

    This was very interesting, I would like to try this out one day so hopefully the open source stuff will become more common and a little cheaper in the future.

  • @ApanLoon
    @ApanLoon 3 года назад +31

    “Do you know what this means? It means that this damn thing doesn’t work at all!”

    • @StevenIngram
      @StevenIngram 3 года назад +4

      I thought of the same quote. LOL

    • @jonmayer
      @jonmayer 3 года назад

      Listen, Doc.

  • @ted5610
    @ted5610 3 года назад

    just being able to see the different outputs from the electrodes is wild :o

  • @LucasMcDonald
    @LucasMcDonald 3 года назад

    James, I used to do “bio feedback” as a training method for ADD. It used basically the same interfaces as your using here. The mantra they taught me to “brain train” I found years later is just mindful meditation. I’m sure looking into this will help you with your results.

  • @benGman69
    @benGman69 3 года назад +11

    *James tenses his arms and suddenly drops to the floor with spit around his mouth

  • @TechMarine
    @TechMarine 3 года назад +7

    If I may suggest, you should use the "tree" support structure, you would save so much plastic for that kind of construction

  • @reggiep75
    @reggiep75 3 года назад

    Reminds my of the numerous EEG tests I had for epilepsy but this kit is just more enjoyable and fun.

  • @World_Theory
    @World_Theory 3 года назад +11

    I didn't know there were brain interface kits like this! That's pretty darn cool.
    I assume that you'll eventually use some machine learning with those readings, to interpret what's going on. But I'm not 100% confident in this assumption.
    See this has sparked an idea though. I've been diving into the subject of VR lately, so that's the track my mind tends to be on now. And I've seen some really interesting tracking technology, and ways to drive VR avatars. But I've never seen someone use a brain-computer interface (other than in fiction) to help control an avatar.
    With a brainwave reading kit, though, I think it would allow you to control the body language at the very least, of non-human body parts in an avatar. How much you're concentrating, for example, could be used to modify the idle animation of a tail, wings, or ears, taking inspiration from real life animals when they're concentrating on something.
    I think that combining brainwave tracking with full body tracking, and facial expression tracking, would give a neural network (AI) interpreter a lot of cues to guess at the state of your mind. Perhaps just enough cues to actually fully animate a non-human avatar, despite the lack of physical body parts in some cases, to actually track with traditional means. (With "traditional" being relative. Considering that VR and body tracking are still relatively new fields, even counting the movie industry's use for CGI motion capture.)
    Another thing that I think could be useful, is actually using VR as a prototyping tool for controlling and tracking methods for robots and other things. So that instead of having to spend money on physical materials to build a prototype model, just to work out the bugs in your control interface, you could use a 3D avatar instead. Even if the avatar is hideous and super basic, the movement of the bones should still be useful. Providing that the skeleton measurements are accurate.

    • @vgaggia
      @vgaggia 3 года назад

      I wonder if you had to set that up for movement, and you apply some sort of anesthetic to the person, they'd be able to move like they're actually in the vr, obviously it'd take some machine learning voodoo and insanely accurate sensors, but would still be interesting
      edit:
      could be useful to give to people who have medical issues that can't move anything but they're brains are still functioning perfectly

    • @feda9562
      @feda9562 3 года назад

      ​@@vgaggia that's exactly what BCIs where first developed for in the '70s

  • @EngineeringSpareTime
    @EngineeringSpareTime 3 года назад +1

    Very interesting! Did you think about EMI on the cables of the sensors? There are very sensitiv..bundle them might look better, it‘s might not be better in terms of data quality though (analog output?)

  • @TheArashhak
    @TheArashhak 3 года назад

    Can you share the CAD files (STL?) for the custom headset, please? I'm interested in targeting just the hand movement for a custom assistive device.

  • @electronetiq
    @electronetiq 2 года назад

    Can I translation waves to text by this brain sensor? Please answer thank you.

  • @clancywiggum3198
    @clancywiggum3198 3 года назад

    For what it's worth, and I admit I'm a bit late here, the primary motor cortex is effectively just a bank of output pins for the brain, it just contains the brain end of the neurons that run down the spine and, through a couple of subsequent more or less direct connections, connect to parts of muscles, so if you are directly reading it you can only sense actual physical movement. They're also perfectly mirrored - all neurons for one side of the body go to the other side of the brain. BCIs get more interesting if you target the motor planning areas because you can plan how to move part of you without actually having to move it so there's scope for pure mind control of computers or hardware through that technique, although of course the resolution is low. I would presume that's where you've accidentally targeted because that's a larger area and probably the suggestions of mixed right and left hemisphere involvement in any given limb would be from mixed processing in the motor planning areas as the true primary motor cortex does little if any actual processing.

  • @catalinalb1722
    @catalinalb1722 3 года назад

    Hello, where can I order the kit with sensors and boards?

  • @wbretherton
    @wbretherton 3 года назад +10

    'We're getting closer to Elon Musk's pig' is my favourite quote of the day

  • @marc-antoinebelanger2841
    @marc-antoinebelanger2841 3 года назад

    @James Bruton: My father-in-law just been diagnosed with ALS. I would like to build a dataset of its speech and EEG signals. Before bying the kit I would like to know if it is possible to sacrifice a EEG input for a microphone. Or, is it possible to expand the board to add a microphone?

  • @Genubath1
    @Genubath1 3 года назад

    Something that might affect the signals is something called irradiation. even if you are using one main muscle for a movement, there are tons of other muscles that support it and muscles that support those, and so on. When you lift your legs, you are also flexing your core and moving your arms to balance. When you make a fist, you flex your muscles all the way up your arm and into your core.

  • @toniwalter2911
    @toniwalter2911 3 года назад

    at 15:28 what does the diagram mean by trunk? was it made for elephants or am i not getting it right?

  • @jonathanballoch
    @jonathanballoch 2 года назад

    awesome stuff!! did you ever build the new adjustable headset?

  • @thorley1983
    @thorley1983 3 года назад

    could you calibrate the input with a hand movement followed by a jaw movement? and use a signal strength thing so that the position of the electrodes isnt as critical?

  • @nickgennady
    @nickgennady 3 года назад

    Can this be used around wrist to read finger data?

  • @ollie-d
    @ollie-d 3 года назад

    I have been doing EEG-based BCI research for just under 6 years now and there are a lot of errors in this video, but it’s a fairly good start.
    It wouldn’t be productive to nitpick everything in here, but know that you’re unlikely to get a good motor imagery-based system with more than two classes using the OpenBCI. You need to collect more controlled data. If you want to be making inferences on imagined movement, make sure you’re not moving your limbs but rather imagining movements (rotating a door knob works well). You should see a decrease in the mu rhythms over the sensory motor cortex during imagined movement. You’ll typically see a decrease in both C3 and C4 mu power but the mu power over the electrode contralateral to the imagined hand should be lower.
    If you wanted to use the activity from the actual muscle movements then you’ll have a much easier time using EMG electrodes on the arms since the signals are orders of magnitude more powerful and classification is trivial

  • @BenKDesigns
    @BenKDesigns 3 года назад

    James, you're like John Oliver, if John Oliver were an incredibly cool nerd. And, of course, I mean that in the best way possible. Love your channel.

  • @wesmatchett615
    @wesmatchett615 3 года назад +2

    This is amazing similar to Dr. Emmett Brown’s thought helmet from Back To The Future

  • @jesseshakarji9241
    @jesseshakarji9241 3 года назад +1

    It may be interesting to pair this with an artificial neural network(ANN) that you could train on your brain sensor data. This way instead of looking for specific jumps of activity an certain channels, you could use an ANN to classify the data for you so it would know if you're say moving a leg vs an arm and other muscle groups. Seems like the sensors are pretty sensitive so who knows how well this would work.

  • @sunboy4224
    @sunboy4224 3 года назад +1

    I actually have worked in labs that did BCI's, and just graduated with a PhD in Biomedical engineering, with a focus on neuroengineering. I'm sorry to say, but this project is dead in the water. It's been about 5 years since I've looked at an EEG signal, so my memory is admittedly a bit fuzzy on the details, but my lab was attempting to decode leg kinematics from EEG. We were using decently complex filtering (Unscented Kalman filters, H-Infinity filters, all kinds of frequency-domain filters, as well as PCA reconstruction [for offline analysis]), and in the year that I was there we weren't able to decode a signal that "looked good" (we were getting some metrics that it was kind of working, but DEFINITELY nothing that looked correct to the eye). We believe this is because leg kinematics are actually encoded in the spine rather than the brain, but even if that wasn't the case (as in hand-movements), the EEG signal is just WAY too noisy to do anything with.
    It's been a while since I've looked at EEG, but I'm pretty confident in saying that most or all of the signal you were seeing on your headset was EMG artifact. If you WERE to see some kind of movement signal, it probably wouldn't look like what you think it would. It's not going to be that enormous amplitude spike, 5x larger than the noise floor. It would probably be some small increase in the spectral power that correlates loosely to the limb moving. You just don't have enough electrodes in the area to get a useful signal, and you're probably not going to without opening up your skull and sticking something in.
    As for putting electrodes on your frontal cortex, well...you might actually have a little bit better luck there. Again, you're not going to be able to think "cup" and have the computer understand. But, MAYBE you can put the probes on your visual cortex and get the computer to recognize when you look at a very large red piece of paper that you mean cup, and a very large blue piece of paper means something else (and even then, you have to be very careful that the computer isn't just seeing the difference between "look left" and "look right", i.e. EMG from your eyes). A great example of a functioning EEG device that you might base your design on is the "P300 speller". The P300 wave is a repeatable brain signal that occurs when you observe a specific stimulus among a background of similar stimuli (imagine a room full of people who are sporadically clapping, and you focus on a specific person and count every time they clap). Using a specially built system, a computer is able to determine which letter a person is focusing on in a displayed keyboard, and then type the chosen letter, allowing someone to type without using their hands. THAT would be a cool project to get going...either spell the object you want ("cup", "remote", etc), or just display pictures of each object and have a "P300 speller"-like system pick out what you want.

  • @LokiLeDev
    @LokiLeDev 3 года назад

    Cool stuff! It is really hard to extract meaningful information from non invasive probes like that. I've seen a conference where they said that actually we can read about 1bit/s from the brain! And they tried machine learning to decode the signals but it was actually easier to train the brain to produce more clear signals!

  • @antonwinter630
    @antonwinter630 3 года назад

    what a time to be alive

  • @aamirbangash985
    @aamirbangash985 2 года назад

    Thanks a lot for the video
    How can I use this run time data to control a wheelchair using Raspberry pi as microcontroller?
    In other words, is this board compatible with Raspberry pi?
    Regards

  • @ebybbob
    @ebybbob 3 года назад

    Hey man, looking down like that in a superhero fight is dangerous... even if it let's you charge up your war-face....
    stay safe out there

  • @SyedShah-zx6qq
    @SyedShah-zx6qq 7 месяцев назад

    I cant seem to find the project files in his gituhb can someone please forward me the link

  • @at0mic282
    @at0mic282 3 года назад

    Can you test responses to sudden shocks or surprises? so maybe a protection hardware would be possible to shield the user when they are suddenly put in a precarious situation

  • @vmiguel1988
    @vmiguel1988 3 года назад +4

    Oh James come on do it properly! Drill your head and stick the electrodes into the brain! 😁

  • @1kreature
    @1kreature 3 года назад

    You said it your self... These are just single conductor electrodes that contact the skin. Thus, I don't know why you would think they "point" in any direction and thus have directional pickup capability.
    As for the moving of the arms: Try not waving the arms closer to the headset. Less noise. Sideways out from the body or simply making flat hand/fist with arms resting on lap works better.

  • @BLBlackDragon
    @BLBlackDragon 3 года назад

    Even a rudimentary system like OpenBCI can have a number of applications. Motor control, bio-feedback training, etc. This could be fun to play with.

  • @zeekjones1
    @zeekjones1 3 года назад +1

    I'd point towards the upper forehead, as there is more there to work with.
    As you said, to record a limb, it's better off probing from said limb, however thought is only in the head, therefore targeting logic and reasoning can approximate a new digital 'limb'.
    Tandem AI and brain training; eventually you could move a cursor, then a game controller, maybe even some custom macros on the PC.

  • @wolf1066
    @wolf1066 Год назад

    This was very interesting since I was wondering about Open BCI and how well it would work to control a robot arm. It looks like a lot of fiddling about would be required before I could get it to read brainwaves to the degree of precision I would need.

  • @nasim3269
    @nasim3269 3 года назад

    I saw those electrodes spiked for no reason were of a lesser amplitude compared to the electrodes which made sense. I think if you do further filtering you'll get a much better resolution for the motor command signals.

  • @4.0.4
    @4.0.4 3 года назад

    I wonder if you could train a neural network to take those signals and convert them into an estimated intention.

  • @Rooey129
    @Rooey129 3 года назад +13

    That helmet is a giant antenna, especially for low frequency, you should shield and ground it out.

    • @jamesbruton
      @jamesbruton  3 года назад +6

      I might put my whole head in a metal box

    • @Rooey129
      @Rooey129 3 года назад +1

      @@jamesbruton thats a good idea, if it works, you could even just run the sensors off a shielded cat6, making sure to ground the shield and aluminum foil around the helmet.

    • @quattrocity9620
      @quattrocity9620 3 года назад

      I've seen bigger...

    • @noahluppe
      @noahluppe 3 года назад

      @@jamesbruton The man in the iron mask 2: electroencephalographic bogaloo

  • @OMAR-fq4qi
    @OMAR-fq4qi 3 года назад

    Link to buy all electronic parts please

  • @TiagoTiagoT
    @TiagoTiagoT 3 года назад +2

    Maybe you could train some sort of neural net with videos of your face and the EEG readings, to identify and subtract the patterns that are produced by blinking, eye movements etc, when those patterns are present, while leaving the rest of the signal intact?

  • @deepakjoshi6242
    @deepakjoshi6242 3 года назад +5

    This guy always bring interesting stuff in a fun to watch way..

  • @johncaccioppo1142
    @johncaccioppo1142 3 года назад

    Have you figured out how to convert this into a synth controller yet? I think the Spitfire audio orchestra would overlay perfectly.

  • @RupertBruce
    @RupertBruce 3 года назад

    A 12 node band across the top, two either side at back for visual, two on temples for easy signaling/button pressing. Top band is 3 side-by-side on each side. I wouldn't expect much from lower limbs but I'd love to build a neural net with sensors as input and Nvidia's gesture model from a camera pointed at you so that it can build a body language model.

  • @ryansummer1589
    @ryansummer1589 3 года назад +1

    This is pretty awesome!

  • @barrettdent405
    @barrettdent405 3 года назад

    Reminded of Doc Brown’s rig in back to the future.

  • @thanhvinhpham3586
    @thanhvinhpham3586 3 года назад

    Is there a way to prevent others from using BCI technology on your body?

  • @negative258
    @negative258 3 года назад +1

    Maybe you can look into MyoWare EMG sensor to activate the claws. EEG activated tasks will be pretty interesting tho

  • @jefersonmollerdossantos7735
    @jefersonmollerdossantos7735 Год назад

    como comprar este BCI?

  • @AJB2K3
    @AJB2K3 3 года назад

    I wonder how those spring jumper pins would work as they have tiny tips.

  • @MuhammadDaudkhanTV100
    @MuhammadDaudkhanTV100 3 года назад

    Great full ideas and good work

  • @TiagoTiagoT
    @TiagoTiagoT 3 года назад

    Would be interesting to see a visualization that colors each region based on the frequencies picked by the respective sensor, like mapping the lowest frequency to red, the highest to blue, and interpolating in between, and blending the various hues in weighted by the respective intensities; producing white where all frequencies are being detected at maximum strength together, and black when none is at all.

  • @Vionbringer
    @Vionbringer 3 года назад

    This is really awesome and there are plenty of possibilities, but the main question I have is why is it so bulky?

  • @BillyHardcase
    @BillyHardcase 3 года назад

    Well there is definitely some activity there ;) very interesting.

  • @georgemathieson6097
    @georgemathieson6097 3 года назад +44

    Favourite quote: "We're getting close to Elon Musk's pig"

  • @kicktangerines8528
    @kicktangerines8528 3 года назад +1

    I was just doing a paper on this. SO AWESOME!

  • @RIchardBH3
    @RIchardBH3 3 года назад +1

    awesome product

  • @Saraseeksthompson0211
    @Saraseeksthompson0211 2 года назад

    You are an absolute genius

  • @inamdarswapneel11
    @inamdarswapneel11 3 года назад

    James : Can you please share 3d printing files for custom headset that you have designed ?

  • @Shinika01
    @Shinika01 3 года назад

    Sooooo EPIC!!!!! Please go further with this toy!!!! Show us what could be done!

  • @kestergascoyne6924
    @kestergascoyne6924 3 года назад

    Mindblowing stuff!

  • @Quasar_QSO
    @Quasar_QSO 2 года назад

    I sure hope this mind reading tech gets so much better soon. I want an omnidirectional wheelchair that I can control with my thoughts.

  • @bbogdanmircea
    @bbogdanmircea 3 года назад

    I really am curious about the number of hours those TAZ 3D Printers have under their belts! A dedicated video would be nice!

  • @stefanguiton
    @stefanguiton 3 года назад +1

    This would be amazing on your Exo suit!

  • @chrism9976
    @chrism9976 2 года назад

    I'd like to see a BCI connected to a keyboard. I'd like to know if one could communicate with others in the event of a paralyzing stroke.

  • @isbestlizard
    @isbestlizard 3 года назад

    I wonder if this could recreate those famous experiments where people make detectable intentions to do things slightly before they consciously attribute the decision point that'd be cool to see!

  • @jasondelong83
    @jasondelong83 3 года назад

    What is up with the solid 50Hz spike?

  • @excitedbox5705
    @excitedbox5705 3 года назад

    You could try a tight fitting bathcap or a swimming cap and mount probes all over that. You really just need wires connected to a metal tack. if you coat the tack in conductive gel you will get better readings.

  • @commanderbrickbreaker45
    @commanderbrickbreaker45 3 года назад

    Theoretically if someone were to use this and maybe change a few things up (or build an original version of something like this), do you reckon it'd be possible to Map the brain as it's Dreaming? If anything it'd be interesting to see how the brain changes comparing to Awake versus Asleep. (Though the Dream Mapping may itself just be what I'm curious about, A dream.)

  • @farazsayed5730
    @farazsayed5730 3 года назад +1

    What would be super awesome is if there was a headset with loads and loads of pins connected to something like an fpga, and some software to configure the fpga to pick and choose regions of interest. You could even get away with using sprung test probes that probably won't hurt your head if you had them sufficiently densely packed

    • @michaelarview
      @michaelarview 2 года назад

      Also with cones around the pins to increase range

  • @graealex
    @graealex 3 года назад +5

    "That's my brain"
    WTF, put it back!

  • @michaelblackstock6140
    @michaelblackstock6140 3 года назад

    This is very interesting, please do more research. Thanks

  • @thesral96
    @thesral96 3 года назад

    Really fascinating!

  • @amirhm64
    @amirhm64 3 года назад +2

    Awesome James, you are teaching us new things every time. Aaand i think your are on your way to become an evil scientist :)))

  • @marks47
    @marks47 3 года назад

    So are the pros using AI learning for the patterns associated with the limbs for the prosthetics to perform tasks semi-auto-matically? Or would they be better off controlling it "manually" in real-time so minute adjustments can be made?

    • @jamesbruton
      @jamesbruton  3 года назад

      I'd like to use deep learning to process the results, but I need some consistent results first!

  • @user-oz6vh7en3d
    @user-oz6vh7en3d 3 года назад

    Can you tell us what skills and knowledge are needed to work with the brain-computer interface?

  • @rcninjastudio
    @rcninjastudio 3 года назад

    Just one step closer to James becoming a cyborg😂, seriously though I find things like this fascinating, we've been to the moon, sent probes into deep space but we still don't fully know how the brain works

  • @user-ml4kk9re5s
    @user-ml4kk9re5s 3 года назад

    buy site please

  • @tedhuntington7692
    @tedhuntington7692 3 года назад

    fairly soon we may see a kind of menograph, a device that records thought-audio, as was thought about by Hugo Gernback and mentioned in his Ralph 124C41+ in 1911 CE.

  • @HKallioGoblin
    @HKallioGoblin 2 года назад

    Secret services created a very advanced form of technological mindcontrol in 2008 that can understand all those thoughts. All readings can be animated to screen, so we know what those readings mean.

  • @sharedinventions
    @sharedinventions 3 года назад

    You should build a motorized probe positioning system, that is thought-controlled. :)

  • @userou-ig1ze
    @userou-ig1ze 3 года назад

    brilliant, just thought last week it would be great if you'd try the openbci stuff. It's prohibitively expensive at around 1000$ (lowest you can go AFAIK) but it's still better than the non-opensource alternatives (in terms of: it's actually open source). Emotiv even encrypts their electrode readings - originally open source, then the urge for money kicked in - now you pay - for every recording *facepalm* - i.e. to decrypt their electrode readings

  • @ahmedkamel821
    @ahmedkamel821 3 года назад

    The video is amazing as usual, I am just surprised by how bad the 3d printing quality is from such a professional maker!

    • @jamesbruton
      @jamesbruton  3 года назад +3

      It could have been better, but each half took 12 hours as it was, so it wasn't on the highest quality settings.

  • @flloyd86
    @flloyd86 3 года назад

    I'm sure you've already thought of this, so. Are you aiming to apply Machine Learning? I see you've spoken about it while working on the jetson. It would be great to translate the input from cranial sensors into, dare I say, ROS messages? That would be cool and I might be able to find a repository to help. :)

  • @rustycobalt5072
    @rustycobalt5072 3 года назад

    Build a faraday cage around ya to remove all noise, have the Bluetooth receiver inside as well

  • @oliverer3
    @oliverer3 3 года назад +7

    The fact to you specified that it wasn't your actual brain made me laugh.
    Also side note, can we consider this a thinking hat?

    • @pvic6959
      @pvic6959 3 года назад

      lol i didnt need to be told. i would assume his real brain is 10x that size lolol

  • @philippanzbock4324
    @philippanzbock4324 3 года назад

    Cool video!

  • @Thuliolima2008
    @Thuliolima2008 3 года назад

    Very Good!

  • @CyberSyntek
    @CyberSyntek 3 года назад +1

    As much as I love the fact you have this up and running and believe if anyone can pump out great diy results on this project it is you James, but... my god man you are jumping from project to project on a weekly basis! XD
    Perhaps working on one thing at a time and really making the solid gains on it for a little longer. I understand it is fun to experiment and play with new concepts and designs but... you are James Burton! You have the ability to really push these projects to a next level.
    Either way keep doing you, love what you are doing. I get it that you need to keep the fresh content coming to bring in more supporters. We need a James clone at some point to allow the project focus time. XD

  • @lasersbee
    @lasersbee 3 года назад

    11:33.. If the signals from the sensors are so low why are the wires from the sensors not shielded ?

    • @jamesbruton
      @jamesbruton  3 года назад

      I'm not really sure, might be worth putting my whole head inside a metal box

  • @KSITREVS
    @KSITREVS 3 года назад

    Why the 20hz when close to the computer??!