February 26, 2021
The future of mind-controlled tech is already here, sort of - Video

The future of mind-controlled tech is already here, sort of – Video


Thanks for watching and see that I’m Bridget Carey.
And if you’re watching us on YouTube, be sure to like and subscribe and we want to hear what you’re thinking.
Hit us up on Twitter, tell us what you’re liking.
You can find me on Twitter at Bridget Carey.
Now, we’ve heard for a long time about the promise of tech that can read our minds, but.
How close are we to finding this area of tech to actually be working enough to be game changing?
So to join me to talk more about that I have cnet’s editor at large Scott Stein, who knows a thing or two about mine tech because he’s been playing with something over the past couple of weeks.
Hi, Scott.
Thanks for joining us.
Hey, are you doing?
Should I put it on?
I’m sorry, sir wearing this you talk.
Listen, if you’re here, I wanna see you wearing tech on your head because you have the craziest stuff all the time.
Tell us.
[LAUGH]
So, I got sent a dev kit for something that arrived just a few weeks ago and this type of tech is interested me.
Since last year and well before that, I wanna preface it by saying that we’ve heard about mind control and we’ve heard about neural input tech for a while.
This is the sort of stuff that promises that.
It sounds like mind reading.
It also sounds like stuff that we’ll begin to read.
What your brain is doing and what your hands might be doing.
Facebook acquired control labs, which is a company that makes tech like this for your arms.
Last year at CES I tried a wristband called major band that I’ll talk about to what I’ve got on my head right now is made by next mind this Goes on the back of my head and is an eg that scans what my visual cortex is doing.
And so whenever demo>> Can you spin around a little bit and show us.
Yeah, so I can show you right now that’s very small, relatively speaking and on the back of it Are these rubber footed contact points.
You put them on the back of your head, you go through a tutorial that make sure all of them are active.
And then it’s dev kit.
So right now you can design different interfaces for it, but it also has a number of demos and mini games built in that you could run on your Mac.
You could run on a tablet
Even have a few demos that work in VR with an Oculus quest connected to a PC that I tried.
So I connected this thing to a quest to and put it right on the back of the band.>>Could you have any more headgear on?
Why don’t you run some of that demos to get to really get the point across because this is such a Strange piece of text so you’re controlling different things on the screen you’re seeing because it just knows that you’re looking in a certain direction.
Yeah, so what next mine does is pretty straightforward in a sense, what you’re doing is activating these markers that appear in their apps.
So you get these kind of large Weird blinking markers and they’re like giant buttons basically.
And you’ll see a couple of them in their demos, in the VR game they look like flashing objects in the game.
And if you look at the object and you’re meant to relax and focus on it eventually it activates And acts as a click.
Now that’s kinda like I mean, we’ve seen tech in the past there, there’s a muse and other bands that have allowed you to focus to seem like you know, games and other toys would activate when you focus that’s a good general type of a thing.
This is doing it specifically when you’re focusing and looking at the object.
This doesn’t have any eye tracking tech in it.
There’s not it’s not looking at my eyes.
And there’s eye tracking cameras for VR.
This is doing it based on what it’s sensing from signals in the back of my head.
And what’s crazy though.
So you just keeping really focus on something.
Sorry to me interrupt you.
You’re just really, focused on something you’re seeing on the screen.
But I think we have more examples of this.
Yeah, it kind of feels like the Jedi thing, where you’re meant to kind of breathe and what is focusing mean, and sometimes I wondered how I could focus more to make it activate.
Should I relax?
Should I keep staring at it?
Sometimes I found if I just did my glasses, maybe But But what was wild is while it wasn’t always sure when it would activate, it activated on the thing I was looking on.
And so in this demo here, you can see they set up for different TV channel type things.
And there there are pre recorded video clips that it scans between.
There’s also a button on the bottom that brings up pause and play.
So I was able to switch to the channel I wanted or hit pause and play and there are multiple options here.
So it sounds like a magic trick but it’s actually happening.
There’s another demo that has a bunch of music track loops like a Grouge band and you can click on the various big buttons.
So, Super simple, but super weird.
And there’s also one that combines control pad there’s the Mario game where I can control and jump around with Mario, but then I can use these targeted sensors focus on them and lift blocks by staring at them.
I think we have some of that gameplay footage also as a demo to show and while we’re playing Playing that.
Are there times you get it wrong I mean are you just like really like straining to focus on something like how fast is it to get it right?
There are times I got it wrong and and they’re more often than not though is about waiting for it to activate.
So I found that when you go through the tutorial, it has a number of times you’d kinda focus on this cursor.
That’s it’s sort of success rate to say, how much you’re in tune with this.
I found that my success varied.
But, but again, it was really about how fast I was able to click it, not which thing I was clicking.
So like most of the time, it was clicking on the thing that I wanted it to.
Did you feel like a Jedi like how weird was it?
[LAUGH] Yeah, I did kind of feel like a Jedi.
The thing that really blew my mind was when my kid walked in and I realized how weird it is because then he saw me do it and he went, What?
And he kind of did this like reaction, you know?
Like, like amazed at that was that I I was even doing this and then it made me step back and appreciate.
This is the VR demo.
Which, you know you look around and basically you can either shoot it the it was like it was like the movie scanners I’m looking at the brains of these creatures.
Until they explode, which was weird.
And then I can jump to teleport points, which are these blinking?
You can see it here they’re kinda like these slightly blinking targets.
And once I look at it then I. I jumped to the next thing.
So super limited controls, but this company is a Paris based company.
They are looking at this as a way to enhance and build upon other types of input not to replace everything.
Now you’ll be able to fully control VR or your computer with your mind.
But you might be able to build upon a whole landscape.
Other inputs including voice, or including controllers to provide an additional nuance.
I think that’s interesting when you talk about things like AR headsets, where you might wonder you’re wearing glasses.
What do you really focusing on?
Things like this might be able to do stuff like that?
Yeah, cuz if you’re an AR headset, you want to select something on the menu.
Right.
It’s all about does it know your intention?
So it is interesting.
This is still in the dev kit cycle right now, you said.
So how close are we to getting this into real world?
Gadgetry to being useful.
I think this step is still a while away.
Nobody’s claiming is going to show up in a real product anytime soon.
And a lot of the, you know how reliable accurate it is.
These are questions.
I think when it comes to accessibility gets really interesting.
These companies are working on Neural inputs are most of the time looking at how they could help people who may not, let’s say someone’s unable to move, their eyes are not able to move their head, if they’re paralyzed, or you potentially could use this to To potentially to interface I I don’t know the results on this yet I don’t know where things could go.
But I know eye tracking tech is often used for that type of input.
Potentially this could be as well.
And that also gets into the territory of armband neural inputs which I’ll get into a second with moocher band, which is very much different beast because this is looking at Your visual cortet, and, a lot of the other neural input tech that we see out there, is looking at stuff for your arms.
And, the two do not intertwine.
It’s not like your arm bands, going to tell you what you’re seeing, or vice versa.
So in a sense, they almost need to be kind of categorized differently.
But, but let’s get into that arm band I was sorry, I was gonna say like, like with the arm and you were saying it’s not obviously reading your mind but it’s reading your intentions, right?
Like your muscle like what you want to do.
Yeah, so let’s go over to MUDRA for a moment and talk about that.
I demoed MUDRA last CES, and I never really got to them the pandemic happened.
I never got to really write the story I wanted to write about this company, but they are releasing an Apple Watch Band in a couple of weeks.
This is on Indiegogo I believe.
So this this company Has a contact filled sensor that you put on your wrist.
And when I tried the early demo, which was this was a year ago.
It can do a couple of things.
First, it has standard motion gestures.
So you can do things like a lot of other, Controllers, but it can also tell your finger movements.
And it can also tell pressure.
So how how much you’re pressing is demonstrated here.
Now how much you’re pressing.
That’s not something that you’d be able to tell from a hand tracker, but it’s telling it From reading your neurons and reading the signals on your wrist.
what’s crazy is that when this is demo to me, not only were the finger motions shown, but there’s a way to do the similar types of input without moving your fingers.
Basically the intent to move your finger can count as an input just as much as moving your finger for things like the pressure or which fingers is moving.
So it could potentially be an input for somebody who doesn’t have the hand, who you know, it’s basically like it enters into prosthetic device territory.
Now, the Apple Watch app is going to be adding other features that I haven’t seen yet.
But, a lot of the demos I tried with mudra bamboo are again, pretty simple.
They’re about playing and pausing music, raising and lowering volume, that type of stuff.
I think about a company that I didn’t get to demo unfortunately, before they were acquired by Facebook control labs.
They also have an arm band type sensor and I don’t know how similar both of them are, but I think they may have a lot in common.
What Facebook’s aiming to do with this type of tech is again down the road.
Facebook’s head of AR Andrew Bosworth recently told me it might be three to five years before we start really seeing that tech surface.
In anything and I think that you will see like this Apple watch band is coming out.
You’ll see experimental type things that start popping up, maybe to play around with, maybe develop an input for kind of like the way you had the Microsoft connect and you had tracking cameras floating around for years, and years and years.
But it didn’t necessarily mean that you’d be using it for everything in your life, but you’ll start seeing some experimental uses toward them.
So I think this is equally or your your haven’t been in a different .>>You are having me like You have me touch my wrist and I’m like wondering Okay, wait.
So if I move my finger can sense it and like of course you like feel a little bit of something in your wrist when you move your finger when your tendons are moving.
So is it sensing like, like just slight movements to know what you intended like this is this is kind of making [LAUGH] You little, ya know, it’s sending it sending it sending impulse, it says sensing impulses.
And so there’s another company here at CES, which, you know, I’ve been trying to reach out to but I think you’re gonna see stuff like this too.
There’s a company making a neural glove for gaming mice, which the idea is that like, again, you wear this thing and that it can sense Your input a fraction faster than the click on the mouse.
And so the idea is I get that edge by just getting the input in a little faster reduce the pain.
When I talked to who, next mine about this.
They see similar types of uses for stuff like that with peripherals like what potentially you could end up.
Gaining accuracy or getting faster input with these things, because they’re, there a lot of times potentially sensing what you’re doing before you’re not before you’re doing it.
But right before your body can fully register it, does that make sense?
Like so for like something like.
The major band, like the idea is that the impulse can get sensed extremely quickly.
So, it’s wild.
To me it’s stuff that I’m just beginning to bend my brain around.
And I’m sure as people watch this, they’ll have thoughts and then they’ll say no, it’s like this or, you forgot about this, but, Both of these are products that I tried and with next mine I’ve been I’ve been living with it at home and seeing what it’s like.
I think what’s interesting for me for that is it proves that it’s it’s really real.
But how really practical is it yet?
You know, I think that we’ll hear a lot of companies throw this stuff out and talk about it.
And potentially you’ll hear mind control, mind sensing, they’re all going to get kind of blended together.
But I think it’s stuff that we work with the algorithms that these things are using.
It’s not too far off in spirit from the stuff doing for my tracking, you know, voice recognition, all the things that are starting to blend and predict and notice what we’re doing.
It sort of feels like it’s creating this bizarre halo of trying to look at our intent.
It’s crazy.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *