I gotta hear about a company called icone that’s developing a holographic platform for phones that will allow you to create holographic images, share them with friends all around you and even engage in holographic chat.
This isn’t smart glasses.
This is some sorta thing that you attach to your phone and can view volumetric 3D content on That sounds pretty crazy.
And yet not unlike some stuff that I’ve already seen.
I really was curious to see how it works and what this is all about.
So I’m talking to Taylor Scott and Joe Ward from Ikan, who hopefully can explain how this functions.>> Yeah, so a little more detail as far as what we’re going to be providing and the game changing nature of the device that is coming to market.
So The combination of really unique hardware, a lot of intellectual property patents that go into creating both a unique lens system to hold a denser light yield, and actually be able to control it via volume.
As well as some really unique artificial intelligence that we use, we refer to it as neuroadaptive artificial intelligence Being able to gain a profile over time to figure out what is the best absolute possible experience that to retinas need to perceive and then going out to for a 12 how many people need to proceed the perfect environment and gauging the computation power required as well as the cell signal that’s available and being able to throttle all of those invariance to create the best possible experience.
So this is something that is not do you have to wear any glasses or is this black glasses free,
It has touch control, gesture control, as well as all the unique controls that are currently with your mobile phone being able to use accelerometer systems, tracking systems AR systems as well.
We actually in sort of embed our artificial intelligence as a foundational layer that can be drawn and used for all of these control mechanisms.
So essentially think of a A portal window that you can attach to your phone that gives you a view to a whole new dimension without goggles.>>So what’s really interesting I mean there’s a lot of interesting things here about I I’ve seen some some light field types of displays looking glass factory and, and NC Sony’s display that they announced at the end of last year.
But I read about it and then I, want, like if you apply this to another device is it a kind of a lens that enables the existing screen to create.
Light field display.
Essentially, we’re teaching the phone a whole new way of projecting imagery and a whole new way of creating light.
And again, you have essentially a supercomputer in your pocket at all times with the mobile phone.
And there’s a lot of things that can be done with it that are not currently being done with it because people haven’t really evolved beyond that.
What’s currently been narrowed again AR is a great stepping stone for that evolution, which is why we fought really hard in the last five years that I can to be able to be agnostic with different AR libraries so everybody on the planet can get into this space and utilise that holographic system.
For us the real passion behind the design is, And it’s a tale as old as time.
That term holographic is used to refer to so many things.
I mean people even refer to it when it comes to digital signals.
We’ll say it’s holographic.
It And while it may be indeed holographic from a physics definition, the average consumer when you say hologram thinks of Princess Leia is floating in a device and she’s interacting with you and she’s also volumetric.
So for us it’s good to do both volumetric but actually gives the phone the ability to project life elsewhere.
In an open field, so I can see both the hologram and everything else around me naturally.
That is the hologram that people want.
And that’s what we can do with the device that you currently have and our accessory
When I met Taylor, he had this concept and this technology were really was focused on projection systems.
And although that’s really cool, my background is in technology and And really, I leaned into that experience and said, I’d really like to develop a platform, an open platform that developers could have access to the SDK, and really let the market create content for that platform and really derive, not only content providers, but distribution models and revenue streams from those various verticals.
So Although gaming is gonna be big for us, we really have shied away from being pigeonholed into that gaming company only.
So I think you can see all the different vertical markets that we can impact so we think we’re doing it the right way.
And it’s taken us about three and a half years to get to this point.
So it’s been a lot of hard work by a great team.
But we’re really excited about where we’re going in 2021.
I know you mentioned to or at least in the in the original invite about possibly seeing a demo as a pop, is there something at all that’s here or is it something that I’d love to see that if there is something,
Yeah so I’m gonna let Taylor comment on the specifics, but I will say that we have just a couple.
We shared with us today that we’ll also share with you and for obvious reasons were a few months away from really letting you in to see a lot more than that.
But tiller can walk you through what you’re about to see.
So, this is an example of both interplay as well as.
What we referred to when we said 90% opacity, and the ability to create really strong forms of life that can both overwhelm and integrate based on what we want.
So we’ll share that now.
So, again, this is the last one, one interesting thing.
We’ve heard of it as the wasp, of course.
But if you look at it, we actually have a unique form of synthetic.
So the one thing that you can’t really optically invisibly control is black because again, black is the opposite of light energy.
We have a unique algorithm that can take the ambience of the room and create.
Create whatever synthetic color needs to appear to where you can always perceive blacks.
You’ll notice he still has that vest, you can still see the antenna, you can make out the fact that those are indeed black even though it’s impossible to project.
Again, that’s unique part of our AI we know what you need to see at all times.
And again, being able to integrate with this and just what we don’t have in this specific video here is the ability to both read out on multiple sides of the device, the gesture control so he can actually land on the hand can actually walk around it.
We have we have many tests and kids will put bottles and cups and he’ll actually able to go inside of the cup and around the bottle and things like that via our tracking So can also do some level of occlusion like it has it has built in occlusion for objects.
So even more than that is is we’re utilizing the object tracking epic occlusion.
And that currently exists in existing AR environments that are natural to program for because there’s been a rapid evolution of that ability in the last three to four years.
So this is actually we naturalized with that tracking, and we utilize our AI to create that exact experience and think about it.
It’s also bigger than this because rather than being closer to just the AR environment, we have to move your phone Now, it can be stationary and I can move around it in a massive field of view, I don’t have to have that whole, the AR frame that everybody sees when you’re, when you’re in that in the zone that no longer exists, I can actually have it I can place it on a table and move around my new depth field portal view and get that same experience.
So it’s an interplay here you see the 2D phone has that same depth field.
You’ll notice here is we use all the different saturated colors that you can utilize in these systems.
And even though his hand is very bright, so it’s catching a lot of light and it’s white, we still retain it on a percent opacity.
So at all times, you’ll notice how bright the colors as well as the depth of resolution matching the phone itself.
So the angles very sharp, very clean, and being able to actually swipe.
Both up and down.
So one thing that he’s not doing here that you can do is he can actually take his hand and touch that light field and actually manipulate the rollerball inside of the hologram and then swipe it back down to the 2D phone and utilize two different sets of touch control as well as gesture control.
So it really will be like an extended display like you could have this be something that you could combine both into One experience or, move sign to the other?
Exactly, one of the my favorite experiences that we that we program with here, is that we are building into our SDK, the ability to split programming functions.
So you can actually have, your streaming service, played, on the 2d phone.
And you can have all the chat features, your text messaging systems, emailing systems, all within the hologram.
And nobody’s ever been able to really type, in a holographic medium before.
It’s actually it’s incredibly cool and evocative.
But you can actually respond to the notification, things like that you can completely split the design of your application to give them everything, give the user everything that they would need.
So that for the system.
So again, a big issue, like for me, when it comes to streaming services is to watch real time commenting and the system and any secondary UI features.
It gets very cramped.
We allow you to completely open that up.
When you turn any phone into effectively to screen device to so you can kind of get to some holographic version of like a Microsoft duo or something, like you could potentially.
Exactly right I mean there’s so many different attempts to increase screen size and it’s just a an alternative way of doing that.
That’s pretty amazing cuz I, when I had played with dual screen phones, I often found like, I’d wanna watch something and then I’d wanna chat or comment and,
And then sometimes apps would not necessarily interface as well as I would have liked with that.
But I’m curious about the possibilities for that here.
We could throw that.
Experience whatever it is on that other holographic display.
Exactly, for us, nobody’s ever been able to send a text message with volume depth and alike.
You think emojis and emojis are fun now being able To actually send them with vibrant depth animations, massive, massively different.
I mean, for me, I can honestly say every week I send about 15 messages with the fireworks animation on my iPhone.
Just for no reason.
I’m just saying hello and there’s fireworks blowing up everywhere.
To do that in a holographic depth field is a game changer because it’s genuinely integrated with the environment because of the fact that we can read so much the environment.
Fireworks sparks can actually fall on the depth of the table behind you and have integrated experiences, things like that.
So being able to separate those increases productivity and being able to change the way you design it increases the overall engagement.
And you’re confident it’ll fit a variety of phones?
I mean, I’m curious if it’s a case or whether it’s, what that will feel like.
There’s a long list of phones that is compatible with.
Again, we were very, very bullish when we went through the design of a hardware system to, one, be as functional as possible.
So there’s never a time that it’s annoying.
The other is the three things the other we don’t want in any way to affect the power of your phone the actual management of energy having to strip away power to you know, create a large open space projection system.
Can I start like a crazy far future thing is that because it’s CES, and we just want to do this What do you think these are going to be in five years?
What’s happening here with holograms and AR and where you think it’s all gonna shake out?
I know Joe probably has an answer to but I got a little I get excited that question, [LAUGH] For me that’s why we’re so passionate about 2021.
Being the year that we have services of being able to both translate existing data, and really easily make new volumetric data.
The library and the universe of content that we’re about to have in the next five years is going to make old 2d photos look so incredibly archaic.
You know, like when, when the iPhone came out, we look back six months later and went, man, what were we thinking with those old flip phones like we had instant nostalgia, that’s 2021 20 to 23 are going to be game changers when it comes to how we view Media no longer in a flat computerized world but an actual dimensional system over the next five years that’s going to completely change the way we engage with each other even a from a virtual context of being able to have a zoom call pre COVID or post.
Post COVID environment is going to change the way we integrate with each other and it’s gonna overall improve the emotional relationship that people have by these communication systems.
And anytime I hold on phones, but how applicable is this tech to going anywhere els, let’se say connected to a computer or even on your wrist or something like.
Standalone you know, do you see it moving into those spaces.
100% we do have in our product roadmap, a number of developments.
The next event will be a clip to take the the actual mobile device and be able to clip it onto your laptop or tablet.
And take advantage of that ecosystem.
And then ultimately an entirely different device that’s much larger that adapts to your tablet or laptop leading up to ultimately what we call healios which is you know, its own operating system biometric PC.
So we, from the business standpoint, which by the way is my job and all of this Taylor’s a real smart guy, but but we have a really robust product roadmap against those deliverables, which we’re really excited about.
I get the ideas and i think it’s it’s fascinating and I think I haven’t done I have not seen anything.
That’s Ambitiously gone towards trying to meet where phones are going with AR.
So I think that’s a really interesting fusion, that kind answers that cause he’s in the phone AR and we see in the headset stuff, [CROSSTALK]
This is that interesting middle zone.
And this is the foundation of a new visual technology that people haven’t had before and it will be building over the course of the next five years to more even more incredible devices, accessory systems standalone systems we mentioned the ability to integrate from a laptop environment of the phone and tablet as well as release A passion for us specifically because do you think but a COVID environment very few people are using their mobile phone for education systems.
But we will be giving people the ability to integrate that same system onto a laptop or a tablet where those education systems are indeed being utilized more frequently, as well as from medical systems most often they’ll utilize tablets.
Being able to integrate with that system as well.
It gives us the ability to cross pollinate a litany of verticals by jumping onto devices that are used for those purposes.
I really appreciate this time.
It’s a it’s a great CES story.
So I’m really curious to see.
I’m curious to see what comes next with it.
I wanna follow up and see part two of this.
That’d be cool.
Thank you for reaching out.