Podcasts about ultrahaptics

  • 15PODCASTS
  • 17EPISODES
  • 33mAVG DURATION
  • ?INFREQUENT EPISODES
  • Jun 20, 2023LATEST

POPULARITY

20172018201920202021202220232024


Best podcasts about ultrahaptics

Latest podcast episodes about ultrahaptics

Power Law with John Coogan
David Holz (Midjourney)

Power Law with John Coogan

Play Episode Listen Later Jun 20, 2023 52:11


In today's episode, I'm diving into the career of David Holz, a figure who echoes the spirit of the early Silicon Valley era.Holz first entered the tech scene as the founder and CEO of Leap Motion. This company, separate from Magic Leap, introduced a fresh user interface for computers, using hand gestures and cameras. Despite its innovative approach, Leap Motion found itself ahead of its time and was sold to Ultrahaptics in 2019.Post-Leap Motion, Holz didn't rest. He established a studio to seek out new possibilities. This exploration led him to the intersection of AI and art. Through self-funding, he developed Midjourney, a product that brings AI-generated art to life, and introduced it as a Discord bot earlier this summer.This episode aims to explore Holz's journey, his experiences with the boom and bust of Leap Motion, and his pivot towards AI-generated art with Midjourney. It's a story of resilience, creativity, and the timeless Silicon Valley spirit of turning setbacks into opportunities.

Haptics Club
#36 Making digital worlds feel more human, with Tom Carter, Ultraleap CEO

Haptics Club

Play Episode Listen Later Jan 20, 2023 59:16


We sit down in this episode with Tom Carter, co-founder, former CTO, and now CEO of Ultraleap. Ultraleap raised their series D in 2021 of $82 millions for their hand tracking, and mid-air haptic technology. We will chat around the entrepreneurial and technological story of Ultraleap, from Ultrahaptics first, the Leap Motion acquisition later and the fusion of the two companies projecting into to future. One of the rare successful haptic ventures. Longer chat, totally worth it !

Sixteen:Nine
Saurabh Gupta, Ultraleap

Sixteen:Nine

Play Episode Listen Later Nov 24, 2021 37:13


If you have been in the industry for a while, you'll maybe remember all the excitement around using gesture technology to control screens. That was followed by the letdown of how crappy and feeble these gesture-driven touchless working examples turned out to be. Like just about everything, the technology and the ideas have got a lot better, and there is a lot of renewed discussion about how camera sensors, AI and related technologies can change up how consumers both interact ... and transact. Ultraleap is steadily developing a product that lets consumers interact with and experience digital displays using sensors and, when it makes sense, haptic feedback. The company was formed in 2019 when Ultrahaptics acquired Leap Motion, and the blended entity now operates out of both Silicon Valley and Bristol, England. Leap Motion was known for a little USB device and a lot of code that could interpret hand gestures in front of a screen as commands, while Ultrahaptics used ultrasound to project tactile sensations directly onto a user's hands, so you could feel a response and control that isn't really there. Or something like that. It's complicated stuff. I had an interesting chat with Saurabh Gupta, who is charged with developing and driving a product aimed at the digital OOH ad market, one of many Ultraleap is chasing. We got into a bunch of things - from how the tech works, to why brands and venues would opt for touchless, when touchscreens are so commonplace, as is hand sanitizer. TRANSCRIPT Hey, Saurabh, thank you for joining me. Let's get this out of the way. What is an Ultraleap and how did it come about?  Saurabh Gupta: Hey, Dave, nice to be here. Thank you for having me. Ultraleap is a technology company and our mission is to deliver solutions that remove the boundaries between physical and digital worlds. We have two main technologies. We have a computer vision-based hand tracking and gesture recognition technology that we acquired and on the other side of the equation, we have made a haptic technology using ultrasound. The whole premise of how we came about was we started out as a haptics company and that's what our founder and CEO, Tom Carter, built when he was in college, and it was a breakthrough idea for us to be able to deliver the sense of touch in mid air using ultrasound was how we started, and to be able to project haptic sensations in mid-air, one of the key components of that was, you need to understand where the hands are in space and for that we were using computer vision technology by Leap Motion to track and locate user's hands in space, and we had an opportunity to make an acquisition, and some of your listeners may already know about Leap Motion. Leap Motion has been a pioneer in gesture based hand tracking technology since 2010. They've got 10 plus years of pedigree in really refining gesture based hand tracking models. So we had an opportunity to purchase them and make an acquisition in 2019, we completed the acquisition and rebranded ourselves to Ultraleap. So that's how we started. As stated in our mission, it's all about focusing on user experience for the use cases of how users are interacting with their environment, and that environment could be a sort of a 2D screen in certain applications, the application that we'll probably talk about today, but also other aspects of augmented reality and virtual reality, which are on the horizon and our emerging technologies that are gaining more ground. So that's the central approach. How can we enhance the interactivity that users have with a physical environment, through an input and an output technology offerings with gesture as input and haptics being the output?  The whole gesture thing through the years has been kind of an interesting journey, so to speak. I can remember some of the early iterations of Microsoft Kinect gesture, sensors, and display companies and solutions providers doing demos showing, you can control a screen by waving your hand, lifting it up and down and this and that, and I thought this is not going to go anywhere. It's just too complicated. There's too much of a learning curve and everything else.  Now, the idea as it's evolved and like all technology got a lot better is, it's more intuitive, but it's still something of a challenge, right? There's still a bit of a curve because we're now conditioned to touching screens. Saurabh Gupta: Yeah, you're right. One of the key aspects here is that gesture has been around. There's been research that goes back to the early 90s, if not in the 80s, but computer vision technology in general has come a long way. The deep learning models that are powering our hand tracking technology today are a lot more sophisticated. They are more robust, they are more adaptable and they are able to train based on a lot of real world inputs. So what that really means is that since the computing power and the technology behind recognizing gestures has improved, a lot of that has manifested itself in a more approachable user experience, and I completely accept the fact that there is a gap and we've got 10 plus years of learned behavior of using a touchscreen. We use a touchscreen everyday, carry it in our pockets, but you also have to understand that when touch screens became prevelant, there was the type keyboard before that.  So the point that I'm making here with this is that we are pushing the envelope on new technologies and a new paradigm of interactivity. Yes, there is a learning curve, but those are the things that we are actually actively solving for: The gesture tracking technology should be so refined that it is inclusive and is able to perform in any environment, and I think we've made some really good steps towards that. You may have heard of our recent announcement of our latest hand tracking offering called Gemini. The fundamental thing with Gemini is that it's based on years and years of research and analysis on making the computer vision, deep learning models, that power that platform to be as robust, to be low latency, high yield in terms of productivity and really high initialization, which means as part of the user experience, when you walk up to an interface, you expect to use it right away. We know we can do that with touch screens, but if you put this technology complementary to an interface, what we are solving for at Ultraleap is: when somebody walks up to a screen and they put up their hand to start to interact, the computer vision technologies should instantly recognize that there's a person who is looking to interact. That's number one, and I think with Gemini, with the deep model work that we've done, we've made some good progress there. Number two, which is once the technology recognizes that a person wants to interact, now can we make it more intuitive for the person to be as or more productive than she would be with a touchscreen interface? And that's where I think we've made more progress. I will say that we need to make more progress there, but some of the things that we've done, Dave. We have a distance call to interact, which is a video tutorial attraction loop that serves as an education piece. And I'll give you a stat. We ran a really large public pilot in the Pacific Northwest at an airport, and the use case there was immigration check-in, so people coming off the plane, before they go talk to a border security agent, some people to fill out their information on a kiosk. So we outfitted some kiosks with our gesture based technology and the rest were the controls, which were all touchscreen based and over multiple weeks we ran this study with active consumers who actually had very little to no prior experience using gestures and we did this AB test where we measured the gesture adoption rate on the kiosks without a call interact, before a call to interact and after a call to interact, and it increased the gesture adoption rate by 30%, which means that it certainly is helping people to understand how to use the interface. The second stat that came from it, that at the end of the pilot, we were almost at 65% gesture adoption rate, which means almost more than 6 out of 10 people who use that interface used gesture as the dominant interface for input control, and the third piece of this was how long did it take for them to finish their session? We measured that using the gesture based interaction, the time was slightly higher than for the control group that was using a touchscreen, but it wasn't much, it was only 10% higher. Now one can look at that stat and say in a transactional setting where you know, it's going to take you 30 seconds to order a burger, adding an extra second can be a problem, but at the same time, those stats are encouraging for us to think about when we look at that as the baseline to improve from.  So if I'm listening to this and I'm trying to wrap my head around what's going on here, this is not a gesture where you're standing 3 feet away from a screen and doing the Tom cruise Minority Report thing, where you're waving your arm and doing this and that is, can you describe it? Because you're basically doing touch-like interactions and the ultrasonic jets or blasts of air or whatever are giving you the feedback to guide you, right?  Saurabh Gupta: So we've got two avenues that we have going at this from. One is for the self service type offering, so you think of check-in kiosks or ordering kiosks at restaurants or even digital wayfinding, digital directories. We are solving for those primarily led at least in the first phase led by our gesture tracking technology. So gesture being the input modality, complimentary to touch. So, what we do is we build a touch-free application, which is a ready to use application that is available today on Windows based media players or systems to convert existing touch screen-based user interfaces to gesture, but what we've done is we've made the transition a lot more intuitive and easier because what we've done is we've replicated and done a lot of research on this and replicated interaction methods or gestures you would call it. I hate to use gestures as a word, because it gets tagged with weird hand poses and things like that, people pinching and all of that. For us, it's all about how we can replicate the same usage that a typical average consumer will have when she interacts with a touch screen based interface. So we came up with this an interaction method that we call Airpush which is basically, to explain it to your listeners, it's all about using your finger and moving towards an interactive element on screen. But what happens is the button gets pressed even before you approach them based on your forward motion or interaction. Now, the smart math behind all of this is that not only do we track motion, but we also track velocity, which means that for people who are aggressive in terms of their button pressing, which means they do short jabs, we can cater for those or people who are more careful in their approach as they move towards the screen, the system is adaptable to cater to all types of interaction types, and we track all the fingers so you can use multiple fingers too or different fingers as well. So these are some of the things that we've included in our application. So that's one side. The second side is all about interactive advertising, immersion and that's where I think we use our haptic technology more, to engage and involve the user in the interactive experience that they're going to. So for self service and more transactional type use cases, we're using primarily our hand gesture technology. And for immersive experiential marketing, or even the digital out-of-home advertising type of use cases, we are leading without haptic based technology.   And you're involved on the digita, out-of-home side, right? That's part of your charge?  Saurabh Gupta: That's correct. So I lead Ultraleap's out-of-home business. So in the out-of-home business, we have both self service retail, and digital out-of-home advertising businesses that we focus on. David:. So how would that manifest itself in terms of, I am at a train station or I'm out somewhere and there's a digital out-of-home display and I go up and interact with it and you're saying it's a more robust and rich experience than just boinking away at a touchscreen. What's going on? What would be a good example of that? Saurabh Gupta: So a good example of digital out of home activations is that we've partnered with CEN (Cinema Entertainment Network) where we've augmented some of their interactive in cinema displays that are being sold from a programmatic perspective. Now the interactive piece is still being worked into the programmatic side of things, but that's one example of an interactive experience in a place based setting. The other example is experiential marketing activations that we've done with Skoda in retail malls and also an activation that we did with Lego for Westfield. So these are some of the experiences that we've launched and released with our haptics technology and on the self service side we've been working with a lot of providers in the space you may have heard of.  Our recent pilot concluded with PepsiCo where we are bringing in or trialing gestures for their ordering kiosks for their food and beverage partners. So these are some of the things that are going on on both sides in the business. David:. So for the Lego one or the Scoda one, what would a consumer experience?  Saurabh Gupta: So these are all interactive experiences. So for Lego, it was about building a Lego together. So basically using our haptic technology which obviously contains gestures as the input, moving Lego blocks and making an object that was being displayed on a really large LED screen at one of the retail outlets and in London, so a user would walk up, they would use their hands in front of our haptic device to control the pieces on the screen and then join them together and make a Lego out of it and while they're doing that, they're getting the sensation of the tactile sensation of joining the pieces and that all adds up to a really immersive, engaging experience within a digital out of home setting.  So you get the sensation that you're snapping Lego pieces together?  Saurabh Gupta: Yeah, snapping pieces together, controlling so you get the agency of control, and it's one of those sensations that gives you a very high memorability factor. I don't know whether you track the news. This was in 2019. We did actually a really extensive activation with Warner Brothers in LA, and what we did was at one of the cinemas down there for Warner Brothers' three upcoming movies, Shazam, The Curse of La Llorona, and Detective Pikachu, we added interactive movie posters using haptics in the cinema lobby, and this would complement the digital poster network that was already existing at that location, and over the course of the activation, which was around six weeks long, we had almost 150,000 people that went through the cinema and we actually did in partnership with QBD, we did a lot of analytics around what the. performance was of an interactive movie poster experience within a digital out-of-home setting and got some really great stats.  We measured a conversion rate between an interactive experience versus a static digital signage experience. The conversion rate was almost 2x, 33% increase in dwell time, like people were spending more time in front of an interactive sign versus a static sign. Attention span was significantly higher at 75%, 42% lift in brand favorability. So these are really interesting stats that gave us the confidence that haptic technology combined with gesture based interface has a lot of value in providing and delivering memorable experiences that people remember. And that's the whole point with advertising, right? That's the whole point. You want to present experiences that provide a positive association of your branded message with your target consumer, and we feel that our technology allows that connection to be made  One of the assumptions/expectations that happened when the pandemic broke out was that this was the end of touchscreens, nobody's ever going to want to touch the screen again, the interactivity was dead and I made a lot of those assumptions myself and turns out the opposite has happened. The touch screen manufacturers have had a couple of pretty good years and the idea is that with a touchscreen, you can wipe it down and clean your hands and do all that stuff. But you're at a far greater risk standing four feet away from somebody across a counter, ordering a burger or a ticket or whatever it may be.  So when you're speaking with solutions providers, end user customers and so on are you getting the question of, “Why do I need to be touchless?” Saurabh Gupta: Yeah, it's a fair point, Dave, and let me clarify that. Look, from our perspective, we are focusing on building the right technology and building the right solutions that elevate the user experience. Hygiene surely is part of that equation, but I accept your points that there are far greater risks for germ transmission than shared surfaces, I totally accept that, and yes, there is a TCO argument, the total cost of ownership argument that has to be made here also.  The point that I will make here is that we fundamentally believe and being a scale-up organization that is focusing on new technology, we have to believe that we are pushing the technology envelope where what we are focusing on is elevating the user experience from what the current model provides. So yes, there will be some use cases where we are not a good fit, but contactless as a category or touchless as a category, maybe the pandemic catalyzed it, maybe it expedited things, but that category in itself is growing significantly.  A couple of stats here, right? The contactless payment as a category itself, 88% of all retail transactions in 2020 were contactless, that's a pretty big number And assuming that retail is a $25 trillion dollar market. That's a huge chunk.  But that's about speed and convenience though, right? Saurabh Gupta: Totally. But all I'm saying is contactless as a category is preferable from a user perspective. Now, gesture based interactivity as a part of that user flow, we fundamentally believe that gesture based interactivity plays a part in the overall user journey. So let me give you an example.  Some of the retailers that we are talking to are thinking about new and interesting ways to remove levels of friction from a user's in-store experience. So there are multiple technologies that are being trialed at the moment. You may have heard of Amazon's just walk out stores as an example. You don't even have to take out your wallet and that is completely based on computer vision, as an example, but there are other retailers who are looking to use technology to better recognize who their loyal customers are. So think of how we used to all have loyalty cards for Costco or any other retailer.  They're removing that friction to say, when you walk through the door, you've done your shopping and you're at the payment powder, we can recognize who you are. And if we recognize who you are, we can give you an offer at the last mile, and in that scenario, they are integrating gestures as part of the completely contactless flow. This is where I think we are gaining some traction. There is a product that we are a part of that hasn't been announced yet. I can't go into details specifically on who it is and when it's going to be released. But we are part of a computer vision based fully automated checkout system that uses gesture as the last mile for confirmation and things of that nature. That's where we are gaining traction. Overall point here is that we are focusing on really showcasing and delivering value on how you can do certain things in a more natural and intuitive way. So think of digital wayfinding at malls, right? You have these giant screens that are traditionally touchscreens, right? When you think of that experience, it has a lot of friction in it, because first of all, you can't use touch as effectively on a large screen because you can't swipe from left to right to turn a map as an example. We fundamentally believe that the product could be better with gesture. You can gesture to zoom in, zoom out, rotate a map, and find your direction to a store. Those kinds of things can be augmented. That experience can be augmented with adding just a capability as opposed to using a touchscreen based interface. So those are the high value use cases that we are focusing on.  So it's not really a case where you're saying, you don't need to touch screen overlay anymore for whatever you're doing, Mr. Client, you just use this instead. It's tuned to a particular use case and an application scenario, as opposed to this is better than a touch overlay? Saurabh Gupta: I think that is a mission that we are driving towards, which is, we know that there is potentially a usability gap between gesture in terms of its evolution than touchscreen. We are looking to bridge that gap and get to a point where we can show more productivity using gesture.  And the point is that with our technology, and this is something that you referenced a second ago, you can turn any screen into a touchscreen. So you don't necessarily need a touchscreen and then you can convert it to gesture. You can convert any LCD screen to an interactive screen. So there is some deep argument there as well. What's the kit, like what are you adding? Saurabh Gupta: Just a camera and a USB cable, and some software. And if you're using haptics feedback, how does that work? Saurabh Gupta: So haptics is a commercially off the shelf product. So it's another accessory that gets added to the screen. However, that contains the camera in it so you don't need an additional camera. That also connects to external power and a USB back to the media player.  So as long as you've got a USB on the media player, you're good, and right now your platform is Windows based. Do you have Android or Linux?  Saurabh Gupta: Good question, Dave. So right now we are Windows based, but we know it's of strategic importance for us to enable support on additional platforms. So we are starting to do some work on that front. You'll hear some updates from us early next year on at least the hand tracking side of things being available on more platforms than just Windows.  How does economics work? I suspect you get this question around, “All right. If I added a touch overlay to a display, it's going to cost me X. If I use this instead, it's going to cost me Y.  Is it at that kind of parity or is one a lot more than the other?  Saurabh Gupta: It depends on screen size, Dave, to be honest. So the higher in screen size you go, the wider the gap is. I would say that for a 21 or 23 inch screen and up, the economics are in our favor for a comparable system. And are you constrained by size? I think of all the LED video walls that are now going into retail and public spaces and so on, and those aren't touch enabled. You really wouldn't want to do that, and in the great majority of cases with this, in theory, you could turn a potentially fragile, please don't touch surface like that into an interactive surface, but are you constrained to only doing things like a 55 inch canvas or something? Saurabh Gupta: This will require a little bit of technical explanation. The Lego example that I talked about was targeted on, I would say a large outdoor LED screen. So the concept here is that if you want one-to-one interactivity.  So what do I mean by one-to-one interactivity? One-to-one interactivity is that basically when in our interface, when the user approaches the screen, there is an onscreen cursor that shows up, and that on screen cursor is what is the control point for the user. Now one-to-one interactivity for us to achieve that where the cursor is at the same height or there's no parallax between where the finger is and where the cursor is, for that you have to be connected to or at the screen, and when you are connected to the screen, based on our current camera technology, we can control up to a 42 inch screen for one-to-one interactivity, but we've also been doing exams showing examples where if you connect the sensor to slightly in front of the display, then you can cover a wider area and we've been able to showcase examples of our technology being used on up to a 75 inch LCD screen in portrait mode.  So then any larger than that, the scale gets a little wonky, right? Cause you've got a person standing in front of a very large display and it just starts to get a little weird. Saurabh Gupta: Yeah. It's like putting a large TV in a small living room. So you need to be slightly further away because then it gets too overwhelming, and for that, we have worked with certain partners and they've done some really interesting work like this company called IDUM, they built a pedestal and so that pedestal encloses our tracking device, and that can be placed several feet from a large immersive canvas, like a LED wall, as an example, in a museum type activation, and people can walk by and then they can control the whole screen with that pedestal slightly further away from the screen. So it's like a Crestron controller or something except for a big LED display!  Saurabh Gupta: Exactly. It's like a trackpad in front of the screen, but slightly further away.  Gotcha. All right. Time flew by, man. We're already deep into this. You were telling me before we hit record that your company will be at NRF and you may also have people wandering around IEC but if people want to know more about your company, they go to ultraleap.com?  Saurabh Gupta: That's correct. Ultraleap.com, we have all the information there and David, it was great to talk to you and thank you for the opportunity.   

INIT
Current Issues in Haptic Interaction Design

INIT

Play Episode Listen Later Dec 1, 2019 53:28


Dr. Marcello Giordano is a Research Scientist at Chatham Labs (https://chathamlabs.com/) with a background in human computer interaction (HCI) and haptics. He received his PhD from McGill University in Canada, where his work was concerned with haptics in musical interaction and incorporating touch into digital music. Now, in both his research and his applied work at Chatham Labs, Marcello aims to bring touch to a wide variety of digital interactions. Previously, Marcello was a HCI Researcher at Huawei Technologies, investigating haptics for mobile devices, and a haptics engineer for Ultrahaptics Ltd., where he worked on both research and products utilizing Ultrahaptics's mid-air haptic technology. Special Guest: Marcello Giordano.

Dream Employer
EP #03 - Brygida Dzidek, Haptology

Dream Employer

Play Episode Play 30 sec Highlight Listen Later Oct 23, 2019 32:47


Z jakich narzędzi będą korzystać pracownicy przyszłości? Czym są aplikacje immersyjne i czy trudno je zbudować? Jakie są największe wyzwania w procesie wdrażania nowoczesnych, futurystycznych technologii dotykowych? Dlaczego demokratyzacja nauki i technologii jest ważna nie tylko dla medycyny, edukacji i nauk społecznych ale także dla biznesu i każdego z nas. O tym rozmawiam z Brygidą Dzidek z firmy Haptology. Zapraszam! Gość: Brygida Dzidek Prowadzący: Paweł Zawadzki Produkcja: Większe LogoPartner merytoryczny: brief.pl

Reflect Rethink Reboot
Bristol - Startup City

Reflect Rethink Reboot

Play Episode Listen Later Jul 10, 2019 28:08


We’re now in the fourth industrial revolution, a fast-paced world where technology has the ability to create a better world for all of us, and so those behind it have a great responsibility! In this episode we take a look at our home town's vibrant tech startup scene, meeting some of the founders and startups who are helping Bristol and the south west of England compete on the world stage. What makes Bristol and the south west of England such a special place when it comes to tech startups and what can the rest of the world learn from our little corner of the UK? Hosted by Sarah Keates, part of the team at tech hub, Techspark, we’ll find out what gives this region the edge, whether there are certain types of startup that thrive here and ask how we’ve been able to create so many game-changing tech startups? Sarah meets 3 businesses who are all at different stages of their journey, from OKKO Health, an early stage startup led by founder Stephanie Campbell to Ultrahaptics, one of the countries most innovative tech businesses that is now making (ultrasonic) waves around the world and started life as a University of Bristol final year project. She also meets Zara Nanu, founder of Gapsquare who is on a mission to create a world with fair pay. We also go along to the Techfusion Expo where entrepreneurs and innovators share ideas and showcase new ways in which technology like Artificial Intelligence (AI) is revolutionising business, and our every day lives. Here we also meet Jason Hart, now one of the world’s leading experts on cyber security who’s journey into business started out fixing and selling old TVs out of the estate he grew up on in Bristol. This episode was made possible thanks to Blackstar Solutions - find out more at blackstarsolutions.co.uk TEDxBristol 2019 is coming (Nov 17 2019) - find out more!

The Art Of Struggle.
VR/AR company Leap Motion sold to UltraHaptics for $30 million valuating the company at an estimated $300 million

The Art Of Struggle.

Play Episode Listen Later Jun 11, 2019 4:30


Leap Motion, the AR and VR technology company behind the open-source North Star AR headset, has reportedly been purchased by the UK-based firm UltraHaptics for $30 million. As reported by the wall street journal. Full Article https://www.wsj.com/articles/leap-motion-once-a-virtual-reality-high-flier-sells-itself-to-u-k-rival-11559210520 #virtualreality #vr #d #augmentedreality #gaming #htcvive #oculus #ar #playstation #oculusrift #art #technology #psvr #video #vive #videogames #ps #games #vrgame #photography #vrgaming #oculusvr #design #insta #architecture #lasertag #virtualtour #vrart #mixedreality #bhfyp --- Support this podcast: https://anchor.fm/pixel-sultan/support

TechCrunch
Daily Crunch 5/31/19

TechCrunch

Play Episode Listen Later May 31, 2019 3:36


Welcome to TechCrunch daily news, a round up of the top tech news of the day. Brought to you by MD Anderson Cancer Center – where a team of nearly 21,000 strong are researching, innovating, and working to end cancer. Learn more about the leader in cancer care at MakingCancerHistory.com. -- UltraHaptics acquires Leap Motion -- Foursquare acquires Placed -- and Apple's big developer conference approaches. Here's your Daily Crunch for May 31, 2019.

TechCrunch
Daily Crunch 5/31/19

TechCrunch

Play Episode Listen Later May 31, 2019 3:36


Welcome to TechCrunch daily news, a round up of the top tech news of the day. Brought to you by MD Anderson Cancer Center – where a team of nearly 21,000 strong are researching, innovating, and working to end cancer. Learn more about the leader in cancer care at MakingCancerHistory.com. -- UltraHaptics acquires Leap Motion -- Foursquare acquires Placed -- and Apple's big developer conference approaches. Here's your Daily Crunch for May 31, 2019.

Long Beach Business Podcast
Episode 10 - Pacific Visions Technology Sets Trends In Inclusion

Long Beach Business Podcast

Play Episode Listen Later May 28, 2019 11:36


Pacific Visions offers experiences that go beyond the traditional, featuring a theater equipped with rumbling seats and misters as the centerpiece. With the help of technology company Ultrahaptics, the Aquarium has created an experience for visitors with disabilities that has the potential to set trends in accessibility. The Business Journal met with Aquarium of the Pacific President and CEO Jerry Schubel and Ultrahaptics Director of Product Marketing Vince Fung to learn more.

SETsquared Downloaded
Episode 7: Scale-Up Success

SETsquared Downloaded

Play Episode Listen Later Aug 23, 2018 17:27


In the latest episode of SETsquared Downloaded, we take a look at the challenges businesses and entrepreneurs face when trying to scale-up. A scale-up is defined as a company that has already validated its product within the marketplace, and has proven that the unit economics are sustainable. We speak to two business owners who have successfully grown their businesses to get their top tips. We also chat to SETsquared Surrey’s Head of Incubation to find how organisations like SETsquared can help businesses scale-up succesfully. In this episode of SETsquared Downloaded, we interview: Steve Cliff, CEO of Ultrahaptics about the challenges scale-ups face and how he achieved it. Tom Carter, CTO and founder of Ultrahaptics to get his advice about the growing pains business might face. Caroline Fleming, Head of Incubation at SETsquared Surrey about the needs of a scale-up and what support is available to businesses. Music: "Live the World" by Lee Rosevere, used under CC BY / Cut down from original.

Growth Mindset Podcast
27 - The Hottest Tech Innovation of the Year - Tom Carter - Ultrahaptics - [Repeat]

Growth Mindset Podcast

Play Episode Listen Later Jul 16, 2018 55:26


TOM CARTER — FOUNDER & CTO — ULTRAHAPTICS Tom Carter is the Founder and CTO at Ultrahaptics. A leading technology company making mind-blowing technology that lets users feel the sensation of touch in mid-air. The future is coming and they are at the pinnacle of it. They just won the "Tech Innovation of the Year" award and it's an exciting time for the company as they go from strength to strength. Tom invented the technology behind Ultrahaptics alongside his professor in a research lab at the University of Bristol. They have since grown the idea into one of the UK's most innovative businesses raising multiple rounds of funding with a rapidly growing team. You don’t go through events like this without learning a lot of lessons. On the podcast, he shares some amazing insights into what he has learned along the way. It's an incredible story direct from the front line and I’m sure something all listeners will want to follow updates as their tech gets into the hands of consumers over the coming years. Links https://Ultrahaptics.com https://twitter.com/iamtomcarter https://twitter.com/samharristweets Show notes https://bit.ly/2Nj2Z3G Special Guest: Tom Carter.

The Tech Cat Show
Live from the Floor of Augmented World Expo 2018 part2!

The Tech Cat Show

Play Episode Listen Later Jun 13, 2018 73:48


Join the Tech Cat on the show floor at AWE 2018(Augmented World Expo), #1 AR+VR conference and expo. AWE is back for its 9th year in the USA & will illustrate why every organization, startup, and investor must get into XR (short for AR, VR, MR) . We chat with startups and co's who are already using AR & VR tToday's show features interviews with Max Dawes of Zappar, an enterprise solution for creating mixed reality experiences for brands; Maarten Tobias of Dimenco, who is an international pioneer in display technologies and will discuss his partnership with Ultrahaptics on creating a new 'Simulated Reality' experience; Jay Iorie of IEEE who talks about creating standards for immersive technology; Wolfgang Steltzle of Re'Flekt, who provides tools for creating Augmented and Mixed Reality powered manuals and instructions and Ian Kelso of Impossible Things, who is building intuitive immersive experiences for museums and other categories.

Live From Augmented World Expo 2018 -USA
A Conversation with Maarten Tobia, Dimenco -Simulated Reality Live at AWE2018

Live From Augmented World Expo 2018 -USA

Play Episode Listen Later Jun 6, 2018 10:13


Maartin Tobias is the CEO of Dimenco. He's an experienced strategy and business development manager with extensive knowledge and interest in entrepreneurial environment. Dimenco and Ultrahaptics are coming together to offer a Joint Simulated Reality . Dimenco is a leading technology company bringing excitement and reality to the world. Dimenco is recognized as a global market leader in spatial visualization with high-quality award-winning products. We Bring Reality.Ultrahaptics is the world’s leading mid-air haptics company. The company has developed a unique technology that enables users to receive tactile feedback in mid-air without the need for gloves or controllers. The technology uses ultrasound to project sensations through the air directly onto the user’s hands, enabling users to ‘feel’ virtual buttons, get tactile feedback for mid-air gestures, or interact with virtual objects.

re:publica 18 - Alle Sessions
Performersion|The function of Touch Feedbacks in VR Without Haptic Mechanics

re:publica 18 - Alle Sessions

Play Episode Listen Later May 3, 2018 59:47


Ramona Mosse, Gerko Egert, Melanie Jame Wolf Haptic feedback for total immersion is central to many visions of virtual reality applications. The promise to be in contact with all the senses with another person in another place in the world fuels many research branches and start-ups. But previous solutions such as Teslasuit, Neurolace, Ultrahaptics and much more are extremely complicated, resource-intensive and complex interfaces. On the other hand, we all know that artists on stage can touch people in the audience without physically touching them. The fact that this stage craft also works in VR applications has been impressively demonstrated by groups such as MAKROPOL, Bombina Bombast and Polymorf in their performances. Which transmissions can be found here between stagecraft and technology development? The dance scholar Gerko Egert, who has written the current standard work on touch and touching in dance, and Ramona Mosse, dramaturg and academic visionary, and the artist Melanie Jame Wolf, whose work focuses on ideas and questions of persona and staging, discuss utopian and practical models of intimacy and contact in VR. This format is part of the Performersion, a cooperation of re:publica and Performing Arts Programm Berlin.

Geek News Central Special Media Feed
You Can Feel It with Ultrahaptics at CES 2018

Geek News Central Special Media Feed

Play Episode Listen Later Jan 30, 2018 4:06


Playing games using bare hands for controllers goes back as far as the Eye Toy for the Sony PlayStation 2 but it was really Microsoft’s Kinect that brought gesture recognition mainstream. Fantastic as these cameras were, players only received visual feedback, e.g. the in-game character moved in the desired direction. But what if there is … Continue reading You Can Feel It with Ultrahaptics at CES 2018 → The post You Can Feel It with Ultrahaptics at CES 2018 appeared first on Geek News Central.

Growth Mindset Podcast
04 - Putting the Touch in 'Touchless' Tech! - Tom Carter - Founder & CTO, Ultrahaptics

Growth Mindset Podcast

Play Episode Listen Later Aug 30, 2017 55:27


TOM CARTER — FOUNDER & CTO — ULTRAHAPTICS Tom Carter is the Founder and CTO at Ultrahaptics. A leading technology company making mind-blowing technology that lets users feel the sensation of touch in mid-air. The future is coming and they are at the pinnacle of it. Tom invented the technology behind Ultrahaptics alongside his professor in a research lab at the University of Bristol. They have since grown the idea into one of the UK's most innovative businesses raising multiple rounds of funding with a rapidly growing team. You don’t go through events like this without learning a lot of lessons. On the podcast, he shares some amazing insights into what he has learned along the way. It's an incredible story direct from the front line and I’m sure something all listeners will want to follow updates as their tech gets into the hands of consumers over the coming years. TOM’S TOP TIPS: 1. Culture: I love the way he thinks about making a nice culture without trying to take over people’s lives to try and squeeze more work out of them. Just getting more hours out of people doesn’t necessarily mean you’ll get more work Allow people more freedom from work, so work 9–5 and they can go home. They can enjoy the money they earn and live their own social lives. Make sure that meal times are utilised as a good way for the team to bond and get some down time, consider this in the layout of your office 2. Focus on your strengths: I’ve got a lot of respect for the way he freely accepted taking on an incoming CEO so early on in the business, giving the company the management experience it needed to scale rapidly and allowing Tom to focus on driving innovation . Know your own strengths and limitations and just step back Be willing to get better people to do things you can’t do as well, this can help you achieve far more in the long run and you might learn a thing or two from them in the process. 3. Hiring He has a lot of insight from so conducting so many interview conversations in a rapidly scaling company that has experienced barely any turn over of staff. Approach interviews as more open conversations with people rather than a structured process. Aim to just get a good feel for what they would be like and what they could be doing for you. They might even have a role in your company that you weren’t specifically interviewing for. Make a point to ask questions like: What are they proud of doing? What do they enjoy doing? What are they best at doing? When hiring for a top position, screen candidates until you get four people that can do the job and get them all in on the same day. Afterwards, If you can all agree that one can do the definitely do the job don’t waste time overthinking it and just hire them. Bonus — Just do it Keep trying. success isn’t easy but if you have faith in the end goal even if it is a long way off it is worth the effort. Get out and try stuff whilst you’re still young! Don’t wait for life to happen to you because it comes around fast. TOM’S FAVOURITE BOOKS THE HARD THING ABOUT HARD THINGS — BEN HOROWITZ the tale of starting and growing a business to a billion dollars and losing it all and getting back and then nearly losing it and then starting another billion dollar business. Basically, there is no recipe for success and it is defined by what you do when things are going wrong. Such a good read. Sams Review and summary (https://medium.com/@sam_harris/the-hard-thing-about-hard-things-ben-horowitz-summary-and-review-8013261e1b4c) GETTING THINGS DONE: THE ART OF STRESS-FREE PRODUCTIVITY — DAVID ALLEN Just a standardly very useful book about just being an efficient human being and making the most of your time. Find top tips on maximising your time to achieve your goals that you can start implementing from day 1. It might not be the most exciting book but if you’re struggling to keep up with your task list or want to become better at achieving your goals then this should probably be at the top of your list before moving onto anything else. Get the book (https://www.amazon.co.uk/gp/product/B00SHL3V8M/ref=as_li_qf_sp_asin_il_tl?ie=UTF8&tag=samharris48-21&camp=1634&creative=6738&linkCode=as2&creativeASIN=B00SHL3V8M&linkId=e4c34a0be2636b47de811c98f2d8ff89) READY PLAYER 1 - ERNEST CLINE Since recording the podcast I have now read this book and it is amazing. I listened on audible and literally read the whole thing in 2.5 days whilst travelling in Kazakhstan. I couldn’t stop listening. Such an engaging read. Like Tom says it’s a really interesting perspective view on the future that gives a scary insight into some of the possibilities facing us. It also appeals to the geek in you and has a lot of 70’s and 80’s references so I sort of put it as the Matrix meets Guardians of the Galaxy maybe but based in a more possible reality. Get the book (https://www.amazon.co.uk/gp/product/0099560437/ref=as_li_qf_sp_asin_il_tl?ie=UTF8&tag=samharris48-21&camp=1634&creative=6738&linkCode=as2&creativeASIN=0099560437&linkId=230a1dc15b28cbc2156fdfb42691e8c7) Get any of the books free on audible (https://www.amazon.co.uk/Audible-Free-Trial-Digital-Membership/dp/B00OPA2XFG?tag=samharris48%E2%80%9321) TALK TO US We’d love to know what you learned most from the episode and feel free to leave a comment or start a conversation with us on twitter. You can find out more about Sam or Tom and talk about any episode you’ve enjoyed or the future of the podcast. Tom and Ultrahaptics: Instagram (https://www.instagram.com/iamtomcarter/) Twitter (https://twitter.com/iamtomcarter) Ultrahaptics (https://Ultrahaptics.com) University of Bristol (https://bristol.ac.uk/) Sam: Instagram (https://www.instagram.com/samjamsnaps/) Quora (https://www.quora.com/profile/Sam-Harris-58) Twitter (https://twitter.com/samharristweets) LinkedIn (https://www.linkedin.com/in/sharris48/) Subscribe! If you enjoyed the podcast please subscribe and rate it. And of course, share with your friends! Special Guest: Tom Carter.