Inside Facebook Reality Labs: The next era of human-computer interaction (2023)

  • Tags:
  • Inside the Lab
  • Reality Labs
  • Road to AR Glasses

  • Tags:
  • Inside the Lab
  • Reality Labs
  • Road to AR Glasses

TL;DR: In today’s post — the first in a series exploring the future of human-computer interaction (HCI) — we’ll begin to unpack the 10-year vision of a contextually-aware, AI-powered interface for augmented reality (AR) glasses that can use the information you choose to share, to infer what you want to do, when you want to do it.

Next week, we’ll share some nearer-term research: wrist-based input combined with usable but limited contextualized AI, which dynamically adapts to you and your environment. And later in the year, we’ll pull back the curtain on some groundbreaking work in soft robotics to build comfortable, all-day wearable devices and share an update on our haptic glove research.

Recommended Reading

  • Announcing Project Aria: A research project on the future of wearable AR
  • Inside Facebook Reality Labs Research: The future of audio
  • Imagining a new interface: Hands-free communication without saying a word
Inside Facebook Reality Labs: The next era of human-computer interaction (1)

Imagine a world where a lightweight, stylish pair of glasses could replace your need for a computer or smartphone. You’d have the ability to feel physically present with friends and family — no matter where in the world they happened to be — and contextually-aware AI to help you navigate the world around you, as well as rich 3D virtual information within arm’s reach. Best of all, they’d let you look up and stay present in the world around you rather than pulling your attention away to the periphery in the palm of your hand. This is a device that wouldn’t force you to choose between the real world and the digital world.

It may sound like science fiction, but it’s a future that Facebook is building inside our labs. And today, we’ll share our vision for how people will interact with that future.

The AR interaction challenge

Facebook Reality Labs (FRL) Chief Scientist Michael Abrash has called AR interaction “one of the hardest and most interesting multi-disciplinary problems around,” because it’s a complete paradigm shift in how humans interact with computers. The last great shift began in the 1960s when Doug Engelbart’s team invented the mouse and helped pave the way for the graphical user interfaces (GUIs) that dominate our world today. The invention of the GUI fundamentally changed HCI for the better — and it’s a sea change that’s held for decades.

(Video) Morning Star 🌟 News - The Next Era of Human-Computer Interaction, Augmented Reality (2021.03.10)

Doug Englebart’s Mother of All Demos, 1968 (video courtesy of SRI International: www.sri.com)

But all-day wearable AR glasses require a new paradigm because they will be able to function in every situation you encounter in the course of a day. They need to be able to do what you want them to do and tell you what you want to know when you want to know it, in much the same way that your own mind works — seamlessly sharing information and taking action when you want it, and not getting in your way otherwise.

Inside Facebook Reality Labs: The next era of human-computer interaction (2)
(Video) Facebook Reality Labs: Wrist-based Interaction

“In order for AR to become truly ubiquitous, you need low-friction, always-available technology that’s so intuitive to use that it becomes an extension of your body,” says Abrash. “That’s a far cry from where HCI is today. So, like Engelbart, we need to invent a completely new type of interface — one that places us at the center of the computing experience.”

This AR interface will need to be proactive rather than reactive. It will be an interface that turns intention into action seamlessly, giving us more agency in our own lives and allowing us to stay present with those around us.

Importantly, it will need to be socially acceptable in every respect — secure, private, unobtrusive, easy to learn, easy to use, comfortable/all-day wearable, effortless, and reliable.

As we build the next computing platform centered around people, we’re committed to driving this innovation forward in a responsible, privacy-centric way. That’s why we’ve crafted a set of principles for responsible innovation that guide all our work in the lab and help ensure we build products that are designed with privacy, safety, and security at the forefront.

In short, the AR interface will require a complete rethinking of how humans and computers interact, and it will transform our relationship with the digital world every bit as much as the GUI has.

The problem space, explored

Say you decide to walk to your local cafe to get some work done. You’re wearing a pair of AR glasses and a soft wristband. As you head out the door, your Assistant asks if you’d like to listen to the latest episode of your favorite podcast. A small movement of your finger lets you click “play.”

As you enter the cafe, your Assistant asks, “Do you want me to put in an order for a 12-ounce Americano?” Not in the mood for your usual, you again flick your finger to click “no.”

You head to a table, but instead of pulling out a laptop, you pull out a pair of soft, lightweight haptic gloves. When you put them on, a virtual screen and keyboard show up in front of you and you begin to edit a document. Typing is just as intuitive as typing on a physical keyboard and you’re on a roll, but the noise from the cafe makes it hard to concentrate.

Recognizing what you’re doing and detecting that the environment is noisy, the Assistant uses special in-ear monitors (IEMs) and active noise cancellation to soften the background noise. Now it’s easy to focus. A server passing by your table asks if you want a refill. The glasses know to let their voice through, even though the ambient noise is still muted, and proactively enhance their voice using beamforming. The two of you have a normal conversation while they refill your coffee despite the noisy environment — and all of this happens automatically.

A friend calls, and your Assistant automatically sends it to voicemail so as not to interrupt your current conversation. And when it’s time to leave to pick up the kids based on your calendared event, you get a gentle visual reminder so you won’t be late due to the current traffic conditions.

Building the AR interface

FRL Research has brought together a highly interdisciplinary team made up of research scientists, engineers, neuroscientists, and more, led by Research Science Director Sean Keller, all striving to solve the AR interaction problem and arrive at computing’s next great paradigm shift.

(Video) #BiNConference2020 | Facebook Reality Labs Panel

We classically think of input and output from the computer’s perspective, but AR interaction is a special case where we’re building a new type of wearable computer that’s sensing, learning, and acting in concert with users as they go about their day,” says Keller, who joined FRL Research to build a five-person team which has since grown to a team of hundreds of world-class experts in the span of just six years. “We want to empower people, enabling each and every one of us to do more and to be more — so our AR interaction models are human-centric.”

Inside Facebook Reality Labs: The next era of human-computer interaction (3)

At Facebook Connect in 2020, Abrash explained that an always-available, ultra-low-friction AR interface will be built on two technological pillars:

The first is ultra-low-friction input, so when you need to act, the path from thought to action is as short and intuitive as possible.

You might gesture with your hand, make voice commands, or select items from a menu by looking at them — actions enabled by hand-tracking cameras, a microphone array, and eye-tracking technology. But ultimately, you’ll need a more natural, unobtrusive way of controlling your AR glasses. We’ve explored a range of neural input options, including electromyography (EMG). While several directions have potential, wrist-based EMG is the most promising. This approach uses electrical signals that travel from the spinal cord to the hand, in order to control the functions of a device based on signal decoding at the wrist. The signals through the wrist are so clear that EMG can detect finger motion of just a millimeter. That means input can be effortless — as effortless as clicking a virtual, always-available button — and ultimately it may even be possible to sense just the intention to move a finger.

The second pillar is the use of AI, context, and personalization to scope the effects of your input actions to your needs at any given moment. This is about building an interface that can adapt to you, and it will require building powerful AI models that can make deep inferences about what information you might need or things you might want to do in various contexts, based on an understanding of you and your surroundings, and that can present you with the right set of choices. Ideally, you’ll only have to click once to do what you want to do or, even better, the right thing may one day happen without you having to do anything at all. Our goal is to keep you in control of the experience, even when things happen automatically.

While the fusion of contextually-aware AI with ultra-low-friction input has tremendous potential, important challenges remain — like how to pack the technology into a comfortable, all-day wearable form factor and how to provide the rich haptic feedback needed to manipulate virtual objects. Haptics also let the system communicate back to the user (think about the vibration of a mobile phone).

To address these challenges, we need soft, all-day wearable systems. In addition to their deep work across ultra-low-friction input and contextualized AI, Keller’s team is leveraging soft, wearable electronics — devices worn close to or on the skin’s surface where they detect and transmit data — to develop a wide range of technologies that can be comfortably worn all day on the hand and wrist, and that will give us a much richer bi-directional path for communication. These include EMG sensors and wristbands.

AR glasses interaction will ultimately benefit from a novel integration of multiple new and/or improved technologies, including neural input, hand tracking and gesture recognition, voice recognition, computer vision, and several new input technologies like IMU finger-click and self-touch detection. It will require a broad range of contextual AI capabilities, from scene understanding to visual search, all with the goal of making it easier and faster to act on the instructions that you’d already be sending to your device.

(Video) Facebook showcases wrist-worn AR interface concept. Human-Computer Interaction

And to truly center human needs in these new interactions, they will need to be built responsibly from the ground up, with a focus on the user’s needs for privacy and security. These devices will change the way we interact with the world and each other, and we will need to give users total control over those interactions.

Building the AR interface is a difficult, long-term undertaking, and there are years of research yet to do. But by planting the seeds now, we believe we can get to AR’s Engelbart moment and then get that interface into people’s hands over the next 10 years, even as it continues to evolve for decades to come.

More Context

The biggest difference between the future AR interface and everything that’s come before is that there will be much more contextual information available to our AR devices. The glasses will see and hear the world from your perspective, just as you do, so they will have vastly more personal context than any previous interface has ever had. Coupled with powerful AI inference models, this context will give them the ability to help you in an ever-increasing variety of personalized ways and free your mind up to do other things.

Imagine having a pair of glasses that could feed you key statistics in a business meeting, guide you to destinations, translate signs on the fly, tell you where you’ve left your car keys, or even help you with almost any sort of task. Asking what else this interface will enable is kind of like asking what the GUI would enable back in 1967 — the possibilities are vast and open-ended.

Another difference is that most existing interfaces are modal. You pick the mode by running an app, and your set of choices is then altered to match that mode. And as you switch from one app to another, the context of what you’re doing at any given moment is lost as you move to your next task. But AR glasses don’t have that luxury. They will work best if they operate seamlessly in all the contexts you encounter in a day — contexts that change constantly and often overlap. This means that the interface will treat every interaction as an intent inference problem. And it can then use its predictions to present you with a simple set of choices, without having you navigate through menu after menu of options to find the information you might be looking for, as today’s interfaces do.

Critically, the interface of the future will be amplified by a key feedback loop. Not only can the AI learn from you, but because the input is ultra-low-friction (and only requires an “intelligent click”), the AI will ask questions to improve its understanding of you and your needs more quickly. The ability to instruct the system in real time will be hugely valuable and will leapfrog systems that rely on traditional data collection and training.

The ultimate goal is to build an interface that accurately adapts to you and meets your needs — and is able to ask a simple question to disambiguate when it isn’t sure — but this system is years off. That’s partly because the sensing technology and egocentric data needed to train the AI inference models simply do not exist. By collecting first-person perspective data, our recently launched Project Aria will move us one step closer to this goal.

In the nearer term, we’ll see usable but limited contextual AI with predictive features like the ability to proactively suggest a playlist you might want to listen to on your daily jog. Stay tuned to the blog next week, when we’ll pull back the curtain on some of our work with HCI at the wrist and what we call an adaptive interface.

People at the center

Today’s devices have allowed us to connect with people far away from us, unconstrained by time and space, but too often, these connections have come at the expense of the people physically next to us. We tell ourselves that if only we had more willpower we would put down our smartphone and focus on the conversation in front of us. That’s a false choice. Our world is both digital and physical, and we shouldn’t have to sacrifice one to truly embrace the other.

We need to build devices that won’t force us to choose between people and our devices. These future devices will let us look up and stay in the world so that we can do more of what we are built to do as humans — to connect and collaborate.

(Video) A Fireside Chat with Dr. Yaser Sheikh, Director Reality Labs, Meta

But for this next great wave of human-oriented computing to come to fruition, we need a paradigm shift that truly places people at the center. That means our devices will need to adapt to us, rather than the other way around. It means AR needs its own Englebart moment.

Facebook Reality Labs is hiring. Click here to view our current open positions.

PrevMost of computing’s carbon emissions are coming from manufacturing and infrastructureNextMade in Tel Aviv and New York: Instagram Lite

FAQs

What does Facebook reality Lab do? ›

Reality Labs is a business of Meta Platforms (formerly Facebook Inc.) that produces virtual reality (VR) and augmented reality (AR) hardware and software, including virtual reality headsets such as Quest, and online platforms such as Horizon Worlds.

What is virtual reality in HCI? ›

Virtual Reality involves providing sensory input to a user that replicates being present in a real or imagined environment. Most commonly the sensory input is limited to sight and sound, but it can also include other senses such as touch.

What can you do with a human computer interaction degree? ›

Jobs for HCI Program Graduates
  • Front-end designer.
  • Product manager.
  • User experience designer.
  • Software engineer.
  • Accessibility Engineer.

How many people use Facebook reality labs? ›

The 17,000 people working at Reality Labs specifically then equates to 21% of the entire company — 4% higher than a year ago.

What is Facebook's metaverse app? ›

The Facebook metaverse refers to Facebook's contribution to the metaverse, the anticipated successor to the mobile internet in which people interact in an interconnected and immersive digital world.

What is Facebook's metaverse project? ›

With Facebook's unwavering commitment to social innovation, Mark Zuckerberg announced their plans during the Facebook Connect 2021, to create a virtual world to remove all borders between communication, otherwise known as the Facebook metaverse.

Is virtual reality real? ›

“A common way of thinking about virtual realities is that they're somehow fake realities, that what you perceive in VR isn't real. I think that's wrong,” he told the Guardian. “The virtual worlds we're interacting with can be as real as our ordinary physical world. Virtual reality is genuine reality.”

What are the 3 types of virtual reality? ›

There are 3 primary categories of virtual reality simulations used today: non-immersive, semi-immersive, and fully-immersive simulations.

Is Second Life a real game? ›

Second Life is an online multimedia platform that allows people to create an avatar for themselves and then interact with other users and user created content within a multi player online virtual world.

What is the importance of human-computer interaction? ›

Importance of HCI. HCI is crucial in designing intuitive interfaces that people with different abilities and expertise usually access. Most importantly, human-computer interaction is helpful for communities lacking knowledge and formal training on interacting with specific computing systems.

What is the main goal of human-computer interaction explain? ›

The goals of HCI are to produce usable and safe systems, as well as functional systems. In order o produce computer systems with good usability, developers must attempt to: understand the factors that determine how people use technology. develop tools and techniques to enable building suitable systems.

What is the main goal of human-computer interaction? ›

A basic goal of HCI is to improve the interactions between users and computers by making computers more usable and receptive to the user's needs.

Where are Facebook reality labs located? ›

Located primarily in Redmond, Washington (with additional offices in Sausalito, Pittsburgh, and Menlo Park), the team is developing all the technologies needed to enable breakthrough AR glasses and VR headsets, including optics and displays, computer vision, audio, graphics, brain-computer interface, haptic interaction ...

How much money has Facebook put into the metaverse? ›

That puts the annualized metaverse investment at nearly $15 billion, well above the $10 billion annual figure the company had previously given.

What is virtual reality lab? ›

The Virtual Reality (VR) Lab is a unique facility in which you can walk through virtual reality representations of products that are yet to be realised using virtual reality technologies. The lab enables students and staff to visualise designs, develop immersive VR environments, and to test new VR and AR technologies.

Who owns the metaverse? ›

In the case of the Facebook Metaverse, it is owned by the company founder and CEO, Mark Zuckerberg.

How do I join metaverse? ›

How to get into the metaverse. Accessing the metaverse varies from platform to platform. You can enter gaming metaverses as a guest user with just a computer or smartphone — though to truly participate, you'll need a Windows PC and a crypto wallet. Other platforms require augmented reality glasses or a VR headset.

How do I enter metaverse? ›

Accessing the metaverse is as simple as putting on a virtual reality headset and holding a set of controllers. While its biggest use at present is gaming, the metaverse will increasingly be used for shopping, education, job training, doctor's appointments and socializing.

How do I invest on Facebook metaverse? ›

Check CoinMarketCap to see where you can buy Facebook Metaverse and with which currencies. For each cryptocurrency, CoinMarketCap provides a list of purchasing options (also known as market pairs). Go to CoinMarketCap and search for Facebook Metaverse. Tap on the button labeled “Market” near the price chart.

How metaverse will change the world? ›

The metaverse, by offering a 3D environment, has hopes of being the virtual environment for these social moments. Employees of the same company spread across the globe will have the potential to meet up in the metaverse to brainstorm at highly creative levels so as to solve problems.

Can you make money in Facebook metaverse? ›

There are several ways to start earning money in the metaverse. These include flipping digital assets, hosting virtual events, participating in play-to-earn games, designing virtual spaces, and investing in metaverse tokens.

What is reality in real life? ›

Reality is the sum or aggregate of all that is real or existent within a system, as opposed to that which is only imaginary. The term is also used to refer to the ontological status of things, indicating their existence. In physical terms, reality is the totality of a system, known and unknown.

How does virtual reality make you feel? ›

So, VR makes people feel sick because it triggers motion sickness (opens in new tab). When your brain thinks you are moving, but your body is static, it creates a disconnect between the two that causes enough confusion to make you feel ill.

How does virtual reality affect us? ›

Using VR for long periods can also cause physical harm. Nausea or “cybersickness” is well known and due to the fact that you may be moving in the simulation, but your body is not physically moving in the real world. Your brain gets confused. Aside from this, users have been known to experience eye soreness.

What are the 4 key elements to virtual reality? ›

Virtual Reality comprises 4 primary elements: virtual world, immersion, sensory feedback, and interactivity.

What is virtual reality very short answer? ›

WHAT IS VIRTUAL REALITY? Virtual Reality (VR) is a computer-generated environment with scenes and objects that appear to be real, making the user feel they are immersed in their surroundings. This environment is perceived through a device known as a Virtual Reality headset or helmet.

What is virtual reality short answer? ›

Virtual reality is a simulated 3D environment that enables users to explore and interact with a virtual surrounding in a way that approximates reality, as it is perceived through the users' senses.

Can you get pregnant in Second Life? ›

And given what a major part sex plays in Second Life, it's not altogether shocking that one activity gaining traction is the chance to give virtual birth [link not in any way safe for work]. Avatars are able to get pregnant the (virtual) old-fashioned way, and can choose the location in which they deliver.

Who still uses Second Life? ›

Do People Still Play Second Life? The short answer: Absolutely. More than 900,000 users actively play in this vast virtual world despite its age. Second Life unleashes you onto what is, essentially, a giant sandbox environment.

Do people still go on Second Life? ›

If you are wondering whether Second Life is still a thing, the answer is very much yes… and that's in part because it's free to download. Spending time in SL is also free and you can find a lot of free items as well while doing that. If you want to enhance your experience, you'll need to pay up, though.

What is human-computer interaction in your own words? ›

HCI is the study of designing computers and machines so that they best serve their users (i.e. humans). HCI is closely related to the field of User Experience (UX) design and is considered by many to be the forefather of this more modern approach.

What is human-computer interaction in simple words? ›

HCI (human-computer interaction) is the study of how people interact with computers and to what extent computers are or are not developed for successful interaction with human beings. A significant number of major corporations and academic institutions now study HCI.

What is the impact of human-computer interaction in design? ›

At its core, human-computer interaction is necessary because it puts people first by better understanding how they use technology. Human-computer interaction also makes technology more accessible. No two users are the same, and some may be elderly or cognitively or physically impaired.

How can the interaction between humans and computers affect the lives of people at work? ›

Sometimes the workers also become lazy and irresponsible. Most of the tasks are completed with the help of highly sophisticated human-computer interaction systems. This causes workers to not to use their full potential.

What are the three types of human-computer interaction? ›

There are three main types - command-line, menu driven and graphical user interface (GUI).

What company is building Facebook metaverse? ›

Meta Platforms
Entrance sign at Meta's headquarters complex in Menlo Park, California
OwnerMark Zuckerberg (controlling shareholder)
Number of employeesc. 76,000 (Nov. 2022)
DivisionsReality Labs
SubsidiariesNovi Financial
19 more rows

What is Facebook's metaverse world called? ›

How Big Tech is shifting. Facebook staked its claim to the metaverse last year, after shipping 10 million of its virtual-reality headsets and announcing it had renamed itself Meta. Google, Microsoft and Apple have all been working on metaverse-related technology.

Where is Facebook's metaverse? ›

One slight hitch: The metaverse doesn't exist yet, and it probably won't anytime soon. What does exist is an idea, an explosion of hype, and a bevy of rival apps and platforms looking to capitalize on both — without a clear path between the idea and reality. In techland, 2021 wasn't the year of the metaverse.

How much money is Meta putting into metaverse? ›

Meta has thrown $36 billion at the metaverse but plans to spend many billions more on the project. Wary investors are calling for Meta to focus its effort on its profitable divisions. Insider compiled a list of tech breakthroughs that cost far less than the metaverse push.

Will the metaverse be free? ›

How Free Will the Metaverse Be? The metaverse won't be free. But, it doesn't have to be more expensive than the modern internet. There will always be hardware and connectivity costs, but don't worry about not being able to pay your metaverse subscription.

Is Meta losing money? ›

Since changing its name to Meta and investing heavily to create the “metaverse,” a virtual reality world, Facebook's parent company has been plagued with woes. From the start of 2022 to now, the company has shed 70 per cent of its value.

How do virtual labs help students? ›

It allows students to understand the concepts better, which is otherwise difficult to offer with limited equipment and funding. Using virtual labs, teachers can easily explain complex theoretical concepts to students through a visual, immersive experience that can make it simpler for students to understand.

What does Facebook reality lab do? ›

Reality Labs is a business of Meta Platforms (formerly Facebook Inc.) that produces virtual reality (VR) and augmented reality (AR) hardware and software, including virtual reality headsets such as Quest, and online platforms such as Horizon Worlds.

What are the types of virtual labs? ›

There are three kinds of virtual labs: Remote Triggered Labs. Measurement Based Labs. Simulation/ Modeling Based Labs.

What does Facebook do for VR? ›

Earlier this year, Facebook operationalized the concept with the launch of Facebook Spaces, a VR environment in which Oculus Rift users can create their own avatars and meet in virtual spaces, where they can take selfies together in front of different places around the world and do other things such as share messages ...

Does Facebook Track your Oculus activity? ›

Your VR Data and Facebook Data May Be Cross-referenced

The list further above is the data Meta captures about your VR use, but if you're using Quest 2, your VR data is inherently associated with data collected by Meta on your non-VR activity via Facebook.

How much does Facebook pay for metaverse? ›

How big are Facebook's ambitions for the metaverse? So big that the social media giant recently paid a South Dakota bank $60 million just to acquire the trademark rights associated with its Meta Financial name.

Does Facebook post my Oculus activity? ›

We do not use information processed and stored locally on your headset to target ads. Information stored on the device means it does not leave the device or reach Facebook servers — which also means others don't see your data and it can't be used for advertising.

Is metaverse the future? ›

Tim Cook Says The Metaverse Isn't The Future Because People Don't Understand It – They Might Not Have To. Opinions expressed by Forbes Contributors are their own. New! Follow this author to stay notified about their latest stories.

Do you need a Facebook account to use Oculus Quest 2 2022? ›

Do you need a Facebook account to use the Oculus/Meta Quest 2? No, not anymore. On August 23, 2022, Meta removed the requirement of a Facebook account to access Quest 2 and other Quest products. However, what it does require is a Meta Account, separate from Facebook.

What data does Facebook collect from Oculus? ›

But it does say information will be collected about how you interacted with any ads you see: “While testing ads in Oculus apps, Facebook will get new information like whether you interacted with an ad and if so, how—for example, if you clicked on the ad for more information or if you hid the ad.”

Can I use Oculus Quest without Facebook? ›

You'll first be served up a code to pair your VR headset to the Oculus app. You'll then see the option to sign in with Facebook or Instagram. Alternatively, you can create a Meta account via email. By choosing the Meta account option, you no longer need a Facebook or Instagram account to use your Quest.

How can I get rich from metaverse? ›

There are several ways to start earning money in the metaverse. These include flipping digital assets, hosting virtual events, participating in play-to-earn games, designing virtual spaces, and investing in metaverse tokens.

Can you make real money in the metaverse? ›

You can create things like avatars, buy and sell virtual land, and interact with other users in the metaverse. You can also make money in the metaverse, allowing you to have a side hustle or an additional stream of revenue.

Does Facebook delete Oculus accounts? ›

In October last year, the company unveiled that Oculus headsets would require a Facebook login to use, and it was also discovered that deleting a Facebook account would wipe out all Oculus purchases tied to it.

Does Oculus record you? ›

The cameras are all set up specifically for gameplay as well, so they all face the same direction. This means that even if they were watching, which they aren't, they'd probably enjoy a nice view of your wall or TV stand when not in use. Facebook has also stated that they don't capture on-device data.

What is future off Facebook activity? ›

Off-Facebook activity is a summary of activity that businesses and organizations share with us about your interactions with them, such as visiting their apps or websites. They use our Business Tools, like Facebook Login or Facebook Pixel, to share this information with us.

Videos

1. Facebook AR Glasses Interface
(Bas Gezelle)
2. Facebook Connect | Keynote 2020
(Meta Quest )
3. Dr. Lydia Kavraki and Dr. Angela Radulescu on Robotics and Human-computer Interaction DS C2C Seminar
(Rice Ken Kennedy Institute)
4. Dr. Lex Fridman: Machines, Creativity & Love | Huberman Lab Podcast #29
(Andrew Huberman)
5. Vertical 5G and Creative Industries, Eero Tiainen, Aalto Virtual Cinema Lab
(Start North)
6. He Tried To Mess With A Royal Guard & Big Mistake
(Defense Lab)
Top Articles
Latest Posts
Article information

Author: Mr. See Jast

Last Updated: 03/24/2023

Views: 5847

Rating: 4.4 / 5 (75 voted)

Reviews: 82% of readers found this page helpful

Author information

Name: Mr. See Jast

Birthday: 1999-07-30

Address: 8409 Megan Mountain, New Mathew, MT 44997-8193

Phone: +5023589614038

Job: Chief Executive

Hobby: Leather crafting, Flag Football, Candle making, Flying, Poi, Gunsmithing, Swimming

Introduction: My name is Mr. See Jast, I am a open, jolly, gorgeous, courageous, inexpensive, friendly, homely person who loves writing and wants to share my knowledge and understanding with you.