comscore

You can use Apple Vision Pro with motion sickness and not get sick: Heres how

It is even more immersive this way.

Published By: Manik Berry | Published: Jun 12, 2023, 11:49 AM (IST)

  • whatsapp
  • twitter
  • facebook

Highlights

  • Apple Vision Pro uses the R1 chip to process data from 12 cameras, 5 sensors, and 6 mics.
  • This data is processed to reduce motion sickness while using the headset.
  • The reduction is done in real-time, and is personalised for each user based on their eye, head, and hand movements.
  • whatsapp
  • twitter
  • facebook

Apple filed over 5,000 patents to make the Apple Vision Pro. Now one of these patents shows how Apple has made it safe to use the Apple Vision Pro with motion sickness. This means even if you can’t use other VR/AR headsets because of motion sickness, Apple’s headset is safe to use. news Also Read: WhatsApp For iOS Gets A Fresh Liquid Glass Makeover Inspired By iOS 26

The patent shows that most of the headset’s motion sickness reduction is happening at the processor level. Vision Pro’s R1 chip is responsible for processing data from the 12 cameras, 5 sensors, and 6 mics on the device. However, Apple has tuned it in a way that will reduce motion sickness and improve the overall immersion at the same time. news Also Read: How Apple’s ‘Find My iPhone’ Helped Bust Gang Behind 40,000 Stolen Phones

How does Apple Vision Pro motion sickness reduction work?

Apple VisionPro motion sickness reduction patent

Image: Patently Apple

The European Patent Office has published an Apple patent that shows the detailed working of the Vision Pro motion sickness reduction. The headset adjusts the image around and outside the user’s peripheral vision (foveated gaze zone) to make it more immersive. Normal AR/VR headsets have a black border around the edges to keep you focused on the content in front of you. news Also Read: 6 Best Premium Tablets For Office Work: Which One You Should Buy

However, Apple’s AR headset adjusts the contrast and image outside the user’s field of vision. The headset audio is also tuned to keep the user focused and reduce motion sickness. This combination of audio-visual cues is helping Apple improve immersion and make Vision Pro more user-friendly.

But it doesn’t end there, as the R1 chip is making this process personal for each user. According to the patent, the headset will track your eye movements, head positioning, and hand gestures to personalize the experience. It works on 6 degrees of freedom (DoF), tracking user movements and adjusting the cues in real time.

What is the role of the R1 chip and M2 chip in Apple Vision Pro?

Apple M2 and R1 chip roles

Image: Apple

It may sound easy but Apple fitting the M2 chip in a mixed-reality headset is a big deal. This chip is accompanied by a new R1 chip. Apple calls the Vision Pro its first “spatial computer” and hence the M2 chip. It is responsible for running all the tasks on the headset. You can use it as a Mac or an iPad in mixed reality, thanks to the M2 chip.

Coming to the R1, this chip is responsible for processing everything that you see through the headset. All the data from Vision Pro’s 12 cameras, 5 sensors, and 6 mics go through this chip. So the R1 chip creates and maintains the immersion, reduces motion sickness, and translates your commands to the M2 chip for processing.

While the Vision Pro will be available next year, it is surprising to see how much horsepower Apple has fitted into this little device.