Apple Watch will add sleep stages function

Ge Yue, Apple's vice president and managing director of Greater China, delivered a speech at the Shanghai World Artificial Intelligence Conference, mainly expounding on Apple's technical improvements in accessibility and health functions in recent years.

For example, the Apple Watch's Assistive Touch feature, the AirPods Pro's conversation enhancements, and the door detection feature released in iOS 16 and iPadOS 16 are just some of the things Apple offers convenience in the accessibility space for people with special needs. At the same time, in the health field, Apple has been considering the health of the majority of people. That’s why Apple has introduced a new “Sleep Stages” feature in watchOS 9 , which can help you better understand your sleep status.

She believes that machine learning is becoming an important tool for Apple's innovation. At the meeting, she also mentioned that Apple's machine learning models not only rely on chip performance, but also require high-quality input, including touch, motion, sound, and visual information. Apple integrates sensors into hardware devices in order to provide fast and highly accurate signals.

What information did Apple give at the World Artificial Intelligence Conference? What new progress has Apple made in the field of artificial intelligence? We can learn about Apple's latest situation through this speech.

The following is the full speech of Ge Yue, Vice President of Apple and Managing Director of Greater China, "Machine Learning Creates a Healthier and Accessible Future" :

Distinguished leaders and distinguished guests: Good afternoon, everyone. First of all, I am very grateful to the organizer for inviting me back to the World Artificial Intelligence Conference again. I am very happy to have the opportunity to communicate with you today.

At Apple, we want our products to help people innovate and create, providing the support people need in their daily lives. Machine learning plays a vital role here: it can better harness the power of our combination of hardware and software to improve people's lives in every way. We've seen its enormous capabilities countless times.

Today, I want to dive into two areas where the potential to improve people's lives is particularly clear: accessibility and wellness. We'll explore some of Apple's features that machine learning helps enable: some are designed for people with disabilities and special needs, while others help people from all walks of life lead healthier lives.

But, like any technology, machine learning doesn't work alone, so I'll start with the innovations that have made it such a powerful tool.

At Apple, we always focus on the integrity of product design. Therefore, whether it is the hardware or software of our products, we have always believed that design and integration should go hand in hand.

And a great example of this integration is Apple Silicon, which helps enable some powerful new features through strong performance and excellent battery life. Neural network engines are a key part of these innovations. It is built for machine learning and is very powerful and efficient when running machine learning models.

Of course, our cutting-edge machine learning models don't just rely on powerful chips, they also require high-quality input, including touch, motion, sound, and vision. We integrate powerful sensors into our devices. These sensors can provide fast and highly accurate signals to our machine learning models.

Combining these sensors, state-of-the-art machine learning models, and the power of Apple Silicon, we have designed features that can run entirely on end devices. Every function can run on our tailor-made hardware to maximize efficiency and get the best performance without consuming too much power. Since a high-speed network connection is not required, the performance of the function is more stable and reliable.

Most importantly, because no data needs to leave the end device, privacy is better protected. This advantage is especially important for health and accessibility features. Because for these functions, a great user experience is inseparable from efficiency, reliability and privacy protection.

Let's start with accessibility features that provide accessibility assistance. We believe that the best products in the world should meet everyone's needs. Accessibility is one of our core values ​​and an important part of all our products. We are committed to making products that truly work for everyone.

We know that machine learning can help provide independence and convenience for users with disabilities, including the visually impaired, the hearing impaired, those with physical and motor impairments, and those with cognitive impairments. Let's look at a few examples, starting with the Apple Watch. Assistive Touch on the Apple Watch allows users with limited mobility to control the Apple Watch through gestures.

Instead of tapping the display, the feature combines on-device machine learning with data from the Apple Watch's built-in sensors to help detect subtle differences in muscle movement and tendon activity. By including a gyroscope, accelerometer, and optical heart rate sensor, users can control the Apple Watch with hand movements such as pinching or making a fist.

Next, let's talk about AirPods Pro. AirPods Pro combines Apple's H1 chip with a built-in microphone to deliver a powerful listening experience through machine learning. Conversation Enhancement on AirPods Pro uses machine learning to detect and amplify sounds. If you're talking to someone in a noisy restaurant, Conversation Enhancement can focus the voice of the person in front of you so you can hear it more clearly. It is worth mentioning that this function only needs to be run on the terminal device.

Finally, let's take a look at the door detection feature that was recently released in iOS 16 and iPadOS 16. Door detection combines LiDAR scanners, cameras and machine learning running on end devices to help visually impaired users locate doors, detect the distance between people and doors, and determine whether doors are open or closed. It can even read signs and symbols around doors, such as room numbers for offices, or signs for accessible entrances.

Users can also use door detection in combination with people detection to help visually impaired people move more freely in public spaces, identify people nearby, and maintain social distancing. These are just a few examples of how machine learning can have a substantial impact on the lives of people with disabilities. Combining advances in chips, sensor technology, and end-device-based machine learning makes our products easier to use and helps users better interact with the outside world.

Health is another area of ​​our focus with the potential to improve people's lives. Technology can play an important role in making our bodies healthier and encouraging people to live healthier lives.

Our machine learning and sensor technology can provide useful health information, allowing users to gradually achieve overall health through small changes in daily behavior. We always ensure that these health features stand up to rigorous scientific validation, and protecting user privacy is always our top priority.

For example, watchOS 9, which we'll be launching this fall, includes a new "Sleep Stages" feature. It can help everyone better understand their sleep status. Apple Watch has a heart rate sensor and accelerometer built into it. Using signals from these sensors, the Apple Watch can detect whether the user is in REM, core, or deep sleep and provide detailed metrics such as sleep breathing rate and heart rate.

In addition, fall detection uses the accelerometer and gyroscope in the Apple Watch, and through machine learning algorithms, it can identify serious falls. By analyzing the wrist trajectory and impact acceleration, the Apple Watch can send an alert if the user falls. Of course, we also want to go a step further and find a way to support and help users before they fall.

For this purpose, we created the feature of walking stability. It's a first-of-its-kind health feature that uses data generated by the iPhone as users move around to see how risky they are of a fall. This important information can help people gradually improve their mobility, thereby reducing the risk of falls.

While our exploration of health is just beginning, we are already seeing tremendous potential for machine learning and sensor technology to provide health insights, encourage healthy lifestyles, and more. All of these features serve our mission of creating a better life. We are hopeful about the future of machine learning: we strongly believe that it can inspire more innovations that improve people's lives.

It can help us understand our physical condition and develop healthier living habits. It can lower the barriers to the use of technology and bring the outside world closer. It also protects our privacy and gives us confidence in the technology.

At Apple, we strive to innovate, empower users, and make technology a force that improves people's lives. We are excited to continue our journey up and down this path, using machine learning to help create a healthier and more accessible future for all. thank you all!

Post a Comment