Apple previews new iPhone door detection, Apple Watch mirroring, and Live Caption accessibility

Apple officially showcased innovative assistive functions that combine powerful hardware and software with machine learning capabilities. The software functions scheduled to be launched later this year will provide users with disabilities with new tools, covering navigation, health, communication, etc. A variety of functions.

Apple today showcased innovative software features that will give users with disabilities new ways to navigate and communicate, helping them get the most out of Apple products. These powerful updates bring together the company's latest technologies to give users unique tools that can be personalized, and continue to deliver on Apple's long-standing commitment to making products that everyone can use smoothly.

With advanced software, hardware and machine learning capabilities, blind or low-vision people can use door detection to guide themselves through the final steps to the finish line with an iPhone or iPad; users with physical or motor impairments can use voice control and switch control to assist feature to fully control Apple Watch on iPhone using Apple Watch Mirroring; deaf and hard of hearing people can watch live captions on iPhone, iPad, and Mac. Apple will also expand support for VoiceOver, the industry-leading screen reading tool, with more than 20 new language and locale options. These features will come to Apple's major platforms later this year through a software update.

Door detection designed for blind or low vision users

Apple is about to introduce Door Detection, an advanced navigation feature designed for people who are blind or have low vision. Door detection can help users locate a gate when they are about to reach a new destination, tell them how far they are from the door, and describe the characteristics of the door, including whether the door is open or closed. If the door is closed, it can also tell the user whether the door can be opened by pushing, pulling, or turning the handle. Door detection can also read signs and signs on and around doors, such as room numbers in offices, or accessible entry signs. The new feature, which combines lidar scanners, cameras, and on-device machine learning capabilities, will be available on iPhone and iPad models equipped with lidar scanners.

Gate detection will be a new detection mode in the Amplifier App. Magnifier is Apple's built-in app for blind and low-vision users. Door detection and person detection and image description functions can be used separately or simultaneously in detection mode, providing visually impaired users with customizable tools to help them navigate and obtain rich information about their surroundings. In addition to the navigation tools within the Magnifier app, the Apple Maps app also provides voice and tactile feedback to users using VoiceOver to identify the starting point for walking navigation.

Advancing Physical and Motor Assistive Features for Apple Watch

Apple Watch Mirroring helps users remotely control their Apple Watch with a paired iPhone, making it easier for people with physical or motor disabilities to use the Apple Watch. With Apple Watch Mirroring, users can use the iPhone's auxiliary functions such as voice control and switching control to control the Apple Watch and use input methods such as voice commands, voice commands, head tracking, or the Made for iPhone external auxiliary switching device as a tap Apple Alternative operation of the Watch display. Apple Watch Mirroring uses hardware and software features, including AirPlay-based technology, to help users who rely on these features smoothly use Apple Watch-specific apps, such as blood oxygen, heart rate, mindfulness, and more.

In addition, users can further control the Apple Watch using simple gestures. With new Quick Actions on Apple Watch, users can answer or end calls, ignore notifications, take photos, play or pause media in the Now Playing app, and start, pause or resume workouts with a "double-pinch" gesture. This feature is based on the innovative technology used in Apple Watch Assistive Touch, which allows users with limited upper limbs to control their Apple Watch in different ways, such as pinching or making a fist, without having to tap the display.

iPhone, iPad, and Mac Introduce Live Captioning for Deaf and Hard of Hearing Users

Apple introduces Live Captioning3 on iPhone, iPad, and Mac for the deaf and hard of hearing. Users can more easily understand any audio content - including a phone or FaceTime call, a video conference or social media app, streaming content, or a face-to-face conversation with a person. Users can adjust the font size of subtitles for easier reading. Live Captioning for FaceTime calls provides automatically transcribed conversation captions for call participants, making it easier for the hearing-impaired to participate in group video calls. While on a Mac call, users can enter a response and have Live Caption read the response in real-time to other call participants. Live captions are generated on-device, so users can keep their information private and secure.

VoiceOver adds new languages ​​and more

VoiceOver is Apple's industry-leading screen reading tool designed for blind and low-vision users. VoiceOver will be added in more than 20 languages ​​and regions, including Bengali, Bulgarian, Catalan, and Ukrainian, The language implements accessibility features. The new languages, regions, and voices will also apply to accessibility features such as Speak Selection and Speak Screen. Additionally, users of VoiceOver on Mac can take advantage of the new Text Checker to spot common formatting errors such as multiple spaces or capitalization errors, making it easier to proofread documents or emails.

Other functions

Buddy Controller allows users to ask caregivers or friends to help them play games; Buddy Controller can combine any two-game controllers into one, allowing multiple controllers to input the same player's actions.

Siri Pause Time can help users with language impairments to adjust the waiting time for Siri to reply to the command.

Voice-controlled Spelling Mode provides the user with the option to enter letter by letter to dictate specially spelled words5.

Sound recognition can be customized to recognize specific sounds in the user's environment, such as home alarms, doorbells, or unique sounds from home appliances.

Users use voice recognition to identify the unique ringing of the doorbell in their homes.

The user uses the Siri Pause Time to adjust the waiting time for Siri to reply to the command.

Commemorating Global Accessibility Awareness Day

This week, Apple is marking Global Accessibility Awareness Day with special classes, curated collections, and more:

Launching in Canada on May 19, the SignTime service helps Apple Store and Apple Support customers connect with American Sign Language (ASL) interpreters. SignTime is available in the US for customers using ASL, in the UK for customers using British Sign Language (BSL), and in France for customers using French Sign Language (LSF).

Apple Retail Stores around the world will offer live classes throughout the week to help customers learn about iPhone accessibility, and Apple Support social media channels will also feature tutorials.

Accessibility Assistant Shortcuts, coming to the Shortcuts app for Mac and Apple Watch this week, recommend accessibility features based on user preferences.

Apple Maps adds "Park Access for All," a new guide from the National Park Foundation to help users discover accessibility features, programs, and services and explore national parks across the United States. The world's premier institution of higher education for deaf, hard of hearing, and deafblind students - Gallaudet University provides a guide that will help connect users to businesses and organizations that value, embrace, and prioritize the deaf and sign language.

Users can discover inspiring stories of accessibility apps and app creators on the App Store; learn about the innovative ways technology is driving accessible lives on Apple Podcasts.

Apple Music will feature Saylists playlists, each of which will emphasize a different voice. Choosing a playlist and singing along is a fun way to practice pronunciation or speech therapy.

The following are special instructions:

The door detection and person detection features of the Magnifier app require a lidar scanner for iPhone 13 Pro, iPhone 13 Pro Max, iPhone 12 Pro, iPhone 12 Pro Max, 11-inch iPad Pro (2nd and 3rd generation), and 12.9-inch iPad Pro (4th and 5th generation). Users should not rely on door detection in environments where user injury may occur and in high risk or emergency situations.

Apple Watch Mirroring is available for Apple Watch Series 6 and later.

Live Caption will be available in public beta later this year in English (US, Canada); available on iPhone 11 and later, iPad and later with A12 Bionic chip, and Mac models with Apple chip. The accuracy of Live Caption may vary for different content and should not be relied upon in high risk situations.

VoiceOver, Speak Selections, and Speak Screen will add support for Arabic (World), Basque, Bengali (India), Bihari (India), Bulgarian, Catalan, Croatian, Persian, French (Belgium), Galician, Kannada, Chinese dialects (Liaoning, Shaanxi, Shanghai, Sichuan), Marathi, Spanish (Chile), Slovenian, Tamil , Telugu, Ukrainian, Valencian and Vietnamese.

Voice-controlled Spelling Mode in English (US).


Post a Comment

0 Comments