Live captioning made waves back in May, when Apple announced the feature along with other new accessibility features coming to iOS. In this walkthrough, we show you how to enable Live Captions features on an Apple iPhone running the latest version of iOS 16 quickly and easily. If you have hearing impairments, or encounter situations in which you are using your iPhone without audio, you may enjoy the new Live Captions feature (beta) Apple is rolling out with iOS 16.
In the soon-to-be-released iOS 16 Accessibility Preview, Apple described the new hearing-friendly Live Captions feature as a tool for the hearing-impaired and deaf communities to navigate better on devices, calls, videoconferencing, and social media apps, using a device-only, real-time generation of text designed to safeguard speaker conversation privacy and security. Apple is rolling out a new set of accessibility features with the upcoming iOS 16, and it is finally adding a Live Captions feature to all audio content on devices. In iOS 16s launch, Apple is introducing a new set of accessibility improvements, and (finally) introducing Live Captions for all audio content across devices.
First Announcement Of iOS 16
As first announced by Apple as far back as May, its latest major update has a whole bunch of new Accessibility features. Prior to WWDC 22, and to commemorate the global day of accessibility awareness back in May, Apple revealed plans from Apple to introduce several new features in the next version of iOS. These features got their own day in the limelight in order to commemorate Global Accessibility Awareness Day, before WWDC22, at which Apple unveiled its plans for iOS 16 and its other operating systems.
As part of the Global Accessibility Awareness Day, Apple announced a number of new features in iOS 16 that will benefit iPhone and Apple Watch users who are deaf, hard of hearing, blind, or low-vision. The Apple Watch iOS 16. A slew of new accessibility features are going to be available from Apple to the iPhone, Apple Watch, and Mac, including a universal real-time captioning tool, improved modes for visual and audio sensing, and access for WatchOS apps on iOS. To provide an improved user experience for all, Apple is constantly adding new features to iOS and iPadOS.
Alongside these headline features, Apple has announced a number of settings and options designed to assist accessibility needs. These include things like Live Tasks, the new Freeform App, Apple Pay Later, and much more. First, the new Apple Watch Mirroring function lets you control the Apple Watch entirely from an iPhone, and take advantage of helper features such as toggle controls and voice controls.
Live Caption on iOS 16
There are many new features in iOS 16, one of the most notable features is Live Subtitles. Live captions makes it easy for users to turn on auto captions that apply to all sound played on a device, including phone calls, FaceTime sessions, and videos. The new option allows users to easily turn on auto-captioning from the settings menu, which will apply to any audio played within the device, from phone calls, to FaceTime sessions, to videos.
The Caller feature lets users easily hook up auto captioning by hooking it up to their Settings menu, which will override any audio played wrong by the device, from phone calls, to FaceTime sessions, to videos. You will even get the option to turn on the feature for FaceTime, which, as its name suggests, will display captions for all FaceTime calls. To enable live captions in FaceTime, open your Settings app on your iPhone, choose Accessibility settings, tap on Live Captioning (Beta), and under the heading, IN-APP LIVE CAPS, enable the toggle button for Live Captions in FaceTime.
How live captions work When you are on an audio or video call, or watching a video on your device, live captions will automatically display on top of the screen as a separate pop-up, similar to a push notification. While successful in completing an audio OR video call, OR watching a video connected to your device, Live Captions will automatically appear on the top of your surface after successful executing a separate pop up window, similar to a push notification.
Real-Time Support
To create Live Captions to talk about real-time content, begin by tapping anywhere in an easily-accessible widget to toggle four Control Buttons. Via (open in new tab) The new Active Captions are not for just iPhone users, either, and you can enable it from the Settings menu of iPad and Mac devices. A wide range of font-style choices and background-color options in the widgets easy-to-access menu could help to make the Live Captions experience even easier to use.
Captions appear on the front edge of a piece of audio that is playing successfully in real-time, and users are also allowed to adjust the size of their caption fonts to suit their needs. The captions can quickly turn on or off, or the pane they are displayed expands or contracts, using the accessibility standards settings.
Functions
The Live Caption function uses the devices own smarts to automatically create captions for speech (in audio or video) playing on the iPhone, or real-time conversations happening around you. The live captions feature in iOS/iPados 16 and macOS Ventura will enable those with hearing loss or impairment to use the power of machine learning across compatible models of the iPhone, iPad, and Mac to auto-generate and auto-transcribe captions to multimedia on-device content and real-world conversations in real-time.
For Mac users, the update also includes a live text-to-speech feature for calls, similar to that in the Google Pixel update, in which users can enter a reply and have it spoken out loud to others in the conversation. The Live Text upgrade, which lets you copy text from paused videos, as well as convert currencies and translate languages with one tap, is available only if your iPhone has the A12 Bionic processor or higher.
Also, since Live Caption listens to the end of a phrase, it can work backwards and correct the start of that phrase, adding punctuation or replacing words with sounds, just like when using Dictionation on the iPhone.