Apple officially released the iOS 17 system. This update brings major upgrades to communication apps, making airdrop sharing easier, text input more intelligent, and launching Journal App and Standby functions, but also brings many auxiliary functions (accessibility functions).
Three iPhone 14 Pro devices showcasing the upgraded Phone app, FaceTime app, and Messages app experiences in iOS 17.


One of the newly added accessibility features is called “Personal Voice”, which can help users at risk of aphasia create a voice similar to their own to communicate with others.
Developers can already start testing with the “Personal Voices” feature, which is available in the iOS 17 beta and can be found in Accessibility > Personal Voices. The process of creating a “personal voice” takes about an hour and requires recording in a quiet place with no background noise. Apple instructs users to speak naturally at a consistent volume while holding the iPhone about 6 inches from the face

If there is too much background noise where you are, Apple will warn you that you need to find a quieter place to record.
Personal Voice asks you to read a series of sentences aloud, after which your iPhone generates and stores your Personal Voice. Personal Voice can then be used in conjunction with the Live Speech feature, which helps users who cannot speak have typed text read aloud during face-to-face conversations, phone calls, or FaceTime calls.
Personal Voice will be available to the public when Apple releases the first public beta of iOS 17, which launches next month.
In addition to the “personal voice” and real-time voice functions, other auxiliary functions added to iOS 17 include Assistive Access, a customizable interactive interface that helps users with cognitive impairments more easily and independently With iPhone; Point to Word, which helps blind and low-vision users read text on physical objects the device is pointing at.