Apple is soon going to let users of iPhone and iPad operate their gadgets with their eyes

You might think that eye control technology is nothing more than a science fiction movie when it comes to using your eyes to operate your smartphone.

But for millions of iPhone users around the world, it will.

Apple has confirmed that iPad and iPhone tracking is possible.

This device will allow users to control Apple devices with just their eyes using artificial intelligence (AI).

It uses the front camera to track, adjust and calibrate in seconds, and with machine learning on the device, all data used to set up and control this feature is stored securely on the device and is not shared with Apple, according to Apple.

Apple introduced several new accessibility features last week, including an eye tracking device.

According to Apple’s chief executive, we strongly believe in the transformative power of innovation to renew lives.

That’s why Apple has championed inclusive design for nearly 40 years, combining the accessibility of our devices and software.

No additional hardware or accessories are required for Tracking to work with the iPadOS and iOS apps.

After setting it up, users can use Dwell Control to access other features, such as physical buttons, swipes, and other gestures to activate and navigate each component of the app.

Although the new feature won’t be available until “later this year,” it’s already generated a lot of interest on X, officially known as Twitter.

Another added: ‘Black Mirror episode makes sense.’

One joke: ‘Wow, this generation is going to be the laziest generation ever.’

The tech giant has also unveiled a feature it claims can reduce passenger motion sickness in self-driving cars as part of a new feature portfolio.

According to research, Apple says that a sensory conflict between people’s perception of their environment and their emotions often causes motion sickness.

Motion sickness can be reduced by using the new Vehicle Motion Cues function, which adds animated dots at the edge of the screen to indicate changes in vehicle motion.

Another new feature is Music Haptics, which uses tiny motors in the iPhone – amplifying vibrations in the device – to let people who are deaf or hard of hearing experience music that vibrates to the sound of music.

In addition, Apple said it will introduce a new speech feature for customers with poor speech conditions. This feature will allow users to program specific words into their virtual assistant Siri to help them with shortcuts to applications.

Related Posts

Most Pakistani Professionals Use AI Tools, Few Trained: Kaspersky

• Only 52% of those who use artificial intelligence (AI) tools have received training on their safe and responsible use.• 68% of employees use AI for writing or editing tasks.•…

Hacktivists use hashtags as coordination tools in targeted campaigns: Kaspersky report

Islamabad: Kaspersky researchers have published a new report, Signal in the Noise, analyzing more than 120 hacktivist groups and over 11,000 hacktivist posts shared across surface and dark web channels…

Leave a Reply

Your email address will not be published. Required fields are marked *

Verified by MonsterInsights