home bbs files messages ]

Just a sample of the Echomail archive

<< oldest | < older | list | newer > | newest >> ]

 Message 1389 
 Mike Powell to All 
 Apple wants to connect th 
 15 May 25 09:56:00 
 
TZUTC: -0500
MSGID: 1122.consprcy@1:2320/105 2c8b3c6f
PID: Synchronet 3.20a-Linux master/acc19483f Apr 26 202 GCC 12.2.0
TID: SBBSecho 3.20-Linux master/acc19483f Apr 26 2024 23:04 GCC 12.2.0
BBSID: CAPCITY2
CHRS: ASCII 1
Apple wants to connect thoughts to iPhone control  and there's a very good
reason for it

Date:
Thu, 15 May 2025 00:00:00 +0000

Description:
Apples move into brain-computer interfaces could be a boon for those with
disabilities.

FULL STORY

Our smartphones and other devices are key to so many personal and 
professional tasks throughout the day. Using these devices can be difficult 
or outright impossible for those with ALS and other conditions. Apple thinks
it has a possible solution: thinking. Specifically, a brain-computer 
interface (BCI) built with Australian neurotech startup Synchron that could
provide hands-free, thought-controlled versions of the operating systems for
iPhones, iPads, and the Vision Pro headset. 

A brain implant for controlling your phone may seem extreme, but it could be
the key for those with severe spinal cord injuries or related injuries to
engage with the world. Apple will support Switch Control for those with the
implant embedded near the brains motor cortex. The implant picks up the 
brains electrical signals when a person thinks about moving. It translates
that electrical activity and feeds it to Apple's Switch Control software,
becoming digital actions like selecting icons on a screen or navigating a
virtual environment.

Brain implants, AI voices 

Of course, it's still early days for the system. It can be slow compared to
tapping, and it will take time for developers to build better BCI tools. But
speed isnt the point right now. The point is that people could use the brain
implant and an iPhone to interact with a world they were otherwise locked out
of. 

The possibilities are even greater when looking at how it might mesh with
AI-generated personal voice clones. Apple's Personal Voice feature lets users
record a sample of their own speech so that, if they lose their ability to
speak, they can generate synthetic speech that still sounds like them. Its 
not quite indistinguishable from the real thing, but its close, and much more
human than the robotic imitation familiar from old movies and TV shows. 

Right now, those voices are triggered by touch, eye tracking, or other
assistive tech. But with BCI integration, those same people could think their
voice into existence. They could speak just by intending to speak, and the
system would do the rest. Imagine someone with ALS not only navigating their
iPhone with their thoughts but also speaking again through the same device by
"typing" statements for their synthetic voice clone to say. 

While it's incredible that a brain implant can let someone control a computer
with their mind, AI could take it to another level. It wouldn't just help
people use tech, but also to be themselves in a digital world.

======================================================================
Link to news story:
https://www.techradar.com/computing/artificial-intelligence/apple-wants-to-con
nect-thoughts-to-iphone-control-and-theres-a-very-good-reason-for-it

$$
--- SBBSecho 3.20-Linux
 * Origin: capitolcityonline.net * Telnet/SSH:2022/HTTP (1:2320/105)
SEEN-BY: 105/81 106/201 128/187 129/305 153/7715 154/110 218/700 226/30
SEEN-BY: 227/114 229/110 111 114 206 300 307 317 400 426 428 470 664
SEEN-BY: 229/700 705 266/512 291/111 320/219 322/757 342/200 396/45
SEEN-BY: 460/58 712/848 902/26 2320/0 105 3634/12 5075/35
PATH: 2320/105 229/426


<< oldest | < older | list | newer > | newest >> ]

(c) 1994,  bbs@darkrealms.ca