SELECT LANGUAGE BELOW

George Orwell Would Be Disturbed by Apple’s Billion-Dollar Dystopian Purchase

George Orwell Would Be Disturbed by Apple's Billion-Dollar Dystopian Purchase

It’s hard to imagine that this is what Steve Jobs envisioned when he started Apple, but perhaps it’s closer to what George Orwell anticipated years ago.

On January 29, Apple revealed its acquisition of Q.ai, an obscure Israeli startup, for roughly $2 billion. That’s a significant valuation, which raises eyebrows. From what I gather, Q.ai’s technology does seem quite impressive. Some might even label it as dystopian.

One patent application indicates that the technology can “generate conversation records using facial movements” and can “determine an individual’s emotional state based on slight facial movements.” Another application suggests the software “synthesizes audio based on the words silently uttered by the subject.” There’s even mention of a sensing device fitted to the user’s ear, which captures light reflected from the face and processes signals to produce audio output.

In essence, before you say something, your brain sends signals to your muscles, and Q.ai’s sensors pick up on these subtle “micro-movements.” Apparently, the company boasts, “In a world full of noise, we create a new kind of calm. Magic has come true.”

You can really see what Apple aims to achieve here. Imagine controlling your iPhone without speaking or typing. Envision wearing glasses that can anticipate when a baby is about to cry or aid medical professionals in monitoring stroke-prone patients. This concept, while fascinating, feels almost like telekinesis, allowing interaction with the environment without any vocal command.

It’s honestly astonishing.

But there are also dark implications—it raises a host of privacy issues. With ideas like “silent speech” and “pre-speech,” we might be inching closer to something like “pre-crime.” Just picture a police officer using smart glasses to scan crowds for signs of agitation. They might target individuals who appear angry or aggressive, potentially leading to unnecessary confrontations, even if those individuals haven’t committed any offenses. This could enhance current facial recognition systems already utilized by law enforcement.

Think about someone walking through a rough neighborhood while wearing these glasses. They could assess strangers’ demeanors to determine which individuals seem threatening, making quick decisions about when to cross the street. All this data about people is collected without their consent—unlike when someone agrees to Apple’s terms simply by setting up an account. This feels like a gross invasion of privacy.

Beyond civilian use, this technology is likely to find its way into military contexts as well. Imagine a soldier patrolling a busy market, capable of scanning faces to identify hidden threats. This could provide a significant advantage, but what happens if an innocent person gets flagged? The risk of misidentification could be dire.

Regardless of whether Q.ai’s patent claims hold water or if this technology makes it to consumers, law enforcement, or military, the elements at play hint at a rather grim future.

Facebook
Twitter
LinkedIn
Reddit
Telegram
WhatsApp

Related News