OT: Is Apple right in refusing to create/provide an iPhone backdoor?

Submitted by paramount on February 29, 2016 - 10:18pm
I'm Team Apple
78% (18 votes)
I'm Team FBI
22% (5 votes)
Total votes: 23
Submitted by no_such_reality on March 1, 2016 - 8:00am.

I've always assumed there's already one there.

While I'm not team FBI due to potential abuse, Given the amount of data Apple collects, their privacy claims are a bit hollow and the people are sheep.

Submitted by spdrun on March 1, 2016 - 9:35am.

My problem is different. Slippery slope.

If computer firms will be required to install back doors, what's to stop someone from installing a foreign-designed variant of Linux with unbroken crypto instead of the "stock" OS?

In order to guarantee backdoors, either hardware would have to be compromised on an unprecedented scale, or systems will have to be locked down to prevent installation of alternate OS's and truly secure communication software. Neither is good.

I don't want some scum in DC mandating a walled garden.

Submitted by an on March 1, 2016 - 10:03am.

Apple already put all of its Chinese users' data on State own data center. So, this isn't nothing more than a marketing ploy, IMHO.

Submitted by spdrun on March 1, 2016 - 10:30am.

You're not required to cloudfuck your data, though. You can choose not to sign in to iClown, though other OS's are likely better for cloudfree use.

Submitted by an on March 1, 2016 - 10:40am.

spdrun wrote:
You're not required to cloudfuck your data, though. You can choose not to sign in to iClown, though other OS's are likely better for cloudfree use.
How do you know they're not doing that in China for non-cloud data?

Submitted by spdrun on March 1, 2016 - 10:59am.

Because it would likely be meaningless if it's not tied to an identity. Of course, the Chinese themselves can sniff network traffic from devices, be they phones, computers, whatever.

Submitted by an on March 1, 2016 - 11:16am.

spdrun wrote:
Because it would likely be meaningless if it's not tied to an identity. Of course, the Chinese themselves can sniff network traffic from devices, be they phones, computers, whatever.

Not meaningless if they can trace back to who bought that phone and get the lat/long of the phone. Then they would know pretty much who you are. In order to turn off location reporting, you'd have to turn off wifi/data/voice/and gps on the phone. If you do that, then what good is an iPhone?

Submitted by FlyerInHi on March 1, 2016 - 2:16pm.

AN wrote:
Apple already put all of its Chinese users' data on State own data center. So, this isn't nothing more than a marketing ploy, IMHO.

Yeah, marketing to the base.

When it comes to national security, I'm sure the government has ways to access the data, stored on the phone or elsewhere.

But still, unlocking mobile devices should not become routine requests by law enforcement.

Submitted by phaster on March 6, 2016 - 10:39am.

BUSINESSINSIDER.COM wrote:

Ex-NSA chief thinks the government is dead wrong in asking Apple for a backdoor

The former head of the National Security Agency thinks the government is dead wrong in trying to force Apple to build a backdoor into its secure encryption.

In an interview with New America, retired Gen. Michael Hayden said "American security is better served with unbreakable end-to-end encryption than it would be served with one or another front door, backdoor, side door, however you want to describe it."

http://www.businessinsider.com/michael-h...

NPR wrote:

What It Means For Apple To Get Around iPhone's Encryption

...China, for example, is looking at this case with intense interest. They have clearly a, you know, a billion people. Many of them are starting to use Apple smartphones, and they right now don't have access to the information on these phones. That's probably concerning to them. They have notably not been asking for that access yet because they haven't had to.

http://www.npr.org/2016/02/26/468216122/...

Submitted by spdrun on March 6, 2016 - 10:39am.

He knows.

Terrorists can always use things like one-time pads which are not amenable to breakable crypto.

But security holes are holes, and will endanger legitimate data.

Submitted by afx114 on March 8, 2016 - 12:18pm.

I wonder if all the "Team FBI" folks keep a key under their doormat.

Submitted by an on March 8, 2016 - 5:34pm.

afx114 wrote:
I wonder if all the "Team FBI" folks keep a key under their doormat.

I don't, but I also don't live in a vault. If the FBI wanted, they can break down my door pretty easily with a battering ram and I don't put in trip wire to blow up the house if they break down my door either.

Submitted by afx114 on March 9, 2016 - 10:00am.

AN wrote:
I don't, but I also don't live in a vault. If the FBI wanted, they can break down my door pretty easily with a battering ram and I don't put in trip wire to blow up the house if they break down my door either.

Just curious, why don't you put a key under your mat?

Submitted by an on March 9, 2016 - 11:40am.

afx114 wrote:
AN wrote:
I don't, but I also don't live in a vault. If the FBI wanted, they can break down my door pretty easily with a battering ram and I don't put in trip wire to blow up the house if they break down my door either.

Just curious, why don't you put a key under your mat?


What would be a reason to do so?

Submitted by an on March 9, 2016 - 11:50am.

afx114 wrote:
I wonder if all the "Team FBI" folks keep a key under their doormat.
On the flip side of this "coin"/question, do you booby trap your house to blow up when an unauthorized person enter your house? Do you put bars on your windows and doors?

Submitted by spdrun on March 9, 2016 - 12:02pm.

^^^ No. But so long as the booby trap only (say) destroys documents and doesn't injure anyone, doing so should be legally permitted.

Submitted by an on March 9, 2016 - 12:18pm.

spdrun wrote:
^^^ No. But so long as the booby trap only (say) destroys documents and doesn't injure anyone, doing so should be legally permitted.
Is it legal? If you're a pedophile and there's data on your computer that would incriminate you, but you booby trap your computer to destroy the evidence. The cop got a warrant to search your house and trip the booby trap and the evidence is destroyed, wouldn't you be charge with destruction of evidence or something like that?

Now, if the stupid pedophile is dumb enough to store those evidence on the cloud. Would it be ok for law enforcement to ask the cloud service to release those data?

Submitted by spdrun on March 9, 2016 - 12:26pm.

OK to ask. But also should be legally OK for people to store info on a cloud service encrypted using keys only known to them.

Thing is that the user of the iPhone would not trip the booby trap. Any unauthorized party (thief or cop) would trip it via 10 failed attempts.

Submitted by an on March 9, 2016 - 12:30pm.

What happen if you're stupid and forgot your password?
Why would it be OK for the cop to ask for encrypted data on the cloud but not on the device?

Submitted by spdrun on March 9, 2016 - 12:36pm.

The owner of the cloud service is not the accused. If you use a cloud service with your own key, if you lose it, you're SOL.

The owner of the device is the accused, not Apple. Apple has a right to manufacture devices that are resistant to cracking. They have no obligation to make them "hackable."

The accused is dead, so in no position to give out the passcode. If s/he were alive, self-incrimination might apply as well.

Submitted by an on March 9, 2016 - 12:53pm.

We're not talking about Apple building devices that are resistant to cracking/hacking. We're talking about Apple disabling to booby trap that render the phone useless.
The owner of the cloud is the same as Apple.
FYI, the owner of the phone is the government and not the employee since the government paid for it. So the things inside it belong to the government.

Submitted by spdrun on March 9, 2016 - 12:57pm.

The city gov't doesn't know the passcode. The person who knows it is dead. Back at Square One.

Apple has a right to build a device where the crypto cannot be disabled.

Submitted by an on March 9, 2016 - 3:17pm.

spdrun wrote:
The city gov't doesn't know the passcode. The person who knows it is dead. Back at Square One.

Apple has a right to build a device where the crypto cannot be disabled.


Are you saying Apple doesn't have a "I forgot my password" option? Are you saying Apple doesn't know how to disable their self destruct feature? I would guess "NO" on both.

Submitted by spdrun on March 9, 2016 - 3:24pm.

Correct. Unless the device is connected to iClown, there's no magic passcode retrieval option. This one wasn't, or was possibly disconnected at the FBI's behest, depending on whom you ask.

Why should Apple know how to disable the auto-erase feature? To prevent data theft, there SHOULDN'T be a means of disabling it.

If Apple made a legitimately secure device as long as iClown is turned off, more power to 'em!

Submitted by an on March 9, 2016 - 3:27pm.

spdrun wrote:
Correct. Unless the device is connected to iClown, there's no magic passcode retrieval option. This one wasn't, or was possibly disconnected at the FBI's behest, depending on whom you ask.

Why should Apple know how to disable the auto-erase feature? To prevent data theft, there SHOULDN'T be a means of disabling it.

If Apple made a legitimately secure device as long as iClown is turned off, more power to 'em!


Why shouldn't they? Are their engineers that stupid?

Submitted by spdrun on March 9, 2016 - 3:38pm.

No. Their engineers are that smart. They designed a locking system that's extremely difficult to defeat. Beautiful.

Apparently, files are locked by a series of randomly-generated keys. Which key is used depends on when access to the file is needed.

The keys themselves are stored in NVRAM, encrypted using a "master key" that's a mathematical combination of the passcode (or fingerprint hash) and the device's UID. The UID for each device is not recorded by Apple, and is not directly readable via software.

The keys are only decrypted to volatile RAM. Turning off the device (or locking it) makes them go *poof.*

The hardware that generates the "master key" seems to enforce limits on attempts and time between attempts. The firmware of said hardware is not amenable to updating.

Short of disassembling ICs to read data recorded on them (high chance of permanent data loss), good bloody luck.

Decapping. Fun read:
http://arstechnica.com/security/2016/02/...

Submitted by an on March 9, 2016 - 3:54pm.

I don't believe it is impossible. If it's really impossible to access the data, then Apple would have said so and this whole thing would be over. They just don't want to.

Submitted by spdrun on March 9, 2016 - 4:02pm.

Not Apple's property any more, not their problem. If the gov't wants to do it, let them hire an outsider and try.

Submitted by an on March 9, 2016 - 5:29pm.

spdrun wrote:
Not Apple's property any more, not their problem. If the gov't wants to do it, let them hire an outsider and try.

That's what gun manufacturers are saying. They only make the gun. It's the crazy who shoot shit up, so it's not their problem.

Submitted by spdrun on March 9, 2016 - 7:00pm.

And I agree with their stance. But secure crypto isn't a weapon.

Submitted by an on March 9, 2016 - 10:30pm.

spdrun wrote:
And I agree with their stance. But secure crypto isn't a weapon.
It can be in a cyber war.

Submitted by ltsddd on March 31, 2016 - 10:28am.

LOL. Now apple wants to know how the encryption was cracked.

Submitted by spdrun on March 31, 2016 - 2:56pm.

It can be in a cyber war.

Other than the limited case of ransomware (easily prevented with air-gapped backups), it's more of a defensive tool (think bulletproof vest or gas mask) than a weapon in cyber warfare.

Submitted by an on March 31, 2016 - 6:09pm.

spdrun wrote:

It can be in a cyber war.

Other than the limited case of ransomware (easily prevented with air-gapped backups), it's more of a defensive tool (think bulletproof vest or gas mask) than a weapon in cyber warfare.

More like, think a bulletproof body suite with a nuclear bomb underneath, which is set to trigger when a bullet hits it.