Quantcast
Jump to content
Search In
  • More options...
Find results that contain...
Find results in...
    1. Welcome to GTAForums!

    1. GTANet.com

    1. GTA Online

      1. The Cayo Perico Heist
      2. Find Lobbies & Players
      3. Guides & Strategies
      4. Vehicles
      5. Content Creator
      6. Help & Support
    2. Red Dead Online

      1. Frontier Pursuits
      2. Find Lobbies & Outlaws
      3. Help & Support
    3. Crews

    1. Red Dead Redemption 2

      1. PC
      2. Help & Support
    2. Red Dead Redemption

    1. Grand Theft Auto Series

      1. St. Andrews Cathedral
    2. GTA VI

    3. GTA V

      1. Guides & Strategies
      2. Help & Support
    4. GTA IV

      1. The Lost and Damned
      2. The Ballad of Gay Tony
      3. Guides & Strategies
      4. Help & Support
    5. GTA San Andreas

      1. Guides & Strategies
      2. Help & Support
    6. GTA Vice City

      1. Guides & Strategies
      2. Help & Support
    7. GTA III

      1. Guides & Strategies
      2. Help & Support
    8. Portable Games

      1. GTA Chinatown Wars
      2. GTA Vice City Stories
      3. GTA Liberty City Stories
    9. Top-Down Games

      1. GTA Advance
      2. GTA 2
      3. GTA
    1. GTA Mods

      1. GTA V
      2. GTA IV
      3. GTA III, VC & SA
      4. Tutorials
    2. Red Dead Mods

      1. Documentation
    3. Mod Showroom

      1. Scripts & Plugins
      2. Maps
      3. Total Conversions
      4. Vehicles
      5. Textures
      6. Characters
      7. Tools
      8. Other
      9. Workshop
    4. Featured Mods

      1. Design Your Own Mission
      2. OpenIV
      3. GTA: Underground
      4. GTA: Liberty City
      5. GTA: State of Liberty
    1. Rockstar Games

    2. Rockstar Collectors

    1. Off-Topic

      1. General Chat
      2. Gaming
      3. Technology
      4. Movies & TV
      5. Music
      6. Sports
      7. Vehicles
    2. Expression

      1. Graphics / Visual Arts
      2. GFX Requests & Tutorials
      3. Writers' Discussion
      4. Debates & Discussion
    1. Announcements

      1. GTANet 20th Anniversary
    2. Support

      1. Court House
    3. Suggestions

Apple's Message To It's Customers Regarding an FBI Proposal


KyleKeeling

Recommended Posts

I don't see why Apple should have to incur the cost, time and security implications of having to discover and exploit a vulnerability in their product so that law enforcement can access it. If law enforcement want to subvert Apple phones, they can do so the same way they always have- via side-channel attacks. Want to decrypt the phone of a dead man but don't have the capability? Tough sh*t.

 

The question arises whether it's justifiable to force a service provider with no actual responsibility over the content and data stored on their product to backdoor it- because that's what this is, no matter how your dress it up- at the behest of law enforcement. And the simple answer is that it's not. And don't think the precedent will somehow be isolated to the US; once it happens once every despotic or authoritarian regimes which lets Apple operate on its territory will demand the same access.

 

Let's make this clear- Apple is, at absolute worst, holding the data in question, and even that's arguable. The reason the FBI are resorting to invoking a 200 year old law to issue this demand is because Apple has no legal responsibility towards or control over the actual content of the phone.

Link to post
Share on other sites

I don't see why Apple should have to incur the cost, time and security implications of having to discover and exploit a vulnerability in their product so that law enforcement can access it. If law enforcement want to subvert Apple phones, they can do so the same way they always have- via side-channel attacks. Want to decrypt the phone of a dead man but don't have the capability? Tough sh*t.

 

What it sounds like is that Apple is backtracking on their ability to provide a fail-safe in the event should law enforcement need access. Looking back at that article within the provided chart, some of the most recent models w/ the most recent software provide "Law Enforcement/Legal Process Options". So who screwed up here? It was Apple, because they failed to consider the possibility that a suspect/customer may be, as you mentioned, dead.

 

And I'd rather see Apple fork over the money and time needed to do so, why must this be done at the expense of tax payers?

 

The question arises whether it's justifiable to force a service provider with no actual responsibility over the content and data stored on their product to backdoor it- because that's what this is, no matter how your dress it up- at the behest of law enforcement. And the simple answer is that it's not.

 

It's not? Exigent circumstances are a necessary part of the 'right to privacy'. The issue is whether it would it be objectively reasonable to consider that you have a right to privacy for the data on your phone, and if so, is it a right that the public is willing to recognize? I'd argue that it is not, and anyone dumb enough to think that their information is safe on any device from any entity is incredibly naive, and I think that's a reality we're all willing to accept, even when we refuse to use those particular devices. Also, I'm not exactly sure what's absolving them from responsibility. Is there even any precedent for this?

 

This can get even more complicated, though. We could also discuss a cost-benefit analysis. Consider for a moment the amount of iPhones prior to the A6 chip and iOS 8 or 9, which are, in theory, all exploitable in some way, shape or form. Then consider the percentage of phones that are actually running these chips with a recently upgraded OS. Do the handful of iPhone users with these particular hardware specs and software outweigh legal implications failing to comply? I'd be hard pressed to believe it does, for such a small segment of the market.

 

And the simple answer is that it's not. And don't think the precedent will somehow be isolated to the US; once it happens once every despotic or authoritarian regimes which lets Apple operate on its territory will demand the same access.

 

This is the one point that gives me hesitation, because as any business would, it acts in it's own self-interest, and if I were Tim Cook, I probably would have done the same thing, but that's the power of perspective. It means that, even for Apple, let alone the privacy implications of everyday citizens, Apple would be subjected to the laws and procedures of other nations, which would compromise trade secrets, designs, security measures, and myriad of other issues if were to continue doing business in those jurisdictions.

 

I still have mixed feelings about this whole thing, but I still believe Apple should be more compliant with law enforcement, and rather than tattle-tale on Big Brother to the public, they should have discussed these matters behind closed doors and considered some viable alternatives. Does it need to be a back-door? I seriously doubt that, mostly for the reasons I initially stated above: How did Apple not see this coming?

Link to post
Share on other sites

What it sounds like is that Apple is backtracking on their ability to provide a fail-safe in the event should law enforcement need access.

 

Looking back at that article within the provided chart, some of the most recent models w/ the most recent software provide "Law Enforcement/Legal Process Options". So who screwed up here? It was Apple, because they failed to consider the possibility that a suspect/customer may be, as you mentioned, dead.

It's not a case of ability- I have no doubt that Apple have the physical capability to subvert the security of their product or could develop that capability in short order. It's a case of whether they have any legal responsibility to. As it currently stands, despite your suggestion otherwise, they don't. The issue at hand is whether the FBI can use the legal process to force compliance despite there being no law on the statute books explicitly or implicitly suggesting Apple has to comply, and the legal ramifications of that.

 

This isn't a "screw-up" from Apple, it's an intentional business decision. Those "legal process options" are what we're seeing wrangling over at the moment; the "law enforcement" option is likely a trojanised app or other kind of implant. A great solution, but not really workable if the target isn't breathing.

 

And I'd rather see Apple fork over the money and time needed to do so, why must this be done at the expense of tax payers?

Because there's no legal basis for compelling Apple to do it? Because if law enforcement want a particular capability, they can bloody well develop it, buy it from a third party, or do without it? We are, in essence, seeing LE try to demand a vendor does their job for them.

 

It's not? Exigent circumstances are a necessary part of the 'right to privacy'.

This isn't exigent circumstances, though- or more accurately it might be but whether or not it is is basically irrelevant. We're not talking about serving a no-knock warrant, we're talking about the legal and moral implications of attempting to compel a company to backdoor their own product. Hence why the whole thing is being conducted under the 1789 All Writs Act, which the FBI are interpreting as allowing them to demand whatever they want from third parties to assist in investigations. It's not a new tactic, they've used it at least twice before.

 

Also, I'm not exactly sure what's absolving them from responsibility. Is there even any precedent for this?

Apple has no legal responsibility over the content of encrypted data held either in the iCloud or on device. They merely act as a supplier for the hardware used to generate that data, and as a service provider for the holding of it. A legal precedent exists which compels service providers to release cryptographic keys used for encrypting data in instances where they own the private "master" key or keys, but Apple does not have any capability to directly decrypt the data and as a service provider has no legal requirement to assist a third party in doing so.

 

Apple is already subject to the laws and regulations of places in which it operates.

Link to post
Share on other sites

 

So it will wipe the data if removed or just leave it encrypted?

No, it'll just sit there on a hardware-encrypted filesystem with lots of individual encrypted files.

People are talking about this key as if you are doomed without it.

You are doomed without it. All the encryption on the device- the hardware encryption used to secure the filesystem and decrypt on boot, and the encryption of the individual files on that system, is done with keys derived from the original AES key. Attacking it as a solution is pretty much exactly the same as attacking hardware encryption on, say, an SSD- you either need a side-channel attack, to obtain the key from memory via something like a cold-boot attack, or a firmware rootkit. And the latter normally needs the drive to be decrypted in order to be installed in the first place (well, unless it's placed there before the drive is encrypted such as via the supply chain).

AES functions the same way regardless if implementation, so Apple's fancy key derivation gets turned into a 256 bit key in the end.

The algorithm functions the same regardless of implementation, but the actual implementation of the cryptography is currently the only feasible attack vector against AES of any size, and varies dramatically. Security vendors have successfully decrypted files encrypted by Ransomware using 256-bit AES keys because the sh*tty implementation of the cryptography (like here), but without a known side-channel you're basically stuffed. Several theoretical attacks exist which are hypothetically faster than brute-forcing, but they're all still computationally infeasible. Hence, all the attacks made against AES in the past have been based on implementation.

Why can't they get access to the raw AES data and brute force the actual encryption key?

Because brute-forcing AES is currently impossible. Assuming access to the Fujitsu K (10.5 PetaFLOPS) and assuming a very optimistic 1,000 FLOPs per combination check, it would take slightly more than one billion billion (IE one quintillion) years to brute-force a 128-bit AES key. Even with the combined power of the ten most powerful supercomputers on earth, you've only managed to reduce that to one hundred quadrillion years.

brute forcing raw keys might actually be faster.

It would be, but when it takes all the computing power of the entire earth several orders of magnitude longer than the universe has existed to brute-force a single AES-128 key, it's kind of a moot point.

Oh true I guess it's a little different trying to brute force the raw key.

 

I wonder how apple could get by the 80ms delay their key derivation takes? Because I don't see any brute force tool that takes 82ms per attempt working well either.

 

There's a really good article about the "crypto wars" and CALEA that I think is related to this. Plus a Bloomberg article detailing a memo discussing ways for the FBI to gain more authority in the face of more encryption "going dark".

Link to post
Share on other sites

My understand is that the 80ms delay may actually be a hardware protection. The 5 second one is software, but it's not uncommon for hardware encryption modules to employ a hardware timer to defeat bruteforcing.

Link to post
Share on other sites

wait a tick.

I have a question. sorry if it's already been addressed somewhere.

 

why can't Apple like... take the phone, and privately have their engineers or whatever unlock it, and then just like... give it back to the feds?

does that make sense?

 

when you get locked out of your house, the locksmith guy just gets you back into your house. he doesn't create a special Skeleton Key that you could use to get into ANY house; because then he has to trust you, that you'll only use it on your own door. which is of course ridiculous. does the same principle not apply here? are the feds simply unwilling to physically relinquish the device because it's official evidence?

Link to post
Share on other sites

Well that just isn't what the FBI really wants. In their official request they wanted Apple to create new software. Apple has in the past done pretty much what you said so I don't think they're the stubborn party in this equation.

Link to post
Share on other sites

so it's just about their demand.

this is their demand.

 

no wonder it's so goddamn shady.

if the investigators were truly interested in the data alone, they wouldn't care how they got it. right? they're just trying to prevent terror, blah blah blah. they should just be happy when the door is open for them.

 

they shouldn't need their own special skeleton key.

Edited by El Diablo
Link to post
Share on other sites

This takes the cake...

 

http://www.theguardian.com/technology/2016/mar/04/san-bernardino-da-baffles-security-community-lying-dormant-cyber-pathogen-iphone

 

Not only is the idea that a virus can do any damage to anything, while on this locked phone and there's no way to stop it, is an insanely idiotic thought, but then he called it a "cyber pathogen".

Link to post
Share on other sites

The entire information security industry is laughing at him. What an utter buffoon.

Link to post
Share on other sites

President Obama weighs in:

 

All of us value our privacy, and this is a society that is built on a Constitution and a Bill Of Rights and a healthy skepticism about overreaching government power. Before smartphones were invented and to this day, if there is probable cause to think that you have abducted a child, or that you are engaging in a terrorist plot, or you are guilty of some serious crime, law enforcement can appear at your doorstep and say we have a warrant to search your home and can go into your bedroom and into your bedroom drawers to rifle through your underwear to see if there’s any evidence of wrongdoing.

 

And we agree on that because we recognize that just like all of our other rights, freedom of speech, freedom of religion, etc, that there are going to be some constraints imposed to ensure we are safe, secure and living in a civilized society.

Technology is evolving so rapidly that new questions are being asked, and I am of the view that there are very real reasons why we want to make sure the government can not just wily-nilly get into everyone’s iPhones or smartphones that are full of very personal information or very personal data.

What makes it even more complicated is that we also want really strong encryption because part of us preventing terrorism or preventing people from disrupting the financial system or our air traffic control system or a whole other set of systems that are increasingly digitized, is that hackers, state or non-state, can’t get in there and mess around.

So we have two values, both of which are important.


And the question we now have to ask is if technologically it is possible to make an impenetrable device or system where the encryption is so strong that there is no key there, there’s no door at all? And how do we apprehend the child pornographer? How do we solve or disrupt a terrorist plot? What mechanisms do we have available that even do simple things like tax enforcement? Because if you can’t crack that at all, and government can’t get in, then everybody’s walking around with a Swiss bank account in their pocket. So there has to be some some concession to the need to be able to get to that information somehow.

I am way on the civil liberties side of this thing…I anguish a lot over the decisions we make in terms of how we keep this country safe, and I am not interested in overdrawing the values that have made us an exceptional and great nation simply for expediency. But the dangers are real. Maintaining law and order in a civilized society is important. Protecting our kids is important. And so I would just caution against an absolutist perspective on this.

 

Because we make compromises all the time. You know, I haven’t flown commercial in a while. But my understanding is that it’s not great fun going through security. But we make the concession. It’s a big intrusion on our privacy, but we recognize it as important. We have stops for drunk drivers. It’s an intrusion but we think it’s the right thing to do.

 

And this notion that somehow our data is different and can be walled off from those other trade-offs we make, I believe is incorrect. We do have to make sure, given the power of the Internet and how much our lives are digitized, that it is narrow, and is constrained, and that there’s oversight. I’m confident that this is something that we can solve.

 

But we’re going to need the tech community, the software designers, the people who care deeply about this stuff to help us solve it. Because what will happen is if everyone goes to their respective corners and the tech community says ‘Either we have strong, perfect encryption or else it’s Big Brother and an Orwellian world,’ what you’ll find is that after something really bad happens, the politics of this will swing, and they will become sloppy, and rushed, and it will go through Congress in ways that have not been thought through. And then you really will have dangers to our civil liberties because the people who understand this best, who care most about privacy and civil liberties, will have disengaged or taken a position that is not sustainable for the general public as a whole over time.
Link to post
Share on other sites
  • 3 weeks later...

"outside party"

 

If I had to take a wild guess, I'd say they summoned some DARPA alumni, many of whom work in and out of some of Silicon Valley's most reputable companies. It's literally a fraternity of some of the most brilliant engineers and software developers the country has mustered in the name of national defense.

 

In the end, it looks like Apple won the moral victory.

Link to post
Share on other sites

It was probably one of the numerous exploit brokers who buy exploit code from independent researchers and sell to government and law enforcement for CNE (Computer Network Exploitation) purposes. That's where most of the talent in development of exploits and discovery of vulnerabilities lives.

 

Most conventional tech companies are bound by the rules of Responsible Disclosure and never sell exploits- in fact they commonly release details of them publicly once the vendor has patched the vulnerability.

Link to post
Share on other sites

Get a load of this. Now Apple's lawyers want to know how the FBI did it. This is hilarious.

 

 

"To get information about how the FBI cracked the iPhone, Apple might allege that the third party who did the work violated an Apple patent or stole a trade secret from Apple," Wright said. "A substantiated allegation like that might justify a lawsuit by Apple against the third party. Such a lawsuit might justify legal discovery of the technique used by the third party."

 

http://www.techrepublic.com/article/apple-demands-to-know-how-fbi-cracked-san-bernardino-iphone/#ftag=YHF87e0214

Edited by X S
Link to post
Share on other sites

Of course they want to know how. It's just been very publicly outed that there's an unpatched vulnerability in, at the very least, all iPhone 5Cs.

Link to post
Share on other sites

It was probably one of the numerous exploit brokers who buy exploit code from independent researchers and sell to government and law enforcement for CNE (Computer Network Exploitation) purposes. That's where most of the talent in development of exploits and discovery of vulnerabilities lives.

Most conventional tech companies are bound by the rules of Responsible Disclosure and never sell exploits- in fact they commonly release details of them publicly once the vendor has patched the vulnerability.

I heard it was an Israeli firm, I bet they just bought it off some hackers. Aparently there's big money to n selling exploits over there in eastern Europe and such. I think that's where the Stuxnet exploits are suspected to have come from too.

Link to post
Share on other sites

That was Cellebrite, they were going to do a hardware-replication-based attack on it. I don't think it was them based on the reports.

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • 2 Users Currently Viewing
    0 members, 0 Anonymous, 2 Guests

×
×
  • Create New...

Important Information

By using GTAForums.com, you agree to our Terms of Use and Privacy Policy.