How Google Authenticator Made One Company's Network Breach … – Slashdot

Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!




The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
It sounds like this was a very well executed attack as well.
No doubt all that the employee had to miss was a little bit funky email address and/or a little bit funky enrollment website and the hard part is done. And benefits enrollment emails and websites are often wacky, so I completely understand how a non-expert could fall for exactly this angle of attack.
On top of that, regular users are trained to just enter the codes whenever they see them these days. It’s pretty much the same problem we’ve always had
deep insider info does make hacking easier + outsourcing can make it harder to find out if some really works for them or not.
Not only that, but the naivete from a lot of companies security teams who assume that an attacker won’t have this kind of information.
There’s a huge trend towards black box security tests because “an attacker couldn’t possibly know X”, rather than a more sensible assume breach scenario.
Part of the problem is that a lot of companies don’t take security seriously until it bites them. At a job I had once, someone from IT who I had not met before requested my password in an e-mail. I declined. So they came around to my desk and demanded it, but I told it I wouldn’t give it to them without written instructions from management. So, they went to my manager and I got my instructions, but I also had to deal with an irritated IT person and got reprimanded by my manager. This despite the fact that they were the ones who were violating security best practices and the very clearly written instructions in the published IT manual. Technically, I shouldn’t have handed out my password without written instructions from an actual company officer, but what are you supposed to do when not violating company policy is being treated as a form of stubbornness and insubordination? Bear in mind that, while this was not a strict IT job, it was IT adjacent. Everyone involved should have had some technical chops. The manager wasn’t really a tech-type per se, but she had been managing technical people for quite a long time. Imagine how much worse it is in departments who aren’t particularly technical.
and opening the door for someone with an Ladders in there hands that can’t get to there card is done with out question at places

and opening the door for someone with an Ladders in there hands that can’t get to there card is done with out question at places

and opening the door for someone with an Ladders in there hands that can’t get to there card is done with out question at places
You having a stroke? Or perhaps a random AI conversation?
Yeah because pulling that kind of shit is a phenomenally good way to bypass access controls.
Unless you personally know that the person is still employed at the company, you are giving access to someone that you don’t know should have access. It’s standard pentesting stuff. Dress up in a way which makes you an invisible part of the infrastructure, or in something that makes people liable to help you. Used to be a clipboard, cup of tea and harried expression would get you in the door, then a high vis vest was
someone from IT who I had not met before requested my password in an e-mail.
Why would they be requesting your password in the first place, let alone through email? That gives them direct access to your account. As you said, that is in direct violation of standard IT security policies. I would love to hear the justification why someone from IT would randomly need a user’s password.
The story is that the problem with the Google solution is that one employee was able to provide a One Time Password token to the attackers which then led to the compromise of other accounts. This was not deep insider info.
Ie, the story was not about hacking through social engineering, or through the lack of MFA, but that the solution they used from Google had flaws that made this worse. Ie, Google Authenticator had a synchronization feature that synced MFA codes to the cloud, a feature that others had warned was insecure.
The whole point of MFA and OTPs is to prevent compromise of one individual from leading to wide spread compromise within the organization.

the story was not about hacking through social engineering

the story was not about hacking through social engineering
What do you mean it wasn’t through social engineering? From TFS:

Shortly afterward, the employee received a phone call from someone who claimed to be an IT team member and had familiarity with the “floor plan of the office, coworkers, and internal processes of our company.” During the call, the employee provided an “additional multi-factor code.”

Shortly afterward, the employee received a phone call from someone who claimed to be an IT team member and had familiarity with the “floor plan of the office, coworkers, and internal processes of our company.” During the call, the employee provided an “additional multi-factor code.”
If that’s not social engineering, I don’t know what is. Well, I do. But so is that.
I didn’t say social engineering wasn’t a part of the story, but this was just part of the preamble to the real issue. But the heart of the story, what the story is ABOUT and why it matters, was the point that the Google tool made this small breach into a huge breach because of a feature that security experts had warned about.

Google Authenticator had a synchronization feature that synced MFA codes to the cloud, a feature that others had warned was insecure.

Google Authenticator had a synchronization feature that synced MFA codes to the cloud, a feature that others had warned was insecure.
That’s silly. Oh, I suppose it makes social engineering a bit more effective, because if you can get the target to give you the password and OTP for their Google account, you can get the OTPs for all of their other accounts. But if you’ve got their email, you’re a short step away from access to all of their other accounts anyway, since everything uses email for password reset.
On the other hand, the cloud sync feature provides the user with a simple, automated backup and restore, so whenever they set up a

since everything uses email for password reset.

since everything uses email for password reset.
Not when they’re secured with a tertiary Authenticator app. That’s the whole point of MFA. If they’re secured with an authenticator, it doesn’t matter if you’ve got the email address and password, because you can’t get the OTP. Those OTPs are supposed to be locked away on the user’s device.
Unless, of course, Google makes an incredibly stupid move and syncs those OTPs to the cloud in a way that makes a single OTP match for dozens of accounts when they should be exclusive to a single account.
The attackers caught a live one, a sucker willing to provide not just one, but two OTP codes to a malicious party.
Sure, they can be all pissed that the user’s google account had stored credentials, including OTP shared secrets, but ultimately the user allowed the attacker to authenticate as them, and there’s lots of general precedent in the industry to allow an authorized user to “extend” a second factor to another device if they can prove who they are.
Even if you don’t allow a shared secret to roam, most f

If you can get a single TOTP code sample with the current UTC time you can generate a shortlist of about 20 private keys.

If you can get a single TOTP code sample with the current UTC time you can generate a shortlist of about 20 private keys.
This would be a gigantic citation needed. Unless you are speaking of some bizarre implementation, TOTP is an HMAC of a 128 bit or higher key. I’ve never seen a hint of the shared secret being at risk simply by knowing one or two of the codes.
I remember that many people here (me included) though that quietly updating the Google authenticator app to sync to the cloud was a very, very bad idea. Did not take long to backfire badly.
To to re-iterate, authenticator apps should only keep secrets locally unless you really understand what you are doing and this feature (if present) should always be default-off. Obviously, you can still get an exceptionally stupid employee to send you a picture of the clone-codes (for moving the authentication codes to a different device), but that is highly suspicious and requires some work. With the cloud-sync Google uses, stealing the secrets is apparently very easy. I cannot really tell how easy, since I refused that update. With Authy, for example, if you chose online backup there is at least a passphrase protection and you are warned to not ever share that passphrase.
To to re-iterate, authenticator apps should only keep secrets locally
If you need that, then use a security key. Phones are not security keys. No matter how locally you store them: malware on the phone can exploit them, and maybe even leak them.
It’s a bad idea to keep them locally on a phone, because phones have a relatively short lifetime and need to be replaced, Or you need a backup phone — the purpose of the Sync feature. it’s not really acceptable to have to manually go and reset 50 credentials
Maybe true, but no one wants to carry around a dozen authentication dongles
no one wants to carry around a dozen authentication dongles
I said A security key – specifically 1 personal FIDO2 key to carry around for all authentication purposes, and perhaps 2 Backup keys to lock up where you won’t lose them; Your work and Each service ought to let you register as many keys as you need, but Not 12.

There are limitations with the YubiKey in terms of supported accounts. It can store up to 25 FIDO2 credentials for password-free logins, two OTP credentials, 32 OATH credentials for one-time passwords (when paired with the Yubico Authenticator), and an unlimited number of U2F credentials.

There are limitations with the YubiKey in terms of supported accounts. It can store up to 25 FIDO2 credentials for password-free logins, two OTP credentials, 32 OATH credentials for one-time passwords (when paired with the Yubico Authenticator), and an unlimited number of U2F credentials.
For me, I put my important stuff on the key, the less important stuff in google authenticator. Combined with Microsoft stuff in it’s authenticator, duo, ping, and i don’t remember all of them but the point is there is A LOT of authenticator apps I need to use.
This is patently false: you can have more than one key on a IAM user. Our root account has 5 yubikeys on it.
From the IAM page: “Use MFA to increase the security of your AWS environment. Signing in with MFA requires an authentication code from an MFA device. Each user can have a maximum of 8 MFA devices assigned. “

Maybe true, but no one wants to carry around a dozen authentication dongles

Maybe true, but no one wants to carry around a dozen authentication dongles
FIDO security keys let you enroll many authentication credentials onto one device. I think mine has a dozen. So you only need one security key per computer, if you want to just leave it plugged into the port, or maybe one on a keychain and a backup in a drawer, each with all of your FIDO-capable accounts enrolled.
You can transfer your codes from Google Authenticator by QR code. That’s how I always move them from an old phone to a new one.
Bullshit. This is a risk-management question. I have a tablet with no network connections for some high-security TOTP codes. (Just put it into airplane mode, and, by law, the RF part must be off.) I have the regular ones on my phone. I do not install random crap on that phone. Even if somebody does, that phone still needs to be linked to systems and accounts. Sure, if somebody is abysmally stupid and downloads malware on their phone and then logs into things from that phone using the authenticator app on th

I have a tablet with no network connections for some high-security TOTP codes. (Just put it into airplane mode, and, by law, the RF part must be off.)

I have a tablet with no network connections for some high-security TOTP codes. (Just put it into airplane mode, and, by law, the RF part must be off.)
I’d disable bluetooth too.
I do the exact same thing with an old iPod Touch. The reason I have it in airplane mode with BT disabled is both security, and that I’ve had PW managers sync corrupt data before causing all 2FA keys to be rendered useless. With the offline device, it wasn’t too hard to recover, but without it, it would have been impossible in some cases, like some NAS machines.
“Just put it into airplane mode, and, by law, the RF part must be off”
BWAHAHAHAHA nope modern phones you can enable airplane mode then turn on the wifi to get access to the in-plane wifi network.
At the time this tablet was manufactured, it was the law.
Exactly. And there’s nothing on that feature to warn you of what it actually means, even if you wanted to make a decision about balancing convenience. It just offers it as a “backup”.
Bullshit. This is a risk-management question.
Nonsense.
I have a tablet with no network connections for some high-security TOTP codes.
No. We were having a serious discussion here. Why does some fruit loop always gotta barge in with frivolous claims based on their one-of-a-kind system. Let me guess… you opened up the tablet and removed the antenna leads. If not, then it’s a lot less secure than you think it is, and Airplane mode is no assurance that the OS is untampered with and couldn’t be hacke

Bullshit. This is a risk-management question.
Nonsense.

Bullshit. This is a risk-management question.
Nonsense.
Ah, no? All IT security is applied risk management. If you do not understand that, you have no place in this discussion. The rest of your statement is just as bereft of insight, so I will not even bother to answer it.

If you need that, then use a security key. Phones are not security keys. No matter how locally you store them: malware on the phone can exploit them, and maybe even leak them.

If you need that, then use a security key. Phones are not security keys. No matter how locally you store them: malware on the phone can exploit them, and maybe even leak them.
Chasing perfect security is a fools errand. Phones are an order of magnitude better than single factor authentication, and an order of magnitude more convenient (reads more likely to be used rather than deactivated) than carrying around one or more special purpose devices.
The news story here isn’t that an employee’s phone was hacked, it’s that an employee handed over their credentials. You can throw as many authenticators at them as you want, you won’t solve the issue that way.
Phones are an order of magnitude better than single factor authentication
False, they’re not an order of magnitude better. The reason that is false, is because App authenticators are still Single-factor. Just like
“security questions” are still Single-factor.
What makes you think there is a second factor, when you launch an e-mail app on your phone, Type your password into the App, then approve the login through an authenticator on the phone?
A malware bug that gains root on your phone can exfiltrate the Sha
Given they pretty much were able to get the employee to hand over as much authentication material as they liked, there’s not a whole lot of blame left for the sync feature on OTP…
I’m reasonably confident that even if they couldn’t get the OTP secret, they could have accessed an “enroll a new code” page and gotten that exact same employee to just feed them however much authentication material needed to enroll a new authenticator. It doesn’t matter if it was synced shared secret, new shared secret, or even
The goal of security engineering is not to make it impossible to do attacks. Here it is to make insecure behavior cumbersome and high-effort compared to secure behavior. That way fewer of these attacks succeed and the attacker has to invest more and has a higher risk to get caught. That “sync” feature makes it very easy to do this attack and hence it is a brazen violation of sound security engineering principles.
So, yes, there is a lot of blame on Google here. Sure, there is also a lot of blame on that utte
I agree, esp. your point that it works to make empty promises.
(And I wonder if, in a very general way, there’s a theme around how the internet may usher in an age of transparency and integrity, if ever humanity is to survive! but I digress.)
It’s also a link to the notion that corporations are machines to make money, and as machines, workers are just cogs, and nobody is responsible for anything.
So I don’t know, security can only go so far, because it takes human effort, and that effort will most of the time
I think the point is that the way the “sync” is implemented in Google authenticator intentionally obscures what is happening in order to make it seem more secure and to make it “scary” to disable it. The “dark pattern” mentioned in the article means a user design issue where the company (Google) uses warning messages and dialog structures to mislead the user into doing what the company prefers rather than assisting the user in making an informed decision. It seems like there is a training issue at Retool
Authenticator apps need to be built around the concept that the shared secret needs to be as protected, if not more than passwords. This means:
The shared secrets need to be encrypted. Every field.
If stored/synced, it needs a sync key that is separate from the login info, and is manually copied to endpoints. 1Password’s secret key, and KeePass’s keyfile are good examples of this. This ensures that the secrets cannot be easily decoded from a backend cloud server.
It needs an ability to be exported, so one can back up the keys unencrypted, so one can go to another PW manager. Backups are important because if a sync error causes corruption, it will be a show stopper, and many 2FA keys can’t easily be regenerated if lost.
Ideally, the user should have an option for a PIN and/or fingerprint and/or face identification, depending on what the device offers, as well as the pass phrase.
Another nice option would be either piggybacking off an existing cloud provider, or having its own cloud provider, and with a secondary key, this will ensure that the backend can’t be brute-forced.
Finally, the app needs both hammering protection, and a duress code, so it would erase itself and require re-syncing with the secondary key on an endpoint, just in case the phone was stolen while unlocked and an attacker is trying to guess their way in.
Well, I’m no security expert, but I kinda get the non sequitur of “something unique you have (only you)” with “…aaaand which the cloud has.”
Headline should read: How One Extremely Stupid Employee Made Company’s Data Breach Much Much Worse
What?
Bahahahahhahahah
Agreed. This is very much like a “Man uses drill to drill through board freehand, accidentally drills through self, sues drill maker for having drills that can hurt people when not used carefully”. Unfortunately there are a lot of lawsuits like this…
Feature worked as intended – someone with appropriate account credentials was able to access multiple accounts that were associated with the convenience feature.
Are there risks to doing this? Yes, if someone other than the intended person gets access. Plan
No, not agreed. If this is the feature working as intended then it is a BAD FEATURE.
That’s the point. Secure systems been to be built on the idea that (a) most people don’t understand security at all, (b) malicious people out there trying to break your security exist and (c) everyone fucks up sooner or later.
Anything that violates one of those three assumptions is a bad feature. People know and they warned about this.
Google IIUC changed it from “you have to physically give your phone away permanently to a s
I’ve looked through roster of people working at Retool.. this is expected.
I’m surprised it took this long for them to get hacked.
But it sounds like “User A”‘s OTP allowed the attacker to compromise other accounts in the organisation?

a sync feature Google added to its authenticator in April magnified the severity of the breach because it allowed the attackers to compromise not just the employee’s account but a host of other company accounts as well

a sync feature Google added to its authenticator in April magnified the severity of the breach because it allowed the attackers to compromise not just the employee’s account but a host of other company accounts as well
How is that a useful feature?
My brother-in-law had the opposite problem, and really, really wished that cloud sync had been enabled.
He’s a freelance sysadmin, so has dozens, maybe more than a hundred, accounts on a whole bunch of different systems. He uses randomly-generated passwords, stored in a cloud-synced password manager, and Google Authenticator for TOTP-based MFA on nearly all of them, but had cloud sync disabled.
A few weeks ago, he switched phones, and didn’t sync his Google Authenticator config. Oops. He also didn’t have
I nearly fell off this cliff with a PW manager that synced… but it synced corrupt data, so my smartphone and tablet were unusable. All my MFA keys were useless, and I would have had to do that recovery process, one by one, perhaps in some cases it wouldn’t have been possible because some accounts didn’t have the option to save recovery codes.
What saved me? An offline iPod Touch which stays in airplane mode. After I powered that on and redid 2FA with everything, I now use a PW manager that I can dump th
Bitwarden lets you dump your saved data into a JSON-formatted file. I do that every few months – dump the data into a file stored on an encrypted disk image I’ve saved locally.

He understands the importance of backups! But, he didn’t have one.

He understands the importance of backups! But, he didn’t have one.
If he didn’t have a backup, it’s arguable whether or not he actually understands the importance of backups.

He understands the importance of backups! But, he didn’t have one.

If he didn’t have a backup, it’s arguable whether or not he actually understands the importance of backups.

He understands the importance of backups! But, he didn’t have one.

He understands the importance of backups! But, he didn’t have one.
If he didn’t have a backup, it’s arguable whether or not he actually understands the importance of backups.
Nonsense. He just hadn’t considered the need to back up this particular item. I’m sure even an infallible expert like you has one or two things you haven’t backed up.
That’s the thing with security, everyone fucks up. A lot of the comments on this article are people pointing&laughing or being angry at people fucking up, but it happens an it happens to everyone sooner or later. Incidentally this is why so many here think C is still the best thing since sliced bread because they cannot imagine fucking up, but the rest of us know what will happen sooner or later.
That chap had of course fucked up, but because he had good security that meant a tedious recovery process, wh
Phones are computers, computers get hacked so authentication based upon a phone is fundamentally flawed. Yes it’s convenient however but so is leaving your key under the mat. If you’re using a phone as an authentication credential for something of value its dumb. Yes computers can have secure elements however ones like phones are always connected to the Internet so it’s just a dumb way of doing things.
While the article recommend FIDO tokens, a smartcard based system also provides strong protection for a sig

… poorly written disclosure.

… poorly written disclosure.
Google authenticator worked as intended, don’t blame it. So “give me your password” phishing has escalated to “give me your OTP code”: There will always be some idiot who obeys an invisible and unverified authority figure (See “Compliance”, 2007).
The problem isn’t Google Authenticator or even Google making a copy of that database, it’s Google encouraging multiple devices to log into the one account: If the wrong device gains access, they have the keys to the kingdom: Authority over all other devices. Convenience (and profiling), contrary to Google’s/Microsoft’s claims, don’t increase security.
This stupidity continues most recently with new proof-of-identity laws: Criminals are paradoxically targeting anti-crime databases so Amazon and PayPal are demanding they hold more personal details to prove you are not a criminal. No, just no.
Google authenticator worked as intended [ … ]
“NOTABUG: Working as designed.”
Yeah, we know, Sparky… The design is fucking idiotic!
It seems clear that one of the OTP codes got them into the rube’s account — the second OTP code allowed them to copy out his Google Authenticator database. If that copy hadn’t existed — and indeed did not exist until Google decided to make copies for itself — then they would have had to keep pumping him for OTP codes, and the damage would likely have been more limit

… a STUPID FUCKING IDEA!

… a STUPID FUCKING IDEA!
Anyone with access to that data file, or access to your phone, also gets access to your OTP secrets: That’s the point of failure, which happens because most authenticator apps aren’t password protected.

… Google bears partial responsibility

… Google bears partial responsibility
Since the criminal actually phoned the dope who assisted with further breaches, no level of password/OTP security would have prevented this cyber-crack.

… a sync feature Google added

… a sync feature Google added
It’s sounds like Google put all OTP secrets into the one database, because remember, it’s multiple people using the one account, thus making all services sha
Anyone with access to that data file, or access to your phone, also gets access to your OTP secrets: That’s the point of failure, which happens because most authenticator apps aren’t password protected.
Most phones are encrypted and protected with a password. In order to get all of the OTP secrets, I have to find who you are, physically go to where you are, steal your phone, crack it and then get your OTP codes.
Or
Use the back door that allows me to spear phish the secret from someone already proven vulnerabl
It’s like people can’t think of one step ahead. Companies send warning messages to indicate there’s been suspicious activity on your account, so scammers start sending warning messages to say there’s been suspicious activity on your account and you must act urgently.
Is it really so hard to foresee, is it really so hard to use a bit of empathy and imagination to think a step ahead?
I’m not quite sure what the key principle is here, but surely there’s something to be said for segregation and isolation. Interes
There may be more comments in this discussion. Without JavaScript enabled, you might want to turn on Classic Discussion System in your preferences instead.
Iranian Hackers Target Satellite and Defense Firms, Microsoft Says
Three in Four Americans Believe AI Will Reduce Jobs – Gallup Poll
CCI Power 6/40: one board, a megabyte of cache, and an attitude…

source

Leave a Reply

Call Support:

Email:

Address:

Useful Links

Categories

Top Model Escorts. All rights reserved.