Millions of Subarus could be remotely unlocked, tracked due to security flaws
-
[email protected]replied to [email protected] last edited by
Shah and Curry's research that led them to the discovery of Subaru's vulnerabilities began when they found that Curry's mother's Starlink app connected to the domain SubaruCS.com, which they realized was an administrative domain for employees. Scouring that site for security flaws, they found that they could reset employees' passwords simply by guessing their email address, which gave them the ability to take over any employee's account whose email they could find. The password reset functionality did ask for answers to two security questions, but they found that those answers were checked with code that ran locally in a user's browser, not on Subaru's server, allowing the safeguard to be easily bypassed. “There were really multiple systemic failures that led to this,” Shah says.
Yeah, this kinda bothers me with computer security in general. So, the above is really poor design, right? But that emerges from the following:
-
Writing secure code is hard. Writing bug-free code in general is hard, haven't even solved that one yet, but specifically for security bugs you have someone down the line potentially actively trying to exploit the code.
-
It's often not very immediately visible to anyone how actually secure code code is. Not to customers, not to people at the company using the code, and sometimes not even to the code's author. It's not even very easy to quantify security -- I mean, there are attempts to do things like security certification of products, but...they're all kind of limited.
-
Cost -- and thus limitations on time expended and the knowledge base of whoever you have working on the thing -- is always going to be present. That's very much going to be visible to the company. Insecure code is cheaper to write than secure code.
In general, if you can't evaluate something, it's probably not going to be very good, because it won't be taken into account in purchasing decisions. If a consumer buys a car, they can realistically evaluate its 0-60 time or the trunk space it has. But they cannot realistically evaluate how secure the protection of their data is. And it's kinda hard to evaluate how secure code is. Even if you look at a history of exploits (software package X has had more reported security issues than software package Y), different code gets different levels of scrutiny.
You can disincentivize it via market regulation with fines. But that's got its own set of issues, like encouraging companies not to report actual problems, where they can get away with it. And it's not totally clear to me that companies are really able to effectively evaluate the security of the code they have.
And I've not been getting more comfortable with this over time, as compromises have gotten worse and worse.
thinks
Maybe do something like we have with whistleblower rewards.
https://www.whistleblowers.org/whistleblower-protections-and-rewards/
- The False Claims Act, which requires payment to whistleblowers of between 15 and 30 percent of the government’s monetary sanctions collected if they assist with prosecution of fraud in connection with government contracting and other government programs;
- The Dodd-Frank Act, which requires payment to whistleblowers of between 10 percent and 30 percent of monetary sanctions collected if they assist with prosecution of securities and commodities fraud; and
- The IRS whistleblower law, which requires payment to whistleblowers of 15 to 30 percent of monetary sanctions collected if they assist with prosecution of tax fraud.
So, okay. Say we set something up where fines for having security flaws exposing certain data or providing access to certain controls exist, and white hat hackers get a mandatory N percent of that fine if they report it to the appropriate government agency. That creates an incentive to have an unaffiliated third party looking for problems. That's a more-antagonistic relationship with the target than normally currently exists -- today, we just expect white hats to report bugs for reputation or maybe, for companies that have it, for a reporting reward. This shifts things so that you have a bunch of people effectively working for the government. But it's also a market-based approach -- the government's just setting incentives.
Because otherwise, you have the incentives set for the company involved not to care all that much, and the hackers out there to go do black hat stuff, things like ransomware and espionage.
I'd imagine that it'd also be possible for an insurance market for covering fines of this sort to show up and for them to develop and mandate their own best practices for customers.
The status quo for computer security is just horrendous, and as more data is logged and computers become increasingly present everywhere, the issue is only going to get worse. If not this, then something else really does need to change.
-
-
[email protected]replied to [email protected] last edited by
Someone is going to exploit this to make a great lesbian dating app
-
[email protected]replied to [email protected] last edited by
We should all start asking around our local auto shops that handle software and ask if they disable gps or internet services.
It's not illegal to modify your own vehicle (yet) so jailbreaking these shitty cars would be an awesome service.
-
[email protected]replied to [email protected] last edited by
I doubt there would be any auto shops that can reliably deal with software side elements that aren't the dealership, and the dealership would refuse.
-
[email protected]replied to [email protected] last edited by
Car manufacturers are required by law to offer the same tools that dealers use for independent repair shops to repair their vehicles.
Some cars are more programmable than others. BMWs for example you can change pretty much anything about the car. But most cars aren't as modifiable as them.
-
[email protected]replied to [email protected] last edited by
"Privacy researchers at the Mozilla Foundation in September warned in a report that “modern cars are a privacy nightmare,” noting that 92 percent give car owners little to no control over the data they collect, and 84 percent reserve the right to sell or share your information. (Subaru tells WIRED that it “does not sell location data.”)"
Such a statement about not selling data can be very misleading, because the essential statement of saying "we do not share your location data" does not seem to have been made! Please, let us stop falling for the trick of companies saying that they do not sell our data as somehow equating to them respecting our privacy, because it is not an equivalence.
“While we worried that our doorbells and watches that connect to the Internet
might be[are] spying on us, car brands quietly entered the data business by turning their vehicles into powerful data-gobbling machines,” Mozilla's report reads.“People are being tracked in ways that they have no idea are happening.”
"the minute you hook up your phone to Bluetooth, it automatically downloads all the information off your phone, which is sent back to the vehicle manufacturer."
"if you want to protect the data on your phone, don't connect it to the car."
-
[email protected]replied to [email protected] last edited by
Yeah, this kinda bothers me with computer security in general. So, the above is really poor design, right? But that emerges from the following:
- Writing secure code is hard. Writing bug-free code in general is hard, haven’t even solved that one yet, but specifically for security bugs you have someone down the line potentially actively trying to exploit the code.
- It’s often not very immediately visible to anyone how actually secure code code is. Not to customers, not to people at the company using the code, and sometimes not even to the code’s author. It’s not even very easy to quantify security – I mean, there are attempts to do things like security certification of products, but…they’re all kind of limited.
- Cost – and thus limitations on time expended and the knowledge base of whoever you have working on the thing – is always going to be present. That’s very much going to be visible to the company. Insecure code is cheaper to write than secure code.
There is nothing wrong with your three points, in general. But I think there are some things in this given case that are very visible weak points before getting into the source code:
-
You should not have connections from the cars to the customer support domain at all. There should be a clear delineation between functions, and a single (redundant if necessary) connection gateway for the cars. This is to keep the attack surface small.
-
Authentication is always server side, passwords and reset-question-answers are the same in that regard. Even writing that code on the client was the wrong place from the start.
-
Resetting a password should involve verifying continued access to the associated email account.
So it seems to me that here the fundamental design was not done securely, far before we get into the hard part of avoiding writing bugs or finding written bugs.
This could have something to do with the existing structures. E.g. the CS platform was an external product and someone bolted on the password reset later in a bad way. The CS department needed to access details on cars during support calls and instead of going though the service that communicates with the cars usually, it was simpler to implement a separate connection to the cars directly. (I'm just guessing of course)
Maybe besides cost, there is also an issue that nobody in the organization has an overall responsibility or the power to enforce a sensible design on the interactions between various systems.
-
-
[email protected]replied to [email protected] last edited by
I assume, “does not sell location data,” is like the government which "does not sell laws," but 'our 884 lobbying partners' can have influence.
the minute you hook up your phone to Bluetooth, it automatically downloads all the information off your phone
But this I'm skeptical of. What data does it get from the phone? Bluetooth you can allow it to have your call history, right? And maybe contacts? At least you can choose, I think?
-
[email protected]replied to [email protected] last edited by
"Intercept hot singles in your area! On their way home from work!
No, it's not creepy at all!"
-
[email protected]replied to [email protected] last edited by
I doubt car manufacturers offer the ability to jailbreak their car OS to independent repair shops.
You're looking for a hacker, not a dude who changes oil.