A 19-year old, David Colombo, got thrusted into the spotlight after discovering a potential security issue for Tesla owners running a third-party application, TeslaMate. David made the discovery while performing an initial enterprise network sweep for a corporate customer. He didn’t find the usual vulnerabilities, such as unprotected backup database, but something different. An unusual named Docker instance running within the corporate network—after some diligence David confirmed an application called TeslaMate was running within his customer’s network. The experience is documented in David’s Medium post.
TeslaMate
TeslaMate application is data logger for Tesla vehicles, which collects useful high precision drive and charging data. The intention was for the application to run on a personal home computer network – it wasn’t built to protect against the threats of running in a public or enterprise network. David’s curiosity led him to discover TeslaMate instances on the public Internet – available to any hacker for exploitation.
TeslaMate requires access to the Tesla vehicle API to acquire data within a vehicle for reporting. The vehicle’s API is protected by an access token. Tokens are commonly used in web-based applications for single sign-on (SSO) – SSO provides a better user experience. This convenience has a dangerous edge; the token is meant to only be disclosed over an end-to-end secure connection. If a token is exposed through a log file or some other mechanism, an exposed token can be used to impersonate the legitimate owner. This potential vulnerability is not a novel attack – this is a known risk of using tokens. Tokens can be revoked when a leak is discovered, but discovering leaks is nearly impossible unless someone comes forward. Tokens typically have a lifetime, requiring the user to re-authenticate for access to a new valid token lease.
Tokens
Tokens are pervasive. Anyone that has linked their social network account to a game has likely used a token. A game will ask for different permissions: May I see your timeline? May I see your friend list? May I update your timeline? As a player, you can typically decide what you authorize. In the real world, some cars include a valet key limiting the vehicle’s speed or access to the glove box – same concept, limit access based on possession of the key.
The same Tesla token used to collect data can also be used to perform remote commands – umm, one token to rule the car? The wrong person with access to the token can make alarming real-time changes to a connected Tesla.
This is not a massive breach of Tesla’s infrastructure. Tesla didn’t make a mistake exposing tokens. The number of affected Tesla tokens is tiny. Tokens can be leaked in many ways. Transport over insecure links, such as HTTP. Or through a developer inadvertently exposing a token in a debug log output, which an owner then shares on a public support board. Or as in this case, an owner enters their valid credentials into a third party application to retrieve their hidden token.
The reverse-engineered Tesla API (at the time of the discovery) did not support fine-grain control – a token had complete API access. I would expect Tesla to add fine-grain support to their API so that the owner may grant specific authorizations. And I hope Tesla will prevent an API from taking particular actions while a car is in motion – opening a vehicle’s doors, or aggressively adjusting audio volume (percent limits). Limits should be placed within the app, within the cloud relaying the information, and finally, within the vehicle itself. This is similar to a web-browser and webserver, the naïve webserver relies on a browser to filter and present proper input; an attacker can simply impersonate a browser to bypass any client-checking precautions.
Attaching any device or computer on the public Internet without proper security is an easy target for hackers. Tesla did quickly revoke all tokens after being notified by David – this likely caused some inconvenience but it was the correct response, exposed tokens are now worthless. Of course, any ill-deployed TeslaMate applications on the Internet are still vulnerable, and will likely receive malicious attention.
Things to Consider
Things to consider in your design.
Shorten the life span of a token to minimize the potential for a disclosed token. A data leak, such as a Heartbleed-like vulnerability, would likely expose tokens. Tokens should not be logged in a mobile app’s debug log or on the server-side – they need to be treated as what they are, an ephemeral username and password.
Consider Multi-factor Authentication (MFA) in your design. A car’s key or phone’s secure enclave can be used as an additional factor to enhance security and prevent third parties from impersonating legitimate apps. Design your API such that a token becomes a nonce token (PKI ensures a token is used once and only once per API call); therefore, no risk if a token is exposed.
Conclusion
Today, Tesla may not be ready to open its API. Opening APIs is better than hiding them – the proof is the reverse engineering. Good Apple apps will use the official APIs, jailbroken ones that slip by will stand out for not using published interfaces. We are very early, every significant device manufacturer, including automotive OEMs, are moving towards the iPhone wall garden experience; “it just works”, “an app for that”, while weeding out the bad. Apple provides developers everything necessary to release apps. OEMs are already starting to fall across this app developer spectrum; many will develop a simple remote-API plus token model, and others provide a complete development platform with FAQs, documentation, APIs/libraries, tools, and a vetting process.
It will be a brave new world for automotive makers. OEMs are finding themselves signing up to be software houses.
Takeaways
- Shorter token access lifetime duration.
- Vehicles should sanity check API commands.
- Add fine-grain API permissions per token.
- Don’t expose tokens. Better yet, adopt multi-factor authentication (MFA) nonce tokens – an exposed token is useless.