Share this story

LAS VEGAS—The Electron development platform is a key part of many applications, thanks to its cross-platform capabilities. Based on JavaScript and Node.js, Electron has been used to create client applications for Internet communications tools (including Skype, WhatsApp, and Slack) and even Microsoft's Visual Studio Code development tool. But Electron can also pose a significant security risk because of how easily Electron-based applications can be modified without triggering warnings.

At the BSides LV security conference on Tuesday, Pavel Tsakalidis demonstrated a tool he created called BEEMKA, a Python-based tool that allows someone to unpack Electron ASAR archive files and inject new code into Electron's JavaScript libraries and built-in Chrome browser extensions. The vulnerability is not part of the applications themselves but of the underlying Electron framework—and that vulnerability allows malicious activities to be hidden within processes that appear to be benign. Tsakalidis said that he had contacted Electron about the vulnerability but that he had gotten no response—and the vulnerability remains.

While making these changes required administrator access on Linux and MacOS, it only requires local access on Windows. Those modifications can create new event-based "features" that can access the file system, activate a Web cam, and exfiltrate information from systems using the functionality of trusted applications—including user credentials and sensitive data. In his demonstration, Tsakalidis showed a backdoored version of Microsoft Visual Studio Code that sent the contents of every code tab opened to a remote website.

A demonstration of a BEEMKA-backdoored version of the BItWarden application.

It’s not a bug, it’s a feature

The problem lies in the fact that Electron ASAR files themselves are not encrypted or signed, allowing them to be modified without changing the signature of the affected applications. A request from developers to be able to encrypt ASAR files was closed by the Electron team without action.

Code inserted into the ASAR can run either within the application's context or within the context of the Electron framework itself. Application code is "plain old JavaScript," Tsakalidis explained, capable of calling Electron's operating-specific modules—including microphone and camera controls, as well as operating system interfaces. Code injected into Electron's internal Chrome extensions can allow attackers to bypass certificate checks, so that, while code may still force communications over HTTPS, an attacker can use a self-signed certificate on a remote system for exfiltration. And Web communications can be altered or completely blocked—including applications' updating features, which would prevent new versions from being automatically installed, displacing the backdoored application.

Tsakalidis said that in order to make modifications to Electron apps, local access is needed, so remote attacks to modify Electron apps aren't (currently) a threat. But attackers could backdoor applications and then redistribute them, and the modified applications would be unlikely to trigger warnings—since their digital signature is not modified.

Share this story

Sean Gallagher
Sean was previously Ars Technica's IT and National Security Editor. A former Navy officer, systems administrator, and network systems integrator with 20 years of IT journalism experience, he lives and works in Baltimore, Maryland. Emailsean.gallagher@arstechnica.com//Twitter@thepacketrat

A lot of Electron apps seem to insist on installing into AppData on Windows, so someone with local access can just patch the executable.

Is that really a problem? Yes its files are accessible to any running code but the alternative is to grant every application administrative privileges which is also dangerous in its own way. I'd generally rather have my apps running in user space.

Placing a program in Program Files just makes its executable files write-protected for regular users, it does not elevate it when it runs. The normal order of things on Windows is to place executable code in Program Files and modifiable data in AppData. Personally, I don't use apps that can't be placed in Program Files, and periodically run a search for executables in AppData deleting or uninstalling all that I find.

There's a lot of focus on the "malicious installer" which I addressed during the talk (it should be uploaded by BSidesLV at some point).

The main context for this issue is machines that already have apps installed, and are backdoored by malicious actors.

For example, you got slack on all machines in an organisation, slack runs on start up, and it's installed in %appdata%. The current user clicks on a phishing email or a payload document and this grants the attacker access. Attacker uses slack's electron.asar file to install a powershell payload to gain access to the internal network every time that user logs in.

I'm a security consultant and have performed this technique to a client during a job. So the threat is there, it's the target group that changes. Sure, individual users aren't at risk, but organisations that say "yeah, we trust slack and vscode" are.

This whole thing could be up in flames if a single integrity check is added - I don't understand the pushback. And to be completely fair, Electron is a great framework and gets the job done, no need to "hate" it.

This is NOT a vulnerability. This is adding new code to existing code. That's like saying with administrator access I can add a .bat file to your Startup folder and thus Windows is susceptible. Or the way one can edit a command string on a Shortcut. Or that I can edit existing .bat files.

Be more critical readers of bullshit security "experts" using journalists and shitty platforms (let me guess, Twitter) to advertise their services.

How is this not a vulnerability?

EXE and DLL's are signed. If you try to modify them, the signature check will fail. Since the EXE and DLL's of the electron apps aren't changed, and the app code isn't signed, it can be changed without anything noticing it.

Read my edit explaining the second lie that people are falling for.

But you can also modify code already on the users system. Since the signed binaries aren't changing, and that code isn't signed, nothing would know that it was changed.

WITH LOCAL WRITE ACCESS TO EDIT FILES. IF I HAVE THAT I AM JUST INSTALLING MY WHOLE MALWARE SUITE.

Whoa there. No need to get all triggered. We're trying to have a civil discussion here. Do you have stock in Electron?

To answer that, there is a difference between installing easily identifiable malware that runs in the users startup folder that can be easily found and removed, vs adding malware to an application the user might use regularly. It may be harder to discover that Slack is being malicious vs an app called "perfectlysafe.exe" in your startup folder.

Anyway, I'll end this thread here since I fear you might have an aneurysm trying to make your point.

So in a quick check of a system with not a lot of software installed, Subnautica, Eve Online, and the Android SDK have plain text Python .py files that I could edit and would be potentially launched arbitrarily. All browsers have an editable default .html file and tons of Javascript .js. And you don't even want to get the amount of software that runs a lightweight HTTP server with various scripts that are again, plain text and editable.

All those are easily edited files which can run malware when modified. They are NOT checked for signatures or even a simple hash.

No, I don't deal with Electron, but I work for a security vendor and it's tedious dealing with people with their hair on fire about lies and supposition from fake security researchers. In the industry, we're all getting hammer as is with dumb questions about SSRF (server side request forgery) because other frauds out there made a supposition story about them being used in the Capital One breach.

It is normal in modern computing to have THOUSANDS of plain text script files on a given PC, regardless of OS. EVERY SINGLE ONE is attack surface for editing and adding arbitrary malware.

Edit: Downvoters: explain your downvotes. Lots of software runs unsigned, unverified scripts. As to my attitude in the thread -- entirely valid when the story at hand is bullshit. I will blame ignorance (not understanding how his computer works for Sean) rather than malice (intentional clickbait). Everything I have said are straight facts and ones you can prove yourself.

how exactly does encrypting or signing the app help with anything if its the app itself that is decrypting or checking the signature?

All an attacker has to do is change the encryption/decryption key(s) and/or redo the signatures and the user will still be none the wiser. If the OS enforced signing the executable (and the keys were stored in the executable), it be a different story (which you can somewhat get on windows and mac, but not as much on linux)

A lot of Electron apps seem to insist on installing into AppData on Windows, so someone with local access can just patch the executable.

Not so simple. Patching the executable will change it's signature and Windows will catch that code sign is invalid and warn you.

The important part of the exposed vulnerability here is that it just changes the scripts shipped with the distribution that are not signed and won't trigger any alert. These scripts have full OS access due to Node.js capability, with the only restrictions the user permissions.

Installing a spy trojan to get you di* picks in that way is so handy, no need to worry about antivirus caching you, at least for the moment.

This is NOT a vulnerability. This is adding new code to existing code. That's like saying with administrator access I can add a .bat file to your Startup folder and thus Windows is susceptible. Or the way one can edit a command string on a Shortcut. Or that I can edit existing .bat files.

Be more critical readers of bullshit security "experts" using journalists and shitty platforms (let me guess, Twitter) to advertise their services.

How is this not a vulnerability?

EXE and DLL's are signed. If you try to modify them, the signature check will fail.

Fun fact: You don't have to sign executables for them to run on Windows, so one would think there's an obvious workaround. Also what exactly is stopping the attacker from simply re-signing the executable with their own key?

Or just putting a bat file into the startup folder which can't be signed and will run without troubles.

Or even better: How about continue using the mechanism that allowed them to get arbitrary code execution and file system access in the first place?

The only people who have problems from this is are those that use enterprise tools to limit code execution to applications signed with specific keys. Rather common in secure enterprise environments but hardly the issue made out to be.

This is NOT a vulnerability. This is adding new code to existing code. That's like saying with administrator access I can add a .bat file to your Startup folder and thus Windows is susceptible. Or the way one can edit a command string on a Shortcut. Or that I can edit existing .bat files.

Be more critical readers of bullshit security "experts" using journalists and shitty platforms (let me guess, Twitter) to advertise their services.

How is this not a vulnerability?

EXE and DLL's are signed. If you try to modify them, the signature check will fail.

Fun fact: You don't have to sign executables for them to run on Windows, so one would think there's an obvious workaround. Also what exactly is stopping the attacker from simply re-signing the executable with their own key?

Or just putting a bat file into the startup folder which can't be signed and will run without troubles.

Or even better: How about continue using the mechanism that allowed them to get arbitrary code execution and file system access in the first place?

The only people who have problems from this is are those that use enterprise tools to limit code execution to applications signed with specific keys. Rather common in secure enterprise environments but hardly the issue made out to be.

At least in Windows 10, running signed code, but with globally untrusted signature will warn you immediately. However, if you actually remove the signature, will probably go silent (haven't tested it)

Also is this for skype or skype for business.? they are two seperate non interchangable products.

Skype for Business is going away, replaced by Microsoft Teams, which is (you guessed it) an ElectronJS app.

Aside from the whole "Microsoft can't be bothered to do a proper win32 and/or 'modern'/'metro'/RT/whatever it's called now application?" issue; what drives me nuts about Teams is the install behavior:

It's one of the per-user, install to appdata, applications; which I can understand as a "have to beat Slack's marketshare, gotta be easy to install without admin rights" measure; but even the "machine-wide installer" doesn't do a normal c:\program files install: it just drops a little autorun into the default user profile(not sure about existing profiles) that silently runs the per-user install on login.

Not a huge deal on more or less single user machines, just ugly; but it adds significantly to the time between login and usable desktop for people without existing profiles(doesn't help that OneDrive also has per-user stuff it does on first login; as do assorted provisioned appx packages); and it leads to a great deal of superfluous WAN traffic and wasted disk space if you have tons of users hitting shared computers and generating new profiles frequently.

At least applications like Chrome have the decency to offer either option; run the installer without admin and it goes in appdata; run as admin and it goes in program files; but for whatever reason Microsoft has decided that that isn't good enough. Makes handling VDI/terminal services environments a blast, let me tell you...

This is NOT a vulnerability. This is adding new code to existing code. That's like saying with administrator access I can add a .bat file to your Startup folder and thus Windows is susceptible. Or the way one can edit a command string on a Shortcut. Or that I can edit existing .bat files.

Be more critical readers of bullshit security "experts" using journalists and shitty platforms (let me guess, Twitter) to advertise their services.

How is this not a vulnerability?

EXE and DLL's are signed. If you try to modify them, the signature check will fail.

Fun fact: You don't have to sign executables for them to run on Windows, so one would think there's an obvious workaround. Also what exactly is stopping the attacker from simply re-signing the executable with their own key?

Or just putting a bat file into the startup folder which can't be signed and will run without troubles.

Or even better: How about continue using the mechanism that allowed them to get arbitrary code execution and file system access in the first place?

The only people who have problems from this is are those that use enterprise tools to limit code execution to applications signed with specific keys. Rather common in secure enterprise environments but hardly the issue made out to be.

At least in Windows 10, running signed code, but with globally untrusted signature will warn you immediately. However, if you actually remove the signature, will probably go silent (haven't tested it)

I think it depends on your 'Smartscreen' settings, at least in Win10. MS hasn't pushed it as hard as Apple has with 'Gatekeeper'; but if you enable it you'll get 'Smartscreen' warnings, at least for files identified as originating from the internet zone; if the signature is nonexistent, broken; or blacklisted.

It would also likely affect the success of a powershell-based attack sequence; I don't think 10 lets you run unsigned PS scripts out of the box; so you life will be easier if you can use some signed malice rather than having to modify or bypass the executionpolicy(though, unless they've tightened this up, there are a variety of options for doing that, since powershell is also a shell; and 'running a script' is pretty similar to feeding a text file to a shell in order...)

Yea, that was cringe to read since they don't care about other OS. Commenters called them out for ignoring Windows issue when developers specifically discussed about OSX.

One commenter is a security consultant and show it is possible to do it in Electron. It look like Electron developer is trying to sweep under the rug by keeping the communication outside of the issue tracker via email. Serious dude? you can't own up your errors and admit it is an issue.

If the Electron app runs under higher privileges than needed to write to the appdata folder, than this can be used for a privileges escalation hack.

Otherwise, I don't see the problem. If you already have appdata write rights, there is lots of things you can do. Like putting something in the start-up folder. I've yet to meet someone who regularly checks the start-up folder.

The current user clicks on a phishing email or a payload document and this grants the attacker access. Attacker uses slack's electron.asar file to install a powershell payload to gain access to the internal network every time that user logs in.

So you already got the user to execute an unknown program downloaded from the internet (for which Windows will certainly warn). The hacker can access the internal network at that point. No need to rewrite the Electron files.

So you already got the user to execute an unknown program downloaded from the internet (for which Windows will certainly warn). The hacker can access the internal network at that point. No need to rewrite the Electron files.

Well, you do need to do that in order to establish persistence for when the use shuts down their machine. This is how you remain on the internal network.

A lot of Electron apps seem to insist on installing into AppData on Windows, so someone with local access can just patch the executable.

Is that really a problem? Yes its files are accessible to any running code but the alternative is to grant every application administrative privileges which is also dangerous in its own way. I'd generally rather have my apps running in user space.

Placing a program in Program Files just makes its executable files write-protected for regular users, it does not elevate it when it runs. The normal order of things on Windows is to place executable code in Program Files and modifiable data in AppData. Personally, I don't use apps that can't be placed in Program Files, and periodically run a search for executables in AppData deleting or uninstalling all that I find.

No but you have no idea what that installer that you're granting administrative privileges to is doing. If it's a user level installer atleast it can only mess around with user space.