Give me a specific URL where you have the issue, I will see what is going on.
The download button on the right opens a pop up with the original size image. Some images are small enough to download the full size straight from the gallery page, but most are too large and are only available at original size through the download button.
That button does some kind of third party request but I haven’t been able to figure it out.
Ok, it took me a while to figure it out, it’s something to do with Firefox pulling the image from its cache. You need to whitelist the
wyciwyg-scheme really means that Firefox is pulling the image from a sort of
wyciwyg: URI (its cache I believe), and since that scope is not whitelisted by default, this was breaking pulling from the cache – that is about as much as I understand for now.
So anyway, create this rule in My rules, then persist, issue should be gone for good. I will add it to uMatrix’s default ruleset.
matrix-off: wyciwyg-scheme true
wyciwyg means “what you cache is what you get”.
So far so good. Sweet, thanks. This has fixed some issues I was having on other pop up sites as well.
I was testing uMatrix and just now I had an issue where I couldn’t log into
twitter.com, even after turning off matrix filtering for all
twitter.com and all per-site switches, I couldn’t figure what was going on.
Turns out that with all my testing I had enabled matrix filtering for the
behind-the-scene scope, and this was causing the issue:
twitter.com uses a service worker – which is requested in the
Once I turned off matrix filtering for
behind-the-scene (including turning off all per-site switches in that special scope), I could log in fine on
twitter.com in default-deny mode with only
So anyway, this is just to point out that sometimes we have to look everywhere when trying to un-break a site.
I would like to see a redirect path on the “uMatrix has prevented the following page from loading:” page. Currently, I copy and paste the blocked redirect link into http://redirectdetective.com/ to see the ultimate destination. Simply add the redirect path below the blocked link with an option to click directly on it. This would be a useful feature.
I use uBO and uMatrix for a while so I first want to thank you for your work!
Yesterday I noticed in the changelog that you started supporting uMatrix on Firefox for Android
I have immediately installed the WebExtension and noticed the lack of a custom logo on AMO.
Since I can’t help in the code I’d like to suggest a few creations.
Thank you again.
If you are interested I can modify them for your needs of course.
why is my uMatrix dashboard gray after 10 minutes and the pages are not working anymore. even if I reload it without closing the browser. When I close the browser and open it all goes for 10 minutes.
Initially uMatrix worked as it should, but for the past few days clicking on the scope selector (the blue bar at the top-left) just highlights the text instead of being able to select from, say *, domain.com and www.domain.com.
Also, I fairly often find that the refresh button doesn’t always reload with the new permissions. In Windows, CTRL+R also doesn’t work, but CTRL+F5 does.
Amazing add-on, BTW. The user-interface is perfectly designed. It’s so intuitive to use, and more flexible than NoScript.
Is there anything I can try to get the scope selector working?
The highlighted text is the current scope – there is no more dropdown list. See release notes.
When there is such issue, try holding the shift key down when clicking reload, sometimes it may happen the browser’s cached responses are causinf the webRequest API to be skipped.
FF 57: Even if I disable the default “Block all hyperlink auditing attempts” , " network-prefetch " = false and " network.dns.disablePrefetch " = true . If I set now " network-prefetch " = true and " network.dns.disablePrefetch " = false and restart FF 57, the settings are again: " network-prefetch " = false and " network.dns.disablePrefetch " = true (even if “Block all hyperlink auditing attempts” is still disabled). If I disable uMatrix itself completely and set my default settings (true & false) in FF again and take a restart, there are no changes in my FF - settings. Is this by design or a bug?
By design. uMatrix by its nature will not let your browser establish connections to the outside world if network requests are meant to be blocked – hence it internally disable pre-fetching, pre-connecting and pre-dns lookups.
By the way this is after I reset uMatrix back to its default settings since I had imported a bunch of rules I had built up in Chrome over the years and thought it was best to start from a clean slate.
I see a
<meta http-equiv="refresh"> in the
<noscript> tag. However as noted in the release notes, for some reason Firefox does not execute the redirection.
I thought Google would be considerate enough to put in a link that can be clicked manually, but I see it’s not the case in the current scenario.
So I will need to execute the redirection manually – but only for Firefox, since it works fine with Chromium. I will try to fix this for next release. Another solution is to insert a link to be clicked manually.
<meta http-equiv="refresh"> in it? I use multiple gmail accounts that are all logged into during a browser session. Just wanted to make sure that it was not an oversight on my part with some setttings in uMatrix that I did not set properly.
I just published a release candidate with a fix for the
<meta http-equiv="refresh" ...> in it:
See it this solves your blank page issue.
Yes it solves the issue with blocking acccounts.google.com. Now I see the login boxes as one should. Thanks for the quick fix.
Hi, simple question:
Days ago Gorhill mentioned he will prefer UMatrix for mobiles.
I am comfortable and can handle both, UMatrix and UBlock.
But, I don’t care at all about things like “hosts files”, “cosmetic filtering”, “popup blocker”, “zapper/picker element” etc etc etc.
I do care about RAM, system performance, firewall functions etc.
In this context, is there any reason to prefer UBlock rather than UMatrix?
I read a lot of stuff, but it is still confusing me the pros & cons of UMatrix vs UBlock.
Considering that I am not interested in ad-blocking functions, or anti-malware, or cosmetics, or hosts files etc, and considering that I focus only in “firewall functions” and system performance… am I right in choosing UMatrix rather than UBlock?
I understand that UBlock is more suitable for most of the users.
However, in my ignorance, a firewall function is many time better than any host file or cosmetic filtering etc. And the UMatrix friendly granular control avoids webpage breaking. I can block almost all the garbage with UMatrix without breaking pages. Why someone will prefer UBlock (less friendly granular control) with tons of hosts lists, filters etc, rather than a clean simple friendly strong firewall?
But Gorhill in the past two years seemed to be so passionate with UBlock, always arguing in favor of UBlock etc, that I have the feeling that I am missing something.
Every-time someone try to explain why to prefer UMatrix, Gorhill appears saying that the same could be done with UBlock in a kind of better and more efficient way. I can’t see that! What am I missing? Why to use UBlock rather than UMatrix? Which one is less system resources consumer?
Very odd issue with site breitbart.com
It loads very slowly (display takes an additional 3-4 seconds) vs instant
Here’s the rub. With Umatrix running, even when the site is unblocked, I have the issue
But If I run UBLOCK along side, the site loads and displays instantly.
So Ublock is doing something to speed things up that I cant get Umatrix to do. OR Umatrix is causing a problem, hard to tell.
Well I can no longer log in on the Discourse forum here, trying to log
in always errors So I have to reply through email, no idea how this
is going to be rendered on Mozilla Discourse.
So anyway… The issue is not the page loading slowly with uMatrix,
the issue is that the page hides it’s own content until some
conditions is triggered. uBO does not suffer this because it uses a
neutered script for
googletagmanager.com, and this neutered script
contains code to prevent the delay in making the page visible.