webRequest.StreamFilter, Blocking, and Prioritization Schemes

Hello! I’ve been developing a plugin that filters NSFW images purely client-side using webRequest.StreamFilter, and so far it’s been going great! I have my plugin working and published here as Wingman Jr. - now I’ve begun to turn my attention on how to improve performance.

Currently, I hook all inbound images via webRequest.StreamFilter and run them through a detection algorithm using a neural net. This works great, but it is of course a bit slow. So I’ve been trying to get a bit creative. In particular, I’d like to make image-heavy sites such as Pinterest or DeviantArt work well for users of my plugin.

I’d like to accomplish one of the two high level objectives:

  1. Somehow affect fetch order of images to prioritize content currently in the viewport.
  2. Prioritize open/new requests based on what is currently in the viewport.

I suspect that option 1 is unlikely to be possible; I’m not quite sure how the exact method works but I’m guessing layout order/priority is more important than current user viewport. I was spiking option 2 last night by creating a content script that uses IntersectionObserver on all img elements to push prioritization back into the background script. While this is at some level working, there is now a different problem. With webRequest and blocking, it appears that you can really only get one resource moving through the chain. (Although I’d love to hear somebody elaborate on what “blocking” precisely means in this context with respect to open HTTP requests.) This means that while the web page can request prioritization, it is basically fated to wait its turn in the priority queue.

So how to solve this? Here are my crazy ideas so far.

One idea is to actually fetch the prioritized resource when it is sent from the content script to the background script and do the scan immediately. Then I can either cache the response for when the StreamFilter catches up or just bite the bullet and fetch it twice. This idea involves some nasty caching and/or being a resource hog.

Another idea is to somehow increase the number of open HTTP requests while in “blocking” mode if possible. It is unclear to me if this is possible; certainly it seems a bit ill-advised. However, if more requests are in-flight, I can prioritize scanning once data transfer is complete.

Yet another idea is to see if I can somehow - through return codes or perhaps some sneaky redirect scheme? - force a retry to act as a deferral mechanism. So the logic might be something like this: 1) receive a stream filter request, 2) check for any open prioritization requests 3) if yes, defer, else service requests as normal.

I understand what I’m looking for here is a bit farfetched, but I’m curious if anyone has any ideas. Thanks in advance!

I ended up getting a proof of concept working but… it was so hacky and the performance improvement was negligible enough that I decided to focus my efforts on performance elsewhere. Thanks everyone that thought about my question!