Due to server maintenance the servers backing the chrome.storage.sync API will be unavailable on 2019-05-20 starting at 10AM UTC. We expect the maintenance to last a maximum of 8 hours. We expect the API will be in read-only mode for most of this time, and will be completely unavailable for a short amount of time.
Please let us know if you have any questions or concerns.
We will keep this thread updated with new developments.
This announcement is a little confusing as it is so specific.
AFAIK the browser.storage.sync API just uses Firefox Sync, does not it?
If so, thus the whole sync servers are down during that period?
If not, I have many questions about how the browser.storage.sync API actually works? Especially, is the data (end-to-end-)encrypted (as with Firefox Sync) when being synced between devices?
Good questions! It does not use the same backend servers as Firefox Sync uses. The Firefox Sync servers will not be down during that time. The data is indeed end-to-end encrypted as with Firefox Sync. Let me know if you have any other questions.
Thanks, but then which servers does it use? Separate ones, really?
Then when is it enabled: If you as a user enable Firefox Sync and enable syncing of “add-ons”, I guess, is not it?
Seeing you reply, this does not seem to be really wrong. Am I right that the “frontend” for the user is basically Firefox Sync? (but there are just separate backend servers used)
Thanks, but then which servers does it use? Separate ones, really?
Then when is it enabled: If you as a user enable Firefox Sync and enable syncing of “add-ons”, I guess, is not it?
As far as I know, the fact that this is a separate service isn’t exposed through the Firefox UI (other than through a pref in about:config, where you can configure the backend server name), and it’s all treated as part of the “Firefox Sync” umbrella.
That’s good then, because that means you can describe it like that to your add-on users.
The only disadvantage I could imagine is the potential meta data leaked through the file size alone or so, as all that traffic is transmitted to that separate domain. So an attacker could estimate how many add-ons you have installed or so, if there is much traffic to that domain, that is obvious.
It gets a little off-topic, but I guess you can split this thread if needed, but…
If the API will be read-only if the servers are down, does that mean that:
if you have a (non-planned) server outage, saving of data via that API will also fail?
(basically the same question:) Does that mean Firefox does not cache the data locally before uploading it to this API?
You know, I am asking 1 especially, because you never know what incidents may happen. And you’ve such a one recently, #armagadd-on-2.0. So – just in case – you may want to limit the damage when that server is down.
Also related to this: Will browser.storage.sync.set() reject the Promise, if it is down/read-only?
That could be kinda bad if your add-on then shows an error – without further explanation. (AFAIK my ones should do so.) I really have not implemented a local fallback or so, I thought the API/Firefox already does so.
So as a proposal: Why not cache the changes locally that are written, to that API and then upload the changes when the server is back online?
This would actually help against a “server is down” scenario.
Thinking of this, actually, I can likewise be offline at some time, would this behave in the same way then?
My understanding is that Firefox does cache the data locally, so from the addon’s point of view, everything will be normal. So .get and .set don’t actually hit the network, but write to a local cache that is then synced periodically. I don’t see why the promise would be rejected in that case.
The main reason I think it’s worth it to let addon developers know about this is that people might complain to you if your addon data doesn’t sync properly between their devices.
Only this sentence let me think that the Promise could be rejected:
So do you mean the server API here? If so add-on devs hardly care about it. The question would be whether the browser.storage.sync.set()WebExtension API shows any different behaviour?
(Also a little off-optic, but I would be curious to know how it then syncs data in case conflicts appear between the online data and locally cached settings.)
Yes, I mean the server API. This could be relevant to addon developers who are testing whether sync works on multiple devices at that time, or addon developers who have users complain about issues.
I don’t work on the client, and know very little about it, so I can’t answer that question unfortunately. My guess is that whatever has the newest timestamp wins.