Planned downtime of the API servers on 2019-05-20

Cross-posted from the dev-addons mailing list.

Due to server maintenance the servers backing the API will be unavailable on 2019-05-20 starting at 10AM UTC. We expect the maintenance to last a maximum of 8 hours. We expect the API will be in read-only mode for most of this time, and will be completely unavailable for a short amount of time.

Please let us know if you have any questions or concerns.

We will keep this thread updated with new developments.

1 Like

This announcement is a little confusing as it is so specific.
AFAIK the API just uses Firefox Sync, does not it?
If so, thus the whole sync servers are down during that period?
If not, I have many questions about how the API actually works? Especially, is the data (end-to-end-)encrypted (as with Firefox Sync) when being synced between devices?

Good questions! It does not use the same backend servers as Firefox Sync uses. The Firefox Sync servers will not be down during that time. The data is indeed end-to-end encrypted as with Firefox Sync. Let me know if you have any other questions.

1 Like

Thanks, but then which servers does it use? Separate ones, really?
Then when is it enabled: If you as a user enable Firefox Sync and enable syncing of “add-ons”, I guess, is not it?

Note these question likely matter for what you, as an add-on dev, write in your privacy policy. Currently I write e.g. that the data there is synced via Firefox Sync and also link to Mozilla’s privacy policy for Firefox Sync.

Seeing you reply, this does not seem to be really wrong. Am I right that the “frontend” for the user is basically Firefox Sync? (but there are just separate backend servers used)

Thanks, but then which servers does it use? Separate ones, really?
Then when is it enabled: If you as a user enable Firefox Sync and enable syncing of “add-ons”, I guess, is not it?

Not sure if this answers your question, but yes, it’s a different set of servers. They’re accessible on the domain name

As far as I know, the fact that this is a separate service isn’t exposed through the Firefox UI (other than through a pref in about:config, where you can configure the backend server name), and it’s all treated as part of the “Firefox Sync” umbrella.

1 Like

Yes, thanks, this answers it totally.

That’s good then, because that means you can describe it like that to your add-on users.

The only disadvantage I could imagine is the potential meta data leaked through the file size alone or so, as all that traffic is transmitted to that separate domain. So an attacker could estimate how many add-ons you have installed or so, if there is much traffic to that domain, that is obvious. :smile:

It gets a little off-topic, but I guess you can split this thread if needed, but…

If the API will be read-only if the servers are down, does that mean that:

  1. if you have a (non-planned) server outage, saving of data via that API will also fail?
  2. (basically the same question:) Does that mean Firefox does not cache the data locally before uploading it to this API?

You know, I am asking 1 especially, because you never know what incidents may happen. And you’ve such a one recently, #armagadd-on-2.0. So – just in case – you may want to limit the damage when that server is down.

  1. Also related to this: Will reject the Promise, if it is down/read-only?
    That could be kinda bad if your add-on then shows an error – without further explanation. (AFAIK my ones should do so.) I really have not implemented a local fallback or so, I thought the API/Firefox already does so.

So as a proposal: Why not cache the changes locally that are written, to that API and then upload the changes when the server is back online?
This would actually help against a “server is down” scenario.

Thinking of this, actually, I can likewise be offline at some time, would this behave in the same way then?

My understanding is that Firefox does cache the data locally, so from the addon’s point of view, everything will be normal. So .get and .set don’t actually hit the network, but write to a local cache that is then synced periodically. I don’t see why the promise would be rejected in that case.

The main reason I think it’s worth it to let addon developers know about this is that people might complain to you if your addon data doesn’t sync properly between their devices.

Okay, that’s good to hear.

Only this sentence let me think that the Promise could be rejected:

So do you mean the server API here? If so add-on devs hardly care about it. The question would be whether the WebExtension API shows any different behaviour?

(Also a little off-optic, but I would be curious to know how it then syncs data in case conflicts appear between the online data and locally cached settings.)

Yes, I mean the server API. This could be relevant to addon developers who are testing whether sync works on multiple devices at that time, or addon developers who have users complain about issues.

I don’t work on the client, and know very little about it, so I can’t answer that question unfortunately. My guess is that whatever has the newest timestamp wins.

May 20, 9:26am PST:

The migration is now done and the service is now 100% available. Please let us know if you encounter any issues.