I need to develop an addon to perform scraping on certain web pages of our clients in the private area of each user once logged on those web pages (all this always under the approval of the user informing him at all times the data that will be obtained) . Each of our clients has a different website and the necessary scraping code is different for each of the clients. Every month we get new clients, so we have to develop the necessary scraping script for new clients. In addition, the information obtained from scraping will be sent by the addon to our REST web service for storage in a database.
To avoid having to generate a new version of the addon every time we develop the scraping script of the new clients, I had thought of developing a web service of type REST that the addon would initially consult, and that would return the URLs of all the clients, along with all the scripts associated with each client. In this way, we would only make a version of the addon and the scraping script would store it in a database and return it to the addon through the web service indicated above and the addon would inject it as a content script using the browser.tabs.executeScript() method. Once the scraping is done, the result would be sent by the addon to the corresponding web service.
Would the architecture that I have commented be feasible? If it is not, what options do I have? This project is very important for my company and I have to look for the right options.
Is there a problem with sending the information obtained by scraping to our external server for storage?
I’m very worried about all this, can someone help me?