API’s, structure & speed issues with Covid Map

Developing my covid map using API data pulled in in JSON format has caused me some grief. I have spent a couple of days with no progress at all trying to get variable data pushed through a function to use in a JS Template literal. In the end I had to wrap the whole thing in a function to make it work, some very, very messy code.

I finally got some of the things working and then I compared load times with the original map and charts below. 1/The first image is the original file that has the data in the file itself, I rebuild it daily with updated data and then put it on my server. It takes 2.18s to load and calls a chart & table from amCharts.js. 2/ The second image is the map from calling API data, downloading it as JSON and processing that and rendering it. It calls 3 api’s for the data then generates the resulting page. There is only the map on the page so far. It takes 4.18s to render, almost 2 x as long as the HTML page.

Build or call API?

The HTML website is definitely faster as it has everything on hand. The API has to go fetch the data before it can display it, but it will do it when needed, so no need to rebuild the page with fresh data daily. More automated.

API info

I looked at building my own API by scraping the web to get the data, see this article Web Scraping with Browser development console & javaScript.

It was a bit like what I was doing with Excel in that I had to go and scrape the data daily and add that to my API JSON file, so on reflection I used some API data that had already been scraped by others. Dixon Cheng allowed me to use his API endpoints for his NZ Covid Map.
One of his endpoints is https://api.covid19map.nz/summary.json .

Now, using his API’s the data is structured in the method that he wants it. For me, I am having to make longer functions/formulas to reach in to grab the data, so first I’m making a call to the API, then I’m finding its length at the top level, then I’m getting an objects length at the next level, then I’m finding the last one in the array then I’m getting an item in that last array. So a lot of drilling down and very inefficient calls, this is why its taking a long time for the page to load.

I wonder if I just find the last items and make an array and then interrogate that array, it may speed the process up. So load api’s, put data into arrays then use arrays. But I only want to load the minimal onto the site or that will slow it down.

API building with Python

When looking at building an API (basically scraping and collecting data for use, I’d thought of using JvaScript as I was consuming the api woth JS. On reflection I think I’d look to use Python to do this andimport something like “beutiful soup” to do the scraping.

That is maybe a follow on project that I want to explore.

End comment

I’ve got bogged down cleaning code for my API calling, and in fact I’m calling more data than I need currently to the page.

I did a video about the issues here:

I have had some thoughts about ways to spered up the process, and may explore these in time, as listed below:

1. Have API files on Server, so reading local file rather than calling over the web?

An automated method to copy the API file from their current location and put them on my server. Something like Automation workshop to do so , with a time set for doing it daily, so file is always updated. That process I haven’t explored yet.

2. Rebuild API’s to make them more efficient?

Is there a way to automate rebuilding the original api’s so that only relevant data is in a smaller JSON file, that way less load time. An automated method to restructure JSON files on the server?

3. Databases?

Another method is to import API data into a Database on the server, if that is refreshed at intervals then the latest data is in a table, on a database on the same server as the