Using data from database collections or from a 3rd-party source can be a powerful tool to enhance your site's functionality. However, sending a lot of data to the browser from the server can be a time-consuming operation and negatively affect your site's loading time. Therefore, you want to minimize the amount of data that is sent from the server to the browser. This article lists a number of approaches you can use, whether you're using a dataset or the Data API, to improve your site's performance.
When retrieving data on a page, you should only retrieve the items that you need. If you need to display a large number of items, you should consider only retrieving some of the items at first and then more items when needed.
For example, suppose you want to display items from a collection in a repeater. Instead of showing all the items from your collection when the page loads, you can start by showing only some of the items. If necessary, you can add a way to load more items or to navigate through pages of items.
Limiting the amount of data you retrieve when using datasets is easy.
First, set the number of items to show in your repeater using the Number of items to display setting in your dataset's settings. This not only limits the number of items displayed but also limits the number of items retrieved by the dataset at one time. The number of items retrieved at one time is also known as the dataset's page size.
If applicable, you should also set a filter on the dataset so it only retrieves the items you need.
Next, you can optionally set up a way for site visitors to load more items or navigate through pages of items. This allows you to speed up the initial loading time of your page while still eventually displaying all items your site visitor wants to see.
To do so, add a button or buttons to your page and set their click actions to the Load More, Previous Page, or Next Page actions.
You can also limit the amount of data you retrieve when using code to populate elements.
Start by using filtering functions, such as eq()
and gt()
, to only query for relevant items. Then you can add the limit
function to your query chain to only retrieve some of the relevant items at first. The limit you set also defines how many items are retrieved in each page of query results. When you need more items later, you can use the various paging functions of the query result, such as hasNext()
and next()
, to retrieve additional pages of items.
For example, here we populate a repeater with data from a query that only retrieves "active" items. We begin by only retrieving the first six items and show a "load more" button if there are additional items to show. When there are no more items to show, the "load more" button is collapsed.
Another approach that can be used when you have a lot of data you want to display is delayed loading. When using delayed loading, you first download a small number of items, which will load quickly, and present those items to site visitors. You can then download the rest of the data at a later time.
For example, suppose you want to display items from a collection in a repeater. Instead of showing all the items from your collection when the page loads, you can start by showing only the items site visitors see at first. Typically, you want to download enough data to populate what a site visitor sees when the page loads (known as "above the fold"). Additional items, those that a visitor would have to scroll to see (known as "below the fold"), you can download in the background after the page has loaded.
Delayed loading of data when using datasets requires you to change the settings of the dataset and to add a little bit of code to your page.
First, set the number of items to show in your repeater using the Number of items to display setting in your dataset's settings. Set it to the number of items that are shown above the fold. This not only limits the number of items displayed but also limits the number of items retrieved by the dataset at one time. The number of items retrieved at one time is also known as the dataset's page size.
If applicable, you should also set a filter on the dataset so it only retrieves the items you need.
Next, add code to your page that incrementally downloads the rest of the items from your collection one page at a time and displays them in the repeater.
This code sets an event handler that runs when the dataset has loaded its first set of data. The event handler checks whether there are any additional pages of data to download. If there is, it downloads a page of data and then checks again, until there are no more pages left.
Start by using filtering functions, such as eq()
and gt()
, to only query for relevant items. Then you can add the limit
function to your query chain to only retrieve some of the relevant items at first. The limit you set also defines how many items are retrieved in each page of query results. Limit your query to the number of items that are shown above the fold. Once those items are loaded, you can use the various paging functions of the query result, such as hasNext()
and next()
, to retrieve additional pages of items.
For example, here we populate a repeater with data from a query that only retrieves "active" items. We begin by only retrieving the first six items, which take up all the space above the fold on our site. After those items are loaded, we query the remaining items and add them to the repeater as they are retrieved.
When retrieving data on a page, you can retrieve only the specific fields that you need. Often times, you will be retrieving data from a collection or 3rd-party source, but only use a subset of the data you retrieve. On the server, in backend code, you can map the large set of fields to the smaller subset that you actually need, and then send only that smaller subset to the browser.
For example, suppose you have a collection that contains a large number of fields. To display the information you have an index page that shows just a bit of information about each item. Then, when a site visitor clicks a link, you display the rest of the data on a page dedicated to displaying one item at a time.
When loading the index page, there is no reason to retrieve all of the data that exists in the collection for each item since you will only be displaying part of that data.
Instead, for the index page, you can retrieve the collection data in a backend web method. Then, in backend code on the server, you can map the retrieved data objects to objects with fewer properties, holding only the data you need. That way, a smaller amount of data needs to make the time-consuming trip from the server to the browser.
Now, when you call your backend function from the browser, only the data you need is returned. You can use that data as usual to populate your repeater.
When loading the item page, you don't need to add any extra code since you'll only be retrieving one item from your collection.
Sometimes, when a page is loading, it is better to download all the data you will eventually need and store it for later instead of downloading it incrementally as it is needed. This approach is often used when the data that is displayed will change based on a visitor interacting with it. Since you've downloaded all the data upfront, when you need to change the data that is displayed, the page transitions smoothly. If you would have to download new data every time a visitor interacts with the page, the page would react slowly and you might end up downloading the same data several times.
To save some of the initial page loading time, you can combine this approach with only downloading the fields you need, as described above.
For example, suppose you have an index page that displays a large number of items, but it is filterable, so they are not all shown at once. When a site visitor chooses a filter from a dropdown, the subset of items that you display changes.
Here we can download all of the items when the page loads and store it in a global variable. When the filter changes, we don't have to download any new data. Instead, we just pull the relevant items from the data we already downloaded. Switching between filters will be very fast since all the work is done in the browser.
Again, we retrieve the collection data in a backend web method. Then map the retrieved data objects to get only the fields we need. In this case, we added the department field.
Then, when the page loads, we call the backend function from the browser and store the retrieved data in a global variable. Every time we need to get a new subset of the data, we call the getDepartmentData
function which filters out the data we don't need. We call it once when the page loads and then every time the value in the dropdown element changes.
Often you use the same data on more than one page of your site. Instead of retrieving the data on each page that you need it, you can retrieve it just once the first time you need it. Then, you can use the wix-storage-frontend API to store the data to be used on other pages.
For example, here we get some data that will be used on multiple pages. This code can be used on each of those pages, or you can add it to the site code if you need the data on all pages. When the page is loading we check the local storage to see if it already contains data. If it does, we simply parse the string data in a JSON object and store it in the data variable to be used elsewhere on the page. If the data doesn't already exist in storage, we query it from a collection, stringify it, and store it both in the data variable to be used elsewhere on the page and in local storage to be used on other pages.
In the example above we use local storage. Depending on your site's specific needs, you may want to use another type of storage. To do so, you simply substitute the type you want to use in the import statement. To learn more about the types of storage, see the API Reference.
Sometimes, you create a page with data that doesn't necessarily need to be a dynamic page. Instead, you can use a regular page and add a dataset to it. For example, if you have an index page that shows all the elements from a collection, you can use a regular page with a dataset to retrieve and display your collection data.
In such cases, it is often better to use a dynamic page anyway. You can set the dynamic page without any fields added to its URL so it does not filter the data and the page receives all your collection items. Because your page is a dynamic page, it is known that the page expects data and the data is therefore retrieved while the page is being rendered. On the other hand, when you use a regular page, data is retrieved much later.
When possible, use the bulk operations of the wix-data API instead of repeatedly calling the single-item version of the function.
For example, instead of repeatedly calling insert()
to add an array of items to a collection one-by-one, you can call bulkInsert()
and add all of the items at once.
Important:
Adding an index to a collection is in open beta and not yet available to all users.
You can add indexes to your data collection to optimize the performance of database queries and improve data retrieval speed. Without indexes, a query runs filter and sort operations that iterate through every item in the collection. For a small collection, the query time might be negligible. But as the quantity of data increases, each query takes longer to process.
When you add an index, you are providing a map of the collection's data based on specific database fields. The database uses this map to quickly identify and retrieve the rows that match a particular value, significantly reducing query response times.
You can also improve the performance of sorting operations with indexes. When you create an index on the column being sorted, you enable the database to arrange data in the desired order without performing a full table scan, reducing sort times and improving query performance.
Note that while indexes can enhance data retrieval speeds, they can slightly slow down write operations because these operations include updating indexes for each change in addition to making the actual change.