I have a web socket that receives data from a web socket server every 100 to 200ms, ( I have tried both with a shared web worker as well as all in the main.js file),
When new JSON data arrives my main.js runs filter_json_run_all(json_data) which updates Tabulator.js & Dygraph.js Tables & Graphs with some custom color coding based on if values are increasing or decreasing
1) web socket json data ( every 100ms or less) -> 2) run function filter_json_run_all(json_data) (takes 150 to 200ms) -> 3) repeat 1 & 2 forever
Quickly the timestamp of the incoming json data gets delayed versus the actual time (json_time 15:30:12 vs actual time: 15:31:30) since the filter_json_run_all is causing a backlog in operations.
So it causes users on different PC’s to have websocket sync issues, based on when they opened or refreshed the website.
This is only caused by the long filter_json_run_all() function, otherwise if all I did was console.log(json_data) they would be perfectly in sync.
Please I would be very very grateful if anyone has any ideas how I can prevent this sort of blocking / backlog of incoming JSON websocket data caused by a slow running javascript function 🙂
I tried using a shared web worker which works but it doesn’t get around the delay in main.js blocked by filter_json_run_all(), I dont thing I can put filter_json_run_all() since all the graph & table objects are defined in main & also I have callbacks for when I click on a table to update a value manually (Bi directional web socket)
If you have any ideas or tips at all I will be very grateful 🙂
worker.js:
const connectedPorts = []; // Create socket instance. var socket = new WebSocket( 'ws://' + 'ip:port' + '/ws/' ); // Send initial package on open. socket.addEventListener('open', () => { const package = JSON.stringify({ "time": 123456, "channel": "futures.tickers", "event": "subscribe", "payload": ["BTC_USD", "ETH_USD"] }); socket.send(package); }); // Send data from socket to all open tabs. socket.addEventListener('message', ({ data }) => { const package = JSON.parse(data); connectedPorts.forEach(port => port.postMessage(package)); }); /** * When a new thread is connected to the shared worker, * start listening for messages from the new thread. */ self.addEventListener('connect', ({ ports }) => { const port = ports[0]; // Add this new port to the list of connected ports. connectedPorts.push(port); /** * Receive data from main thread and determine which * actions it should take based on the received data. */ port.addEventListener('message', ({ data }) => { const { action, value } = data; // Send message to socket. if (action === 'send') { socket.send(JSON.stringify(value)); // Remove port from connected ports list. } else if (action === 'unload') { const index = connectedPorts.indexOf(port); connectedPorts.splice(index, 1); } });
Main.js This is only part of filter_json_run_all which continues on for about 6 or 7 Tabulator & Dygraph objects. I wante to give an idea of some of the operations called with SetTimeout() etc
function filter_json_run_all(json_str){ const startTime = performance.now(); const data_in_array = json_str //JSON.parse(json_str.data); // if ('DATETIME' in data_in_array){ // var milliseconds = (new Date()).getTime() - Date.parse(data_in_array['DATETIME']); // console.log("milliseconds: " + milliseconds); // } if (summary in data_in_array){ if("DATETIME" in data_in_array){ var time_str = data_in_array["DATETIME"]; element_time.innerHTML = time_str; } // summary Data const summary_array = data_in_array[summary]; var old_sum_arr_krw = []; var old_sum_arr_irn = []; var old_sum_arr_ntn = []; var old_sum_arr_ccn = []; var old_sum_arr_ihn = []; var old_sum_arr_ppn = []; var filtered_array_krw_summary = filterByProperty_summary(summary_array, "KWN") old_sum_arr_krw.unshift(Table_summary_krw.getData()); Table_summary_krw.replaceData(filtered_array_krw_summary); //Colour table color_table(filtered_array_krw_summary, old_sum_arr_krw, Table_summary_krw); var filtered_array_irn_summary = filterByProperty_summary(summary_array, "IRN") old_sum_arr_irn.unshift(Table_summary_inr.getData()); Table_summary_inr.replaceData(filtered_array_irn_summary); //Colour table color_table(filtered_array_irn_summary, old_sum_arr_irn, Table_summary_inr); var filtered_array_ntn_summary = filterByProperty_summary(summary_array, "NTN") old_sum_arr_ntn.unshift(Table_summary_twd.getData()); Table_summary_twd.replaceData(filtered_array_ntn_summary); //Colour table color_table(filtered_array_ntn_summary, old_sum_arr_ntn, Table_summary_twd); // remove formatting on fwds curves setTimeout(() => {g_fwd_curve_krw.updateOptions({ 'file': dataFwdKRW, 'labels': ['Time', 'Bid', 'Ask'], strokeWidth: 1, }); }, 200); setTimeout(() => {g_fwd_curve_inr.updateOptions({ 'file': dataFwdINR, 'labels': ['Time', 'Bid', 'Ask'], strokeWidth: 1, }); }, 200); // remove_colors //([askTable_krw, askTable_inr, askTable_twd, askTable_cny, askTable_idr, askTable_php]) setTimeout(() => { askTable_krw.getRows().forEach(function (item, index) { row = item.getCells(); row.forEach(function (value_tmp){value_tmp.getElement().style.backgroundColor = '';} )}); }, 200); setTimeout(() => { askTable_inr.getRows().forEach(function (item, index) { row = item.getCells(); row.forEach(function (value_tmp){value_tmp.getElement().style.backgroundColor = '';} )}); }, 200);
color_table Function
function color_table(new_arr, old_array, table_obj){ // If length is not equal if(new_arr.length!=old_array[0].length) console.log("Diff length"); else { // Comparing each element of array for(var i=0;i<new_arr.length;i++) //iterate old dict dict for (const [key, value] of Object.entries(old_array[0][i])) { if(value == new_arr[i][key]) {} else{ // console.log("Different element"); if(key!="TENOR") // console.log(table_obj) table_obj.getRows()[i].getCell(key).getElement().style.backgroundColor = 'yellow'; if(key!="TIME") if(value < new_arr[i][key]) //green going up //text_to_speech(new_arr[i]['CCY'] + ' ' +new_arr[i]['TENOR']+ ' getting bid') table_obj.getRows()[i].getCell(key).getElement().style.backgroundColor = 'Chartreuse'; if(key!="TIME") if(value > new_arr[i][key]) //red going down table_obj.getRows()[i].getCell(key).getElement().style.backgroundColor = 'Crimson'; } } } }
Potential fudge / solution, thanks Aaron :):
function limiter(fn, wait){ let isCalled = false, calls = []; let caller = function(){ if (calls.length && !isCalled){ isCalled = true; if (calls.length >2){ calls.splice(0,calls.length-1) //remove zero' upto n-1 function calls from array/ queue } calls.shift().call(); setTimeout(function(){ isCalled = false; caller(); }, wait); } }; return function(){ calls.push(fn.bind(this, ...arguments)); // let args = Array.prototype.slice.call(arguments); // calls.push(fn.bind.apply(fn, [this].concat(args))); caller(); }; }
This is then defined as a constant for a web worker to call:
const filter_json_run_allLimited = limiter(data => { filter_json_run_all(data); }, 300); // 300ms for examples
Web worker calls the limited function when new web socket data arrives:
// Event to listen for incoming data from the worker and update the DOM. webSocketWorker.port.addEventListener('message', ({ data }) => { // Limited function filter_json_run_allLimited(data); });
Please if anyone knows how websites like tradingview or real time high performance data streaming sites allow for low latency visualisation updates, please may you comment, reply below 🙂
Advertisement
Answer
I’m reticent to take a stab at answering this for real without knowing what’s going on in color_table
. My hunch, based on the behavior you’re describing is that filter_json_run_all
is being forced to wait on a congested DOM manipulation/render pipeline as HTML is being updated to achieve the color-coding for your updated table elements.
I see you’re already taking some measures to prevent some of these DOM manipulations from blocking this function’s execution (via setTimeout
). If color_table
isn’t already employing a similar strategy, that’d be the first thing I’d focus on refactoring to unclog things here.
It might also be worth throwing these DOM updates for processed events into a simple queue, so that if slow browser behavior creates a rendering backlog, the function actually responsible for invoking pending DOM manipulations can elect to skip outdated render operations to keep the UI acceptably snappy.
Edit: a basic queueing system might involve the following components:
- The queue, itself (this can be a simple array, it just needs to be accessible to both of the components below).
- A queue appender, which runs during
filter_json_run_all
, simply adding objects to the end of the queue representing each DOM manipulation job you plan to complete usingcolor_table
or one of your setTimeout` callbacks. These objects should contain the operation to performed (i.e: the function definition, uninvoked), and the parameters for that operation (i.e: the arguments you’re passing into each function). - A queue runner, which runs on its own interval, and invokes pending DOM manipulation tasks from the front of the queue, removing them as it goes. Since this operation has access to all of the objects in the queue, it can also take steps to optimize/combine similar operations to minimize the amount of repainting it’s asking the browser to do before subsequent code can be executed. For example, if you’ve got several
color_table
operations that coloring the same cell multiple times, you can simply perform this operation once with the data from the lastcolor_table
item in the queue involving that cell. Additionally, you can further optimize your interaction with the DOM by invoking the aggregated DOM manipulation operations, themselves, inside a requestAnimationFrame callback, which will ensure that scheduled reflows/repaints happen only when the browser is ready, and is preferable from a performance perspective to DOM manipulation queueing viasetTimeout
/setInterval
.