AJAX: stability and reliability issues with heavy server load

    For the last couple of months I have been writing a small Ajax application.

    In short, such a simplified browser Excel is a filter at the top, a data plate at the bottom. The user selects what he wants to edit in the filter, the data is loaded into the bottom label, the user can edit it, after changing the data, they are sent to the server, processed there, written to the database, the server generates new data, graphs, data is sent back, graphs are displayed separately in iFrame. Before me, this application was not ajax, but based on the usual forms, and the user had to wait for a new page load after each filter or data change (since the data in the cells depended on other cells, I had to do a recount after each change), and I was instructed to redo everything under Ajax. I'm not a javascript programmer, I'm from the world of Delphi / Builder, where the entire interface is drawn without problems in half an hour, and then the logic is written. I had to learn a lot on the go, I had to write almost everything myself (manually draw controls and describe events in javascript, simultaneously struggling with incompatibility or different reactions of different browsers to the same code - brrrr, you won’t wish the enemy).

    From the beginning, everything was fine, while I worked with the program as an ajax programmer and our php programmer who implemented server logic (everything is much more complicated there than in the browser - large amounts of computation, communication with different data sources).
    And so, we debugged all the logic and loaded the real data, and testers connected to the project, as well as the customer began to test. And this is where the terrible thing began ...

    The load on the server increased many times, the processor is often 100% loaded, especially if several users are working at the highest level, when after changing one value in the plate you have to recount at all levels below and write the changes to the database data (yes, not ideal planning and designing) - but this did not concern me much, I am only responsible for the client interface.

    It concerned me that the ajax started to fail - either the connection would break, then the error would be 504, then the data would be broken and incorrectly collected eval () - oh (I use the JSON format, I chose for ease of use: json_encode ($ data) on the server and var data = eval ("(" + json_string + ")") in the browser). Moreover, the errors are irregular, difficult to track. Then a few ajax requests sent in a row will return in the wrong order. Or the user changes five values ​​in a row in a row, the system processes everything except the fourth - the request disappeared along the way, well, of course, this never pleased the customer.

    So there was a need for some Connection Pool, which would manage all the requests (I create an ajax object for each request and process it completely independently of the others - with a small load and a small amount of data everything worked very quickly and without failures), I would distinguish between types of requests (changing the data in a strict order and depending on the results of previous requests, changing the filter no more than one at a time, etc., it would also be nice to block input before processing important requests), send requests to page I would like to process them in the order of sending, if correctable communication errors occurred, I would try again, and in case of fatal errors I would cancel the requests, etc.

    I think many have already faced a similar problem and somehow solved it. I’m more interested in not even ready-made code, but an algorithm, a description of the technique. Well, any advice from experienced ajax programmers.

    I ask you to answer only on the merits of the question.

    Thanks for attention.

    Also popular now: