
The story of one little experiment with Node.js
Here's what can happen if you make an application on unfamiliar technologies with a limited time.
This article will be interesting most likely for beginners or just for fun. Be careful the article is full of subjective assessments and stupid reasoning of the author.
It all started with this article and a long-standing desire to get to know Node.js. It was impossible to miss such a combination of circumstances). What came of it you will find under the cut.
So, let's begin. Task
It was decided to make a small and fluffy web application for hosting formulas. Well, that is, you come in, enter the formula, press the button, get a link to the render of the formula and rejoice. We will introduce formulas in MathML. Well, it would be cool to still be able to find the formula that interests us and get its render or MathML. It is important to minimize resource consumption.
The task is defined, then architecture.
As a result of the above, the following ensued:
Further about everything a little more detailed
Client / UI / Browser
The client is already pretty standard AngularJS. In my opinion, there is nothing better for resolving binding and building an interface, in my opinion this is the main problem when writing web applications and the angular deals with it with a 5 plus. In addition to the hangar on the client, I used Typescript, Jade, and Sass.
We take the Font Awesome icons, the palette from the Google material here, and so far we can do without JQuery.
Server
Here everything is much worse than it might seem at first glance. On the note, I never wrote anything and this is my first pancake, which turned out to be not so lumpy.
The first thing that surprised me was the number of different libraries. Of course, this is not java, but for the task I found everything that
intel could have required for me - a rather convenient and functional library for logging in the
config node - an easy way to use the
mongoose configuration files - a kind of hibernate between the node and the
express monga - a web framework for the node. Simplifies the already not complicated process of creating a web server on a node
body-parser - processing POST requests, which follows from the name
Agenda - job scheduling
Libxmljs - working with XML, which is important without a JVM in dependencies.
The quality of these modules is sometimes not what we would like). For example libxmljs ... In ubuntu it is not put without a tambourine something like this
On the server side, I also used Typescript for exactly the same reasons as on the client. But there were some problems on this side ... If the client managed to find an approach where the code didn’t get complicated, then using the Typescript node made the code a bit more complicated / confused, but not so much to refuse it.
mongoose
This is an interesting library that allows you to create data schemes, automatically validate, save and read data from monga in accordance with the scheme. This is a very useful and convenient thing. Mongoose simplifies working with a monga from a node to indecent ...
For example, like this, you can save a new formula to the database
Or find a formula
Agenda
Queue is an integral part of algorithms similar to the developed formula conversion algorithm due to performance problems. A large stream of conversion requests can kill our application very quickly. The most logical and easiest way is to queue.
It is worth noting that for the organization of queues there are several libraries for Node.js. I chose Agenda for the following reasons:
Add job to queue
Register job handler
Libxmljs
It is important and even critical to verify the given solution in the request. For data validation, I managed to google enough interesting express-validator and validator.js libraries that allow me to perform the same validation quite efficiently. But I needed to check MathML and for this I met two libraries xsd-schema-validator and Libxmljs . The first one uses Java / SAX, which means you need to have a JVM and something went wrong with it from the first minutes, so as a result I used Libxmljs. No Java is needed for her work, but it was she who brought bugs for which I spent a lot of time.
For the validation of MathML, I used the schemes of the second version, which I got here.
Something like this looks like validation
Rendering
To render formulas we use MathJax in conjunction with PhantomJS. Not the fastest and most stable option, but it turned out to be such a bunch quickly enough that in a limited time is an undeniable advantage;).
MongoDB MongoDB is used
to store formulas. Formula hosting involves quick access to them. For this reason, not only MathML is stored in the monguas, but also a render of the formulas (not because of quick access, but because they need to be stored somewhere). Pictures do not take up much space and just fit into the recommendations to store them as a field in the database without using additional libraries and frameworks such as GridFS. On the other hand, storing rendered images in Mong gives all the advantages of storing data in a database rather than stupid on a file system.
Data schema
Build the project
To build the project, I used Gulp and this was the first and 'strange' experience ... I was pleased with the quantity and quality of the plugins for it. The first time I wrote a script for Gulp after Grunt, I received a script that lived its own life, it did not at all what I expected. The reason for everything is my brain that is not parallel oriented after Grunt. After realizing the errors, I began to redo the build script, but it was a bit late and at the moment the work was not completed. Now the assembly is performed by a bash script that invokes a couple of Gulp tasks. It’s not at all beautiful but it works and the alteration is almost complete. Now the problem is mainly with the compilation of the time script, with the chosen approach (compilation into just one file), it is not possible to catch the end of the compilation process. I will be glad to help here.
Application hosting
I placed the application on two micro-instances in a Google cloud using a generous gift of $ 300 for experiments. Incidentally. Well, again, the twofold feeling of using the cloud. Firstly, it is very, very cool and I was very pleased with the use of cloud technology (at least at the current stage). Well and, naturally, in a barrel of honey without a fly in the ointment. The whole problem is that right now I couldn’t write an adequate deployment script that would install the application into the instance in the cloud along with the installation of the node modules. The procedure works fine when using debian on a local virtual machine, but in debian, the file access error suggested when creating the instance in the cloud was an error accessing files when executing npm install, since new files are created during this procedure from the user and not from the root even if the command is executed on his behalf. Perhaps the problem is in crooked hands, but now the installation is carried out in two stages. On the first, we create all the modules on behalf of the user, and on the second, we install from the root. In general, the cloud is good, especially if it is a Google cloud;)
Well, so that life does not seem to be honey, we select a domain in the RF zone. There were no special problems. The only thing in the Nginx settings is to specify the punycode name and not put Russian letters there.
For my Nginx, it looks like this
Application and JSON
The result is a very interesting system in which data is represented by JSON objects at all stages from the interface in the browser, continuing the node and ending with the database. This is my first experience with such a system, and to be honest, I got a kind of pleasure from it. This is a trifle when your application does not have data transformations, but for me personally this trifle gives some transparency, which makes me personally good :).
Well, what happened in the end
And it turned out a simple web application that allows you to host formulas. With comparatively low iron resources (two micro-instances of the Google Cloud), it was possible to make a fairly simple and load-resistant application.
In general, I was satisfied with the technologies described above. I was pleasantly surprised by the number and quality of libraries for the node and the ease of writing code. An unpleasant residue was left by a couple of bugs in the libraries of the node. Sometimes even everything seemed terribly raw and unfinished. But the errors turned out to be solvable and did not take too much time. By the way, I spent less time on parsing and fixing errors (including one revision in the source) than when setting up and comprehending some libraries in Java. Do not throw poop, I understand the difference in the level of nodes and Java, I'm talking only about the sensations.
Result
Code
I would be glad to constructive criticism and just reviews.
This article will be interesting most likely for beginners or just for fun. Be careful the article is full of subjective assessments and stupid reasoning of the author.
It all started with this article and a long-standing desire to get to know Node.js. It was impossible to miss such a combination of circumstances). What came of it you will find under the cut.
So, let's begin. Task
It was decided to make a small and fluffy web application for hosting formulas. Well, that is, you come in, enter the formula, press the button, get a link to the render of the formula and rejoice. We will introduce formulas in MathML. Well, it would be cool to still be able to find the formula that interests us and get its render or MathML. It is important to minimize resource consumption.
The task is defined, then architecture.
As a result of the above, the following ensued:
- The application is strictly divided into two parts: client and server
- Communication of the client with the server through “almost” :) RESTfull
- On Node.js Server
- As a MongoDB database
Further about everything a little more detailed
Client / UI / Browser
The client is already pretty standard AngularJS. In my opinion, there is nothing better for resolving binding and building an interface, in my opinion this is the main problem when writing web applications and the angular deals with it with a 5 plus. In addition to the hangar on the client, I used Typescript, Jade, and Sass.
- Typescript - the possibilities are relatively simpler to break the code into modules and of course writing typed code and using human classes will not replace anything, no matter how many perverted JSs are screaming.
- Jade - beautiful, modules, functions, minimalism.
- Sass - due to the ability to beat code into modules, variables, etc. little things that make my life a little nicer.
We take the Font Awesome icons, the palette from the Google material here, and so far we can do without JQuery.
Server
Here everything is much worse than it might seem at first glance. On the note, I never wrote anything and this is my first pancake, which turned out to be not so lumpy.
The first thing that surprised me was the number of different libraries. Of course, this is not java, but for the task I found everything that
intel could have required for me - a rather convenient and functional library for logging in the
config node - an easy way to use the
mongoose configuration files - a kind of hibernate between the node and the
express monga - a web framework for the node. Simplifies the already not complicated process of creating a web server on a node
body-parser - processing POST requests, which follows from the name
Agenda - job scheduling
Libxmljs - working with XML, which is important without a JVM in dependencies.
The quality of these modules is sometimes not what we would like). For example libxmljs ... In ubuntu it is not put without a tambourine something like this
cp build/Release/lib.target/xmljs.node build/xmljs.node
On the server side, I also used Typescript for exactly the same reasons as on the client. But there were some problems on this side ... If the client managed to find an approach where the code didn’t get complicated, then using the Typescript node made the code a bit more complicated / confused, but not so much to refuse it.
mongoose
This is an interesting library that allows you to create data schemes, automatically validate, save and read data from monga in accordance with the scheme. This is a very useful and convenient thing. Mongoose simplifies working with a monga from a node to indecent ...
For example, like this, you can save a new formula to the database
var formula: any = new entity.Formula();
formula.name = req.body.name;
formula.description = req.body.descr;
formula.mathml = req.body.data;
formula.save();
Or find a formula
entity.Formula.find( {_id: req.params.id} )
.select(bdKey)
.exec( 'find', (err, formula): void => {
if ( err || (formula.length !== 1) ) {
res.send( 404 );
} else {
res.header("Content-Type", "image/png");
res.send(formula[0][bdKey], {}, function (err) {});
}
});
Agenda
Queue is an integral part of algorithms similar to the developed formula conversion algorithm due to performance problems. A large stream of conversion requests can kill our application very quickly. The most logical and easiest way is to queue.
It is worth noting that for the organization of queues there are several libraries for Node.js. I chose Agenda for the following reasons:
- It is possible to save tasks in MongoDB
- The library allows you to organize not only a queue in its usual sense, but also why it is scheduled according to schedule, which is a very pleasant addition.
Add job to queue
service.agenda.now( 'process new formula', {fid: fid} );
Register job handler
service.agenda.define('process new formula', (job, done): void => {
var data = job.attrs.data;
log.debug( "process new formula: " + data.fid );
AgendaService.processRendering( data.fid, done );
});
Libxmljs
It is important and even critical to verify the given solution in the request. For data validation, I managed to google enough interesting express-validator and validator.js libraries that allow me to perform the same validation quite efficiently. But I needed to check MathML and for this I met two libraries xsd-schema-validator and Libxmljs . The first one uses Java / SAX, which means you need to have a JVM and something went wrong with it from the first minutes, so as a result I used Libxmljs. No Java is needed for her work, but it was she who brought bugs for which I spent a lot of time.
For the validation of MathML, I used the schemes of the second version, which I got here.
Something like this looks like validation
fs.readFile('mathml2.xsd', {encoding:'utf-8'}, (err, data): void => {
try {
if (err) { process.chdir(cwd); fail(); return }
var xsdDoc = libxmljs.parseXmlString(data);
var xmlDoc = libxmljs.parseXmlString(xsd);
var valid = xmlDoc.validate(xsdDoc);
log.debug( "xsd validation" + valid );
process.chdir(cwd);
if ( valid ) ok(); else fail();
} catch(e) {
process.chdir(cwd);
fail();
}
});
Rendering
To render formulas we use MathJax in conjunction with PhantomJS. Not the fastest and most stable option, but it turned out to be such a bunch quickly enough that in a limited time is an undeniable advantage;).
MongoDB MongoDB is used
to store formulas. Formula hosting involves quick access to them. For this reason, not only MathML is stored in the monguas, but also a render of the formulas (not because of quick access, but because they need to be stored somewhere). Pictures do not take up much space and just fit into the recommendations to store them as a field in the database without using additional libraries and frameworks such as GridFS. On the other hand, storing rendered images in Mong gives all the advantages of storing data in a database rather than stupid on a file system.
Data schema
module entity.schema
{
export var IFormula: any = new mongoose.Schema({
name: {type: String, default: 'Формула'},
description: {type: String, default: ''},
created: {type: Date, default: Date.now},
modified: {type: Date, default: Date.now},
mathml: {type: String, default: ''},
png200: {type: Buffer},
png100: {type: Buffer},
png50: {type: Buffer},
png200t: {type: Buffer},
png100t: {type: Buffer},
png50t: {type: Buffer},
ready: {type: Boolean, default: false},
error: {type: Boolean, default: false}
});
}
Build the project
To build the project, I used Gulp and this was the first and 'strange' experience ... I was pleased with the quantity and quality of the plugins for it. The first time I wrote a script for Gulp after Grunt, I received a script that lived its own life, it did not at all what I expected. The reason for everything is my brain that is not parallel oriented after Grunt. After realizing the errors, I began to redo the build script, but it was a bit late and at the moment the work was not completed. Now the assembly is performed by a bash script that invokes a couple of Gulp tasks. It’s not at all beautiful but it works and the alteration is almost complete. Now the problem is mainly with the compilation of the time script, with the chosen approach (compilation into just one file), it is not possible to catch the end of the compilation process. I will be glad to help here.
Application hosting
I placed the application on two micro-instances in a Google cloud using a generous gift of $ 300 for experiments. Incidentally. Well, again, the twofold feeling of using the cloud. Firstly, it is very, very cool and I was very pleased with the use of cloud technology (at least at the current stage). Well and, naturally, in a barrel of honey without a fly in the ointment. The whole problem is that right now I couldn’t write an adequate deployment script that would install the application into the instance in the cloud along with the installation of the node modules. The procedure works fine when using debian on a local virtual machine, but in debian, the file access error suggested when creating the instance in the cloud was an error accessing files when executing npm install, since new files are created during this procedure from the user and not from the root even if the command is executed on his behalf. Perhaps the problem is in crooked hands, but now the installation is carried out in two stages. On the first, we create all the modules on behalf of the user, and on the second, we install from the root. In general, the cloud is good, especially if it is a Google cloud;)
Well, so that life does not seem to be honey, we select a domain in the RF zone. There were no special problems. The only thing in the Nginx settings is to specify the punycode name and not put Russian letters there.
For my Nginx, it looks like this
listen 80;
server_name xn--c1ajpkj4do.xn--p1ai;
Application and JSON
The result is a very interesting system in which data is represented by JSON objects at all stages from the interface in the browser, continuing the node and ending with the database. This is my first experience with such a system, and to be honest, I got a kind of pleasure from it. This is a trifle when your application does not have data transformations, but for me personally this trifle gives some transparency, which makes me personally good :).
Well, what happened in the end
And it turned out a simple web application that allows you to host formulas. With comparatively low iron resources (two micro-instances of the Google Cloud), it was possible to make a fairly simple and load-resistant application.
In general, I was satisfied with the technologies described above. I was pleasantly surprised by the number and quality of libraries for the node and the ease of writing code. An unpleasant residue was left by a couple of bugs in the libraries of the node. Sometimes even everything seemed terribly raw and unfinished. But the errors turned out to be solvable and did not take too much time. By the way, I spent less time on parsing and fixing errors (including one revision in the source) than when setting up and comprehending some libraries in Java. Do not throw poop, I understand the difference in the level of nodes and Java, I'm talking only about the sensations.
Result
Code
I would be glad to constructive criticism and just reviews.