background preloader

Hardening node.js for production part 2: using nginx to avoid node.js load

Hardening node.js for production part 2: using nginx to avoid node.js load
This is part 2 of a quasi-series on hardening node.js for production systems (e.g. the Silly Face Society). The previous article covered a process supervisor that creates multiple node.js processes, listening on different ports for load balancing. This article will focus on HTTP: how to lighten the incoming load on node.js processes. Update: I’ve also posted a part 3 on zero downtime deployments in this setup. Our stack consists of nginx serving external traffic by proxying to upstream node.js processes running express.js. As I’ll explain, nginx is used for almost everything: gzip encoding, static file serving, HTTP caching, SSL handling, load balancing and spoon feeding clients. Too much talk. Also available as a gist. Perhaps this code dump isn’t particularly enlightening: I’ll try to step through the config and give pointers on how this balances the express.js code. The upstream directive specifies that these two instances work in tandem as an upstream server for nginx.

Relative Media Blog | Getting sailsjs and ghost to play nice on DigitalOcean through nginx So my front-end website uses SailsJs 0.9.3 and my blog uses Ghost. Since both are nodejs based I didn't want to have to host them on seperate droplets. I never tinkered with nginx or reverse proxying before but decided it was time. For those that aren't aware what reverse proxying is. Wikipedia explains it quit horribly.. In computer networks, a reverse proxy is a type of proxy server that retrieves resources on behalf of a client from one or more servers. Basically, nginx can listen on port 80 for incoming requests from example.com and then forward them to some other server:port without the "client" knowing. This allows me to forward gorelative.com:80 to localhost:1337 and blog.gorelative.com:80 to localhost:2368. Its very easy to setup, and these steps assume a few prerequisites. first install nginx and the required dependencies.. sudo apt-get install nginx; Now that nginx is installed, its time to write up some configuration to reverse proxy stuff out. sudo service nginx restart;

Node.js on multi-core machines Nginx Load Balancer and Reverse Proxy for Node.js Applications On Digital Ocean Joe McCann Digitial Ocean is rad. A modern VPS with SSD servers for super cheap. Easy to spin up or down. I recently moved a bunch of my static sites to one machine on Digital Ocean. However, to make these sites highly available, I needed to reconfigure my infrastructure a bit. Machine 1 with Nginx installed (to act as load balancer and reverse proxy)Machine 2 with Node.js installed (to serve up the static sites)Machine 3 which is an exact clone of Machine 2 I created these "droplets", all running Ubuntu 13.04 x64, on Digital Ocean pretty easily and installed Nginx on machine 1 and node.js on machines 2 and 3. For all seven of the websites, I upated their respective A records to point to the load balancer's (machine 1) IP address. Machines 2 and 3 have their own respective IP addresses to which they are referred in the Nginx configuration files. For every site, I have an Nginx config file that is similar to the following: And voilà, there you have it.

multithreading - NodeJS in MultiCore System Deploy multiple Node applications on one web server in subdomains with Nginx Earlier I wrote about deploying multiple Node applications on one web server in subfolders with Nginx. Even though this approach is fully viable, you should not use it unless there are some really important reasons forcing you to go for it. Given that the application is mounted to a subfolder, you should use relative URLs only in pages. Otherwise the application location must be configured in both Nginx and application itself. Use of relative URLs has a couple of major drawbacks: If you want to move a page within hierarchy, you need to update its content. To minimize the maintenance complexity and avoid performance downgrade I decided to deploy Node applications in subdomains. Nginx configuration for this setup is very similar to the one with subfolders and even a little bit more simple: Finally you need to configure a DNS record for pet-project.myhost pointing to your server.

Blazing fast node.js: 10 performance tips from LinkedIn Mobile In a previous post, we discussed how we test LinkedIn's mobile stack, including our Node.js mobile server. Today, we’ll tell you how we make this mobile server fast. Here are our top 10 performance takeaways for working with Node.js: 1. Avoid synchronous code By design, Node.js is single threaded. Unfortunately, it is still possible to make synchronous/blocking calls. Our initial logging implementation accidentally included a synchronous call to write to disc. 2. The Node.js http client automatically uses socket pooling: by default, this limits you to 5 sockets per host. 3. For static assets, such as CSS and images, use a standard webserver instead of Node.js. 4. Let's quickly compare rendering a page server-side vs. client-side. Note that everything on this page, except for the user's name, is static: that is, it's identical for every user and page reload. The rest of the page - all the static HTML markup - can be put into a JavaScript template (such as an underscore.js template): 5. 6.

node.js + nginx - And now Optimising NginX, Node.JS and networking for heavy workloads | GoSquared Engineering Used in conjunction, NginX and Node.JS are the perfect partnership for high-throughput web applications. They’re both built using event-driven design principles and are able to scale to levels far beyond the classic C10K limitations afflicting standard web servers such as Apache. Out-of-the-box configuration will get you pretty far, but when you need to start serving upwards of thousands of requests per second on commodity hardware, there’s some extra tweaking you must perform to squeeze every ounce of performance out of your servers. This article assumes you’re using NginX’s HttpProxyModule to proxy your traffic to one or more upstream node.js servers. Tuning the network Meticulous configuration of Nginx and Node.js would be futile without first understanding and optimising the transport mechanism over which traffic data is sent. Your system imposes a variety of thresholds and limits on TCP traffic, dictated by its kernel parameter configuration. Higlighting a few of the important ones…

Related: