Listed on the ServiceStack website it shows that ServiceStack can run on Mono with either:
- XSP
- mod_mono
- FastCgi
- Console
What are these different configurations and which is preferred for Web Services on Mono?
Listed on the ServiceStack website it shows that ServiceStack can run on Mono with either:
What are these different configurations and which is preferred for Web Services on Mono?
From the v4.5.2 Release ServiceStack now supports .NET Core which offers significant performance and stability improvements over Mono that’s derived from a shared cross-platform code-base and supported by Microsoft's well-resourced, active and responsive team. If you’re currently running ServiceStack on Mono, we strongly recommend upgrading to .NET Core to take advantage of its superior performance, stability and its top-to-bottom supported Technology Stack.
Our recommended Setup for hosting ASP .NET sites on Linux and Mono is to use nginx/HyperFastCgi. We've published a step-by-step guide going through creating an Ubuntu VM from scratch complete with deploy / install / conf / init scripts at mono-server-config.
We're no longer recommending MonoFastCGI after noticing several stability and performance issues. This blog post provides a good analysis of the performance, memory usage and stability of the different ASP.NET Hosting options in Mono.
XSP is similar to VS.NET WebDev server - a simple standalone ASP.NET WebServer written in C#. This is suitable for development or small work loads. You just run it from the root directory of your ServiceStack ASP.NET host which will make it available at http://localhost:8080
.
For external internet services you generally want to host ServiceStack web services as part of a full-featured Web Server. The 2 most popular full-featured web servers for Linux are:
Use Mono FastCGI to host ServiceStack ASP.NET hosts in Nginx.
Use mod_mono to host ServiceStack ASP.NET hosts in an Apache HTTP Server.
ServiceStack also supports self-hosting which lets you run your ServiceStack webservices on its own in a standalone Console application (i.e. without a web server). This is a good idea when you don't need the services of a full-featured web server (e.g: you just need to host web services internally on an Intranet).
By default the same ServiceStack Console app binary runs on both Windows/.NET and Mono/Linux as-is. Although if you wish, you can easily daemonize your application to run as a Linux daemon as outlined here. The wiki page also includes instructions for configuring your self-hosted web service to run behind an Nginx or Apache reverse proxy.
Since it provides a good fit for Heroku's Concurrency model as detailed in their 12 factor app self-hosting will be an area we'll be looking to provide increased support around in the near future.
The servicestack.net website itself (inc. all live demos) runs on an Ubuntu hetzner vServer using Nginx + Mono FastCGI.
This command is used to start the FastCGI background process:
fastcgi-mono-server4 --appconfigdir /etc/rc.d/init.d/mono-fastcgi
/socket=tcp:127.0.0.1:9000 /logfile=/var/log/mono/fastcgi.log &
Which hosts all applications defined in *.webapp files in the /etc/rc.d/init.d/mono-fastcgi
folder specified using XSP's WebApp File Format, e.g:
ServiceStack.webapp:
<apps>
<web-application>
<name>ServiceStack.Northwind</name>
<vhost>*</vhost>
<vport>80</vport>
<vpath>/ServiceStack.Northwind</vpath>
<path>/home/mythz/src/ServiceStack.Northwind</path>
</web-application>
</apps>
This runs the FastCGI Mono process in the background which you can get Nginx to connect to by adding this rule to nginx.conf:
location ~ /(ServiceStack|RedisAdminUI|RedisStackOverflow|RestFiles)\.* {
root /usr/share/nginx/mono/servicestack.net/;
index index.html index.htm index.aspx default.htm Default.htm;
fastcgi_index /default.htm;
fastcgi_pass 127.0.0.1:9000;
fastcgi_param SCRIPT_FILENAME /usr/share/servicestack.net$fastcgi_script_name;
include /etc/nginx/fastcgi_params;
}
Which will forward any route starting with /ServiceStack
or /RedisAdminUI
, etc to the FastCGI mono server process for processing. Some example apps hosted this way:
For those interested the full Nginx + FastCGI configuration files for servicestack.net are available for download.
In production we use nginx with unix file sockets
We found a bug/memory leak when using socket communication with nginx, service stack and mono. This was with 500 concurrent requests, whilst you'd expect a spike in cpu and memory it never came back down again. We didn't do any further testing to discover where the problem was but there is a bug logged with xamarin bugzilla that seems similar to the issues we had. Essentially we tried the following and it was good enough for us.
We switched to using unix sockets with the following command params
fastcgi-mono-server4 /filename=/tmp/something.socket /socket=unix /applications=/var/www/
The problem we had with this method is that the permissions of the socket file changed everytime you run fastcgi-mono-server4 so you have to correct them after you've started fastcgi-mono-server4! The other downside is that on our boxes it could only handle about 120 concurrent requests. However this isn't really an issue for us at the moment and you can always spawn more processes.
Hope this helps
Disclaimer: I'm the author of HyperFastCgi server and the author of blog post was mentioned in ceco's answer
nginx with HyperFastCgi do this job. HyperFastCgi does not leak memory as mono fastcgi server and performs much faster, because it uses low-level mono API to pass data between application domains instead of slow mono JIT implementation of cross-domain calls. Also it has option to use native libevent library for sockets communications which is roughly 1.5-2 faster than current mono System.Net.Sockets implementation.
Key features of HyperFastCgi:
Managed Listener with Managed Transport
(uses only managed code, asynchronous System.Net.Sockets. Slow in mono, due to slow JIT cross-domain calls) Managed Listener with Combined Transport
(uses async System.Net.Sockets as listener and low-level mono API for cross-domain calls. Much much faster)Native Listener
(uses native libevent as socket library and low-level mono API to make cross-domain calls. The best performance) Native Listener
makes the web-server works like NodeJS
: all requests are processed in single thread in asynchronous way.There is a helpful and relatively recent blog post regarding the performance of Mono using ServiceStack. I thought it could be of use to some who are about to decide how to host their services: Servicestack performance in Mono.
As it says - the FastCGI Mono server has tons of memory leaks which I can confirm. I ran ab -n 100000 -c 10 http://myurl
on Ubuntu Desktop 14.04 using Mono 3.2.8 and Nginx 1.4.6 and FastCGI Mono Server 3.0.11 and a service written using ServiceStack 3.9.71. I don't think it matters which version of ServiceStack I am using since the FastCGI Mono Server is the leaky bit. It ate all the memory available - about 1Gb out of 2GB in total.
Also, the performance of Nginx + FastCGI Mono Server is bad, at least when compared to other solutions. My sample REST service had about 275 requests per second. The author of the blog had reviewed the code of FastCGI Mono Server and decided to write his own implementation. For some reason it's not working though, at least on my machine.
So the point, I guess, is that you should not use the FastCGI Mono Server. Unless you want to reboot your box often.
As this post is mostly negative I should say what are my intentions regarding hosting my services. I will probably go for self-hosting with an AppHost inheriting AppHostHttpListenerLongRunningBase
behind Nginx. Using the same sample REST service above I get about 1100 requests per second. The better news is that the process had no apparent leaks, I tested it with about 1 000 000 requests and the process had consumed < 100MB RAM.
P.S. I am not the author of the blog post :)
evhttp-sharp - http server with host for NancyFx
https://github.com/kekekeks/evhttp-sharp
Very fast, almost 4 time faster than nancy-libevent2.
http://www.techempower.com/benchmarks/#section=data-r8&hw=i7&test=json&s=2&l=2
There are test results for different configurations:
JSON responses per second:
© 2022 - 2024 — McMap. All rights reserved.