Passing data from Django to C++ application and back
Asked Answered
F

5

6

We are creating a trading application, where the backend is totally in C++ (using QuickFix engine). We would like to build a web application in Django on top of this backend, where the user can place his orders. Both the Django (python) and the C++ application will be running in their own processes and address space. What do you think would be the best idea to pass orders/messages from Django to C++?

Also, this is a trading application, so latency is the biggest concern. So, I do not want to put orders into a database from Django and then fetch from C++ application.

I'm currently looking at doing it via shared memory or some other IPC mechanism. Is this a good idea?

Fango answered 1/9, 2011 at 18:59 Comment(0)
F
2

Well you have to use some IPC method. One that you don't mention here is having the C++ process listen to a socket. That would add in flexibility (with slight speed cost) that the processes don't even need to be on the same machine.

I've been doing a sort of similar thing, coming from C++ but wanting to write UX in python. My computational backend is C++, and I compile a python module and generate html with flask for the UX. My C++ and python live in the same process so I haven't addressed your core question in practice yet.

One piece of advice I would give is to keep all of your IPC stuff in C++, and write a small python module in C++ using Boost.Python. This will let the python process doing 95% of the work in a pythony world, but give you the bit-level confidence I would want as a C++ dev for the data you are sending over to C++. Boost.Python has made bridging C++ and python web frameworks a breeze for me.

Frederico answered 1/9, 2011 at 19:13 Comment(3)
I'm just curious how your C++ and python live in the same process? boost.python?Fango
@Lazylabs: Yes, Boost.python (and even python itself, but the C-API is a bit.. well.. C) allows for embedding, i.e. launching the python interpreter from C/C++. You could also write a python wrapper over your c++'s "main" (which would presumably create some threads and return) and thus have the app launched from python instead. It makes little difference either way (unless your both libs require an event loop on the main thread.)Hypogeal
Actually @Hypogeal I am doing extending rather than embedding. Python is the OS entrypoint to my process, and I have Boost.Python outputting a DLL that works as a PYD python module. Python loads the PYD (== C++ DLL) dynamically when my module is loaded in python. From there, python can instantiate wrapped C++ classes and call wrapped C++ methods.Frederico
T
1

You have to come up with an existing protocol or create your own protocol that will allow communication between C++ and Python. The easiest way I believe would be to use some IPC like ZeroC Ice, or CORBA. Alternatively, you can throw in a native C++ code into Python and use that from Django, that could as well use QuickFIX.

And if you are really concerned about latency (at least milliseconds matter, not saying about nanoseconds) - you shouldn't use QuickFIX or Python at all.

Twitt answered 1/9, 2011 at 19:7 Comment(0)
R
1

I would use zeromq for the IPC

Reiss answered 1/9, 2011 at 19:24 Comment(1)
Thanks. Zeromq looks promising. I'll investigate more into this. And, scaling acorss multiple systems would not be a problem with this setup, which might be a problem in case of shared memory.Fango
H
0

I'd probably go for something like JSON-RPC and communicate over local sockets or named pipes.

Shared memory is faster, but trickier to get right if you have to do it yourself (it implies concurrency and locking, which, IMO, one should avoid if possible.)

It depends on message sizes and latency requirements. And you could always try an IPC mechanism that can work over shared memory, like vlad mentions in the comment below.

(Do note that having an IPC-system thatn can fall back on pipes/sockets might be a good thing, in case you need to cluster your system in the future).

Hypogeal answered 1/9, 2011 at 19:8 Comment(2)
As a matter of fact, shared memory is faster than sockets etc. Many good OSs implement local IPC trough shared memory.Twitt
@Vlad: I was a bit unclear. I cheifly meant that implementing your own IPC layer in shared memory is more difficult than doing so over sockets or pipes.Hypogeal
C
0

You sould/can use Microservice Architecture with message-brokers so you cant easily connect your apps to each other.

Cuneiform answered 23/7, 2022 at 9:58 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.