Why is Sparkjava not suitable for production? [closed]
Asked Answered
E

2

13

No, not Apache Spark. Sparkjava I have found to be the simplest approach to APIs and the one that clicks the most with me, but everyone states that it shouldn't be used for production, but no one has given me a good answer why. Do any of you know any unstableness or security flaws or something else?

Enscroll answered 10/11, 2017 at 19:48 Comment(10)
Do you have any sources on who says it shouldn't be used for prod?Regelation
I use spark-java in production and am happy with it!Appomattox
I went to a NFJS conference and they showed this framework off with others, but immediately stated it is not for prod.Enscroll
@Appomattox Can you give more details on your setup that you use so I can have an example of a well-done real life team using it?Enscroll
A bit late at the part, but my team has been running a system in production using Spark java since 2015. Over 30 million registered users and 50 thousand signups a day. Who needs dependency injection and all that bloat anyway. Simplicity over runtime «magic» any day.Tacnaarica
I am in agreement with you.Enscroll
I use SparkJava in production and it will be spread to whole country for an app hopefully. If you know your app requirements and you don't need spring modules you can use in prod. Dependency Injection is a design pattern you can still use DI. Actually I still use Spring (not spring boot) in my app. Even spring is extra for a few injections, I'm planning to remove Spring as well, because my app is not a large scale J2EE and I would like to reduce jar dependencies. Spring boot brings hundreds of MB deps. But my sparkjava app is only a few MB. easy to distribute and deploy. Verdict,depends app.Neurovascular
Also another interesting library as well that I have found that also seems to be slightly more friendly with Java modules is Helidon by Oracle. It seems Sparkjava is not going to be ready for Java modules for awhile.Enscroll
I think the big problem of Spark is it's name. It's really confusing with Apache's Spark Framework. They could have named it Alspark or Spock, or anything other than Spark.Philips
@LEMUELADANE I agree, having to search for Sparkjava or Javaspark or whatever gets old really quick. But Spock is also taken, its a Groovy test framework.Lucylud
L
19

Thanks for asking such a good question and I don't suppose there is a simple yes or no answer to it directly. Let me start with we have been using Java Spark for microservice development for quite some time now. The challenges which faced during our run with the framework

  1. There is not a lot of material available on the internet for Java Spark as compared to other frameworks like Spring Boot, restlet. So most of the times if you are stuck you are the one who has to solve the problem.

  2. There is no dependency-injection. We had to use Spring in conjunction with java Spark to get this feature.

  3. There is not enough information available on the internet when you want to integrate your microservice with service discovery tools or API gateway.

  4. Integrating Spark with swagger was a mess. It took us days to figure out how we can do.

  5. As it is still evolving and has a long way to go we dont prefer an framework like this in prod.

All that said, it is super easy to create a microservice application on spark java. Kudo to them!!!

Please refer to below links for more info:

Lashawnda answered 10/11, 2017 at 20:36 Comment(3)
Thank you, that helps with understanding what real life challenges a real team faced, rather than a lot of speculation that I have heard from others.Enscroll
You can trash Spring Boot and just use Spark.Philips
See SparkJava + Swagger solutions at serol.ro/posts/2016/swagger_sparkjava and github.com/manusant/spark-swaggerArbutus
R
10

That's all a matter of choice. Spark is intended to be very simple and minimal dependencies are required to get a web app up and running. Spark allows us to build a web app by using only the JSE8 platform, while most of the other existing technologies would require JEE, what would end up increasing a lot the learning curve for using them.

When it comes to deploying your app, be aware thar most of the cloud servers and hosters doesn't support the lean framework Spark out of the box, so that you'll end up having to implement a deploy strategy by yourself. But you can easily deploy an Apache or a Java EE app on such services, and Spark can be easily wrapped in an Apache or JEE web server as described in the documentation

Remark answered 11/11, 2017 at 11:40 Comment(1)
Thank you, that helps me to expect that setup will be an issue if I go with traditional style of setting up an service. I will probably find a simple approach or something to do with scalable containers.Enscroll

© 2022 - 2024 — McMap. All rights reserved.