If set up as describes below, you can manage your project via sbt, and you will be able to run your test cases via sbt test
. Also, after setting things up this way, the project can be opened and used in IntelliJ IDEA 14 as an sbt project.
To do so, I'm currently in the process of creating a new Analyze component. I want to build the Spark setup and the jobs using Scala 2.11. Therefore, I had to compile my own version of Spark 1.5.1, put it onto the systems, and run a cluster from that. This post describes what worked for me.
]]>The good news is that it’s actually quite simple, but the bad news is that it works completely different than object-orientation in languages like C++, Java, Ruby, Python or PHP, making it not-quite-so simple to understand.
But fear not, we are going to take it step by step.
]]>