Do i have to kill the existing spark-shell session before loading again with --packages?
I doubt it is possible in spark-shell with all the "goodies" to make Spark work nicely with Scala implicits and such.
Even if you managed to create a SparkContext or SparkSession with the new jar file loaded what about the existing data structures you have already created? They use the other "incompatible" session and hence would become unusable (or would lead to hard to trace classloader issues).
So, yes, I'd recommend leaving the existing spark-shell session and start over.