Search CQL only: Identify the entity and literal value to overwrite the XML
element in the schema and solrconfig files.

In general, Spark submission arguments
(--submission_args) are translated into system
properties -Dname=value and other VM parameters like classpath. The
application arguments (-app_args) are passed directly to
the application.

Configure the Spark shell with these arguments:

--conf name=spark.value|sparkproperties.conf

An arbitrary Spark option to the Spark configuration prefixed by spark.

name-spark.value

sparkproperties.conf - a configuration

--executor-memory mem

The amount of memory that each executor can consume for the application. Spark uses
a 512 MB default. Specify the memory argument in JVM format using the k, m, or g
suffix.

-framework dse|spark-2.0

The classpath for the Spark shell. When not set, the default is dse.

dse - Sets the Spark classpath to the same classpath that is used by the DSE
server.

spark-2.0 - Sets a classpath that is used by the open source Spark (OSS) 2.0
release to accommodate applications originally written for open source Apache Spark.
Uses a BYOS (Bring Your Own Spark) JAR with shaded references to internal
dependencies to eliminate complexity when porting an app from OSS Spark.

Note: If the
code works on DSE, applications do not require the spark-2.0 framework. Full
support in the spark-2.0 framework might require specifying additional
dependencies. For example: hadoop-aws is included on the dse server path but is
not present on the OSS Spark-2.0 classpath. In this example, applications that use
S3 or other AWS APIs must include their own aws-sdk on the runtime classpath. This
additional runtime classpath is required only for applications that cannot run on
the DSE classpath.