Metrics

Reporting metrics is supported via dropwizard and the default Spark metrics setup. The metrics can be enabled by adding the following lines to your/etc/spark/conf/metric.properties file:

driver.source.io.tiledb.spark.class=org.apache.spark.metrics.TileDBMetricsSource
executor.source.io.tiledb.spark.class=org.apache.spark.metrics.TileDBMetricsSource

When loading an application jar (i.e. via the --jar CLI flag when launching a Spark shell) the metrics are available to the master node and the driver metrics will report. However, the executors will error about a class not found. This is because on each worker node a jar containing the org.apache.spark.metrics.TileDBMetricsSource must be provided in the class path. To address this, you must copy our dedicated path/to/TileDB-Spark/build/libs/tiledb-spark-metrics-<version>.jar to $SPARK_HOME/jars/.