from scipy.ndimage import measurements
im = array(...)
labels, num_features = measurements.label(im)
set_printoptions(threshold=nan, linewidth=nan)
print(str(labels[0:100,0:100]))
MNIST LeNet
import org.apache.sysml.api.mlcontext._
val ml = new MLContext(spark)
val clf = ml.nn.examples.Mnist_lenet
val dummy = clf.generate_dummy_data
val dummyVal = clf.generate_dummy_data
val params = clf.train(dummy.X, dummy.Y, dummyVal.X, dummyVal.Y, dummy.C, dummy.Hin, dummy.Win, 1)
val probs = clf.predict(dummy.X, dummy.C, dummy.Hin, dummy.Win, params.W1, params.b1, params.W2, params.b2, params.W3, params.b3, params.W4, params.b4)
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Recursively remove .DS_Store files | |
find . -name '.DS_Store' -type f -delete | |
spark-shell --executor-memory 4G --driver-memory 4G --driver-class-path=target/SystemML.jar
spark-shell --executor-memory 4G --driver-memory 4G --driver-class-path="target/classes:target/lib/antlr4-runtime-4.5.3.jar:target/lib/wink-json4j-1.4.jar"
To deploy project artifacts to the Apache Snapshot Repository takes a little set-up. Here are the steps that I followed.
Follow the instructions here: https://maven.apache.org/guides/mini/guide-encryption.html
pip3 install -U -e src/main/python
mvn clean package
PYSPARK_PYTHON=python3 spark-submit --master local[*] --driver-class-path target/SystemML.jar src/main/python/tests/test_mllearn.py
PYSPARK_PYTHON=python3 spark-submit --master local[*] --driver-class-path target/SystemML.jar src/main/python/tests/test_mlcontext.py
NewerOlder