This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
[ 0.000000] Booting Linux on physical CPU 0x0 | |
[ 0.000000] Linux version 3.14.0-xilinx-gf387dab-dirty (lzq@armdev01) (gcc version 4.8.3 20140320 (prerelease) (Sourcery CodeBench Lite 2014.05-23) ) #38 SMP PREEMPT Fri Jun 17 20:02:51 CST 2016 | |
[ 0.000000] CPU: ARMv7 Processor [413fc090] revision 0 (ARMv7), cr=18c5387d | |
[ 0.000000] CPU: PIPT / VIPT nonaliasing data cache, VIPT aliasing instruction cache | |
[ 0.000000] Machine model: Xilinx Zynq | |
[ 0.000000] cma: CMA: reserved 128 MiB at 27800000 | |
[ 0.000000] Memory policy: Data cache writealloc | |
[ 0.000000] On node 0 totalpages: 258048 | |
[ 0.000000] free_area_init_node: node 0, pgdat c06e4600, node_mem_map e6fd8000 | |
[ 0.000000] Normal zone: 1520 pages used for memmap |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
library(text2vec) | |
library(xgboost) | |
library(pdp) | |
# Create the document term matrix (bag of words) using the movie_review data frame provided | |
# in the text2vec package (sentiment analysis problem) | |
data("movie_review") | |
# Tokenize the movie reviews and create a vocabulary of tokens including document counts | |
vocab <- create_vocabulary(itoken(movie_review$review, |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
ERROR: Bundle org.protege.common [1] Error starting file:/home/davek/apps/Protege_4.3/bundles/org.protege.common.jar (org.osgi.framework.BundleException: Unresolved constraint in bundle org.protege.common [1]: Unable to resolve 1.0: missing requirement [1.0] osgi.wiring.package; (&(osgi.wiring.package=org.w3c.dom)(version>=0.0.0))) | |
org.osgi.framework.BundleException: Unresolved constraint in bundle org.protege.common [1]: Unable to resolve 1.0: missing requirement [1.0] osgi.wiring.package; (&(osgi.wiring.package=org.w3c.dom)(version>=0.0.0)) | |
at org.apache.felix.framework.Felix.resolveBundleRevision(Felix.java:3818) | |
at org.apache.felix.framework.Felix.startBundle(Felix.java:1868) | |
at org.apache.felix.framework.Felix.setActiveStartLevel(Felix.java:1191) | |
at org.apache.felix.framework.FrameworkStartLevelImpl.run(FrameworkStartLevelImpl.java:295) | |
at java.lang.Thread.run(Thread.java:744) | |
Error: Could not parse XML contribution for "org.eclipse.equinox.registry//plugin.xml". Any contributed extensions and exte |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
library(shiny) | |
fileUrl <- url("http://s3.amazonaws.com/data-excursions/states_cases.Rda") | |
load(fileUrl) | |
diseases <- unique(as.character(states_cases$disease)) | |
states <- unique(as.character(states_cases$state)) | |
shinyServer(function(input, output) { |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
cascading.pipe.OperatorException: [d9b750b1-28f6-4c53-a32...][cascading.pipe.assembly.AggregateBy.initialize(AggregateBy.java:564)] operator Each failed executing operation | |
at cascading.flow.stream.FunctionEachStage.receive(FunctionEachStage.java:107) | |
at cascading.flow.stream.FunctionEachStage.receive(FunctionEachStage.java:39) | |
at cascading.flow.stream.FilterEachStage.receive(FilterEachStage.java:73) | |
at cascading.flow.stream.FilterEachStage.receive(FilterEachStage.java:34) | |
at cascading.flow.stream.FunctionEachStage$1.collect(FunctionEachStage.java:80) | |
at cascading.tuple.TupleEntryCollector.safeCollect(TupleEntryCollector.java:119) | |
at cascading.tuple.TupleEntryCollector.add(TupleEntryCollector.java:107) | |
at com.idexx.lambda.hadoop.jobs.assembler.EntityAssembler$ExtractFields.operate(EntityAssembler.java:121) | |
at cascalog.CascalogFunctionExecutor.operate(CascalogFunctionExecutor.java:41) |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
grammar MinRtf ; | |
document : (control | text )+ ; | |
text : TEXT ; | |
control : KEYWORD INT? SPACE? ; | |
KEYWORD : '\\' (ASCIILETTER)+ ; |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
import org.apache.hadoop.conf.Configuration; | |
import org.apache.hadoop.conf.Configured; | |
import org.apache.hadoop.fs.Path; | |
import org.apache.hadoop.io.IntWritable; | |
import org.apache.hadoop.io.LongWritable; | |
import org.apache.hadoop.io.Text; | |
import org.apache.hadoop.mapreduce.Job; | |
import org.apache.hadoop.mapreduce.Mapper; | |
import org.apache.hadoop.mapreduce.Reducer; | |
import org.apache.hadoop.mapreduce.lib.input.FileInputFormat; |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
/* If I execute only the clientQuery or only the emailQuery by themselves everything works right. | |
I set breakpoints inside the ExtractClientEdgeFields() and ExtractClientId() functions and they | |
are called with only the Data objects with the correct property types. | |
However, if I execute this query as it is shown here then only one of the two functions is called | |
with all of the Data objects from both taps. */ | |
public static Subquery clientEmail(String pailPath) { | |
PailTap clientEdgeTap = clientEdgeTap(pailPath); | |
PailTap clientTap = petOwnerTap(pailPath); |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
public MutableClass changeName(MutableClass oldNameClass, String newName) { | |
MutableClass newNameClass = new MutableClass(); | |
newNameClass = oldNameClass; | |
newNameClass.setName(newName); | |
return newNameClass; | |
} | |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
@Test | |
public void changeNameTest() { | |
MutableClass original_name = new MutableClass("my name"); | |
MutableClass expected_name = new MutableClass("my name"); | |
NameFilter filter = new NameFilter(); | |
MutableClass new_name = filter.changeName(original_name, | |
"new name"); | |
assertEquals(new_name, expected_name); |
NewerOlder