Created
August 28, 2012 17:58
-
-
Save Quantisan/3501436 to your computer and use it in GitHub Desktop.
Impatient part 6
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
aul-Lams-computer:part6 paullam$ hadoop jar target/impatient.jar data/rain.txt output/wc data/en.stop output/tfidf output/trap output/check | |
2012-08-28 18:52:15.457 java[16966:1903] Unable to load realm info from SCDynamicStore | |
12/08/28 18:52:16 INFO util.HadoopUtil: using default application jar, may cause class not found exceptions on the cluster | |
12/08/28 18:52:16 INFO planner.HadoopPlanner: using application jar: /Users/paullam/Dropbox/Projects/Impatient/part6/target/impatient.jar | |
12/08/28 18:52:16 INFO property.AppProps: using app.id: D5424D7B027EC9418FCADE8F3552429B | |
12/08/28 18:52:16 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable | |
12/08/28 18:52:16 WARN snappy.LoadSnappy: Snappy native library not loaded | |
12/08/28 18:52:16 INFO mapred.FileInputFormat: Total input paths to process : 1 | |
12/08/28 18:52:16 INFO mapred.FileInputFormat: Total input paths to process : 1 | |
12/08/28 18:52:16 INFO util.Version: Concurrent, Inc - Cascading 2.0.0 | |
12/08/28 18:52:16 INFO flow.Flow: [] starting | |
12/08/28 18:52:16 INFO flow.Flow: [] source: Hfs["TextDelimited[['stop']->[ALL]]"]["data/en.stop"]"] | |
12/08/28 18:52:16 INFO flow.Flow: [] source: Hfs["TextDelimited[['doc_id', 'text']->[ALL]]"]["data/rain.txt"]"] | |
12/08/28 18:52:16 INFO flow.Flow: [] sink: Hfs["TextDelimited[[UNKNOWN]->['?doc-id', '?word']]"]["tmp/checkpoint/data/etl-stage"]"] | |
12/08/28 18:52:16 INFO flow.Flow: [] parallel execution is enabled: false | |
12/08/28 18:52:16 INFO flow.Flow: [] starting jobs: 1 | |
12/08/28 18:52:16 INFO flow.Flow: [] allocating threads: 1 | |
12/08/28 18:52:16 INFO flow.FlowStep: [] starting step: (1/1) ...checkpoint/data/etl-stage | |
12/08/28 18:52:16 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId= | |
12/08/28 18:52:17 INFO mapred.FileInputFormat: Total input paths to process : 1 | |
12/08/28 18:52:17 INFO mapred.FileInputFormat: Total input paths to process : 1 | |
12/08/28 18:52:17 INFO flow.FlowStep: [] submitted hadoop job: job_local_0001 | |
12/08/28 18:52:17 INFO hadoop.TupleSerialization: using default comparator: cascalog.hadoop.DefaultComparator | |
12/08/28 18:52:17 INFO io.MultiInputSplit: current split input path: file:/Users/paullam/Dropbox/Projects/Impatient/part6/data/en.stop | |
12/08/28 18:52:17 INFO mapred.MapTask: numReduceTasks: 1 | |
12/08/28 18:52:17 INFO mapred.MapTask: io.sort.mb = 100 | |
12/08/28 18:52:17 INFO mapred.MapTask: data buffer = 79691776/99614720 | |
12/08/28 18:52:17 INFO mapred.MapTask: record buffer = 262144/327680 | |
12/08/28 18:52:17 INFO hadoop.TupleSerialization: using default comparator: cascalog.hadoop.DefaultComparator | |
12/08/28 18:52:17 INFO hadoop.TupleSerialization: using default comparator: cascalog.hadoop.DefaultComparator | |
12/08/28 18:52:18 INFO hadoop.FlowMapper: sourcing from: Hfs["TextDelimited[['stop']->[ALL]]"]["data/en.stop"]"] | |
12/08/28 18:52:18 INFO hadoop.FlowMapper: sinking to: CoGroup(e0e52018-cfda-4a01-bc7f-c9b35751c3b5*6ef25d09-8b10-474c-8c50-c67faf71b490)[by:e0e52018-cfda-4a01-bc7f-c9b35751c3b5:[{1}:'?word']6ef25d09-8b10-474c-8c50-c67faf71b490:[{1}:'?word']] | |
12/08/28 18:52:18 INFO hadoop.FlowMapper: trapping to: Hfs["TextLine[['line']->[ALL]]"]["tmp/trap"]"] | |
12/08/28 18:52:18 INFO hadoop.TupleSerialization: using default comparator: cascalog.hadoop.DefaultComparator | |
12/08/28 18:52:18 INFO mapred.MapTask: Starting flush of map output | |
12/08/28 18:52:18 INFO hadoop.TupleSerialization: using default comparator: cascalog.hadoop.DefaultComparator | |
12/08/28 18:52:18 INFO mapred.MapTask: Finished spill 0 | |
12/08/28 18:52:18 INFO mapred.Task: Task:attempt_local_0001_m_000000_0 is done. And is in the process of commiting | |
12/08/28 18:52:18 INFO mapred.LocalJobRunner: file:/Users/paullam/Dropbox/Projects/Impatient/part6/data/en.stop:0+544 | |
12/08/28 18:52:18 INFO mapred.Task: Task 'attempt_local_0001_m_000000_0' done. | |
12/08/28 18:52:18 INFO hadoop.TupleSerialization: using default comparator: cascalog.hadoop.DefaultComparator | |
12/08/28 18:52:18 INFO io.MultiInputSplit: current split input path: file:/Users/paullam/Dropbox/Projects/Impatient/part6/data/rain.txt | |
12/08/28 18:52:18 INFO mapred.MapTask: numReduceTasks: 1 | |
12/08/28 18:52:18 INFO mapred.MapTask: io.sort.mb = 100 | |
12/08/28 18:52:18 INFO mapred.MapTask: data buffer = 79691776/99614720 | |
12/08/28 18:52:18 INFO mapred.MapTask: record buffer = 262144/327680 | |
12/08/28 18:52:18 INFO hadoop.TupleSerialization: using default comparator: cascalog.hadoop.DefaultComparator | |
12/08/28 18:52:18 INFO hadoop.TupleSerialization: using default comparator: cascalog.hadoop.DefaultComparator | |
12/08/28 18:52:18 INFO hadoop.FlowMapper: sourcing from: Hfs["TextDelimited[['doc_id', 'text']->[ALL]]"]["data/rain.txt"]"] | |
12/08/28 18:52:18 INFO hadoop.FlowMapper: sinking to: CoGroup(e0e52018-cfda-4a01-bc7f-c9b35751c3b5*6ef25d09-8b10-474c-8c50-c67faf71b490)[by:e0e52018-cfda-4a01-bc7f-c9b35751c3b5:[{1}:'?word']6ef25d09-8b10-474c-8c50-c67faf71b490:[{1}:'?word']] | |
12/08/28 18:52:18 INFO hadoop.FlowMapper: trapping to: Hfs["TextLine[['line']->[ALL]]"]["tmp/trap"]"] | |
12/08/28 18:52:18 INFO util.Hadoop18TapUtil: setting up task: 'attempt_local_0001_m_000001_0' - file:/Users/paullam/Dropbox/Projects/Impatient/part6/tmp/trap/_temporary/_attempt_local_0001_m_000001_0 | |
12/08/28 18:52:18 WARN stream.TrapHandler: exception trap on branch: 'dcb3c45e-74ae-41a9-b401-7a585a4dffef', for fields: [{1}:'?doc-id'] tuple: ['zoink'] | |
cascading.pipe.OperatorException: [dcb3c45e-74ae-41a9-b40...][sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)] operator Each failed executing operation | |
at cascading.flow.stream.FilterEachStage.receive(FilterEachStage.java:68) | |
at cascading.flow.stream.FilterEachStage.receive(FilterEachStage.java:33) | |
at cascading.flow.stream.FunctionEachStage$1.collect(FunctionEachStage.java:67) | |
at cascading.tuple.TupleEntryCollector.safeCollect(TupleEntryCollector.java:93) | |
at cascading.tuple.TupleEntryCollector.add(TupleEntryCollector.java:86) | |
at cascading.operation.Identity.operate(Identity.java:110) | |
at cascading.flow.stream.FunctionEachStage.receive(FunctionEachStage.java:86) | |
at cascading.flow.stream.FunctionEachStage.receive(FunctionEachStage.java:38) | |
at cascading.flow.stream.FilterEachStage.receive(FilterEachStage.java:60) | |
at cascading.flow.stream.FilterEachStage.receive(FilterEachStage.java:33) | |
at cascading.flow.stream.FunctionEachStage$1.collect(FunctionEachStage.java:67) | |
at cascading.tuple.TupleEntryCollector.safeCollect(TupleEntryCollector.java:93) | |
at cascading.tuple.TupleEntryCollector.add(TupleEntryCollector.java:86) | |
at cascalog.ClojureMap.operate(Unknown Source) | |
at cascading.flow.stream.FunctionEachStage.receive(FunctionEachStage.java:86) | |
at cascading.flow.stream.FunctionEachStage.receive(FunctionEachStage.java:38) | |
at cascading.flow.stream.FunctionEachStage$1.collect(FunctionEachStage.java:67) | |
at cascading.tuple.TupleEntryCollector.safeCollect(TupleEntryCollector.java:93) | |
at cascading.tuple.TupleEntryCollector.add(TupleEntryCollector.java:86) | |
at cascading.operation.Identity.operate(Identity.java:110) | |
at cascading.flow.stream.FunctionEachStage.receive(FunctionEachStage.java:86) | |
at cascading.flow.stream.FunctionEachStage.receive(FunctionEachStage.java:38) | |
at cascading.flow.stream.FilterEachStage.receive(FilterEachStage.java:60) | |
at cascading.flow.stream.FilterEachStage.receive(FilterEachStage.java:33) | |
at cascading.flow.stream.FunctionEachStage$1.collect(FunctionEachStage.java:67) | |
at cascading.tuple.TupleEntryCollector.safeCollect(TupleEntryCollector.java:93) | |
at cascading.tuple.TupleEntryCollector.add(TupleEntryCollector.java:86) | |
at cascalog.ClojureMapcat.operate(Unknown Source) | |
at cascading.flow.stream.FunctionEachStage.receive(FunctionEachStage.java:86) | |
at cascading.flow.stream.FunctionEachStage.receive(FunctionEachStage.java:38) | |
at cascading.flow.stream.FilterEachStage.receive(FilterEachStage.java:60) | |
at cascading.flow.stream.FilterEachStage.receive(FilterEachStage.java:33) | |
at cascading.flow.stream.FunctionEachStage$1.collect(FunctionEachStage.java:67) | |
at cascading.tuple.TupleEntryCollector.safeCollect(TupleEntryCollector.java:93) | |
at cascading.tuple.TupleEntryCollector.add(TupleEntryCollector.java:86) | |
at cascading.operation.Identity.operate(Identity.java:110) | |
at cascading.flow.stream.FunctionEachStage.receive(FunctionEachStage.java:86) | |
at cascading.flow.stream.FunctionEachStage.receive(FunctionEachStage.java:38) | |
at cascading.flow.stream.SourceStage.map(SourceStage.java:102) | |
at cascading.flow.stream.SourceStage.run(SourceStage.java:58) | |
at cascading.flow.hadoop.FlowMapper.run(FlowMapper.java:124) | |
at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:391) | |
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:325) | |
at org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:210) | |
Caused by: java.lang.AssertionError: Assert failed: unexpected doc-id | |
(pred x) | |
at impatient.core$assert_tuple.invoke(core.clj:29) | |
at clojure.lang.AFn.applyToHelper(AFn.java:167) | |
at clojure.lang.AFn.applyTo(AFn.java:151) | |
at clojure.core$apply.invoke(core.clj:605) | |
at clojure.core$partial$fn__446.doInvoke(core.clj:2345) | |
at clojure.lang.RestFn.invoke(RestFn.java:408) | |
at clojure.lang.Var.invoke(Var.java:415) | |
at clojure.lang.AFn.applyToHelper(AFn.java:161) | |
at clojure.lang.Var.applyTo(Var.java:532) | |
at cascalog.ClojureCascadingBase.applyFunction(Unknown Source) | |
at cascalog.ClojureFilter.isRemove(Unknown Source) | |
at cascading.flow.stream.FilterEachStage.receive(FilterEachStage.java:57) | |
... 43 more | |
12/08/28 18:52:18 INFO io.TapOutputCollector: closing tap collector for: tmp/trap/part-m-00001-00001 | |
12/08/28 18:52:18 INFO util.Hadoop18TapUtil: committing task: 'attempt_local_0001_m_000001_0' - file:/Users/paullam/Dropbox/Projects/Impatient/part6/tmp/trap/_temporary/_attempt_local_0001_m_000001_0 | |
12/08/28 18:52:18 INFO util.Hadoop18TapUtil: saved output of task 'attempt_local_0001_m_000001_0' to file:/Users/paullam/Dropbox/Projects/Impatient/part6/tmp/trap | |
12/08/28 18:52:18 INFO mapred.MapTask: Starting flush of map output | |
12/08/28 18:52:18 INFO hadoop.TupleSerialization: using default comparator: cascalog.hadoop.DefaultComparator | |
12/08/28 18:52:18 INFO mapred.MapTask: Finished spill 0 | |
12/08/28 18:52:18 INFO mapred.Task: Task:attempt_local_0001_m_000001_0 is done. And is in the process of commiting | |
12/08/28 18:52:18 INFO mapred.LocalJobRunner: file:/Users/paullam/Dropbox/Projects/Impatient/part6/data/rain.txt:0+521 | |
12/08/28 18:52:18 INFO mapred.Task: Task 'attempt_local_0001_m_000001_0' done. | |
12/08/28 18:52:18 INFO mapred.LocalJobRunner: | |
12/08/28 18:52:18 INFO hadoop.TupleSerialization: using default comparator: cascalog.hadoop.DefaultComparator | |
12/08/28 18:52:18 INFO mapred.Merger: Merging 2 sorted segments | |
12/08/28 18:52:18 INFO mapred.Merger: Down to the last merge-pass, with 2 segments left of total size: 4413 bytes | |
12/08/28 18:52:18 INFO mapred.LocalJobRunner: | |
12/08/28 18:52:18 INFO hadoop.TupleSerialization: using default comparator: cascalog.hadoop.DefaultComparator | |
12/08/28 18:52:18 INFO hadoop.FlowReducer: sourcing from: CoGroup(e0e52018-cfda-4a01-bc7f-c9b35751c3b5*6ef25d09-8b10-474c-8c50-c67faf71b490)[by:e0e52018-cfda-4a01-bc7f-c9b35751c3b5:[{1}:'?word']6ef25d09-8b10-474c-8c50-c67faf71b490:[{1}:'?word']] | |
12/08/28 18:52:18 INFO hadoop.FlowReducer: sinking to: Hfs["TextDelimited[[UNKNOWN]->['?doc-id', '?word']]"]["tmp/checkpoint/data/etl-stage"]"] | |
12/08/28 18:52:18 INFO hadoop.FlowReducer: trapping to: Hfs["TextLine[['line']->[ALL]]"]["tmp/trap"]"] | |
12/08/28 18:52:18 INFO hadoop.TupleSerialization: using default comparator: cascalog.hadoop.DefaultComparator | |
12/08/28 18:52:18 INFO hadoop.TupleSerialization: using default comparator: cascalog.hadoop.DefaultComparator | |
12/08/28 18:52:23 INFO collect.SpillableTupleList: attempting to load codec: org.apache.hadoop.io.compress.GzipCodec | |
12/08/28 18:52:23 INFO collect.SpillableTupleList: found codec: org.apache.hadoop.io.compress.GzipCodec | |
12/08/28 18:52:23 INFO hadoop.TupleSerialization: using default comparator: cascalog.hadoop.DefaultComparator | |
12/08/28 18:52:23 INFO hadoop.TupleSerialization: using default comparator: cascalog.hadoop.DefaultComparator | |
12/08/28 18:52:23 INFO mapred.Task: Task:attempt_local_0001_r_000000_0 is done. And is in the process of commiting | |
12/08/28 18:52:23 INFO mapred.LocalJobRunner: | |
12/08/28 18:52:23 INFO mapred.Task: Task attempt_local_0001_r_000000_0 is allowed to commit now | |
12/08/28 18:52:23 INFO mapred.FileOutputCommitter: Saved output of task 'attempt_local_0001_r_000000_0' to file:/Users/paullam/Dropbox/Projects/Impatient/part6/tmp/checkpoint/data/etl-stage | |
12/08/28 18:52:23 INFO mapred.LocalJobRunner: reduce > reduce | |
12/08/28 18:52:23 INFO mapred.Task: Task 'attempt_local_0001_r_000000_0' done. | |
12/08/28 18:52:23 INFO util.Hadoop18TapUtil: deleting temp path tmp/checkpoint/data/etl-stage/_temporary | |
12/08/28 18:52:23 INFO util.Hadoop18TapUtil: deleting temp path tmp/trap/_temporary | |
12/08/28 18:52:23 INFO util.Hadoop18TapUtil: deleting temp path tmp/trap/_temporary | |
12/08/28 18:52:23 INFO util.HadoopUtil: using default application jar, may cause class not found exceptions on the cluster | |
12/08/28 18:52:23 INFO planner.HadoopPlanner: using application jar: /Users/paullam/Dropbox/Projects/Impatient/part6/target/impatient.jar | |
12/08/28 18:52:23 INFO flow.Flow: [] starting | |
12/08/28 18:52:23 INFO flow.Flow: [] source: Hfs["TextDelimited[[UNKNOWN]->[ALL]]"]["tmp/checkpoint/data/etl-stage"]"] | |
12/08/28 18:52:23 INFO flow.Flow: [] sink: Hfs["TextDelimited[[UNKNOWN]->['?word', '?count']]"]["output/wc"]"] | |
12/08/28 18:52:23 INFO flow.Flow: [] parallel execution is enabled: false | |
12/08/28 18:52:23 INFO flow.Flow: [] starting jobs: 1 | |
12/08/28 18:52:23 INFO flow.Flow: [] allocating threads: 1 | |
12/08/28 18:52:23 INFO flow.FlowStep: [] starting step: (1/1) output/wc | |
12/08/28 18:52:23 INFO jvm.JvmMetrics: Cannot initialize JVM Metrics with processName=JobTracker, sessionId= - already initialized | |
12/08/28 18:52:23 INFO util.HadoopUtil: using default application jar, may cause class not found exceptions on the cluster | |
12/08/28 18:52:23 INFO planner.HadoopPlanner: using application jar: /Users/paullam/Dropbox/Projects/Impatient/part6/target/impatient.jar | |
12/08/28 18:52:23 INFO mapred.FileInputFormat: Total input paths to process : 1 | |
12/08/28 18:52:23 INFO mapred.FileInputFormat: Total input paths to process : 1 | |
12/08/28 18:52:24 INFO flow.Flow: [] starting | |
12/08/28 18:52:24 INFO flow.Flow: [] source: Hfs["TextDelimited[['doc02', 'air']->[ALL]]"]["tmp/checkpoint/data/etl-stage"]"] | |
12/08/28 18:52:24 INFO flow.Flow: [] sink: Hfs["SequenceFile[[UNKNOWN]->['?n-docs']]"]["/tmp/cascalog_reserved/ad21b9d9-b856-45a5-8692-4f57f0852633/2f2bb941-0e57-43ed-94f7-517ccf3af7f7"]"] | |
12/08/28 18:52:24 INFO flow.Flow: [] parallel execution is enabled: false | |
12/08/28 18:52:24 INFO flow.Flow: [] starting jobs: 2 | |
12/08/28 18:52:24 INFO flow.Flow: [] allocating threads: 1 | |
12/08/28 18:52:24 INFO flow.FlowStep: [] starting step: (1/2) | |
12/08/28 18:52:24 INFO jvm.JvmMetrics: Cannot initialize JVM Metrics with processName=JobTracker, sessionId= - already initialized | |
12/08/28 18:52:24 INFO mapred.FileInputFormat: Total input paths to process : 1 | |
12/08/28 18:52:24 INFO flow.FlowStep: [] submitted hadoop job: job_local_0002 | |
12/08/28 18:52:24 INFO hadoop.TupleSerialization: using default comparator: cascalog.hadoop.DefaultComparator | |
12/08/28 18:52:24 INFO io.MultiInputSplit: current split input path: file:/Users/paullam/Dropbox/Projects/Impatient/part6/tmp/checkpoint/data/etl-stage/part-00000 | |
12/08/28 18:52:24 INFO mapred.MapTask: numReduceTasks: 1 | |
12/08/28 18:52:24 INFO mapred.MapTask: io.sort.mb = 100 | |
12/08/28 18:52:24 INFO mapred.MapTask: data buffer = 79691776/99614720 | |
12/08/28 18:52:24 INFO mapred.MapTask: record buffer = 262144/327680 | |
12/08/28 18:52:24 INFO hadoop.TupleSerialization: using default comparator: cascalog.hadoop.DefaultComparator | |
12/08/28 18:52:24 INFO hadoop.TupleSerialization: using default comparator: cascalog.hadoop.DefaultComparator | |
12/08/28 18:52:24 INFO hadoop.FlowMapper: sourcing from: Hfs["TextDelimited[[UNKNOWN]->[ALL]]"]["tmp/checkpoint/data/etl-stage"]"] | |
12/08/28 18:52:24 INFO hadoop.FlowMapper: sinking to: GroupBy(0a61c499-7a88-4e2d-bc1f-3a09ea9d577e)[by:[{1}:'?word']] | |
12/08/28 18:52:24 INFO mapred.MapTask: Starting flush of map output | |
12/08/28 18:52:24 INFO hadoop.TupleSerialization: using default comparator: cascalog.hadoop.DefaultComparator | |
12/08/28 18:52:24 INFO mapred.MapTask: Finished spill 0 | |
12/08/28 18:52:24 INFO mapred.Task: Task:attempt_local_0002_m_000000_0 is done. And is in the process of commiting | |
12/08/28 18:52:24 INFO mapred.LocalJobRunner: file:/Users/paullam/Dropbox/Projects/Impatient/part6/tmp/checkpoint/data/etl-stage/part-00000:0+605 | |
12/08/28 18:52:24 INFO mapred.Task: Task 'attempt_local_0002_m_000000_0' done. | |
12/08/28 18:52:24 INFO mapred.LocalJobRunner: | |
12/08/28 18:52:24 INFO hadoop.TupleSerialization: using default comparator: cascalog.hadoop.DefaultComparator | |
12/08/28 18:52:24 INFO mapred.Merger: Merging 1 sorted segments | |
12/08/28 18:52:24 INFO mapred.Merger: Down to the last merge-pass, with 1 segments left of total size: 561 bytes | |
12/08/28 18:52:24 INFO mapred.LocalJobRunner: | |
12/08/28 18:52:24 INFO hadoop.TupleSerialization: using default comparator: cascalog.hadoop.DefaultComparator | |
12/08/28 18:52:24 INFO hadoop.FlowReducer: sourcing from: GroupBy(0a61c499-7a88-4e2d-bc1f-3a09ea9d577e)[by:[{1}:'?word']] | |
12/08/28 18:52:24 INFO hadoop.FlowReducer: sinking to: Hfs["TextDelimited[[UNKNOWN]->['?word', '?count']]"]["output/wc"]"] | |
12/08/28 18:52:24 INFO hadoop.TupleSerialization: using default comparator: cascalog.hadoop.DefaultComparator | |
12/08/28 18:52:24 INFO hadoop.TupleSerialization: using default comparator: cascalog.hadoop.DefaultComparator | |
12/08/28 18:52:24 INFO hadoop.TupleSerialization: using default comparator: cascalog.hadoop.DefaultComparator | |
12/08/28 18:52:24 INFO mapred.Task: Task:attempt_local_0002_r_000000_0 is done. And is in the process of commiting | |
12/08/28 18:52:24 INFO mapred.LocalJobRunner: | |
12/08/28 18:52:24 INFO mapred.Task: Task attempt_local_0002_r_000000_0 is allowed to commit now | |
12/08/28 18:52:24 INFO mapred.FileOutputCommitter: Saved output of task 'attempt_local_0002_r_000000_0' to file:/Users/paullam/Dropbox/Projects/Impatient/part6/output/wc | |
12/08/28 18:52:25 INFO mapred.LocalJobRunner: reduce > reduce | |
12/08/28 18:52:25 INFO mapred.Task: Task 'attempt_local_0002_r_000000_0' done. | |
12/08/28 18:52:25 INFO flow.FlowStep: [] submitted hadoop job: job_local_0003 | |
12/08/28 18:52:25 INFO hadoop.TupleSerialization: using default comparator: cascalog.hadoop.DefaultComparator | |
12/08/28 18:52:25 INFO io.MultiInputSplit: current split input path: file:/Users/paullam/Dropbox/Projects/Impatient/part6/tmp/checkpoint/data/etl-stage/part-00000 | |
12/08/28 18:52:25 INFO mapred.MapTask: numReduceTasks: 1 | |
12/08/28 18:52:25 INFO mapred.MapTask: io.sort.mb = 100 | |
12/08/28 18:52:25 INFO util.Hadoop18TapUtil: deleting temp path output/wc/_temporary | |
12/08/28 18:52:25 INFO mapred.MapTask: data buffer = 79691776/99614720 | |
12/08/28 18:52:25 INFO mapred.MapTask: record buffer = 262144/327680 | |
12/08/28 18:52:25 INFO hadoop.TupleSerialization: using default comparator: cascalog.hadoop.DefaultComparator | |
12/08/28 18:52:25 INFO hadoop.TupleSerialization: using default comparator: cascalog.hadoop.DefaultComparator | |
12/08/28 18:52:25 INFO hadoop.FlowMapper: sourcing from: Hfs["TextDelimited[['doc02', 'air']->[ALL]]"]["tmp/checkpoint/data/etl-stage"]"] | |
12/08/28 18:52:25 INFO hadoop.FlowMapper: sinking to: GroupBy(584dfc0c-d162-4c9c-962b-ee062993f05a)[by:[{1}:'?__gen24']] | |
12/08/28 18:52:25 INFO mapred.MapTask: Starting flush of map output | |
12/08/28 18:52:25 INFO hadoop.TupleSerialization: using default comparator: cascalog.hadoop.DefaultComparator | |
12/08/28 18:52:25 INFO mapred.MapTask: Finished spill 0 | |
12/08/28 18:52:25 INFO mapred.Task: Task:attempt_local_0003_m_000000_0 is done. And is in the process of commiting | |
12/08/28 18:52:25 INFO mapred.LocalJobRunner: file:/Users/paullam/Dropbox/Projects/Impatient/part6/tmp/checkpoint/data/etl-stage/part-00000:0+605 | |
12/08/28 18:52:25 INFO mapred.Task: Task 'attempt_local_0003_m_000000_0' done. | |
12/08/28 18:52:25 INFO mapred.LocalJobRunner: | |
12/08/28 18:52:25 INFO hadoop.TupleSerialization: using default comparator: cascalog.hadoop.DefaultComparator | |
12/08/28 18:52:25 INFO mapred.Merger: Merging 1 sorted segments | |
12/08/28 18:52:25 INFO mapred.Merger: Down to the last merge-pass, with 1 segments left of total size: 707 bytes | |
12/08/28 18:52:25 INFO mapred.LocalJobRunner: | |
12/08/28 18:52:25 INFO hadoop.TupleSerialization: using default comparator: cascalog.hadoop.DefaultComparator | |
12/08/28 18:52:25 INFO hadoop.FlowReducer: sourcing from: GroupBy(584dfc0c-d162-4c9c-962b-ee062993f05a)[by:[{1}:'?__gen24']] | |
12/08/28 18:52:25 INFO hadoop.FlowReducer: sinking to: TempHfs["SequenceFile[['!__gen26', '!__gen27']]"][62c07e1f-09e6-4aaf-8d4c-c/59034/] | |
12/08/28 18:52:25 INFO hadoop.TupleSerialization: using default comparator: cascalog.hadoop.DefaultComparator | |
12/08/28 18:52:25 INFO hadoop.TupleSerialization: using default comparator: cascalog.hadoop.DefaultComparator | |
12/08/28 18:52:25 INFO hadoop.TupleSerialization: using default comparator: cascalog.hadoop.DefaultComparator | |
12/08/28 18:52:25 INFO hadoop.TupleSerialization: using default comparator: cascalog.hadoop.DefaultComparator | |
12/08/28 18:52:25 INFO mapred.Task: Task:attempt_local_0003_r_000000_0 is done. And is in the process of commiting | |
12/08/28 18:52:25 INFO mapred.LocalJobRunner: | |
12/08/28 18:52:25 INFO mapred.Task: Task attempt_local_0003_r_000000_0 is allowed to commit now | |
12/08/28 18:52:25 INFO mapred.FileOutputCommitter: Saved output of task 'attempt_local_0003_r_000000_0' to file:/tmp/hadoop-paullam/62c07e1f_09e6_4aaf_8d4c_c_59034_157F9E4DBD5C6DCC79A67F642F295396 | |
12/08/28 18:52:25 INFO mapred.LocalJobRunner: reduce > reduce | |
12/08/28 18:52:25 INFO mapred.Task: Task 'attempt_local_0003_r_000000_0' done. | |
12/08/28 18:52:25 INFO flow.FlowStep: [] starting step: (2/2) ...57-43ed-94f7-517ccf3af7f7 | |
12/08/28 18:52:25 INFO jvm.JvmMetrics: Cannot initialize JVM Metrics with processName=JobTracker, sessionId= - already initialized | |
12/08/28 18:52:25 INFO mapred.FileInputFormat: Total input paths to process : 1 | |
12/08/28 18:52:25 INFO flow.FlowStep: [] submitted hadoop job: job_local_0004 | |
12/08/28 18:52:25 INFO hadoop.TupleSerialization: using default comparator: cascalog.hadoop.DefaultComparator | |
12/08/28 18:52:25 INFO io.MultiInputSplit: current split input path: file:/tmp/hadoop-paullam/62c07e1f_09e6_4aaf_8d4c_c_59034_157F9E4DBD5C6DCC79A67F642F295396/part-00000 | |
12/08/28 18:52:25 INFO hadoop.TupleSerialization: using default comparator: cascalog.hadoop.DefaultComparator | |
12/08/28 18:52:25 INFO mapred.MapTask: numReduceTasks: 1 | |
12/08/28 18:52:25 INFO mapred.MapTask: io.sort.mb = 100 | |
12/08/28 18:52:26 INFO mapred.MapTask: data buffer = 79691776/99614720 | |
12/08/28 18:52:26 INFO mapred.MapTask: record buffer = 262144/327680 | |
12/08/28 18:52:26 INFO hadoop.TupleSerialization: using default comparator: cascalog.hadoop.DefaultComparator | |
12/08/28 18:52:26 INFO hadoop.TupleSerialization: using default comparator: cascalog.hadoop.DefaultComparator | |
12/08/28 18:52:26 INFO hadoop.FlowMapper: sourcing from: TempHfs["SequenceFile[['!__gen26', '!__gen27']]"][62c07e1f-09e6-4aaf-8d4c-c/59034/] | |
12/08/28 18:52:26 INFO hadoop.FlowMapper: sinking to: GroupBy(62c07e1f-09e6-4aaf-8d4c-c0cd0e7cd473)[by:[{1}:'!__gen26']] | |
12/08/28 18:52:26 INFO mapred.MapTask: Starting flush of map output | |
12/08/28 18:52:26 INFO hadoop.TupleSerialization: using default comparator: cascalog.hadoop.DefaultComparator | |
12/08/28 18:52:26 INFO mapred.MapTask: Finished spill 0 | |
12/08/28 18:52:26 INFO mapred.Task: Task:attempt_local_0004_m_000000_0 is done. And is in the process of commiting | |
12/08/28 18:52:26 INFO mapred.LocalJobRunner: file:/tmp/hadoop-paullam/62c07e1f_09e6_4aaf_8d4c_c_59034_157F9E4DBD5C6DCC79A67F642F295396/part-00000:0+84 | |
12/08/28 18:52:26 INFO mapred.Task: Task 'attempt_local_0004_m_000000_0' done. | |
12/08/28 18:52:26 INFO mapred.LocalJobRunner: | |
12/08/28 18:52:26 INFO hadoop.TupleSerialization: using default comparator: cascalog.hadoop.DefaultComparator | |
12/08/28 18:52:26 INFO mapred.Merger: Merging 1 sorted segments | |
12/08/28 18:52:26 INFO mapred.Merger: Down to the last merge-pass, with 1 segments left of total size: 11 bytes | |
12/08/28 18:52:26 INFO mapred.LocalJobRunner: | |
12/08/28 18:52:26 INFO hadoop.TupleSerialization: using default comparator: cascalog.hadoop.DefaultComparator | |
12/08/28 18:52:26 INFO hadoop.FlowReducer: sourcing from: GroupBy(62c07e1f-09e6-4aaf-8d4c-c0cd0e7cd473)[by:[{1}:'!__gen26']] | |
12/08/28 18:52:26 INFO hadoop.FlowReducer: sinking to: Hfs["SequenceFile[[UNKNOWN]->['?n-docs']]"]["/tmp/cascalog_reserved/ad21b9d9-b856-45a5-8692-4f57f0852633/2f2bb941-0e57-43ed-94f7-517ccf3af7f7"]"] | |
12/08/28 18:52:26 INFO hadoop.TupleSerialization: using default comparator: cascalog.hadoop.DefaultComparator | |
12/08/28 18:52:26 INFO hadoop.TupleSerialization: using default comparator: cascalog.hadoop.DefaultComparator | |
12/08/28 18:52:26 INFO hadoop.TupleSerialization: using default comparator: cascalog.hadoop.DefaultComparator | |
12/08/28 18:52:26 INFO mapred.Task: Task:attempt_local_0004_r_000000_0 is done. And is in the process of commiting | |
12/08/28 18:52:26 INFO mapred.LocalJobRunner: | |
12/08/28 18:52:26 INFO mapred.Task: Task attempt_local_0004_r_000000_0 is allowed to commit now | |
12/08/28 18:52:26 INFO mapred.FileOutputCommitter: Saved output of task 'attempt_local_0004_r_000000_0' to file:/tmp/cascalog_reserved/ad21b9d9-b856-45a5-8692-4f57f0852633/2f2bb941-0e57-43ed-94f7-517ccf3af7f7 | |
12/08/28 18:52:26 INFO mapred.LocalJobRunner: reduce > reduce | |
12/08/28 18:52:26 INFO mapred.Task: Task 'attempt_local_0004_r_000000_0' done. | |
12/08/28 18:52:26 INFO util.Hadoop18TapUtil: deleting temp path /tmp/cascalog_reserved/ad21b9d9-b856-45a5-8692-4f57f0852633/2f2bb941-0e57-43ed-94f7-517ccf3af7f7/_temporary | |
12/08/28 18:52:26 INFO mapred.FileInputFormat: Total input paths to process : 1 | |
12/08/28 18:52:26 INFO hadoop.TupleSerialization: using default comparator: cascalog.hadoop.DefaultComparator | |
12/08/28 18:52:26 INFO util.HadoopUtil: using default application jar, may cause class not found exceptions on the cluster | |
12/08/28 18:52:26 INFO planner.HadoopPlanner: using application jar: /Users/paullam/Dropbox/Projects/Impatient/part6/target/impatient.jar | |
12/08/28 18:52:26 INFO flow.Flow: [] starting | |
12/08/28 18:52:26 INFO flow.Flow: [] source: Hfs["TextDelimited[['doc02', 'air']->[ALL]]"]["tmp/checkpoint/data/etl-stage"]"] | |
12/08/28 18:52:26 INFO flow.Flow: [] sink: Hfs["TextDelimited[[UNKNOWN]->['?doc-id', '?tf-idf', '?tf-word']]"]["output/tfidf"]"] | |
12/08/28 18:52:26 INFO flow.Flow: [] parallel execution is enabled: false | |
12/08/28 18:52:26 INFO flow.Flow: [] starting jobs: 5 | |
12/08/28 18:52:26 INFO flow.Flow: [] allocating threads: 1 | |
12/08/28 18:52:26 INFO flow.FlowStep: [] starting step: (1/5) | |
12/08/28 18:52:26 INFO jvm.JvmMetrics: Cannot initialize JVM Metrics with processName=JobTracker, sessionId= - already initialized | |
12/08/28 18:52:26 INFO mapred.FileInputFormat: Total input paths to process : 1 | |
12/08/28 18:52:27 INFO flow.FlowStep: [] submitted hadoop job: job_local_0005 | |
12/08/28 18:52:27 INFO hadoop.TupleSerialization: using default comparator: cascalog.hadoop.DefaultComparator | |
12/08/28 18:52:27 INFO io.MultiInputSplit: current split input path: file:/Users/paullam/Dropbox/Projects/Impatient/part6/tmp/checkpoint/data/etl-stage/part-00000 | |
12/08/28 18:52:27 INFO mapred.MapTask: numReduceTasks: 0 | |
12/08/28 18:52:27 INFO hadoop.TupleSerialization: using default comparator: cascalog.hadoop.DefaultComparator | |
12/08/28 18:52:27 INFO hadoop.FlowMapper: sourcing from: Hfs["TextDelimited[['doc02', 'air']->[ALL]]"]["tmp/checkpoint/data/etl-stage"]"] | |
12/08/28 18:52:27 INFO hadoop.FlowMapper: sinking to: TempHfs["SequenceFile[['?doc-id', '?word']]"][3b2fecf6-e588-401d-8f2e-4/43648/] | |
12/08/28 18:52:27 INFO mapred.Task: Task:attempt_local_0005_m_000000_0 is done. And is in the process of commiting | |
12/08/28 18:52:27 INFO mapred.LocalJobRunner: | |
12/08/28 18:52:27 INFO mapred.Task: Task attempt_local_0005_m_000000_0 is allowed to commit now | |
12/08/28 18:52:27 INFO mapred.FileOutputCommitter: Saved output of task 'attempt_local_0005_m_000000_0' to file:/tmp/hadoop-paullam/3b2fecf6_e588_401d_8f2e_4_43648_AA98EA2E3D867CF29ADA8F5ABC248CAC | |
12/08/28 18:52:27 INFO mapred.LocalJobRunner: file:/Users/paullam/Dropbox/Projects/Impatient/part6/tmp/checkpoint/data/etl-stage/part-00000:0+605 | |
12/08/28 18:52:27 INFO mapred.Task: Task 'attempt_local_0005_m_000000_0' done. | |
12/08/28 18:52:27 INFO flow.FlowStep: [] starting step: (2/5) | |
12/08/28 18:52:27 INFO jvm.JvmMetrics: Cannot initialize JVM Metrics with processName=JobTracker, sessionId= - already initialized | |
12/08/28 18:52:27 INFO mapred.FileInputFormat: Total input paths to process : 1 | |
12/08/28 18:52:27 INFO flow.FlowStep: [] submitted hadoop job: job_local_0006 | |
12/08/28 18:52:27 INFO hadoop.TupleSerialization: using default comparator: cascalog.hadoop.DefaultComparator | |
12/08/28 18:52:27 INFO io.MultiInputSplit: current split input path: file:/tmp/hadoop-paullam/3b2fecf6_e588_401d_8f2e_4_43648_AA98EA2E3D867CF29ADA8F5ABC248CAC/part-00000 | |
12/08/28 18:52:27 INFO hadoop.TupleSerialization: using default comparator: cascalog.hadoop.DefaultComparator | |
12/08/28 18:52:27 INFO mapred.MapTask: numReduceTasks: 1 | |
12/08/28 18:52:27 INFO mapred.MapTask: io.sort.mb = 100 | |
12/08/28 18:52:27 INFO mapred.MapTask: data buffer = 79691776/99614720 | |
12/08/28 18:52:27 INFO mapred.MapTask: record buffer = 262144/327680 | |
12/08/28 18:52:27 INFO hadoop.TupleSerialization: using default comparator: cascalog.hadoop.DefaultComparator | |
12/08/28 18:52:27 INFO hadoop.TupleSerialization: using default comparator: cascalog.hadoop.DefaultComparator | |
12/08/28 18:52:27 INFO hadoop.FlowMapper: sourcing from: TempHfs["SequenceFile[['?doc-id', '?word']]"][3b2fecf6-e588-401d-8f2e-4/43648/] | |
12/08/28 18:52:27 INFO hadoop.FlowMapper: sinking to: GroupBy(0881370d-4014-4270-bec8-5d99bd333c53)[by:[{2}:'?tf-word', '?doc-id']] | |
12/08/28 18:52:27 INFO mapred.MapTask: Starting flush of map output | |
12/08/28 18:52:27 INFO hadoop.TupleSerialization: using default comparator: cascalog.hadoop.DefaultComparator | |
12/08/28 18:52:27 INFO mapred.MapTask: Finished spill 0 | |
12/08/28 18:52:27 INFO mapred.Task: Task:attempt_local_0006_m_000000_0 is done. And is in the process of commiting | |
12/08/28 18:52:27 INFO mapred.LocalJobRunner: file:/tmp/hadoop-paullam/3b2fecf6_e588_401d_8f2e_4_43648_AA98EA2E3D867CF29ADA8F5ABC248CAC/part-00000:0+1511 | |
12/08/28 18:52:27 INFO mapred.Task: Task 'attempt_local_0006_m_000000_0' done. | |
12/08/28 18:52:27 INFO mapred.LocalJobRunner: | |
12/08/28 18:52:27 INFO hadoop.TupleSerialization: using default comparator: cascalog.hadoop.DefaultComparator | |
12/08/28 18:52:27 INFO mapred.Merger: Merging 1 sorted segments | |
12/08/28 18:52:27 INFO mapred.Merger: Down to the last merge-pass, with 1 segments left of total size: 1295 bytes | |
12/08/28 18:52:27 INFO mapred.LocalJobRunner: | |
12/08/28 18:52:27 INFO hadoop.TupleSerialization: using default comparator: cascalog.hadoop.DefaultComparator | |
12/08/28 18:52:27 INFO hadoop.FlowReducer: sourcing from: GroupBy(0881370d-4014-4270-bec8-5d99bd333c53)[by:[{2}:'?tf-word', '?doc-id']] | |
12/08/28 18:52:27 INFO hadoop.FlowReducer: sinking to: TempHfs["SequenceFile[['?tf-word', '?doc-id', '?tf-count']]"][2a3894d4-1535-4e87-b9c6-c/23026/] | |
12/08/28 18:52:27 INFO hadoop.TupleSerialization: using default comparator: cascalog.hadoop.DefaultComparator | |
12/08/28 18:52:27 INFO hadoop.TupleSerialization: using default comparator: cascalog.hadoop.DefaultComparator | |
12/08/28 18:52:27 INFO hadoop.TupleSerialization: using default comparator: cascalog.hadoop.DefaultComparator | |
12/08/28 18:52:27 INFO hadoop.TupleSerialization: using default comparator: cascalog.hadoop.DefaultComparator | |
12/08/28 18:52:27 INFO mapred.Task: Task:attempt_local_0006_r_000000_0 is done. And is in the process of commiting | |
12/08/28 18:52:27 INFO mapred.LocalJobRunner: | |
12/08/28 18:52:27 INFO mapred.Task: Task attempt_local_0006_r_000000_0 is allowed to commit now | |
12/08/28 18:52:27 INFO mapred.FileOutputCommitter: Saved output of task 'attempt_local_0006_r_000000_0' to file:/tmp/hadoop-paullam/2a3894d4_1535_4e87_b9c6_c_23026_C449FBD2EE9AC4860DD1C27EF2446BFF | |
12/08/28 18:52:27 INFO mapred.LocalJobRunner: reduce > reduce | |
12/08/28 18:52:27 INFO mapred.Task: Task 'attempt_local_0006_r_000000_0' done. | |
12/08/28 18:52:27 INFO flow.FlowStep: [] starting step: (3/5) | |
12/08/28 18:52:27 INFO jvm.JvmMetrics: Cannot initialize JVM Metrics with processName=JobTracker, sessionId= - already initialized | |
12/08/28 18:52:27 INFO mapred.FileInputFormat: Total input paths to process : 1 | |
12/08/28 18:52:28 INFO flow.FlowStep: [] submitted hadoop job: job_local_0007 | |
12/08/28 18:52:28 INFO hadoop.TupleSerialization: using default comparator: cascalog.hadoop.DefaultComparator | |
12/08/28 18:52:28 INFO io.MultiInputSplit: current split input path: file:/tmp/hadoop-paullam/3b2fecf6_e588_401d_8f2e_4_43648_AA98EA2E3D867CF29ADA8F5ABC248CAC/part-00000 | |
12/08/28 18:52:28 INFO hadoop.TupleSerialization: using default comparator: cascalog.hadoop.DefaultComparator | |
12/08/28 18:52:28 INFO mapred.MapTask: numReduceTasks: 1 | |
12/08/28 18:52:28 INFO mapred.MapTask: io.sort.mb = 100 | |
12/08/28 18:52:28 INFO mapred.MapTask: data buffer = 79691776/99614720 | |
12/08/28 18:52:28 INFO mapred.MapTask: record buffer = 262144/327680 | |
12/08/28 18:52:28 INFO hadoop.TupleSerialization: using default comparator: cascalog.hadoop.DefaultComparator | |
12/08/28 18:52:28 INFO hadoop.TupleSerialization: using default comparator: cascalog.hadoop.DefaultComparator | |
12/08/28 18:52:28 INFO hadoop.FlowMapper: sourcing from: TempHfs["SequenceFile[['?doc-id', '?word']]"][3b2fecf6-e588-401d-8f2e-4/43648/] | |
12/08/28 18:52:28 INFO hadoop.FlowMapper: sinking to: GroupBy(3f626e55-f546-402b-9384-6e9853219c26)[by:[{2}:'?__gen30', '?__gen29']] | |
12/08/28 18:52:28 INFO mapred.MapTask: Starting flush of map output | |
12/08/28 18:52:28 INFO hadoop.TupleSerialization: using default comparator: cascalog.hadoop.DefaultComparator | |
12/08/28 18:52:28 INFO mapred.MapTask: Finished spill 0 | |
12/08/28 18:52:28 INFO mapred.Task: Task:attempt_local_0007_m_000000_0 is done. And is in the process of commiting | |
12/08/28 18:52:28 INFO mapred.LocalJobRunner: file:/tmp/hadoop-paullam/3b2fecf6_e588_401d_8f2e_4_43648_AA98EA2E3D867CF29ADA8F5ABC248CAC/part-00000:0+1511 | |
12/08/28 18:52:28 INFO mapred.Task: Task 'attempt_local_0007_m_000000_0' done. | |
12/08/28 18:52:28 INFO mapred.LocalJobRunner: | |
12/08/28 18:52:28 INFO hadoop.TupleSerialization: using default comparator: cascalog.hadoop.DefaultComparator | |
12/08/28 18:52:28 INFO mapred.Merger: Merging 1 sorted segments | |
12/08/28 18:52:28 INFO mapred.Merger: Down to the last merge-pass, with 1 segments left of total size: 1255 bytes | |
12/08/28 18:52:28 INFO mapred.LocalJobRunner: | |
12/08/28 18:52:28 INFO hadoop.TupleSerialization: using default comparator: cascalog.hadoop.DefaultComparator | |
12/08/28 18:52:28 INFO hadoop.FlowReducer: sourcing from: GroupBy(3f626e55-f546-402b-9384-6e9853219c26)[by:[{2}:'?__gen30', '?__gen29']] | |
12/08/28 18:52:28 INFO hadoop.FlowReducer: sinking to: TempHfs["SequenceFile[['?df-word', '!__gen31']]"][3985e8fe-df89-4eee-9c9e-9/63524/] | |
12/08/28 18:52:28 INFO hadoop.TupleSerialization: using default comparator: cascalog.hadoop.DefaultComparator | |
12/08/28 18:52:28 INFO hadoop.TupleSerialization: using default comparator: cascalog.hadoop.DefaultComparator | |
12/08/28 18:52:28 INFO hadoop.TupleSerialization: using default comparator: cascalog.hadoop.DefaultComparator | |
12/08/28 18:52:28 INFO hadoop.TupleSerialization: using default comparator: cascalog.hadoop.DefaultComparator | |
12/08/28 18:52:28 INFO mapred.Task: Task:attempt_local_0007_r_000000_0 is done. And is in the process of commiting | |
12/08/28 18:52:28 INFO mapred.LocalJobRunner: | |
12/08/28 18:52:28 INFO mapred.Task: Task attempt_local_0007_r_000000_0 is allowed to commit now | |
12/08/28 18:52:28 INFO mapred.FileOutputCommitter: Saved output of task 'attempt_local_0007_r_000000_0' to file:/tmp/hadoop-paullam/3985e8fe_df89_4eee_9c9e_9_63524_343B3D95D2B0A6143A4BF92B8E32A90A | |
12/08/28 18:52:28 INFO mapred.LocalJobRunner: reduce > reduce | |
12/08/28 18:52:28 INFO mapred.Task: Task 'attempt_local_0007_r_000000_0' done. | |
12/08/28 18:52:28 INFO flow.FlowStep: [] starting step: (5/5) | |
12/08/28 18:52:28 INFO jvm.JvmMetrics: Cannot initialize JVM Metrics with processName=JobTracker, sessionId= - already initialized | |
12/08/28 18:52:28 INFO mapred.FileInputFormat: Total input paths to process : 1 | |
12/08/28 18:52:28 INFO flow.FlowStep: [] submitted hadoop job: job_local_0008 | |
12/08/28 18:52:28 INFO hadoop.TupleSerialization: using default comparator: cascalog.hadoop.DefaultComparator | |
12/08/28 18:52:28 INFO io.MultiInputSplit: current split input path: file:/tmp/hadoop-paullam/3985e8fe_df89_4eee_9c9e_9_63524_343B3D95D2B0A6143A4BF92B8E32A90A/part-00000 | |
12/08/28 18:52:28 INFO hadoop.TupleSerialization: using default comparator: cascalog.hadoop.DefaultComparator | |
12/08/28 18:52:28 INFO mapred.MapTask: numReduceTasks: 1 | |
12/08/28 18:52:28 INFO mapred.MapTask: io.sort.mb = 100 | |
12/08/28 18:52:29 INFO mapred.MapTask: data buffer = 79691776/99614720 | |
12/08/28 18:52:29 INFO mapred.MapTask: record buffer = 262144/327680 | |
12/08/28 18:52:29 INFO hadoop.TupleSerialization: using default comparator: cascalog.hadoop.DefaultComparator | |
12/08/28 18:52:29 INFO hadoop.TupleSerialization: using default comparator: cascalog.hadoop.DefaultComparator | |
12/08/28 18:52:29 INFO hadoop.FlowMapper: sourcing from: TempHfs["SequenceFile[['?df-word', '!__gen31']]"][3985e8fe-df89-4eee-9c9e-9/63524/] | |
12/08/28 18:52:29 INFO hadoop.FlowMapper: sinking to: GroupBy(3985e8fe-df89-4eee-9c9e-97407c4edd78)[by:[{1}:'?df-word']] | |
12/08/28 18:52:29 INFO mapred.MapTask: Starting flush of map output | |
12/08/28 18:52:29 INFO hadoop.TupleSerialization: using default comparator: cascalog.hadoop.DefaultComparator | |
12/08/28 18:52:29 INFO mapred.MapTask: Finished spill 0 | |
12/08/28 18:52:29 INFO mapred.Task: Task:attempt_local_0008_m_000000_0 is done. And is in the process of commiting | |
12/08/28 18:52:29 INFO mapred.LocalJobRunner: file:/tmp/hadoop-paullam/3985e8fe_df89_4eee_9c9e_9_63524_343B3D95D2B0A6143A4BF92B8E32A90A/part-00000:0+764 | |
12/08/28 18:52:29 INFO mapred.Task: Task 'attempt_local_0008_m_000000_0' done. | |
12/08/28 18:52:29 INFO mapred.LocalJobRunner: | |
12/08/28 18:52:29 INFO hadoop.TupleSerialization: using default comparator: cascalog.hadoop.DefaultComparator | |
12/08/28 18:52:29 INFO mapred.Merger: Merging 1 sorted segments | |
12/08/28 18:52:29 INFO mapred.Merger: Down to the last merge-pass, with 1 segments left of total size: 546 bytes | |
12/08/28 18:52:29 INFO mapred.LocalJobRunner: | |
12/08/28 18:52:29 INFO hadoop.TupleSerialization: using default comparator: cascalog.hadoop.DefaultComparator | |
12/08/28 18:52:29 INFO hadoop.FlowReducer: sourcing from: GroupBy(3985e8fe-df89-4eee-9c9e-97407c4edd78)[by:[{1}:'?df-word']] | |
12/08/28 18:52:29 INFO hadoop.FlowReducer: sinking to: TempHfs["SequenceFile[['?df-count', '?tf-word']]"][75666488-7d53-4334-8eb1-5/38952/] | |
12/08/28 18:52:29 INFO hadoop.TupleSerialization: using default comparator: cascalog.hadoop.DefaultComparator | |
12/08/28 18:52:29 INFO hadoop.TupleSerialization: using default comparator: cascalog.hadoop.DefaultComparator | |
12/08/28 18:52:29 INFO hadoop.TupleSerialization: using default comparator: cascalog.hadoop.DefaultComparator | |
12/08/28 18:52:29 INFO hadoop.TupleSerialization: using default comparator: cascalog.hadoop.DefaultComparator | |
12/08/28 18:52:29 INFO mapred.Task: Task:attempt_local_0008_r_000000_0 is done. And is in the process of commiting | |
12/08/28 18:52:29 INFO mapred.LocalJobRunner: | |
12/08/28 18:52:29 INFO mapred.Task: Task attempt_local_0008_r_000000_0 is allowed to commit now | |
12/08/28 18:52:29 INFO mapred.FileOutputCommitter: Saved output of task 'attempt_local_0008_r_000000_0' to file:/tmp/hadoop-paullam/75666488_7d53_4334_8eb1_5_38952_9283E396B43699A1CC818E039BF16D61 | |
12/08/28 18:52:29 INFO mapred.LocalJobRunner: reduce > reduce | |
12/08/28 18:52:29 INFO mapred.Task: Task 'attempt_local_0008_r_000000_0' done. | |
12/08/28 18:52:29 INFO flow.FlowStep: [] starting step: (4/5) output/tfidf | |
12/08/28 18:52:29 INFO jvm.JvmMetrics: Cannot initialize JVM Metrics with processName=JobTracker, sessionId= - already initialized | |
12/08/28 18:52:29 INFO mapred.FileInputFormat: Total input paths to process : 1 | |
12/08/28 18:52:29 INFO mapred.FileInputFormat: Total input paths to process : 1 | |
12/08/28 18:52:29 INFO flow.FlowStep: [] submitted hadoop job: job_local_0009 | |
12/08/28 18:52:29 INFO hadoop.TupleSerialization: using default comparator: cascalog.hadoop.DefaultComparator | |
12/08/28 18:52:29 INFO io.MultiInputSplit: current split input path: file:/tmp/hadoop-paullam/2a3894d4_1535_4e87_b9c6_c_23026_C449FBD2EE9AC4860DD1C27EF2446BFF/part-00000 | |
12/08/28 18:52:29 INFO hadoop.TupleSerialization: using default comparator: cascalog.hadoop.DefaultComparator | |
12/08/28 18:52:29 INFO mapred.MapTask: numReduceTasks: 1 | |
12/08/28 18:52:29 INFO mapred.MapTask: io.sort.mb = 100 | |
12/08/28 18:52:29 INFO mapred.MapTask: data buffer = 79691776/99614720 | |
12/08/28 18:52:29 INFO mapred.MapTask: record buffer = 262144/327680 | |
12/08/28 18:52:29 INFO hadoop.TupleSerialization: using default comparator: cascalog.hadoop.DefaultComparator | |
12/08/28 18:52:29 INFO hadoop.TupleSerialization: using default comparator: cascalog.hadoop.DefaultComparator | |
12/08/28 18:52:29 INFO hadoop.FlowMapper: sourcing from: TempHfs["SequenceFile[['?tf-word', '?doc-id', '?tf-count']]"][2a3894d4-1535-4e87-b9c6-c/23026/] | |
12/08/28 18:52:29 INFO hadoop.FlowMapper: sinking to: CoGroup(2a3894d4-1535-4e87-b9c6-cd485df181e8*75666488-7d53-4334-8eb1-535d97c10f6b)[by:2a3894d4-1535-4e87-b9c6-cd485df181e8:[{1}:'?tf-word']75666488-7d53-4334-8eb1-535d97c10f6b:[{1}:'?tf-word']] | |
12/08/28 18:52:29 INFO mapred.MapTask: Starting flush of map output | |
12/08/28 18:52:29 INFO hadoop.TupleSerialization: using default comparator: cascalog.hadoop.DefaultComparator | |
12/08/28 18:52:29 INFO mapred.MapTask: Finished spill 0 | |
12/08/28 18:52:29 INFO mapred.Task: Task:attempt_local_0009_m_000000_0 is done. And is in the process of commiting | |
12/08/28 18:52:29 INFO mapred.LocalJobRunner: file:/tmp/hadoop-paullam/2a3894d4_1535_4e87_b9c6_c_23026_C449FBD2EE9AC4860DD1C27EF2446BFF/part-00000:0+1543 | |
12/08/28 18:52:29 INFO mapred.Task: Task 'attempt_local_0009_m_000000_0' done. | |
12/08/28 18:52:29 INFO hadoop.TupleSerialization: using default comparator: cascalog.hadoop.DefaultComparator | |
12/08/28 18:52:29 INFO io.MultiInputSplit: current split input path: file:/tmp/hadoop-paullam/75666488_7d53_4334_8eb1_5_38952_9283E396B43699A1CC818E039BF16D61/part-00000 | |
12/08/28 18:52:29 INFO hadoop.TupleSerialization: using default comparator: cascalog.hadoop.DefaultComparator | |
12/08/28 18:52:29 INFO mapred.MapTask: numReduceTasks: 1 | |
12/08/28 18:52:29 INFO mapred.MapTask: io.sort.mb = 100 | |
12/08/28 18:52:29 INFO mapred.MapTask: data buffer = 79691776/99614720 | |
12/08/28 18:52:29 INFO mapred.MapTask: record buffer = 262144/327680 | |
12/08/28 18:52:29 INFO hadoop.TupleSerialization: using default comparator: cascalog.hadoop.DefaultComparator | |
12/08/28 18:52:29 INFO hadoop.TupleSerialization: using default comparator: cascalog.hadoop.DefaultComparator | |
12/08/28 18:52:29 INFO hadoop.FlowMapper: sourcing from: TempHfs["SequenceFile[['?df-count', '?tf-word']]"][75666488-7d53-4334-8eb1-5/38952/] | |
12/08/28 18:52:29 INFO hadoop.FlowMapper: sinking to: CoGroup(2a3894d4-1535-4e87-b9c6-cd485df181e8*75666488-7d53-4334-8eb1-535d97c10f6b)[by:2a3894d4-1535-4e87-b9c6-cd485df181e8:[{1}:'?tf-word']75666488-7d53-4334-8eb1-535d97c10f6b:[{1}:'?tf-word']] | |
12/08/28 18:52:29 INFO mapred.MapTask: Starting flush of map output | |
12/08/28 18:52:29 INFO hadoop.TupleSerialization: using default comparator: cascalog.hadoop.DefaultComparator | |
12/08/28 18:52:29 INFO mapred.MapTask: Finished spill 0 | |
12/08/28 18:52:29 INFO mapred.Task: Task:attempt_local_0009_m_000001_0 is done. And is in the process of commiting | |
12/08/28 18:52:29 INFO mapred.LocalJobRunner: file:/tmp/hadoop-paullam/75666488_7d53_4334_8eb1_5_38952_9283E396B43699A1CC818E039BF16D61/part-00000:0+764 | |
12/08/28 18:52:29 INFO mapred.Task: Task 'attempt_local_0009_m_000001_0' done. | |
12/08/28 18:52:29 INFO mapred.LocalJobRunner: | |
12/08/28 18:52:29 INFO hadoop.TupleSerialization: using default comparator: cascalog.hadoop.DefaultComparator | |
12/08/28 18:52:29 INFO mapred.Merger: Merging 2 sorted segments | |
12/08/28 18:52:29 INFO mapred.Merger: Down to the last merge-pass, with 2 segments left of total size: 1946 bytes | |
12/08/28 18:52:29 INFO mapred.LocalJobRunner: | |
12/08/28 18:52:29 INFO hadoop.TupleSerialization: using default comparator: cascalog.hadoop.DefaultComparator | |
12/08/28 18:52:29 INFO hadoop.FlowReducer: sourcing from: CoGroup(2a3894d4-1535-4e87-b9c6-cd485df181e8*75666488-7d53-4334-8eb1-535d97c10f6b)[by:2a3894d4-1535-4e87-b9c6-cd485df181e8:[{1}:'?tf-word']75666488-7d53-4334-8eb1-535d97c10f6b:[{1}:'?tf-word']] | |
12/08/28 18:52:29 INFO hadoop.FlowReducer: sinking to: Hfs["TextDelimited[[UNKNOWN]->['?doc-id', '?tf-idf', '?tf-word']]"]["output/tfidf"]"] | |
12/08/28 18:52:30 INFO hadoop.TupleSerialization: using default comparator: cascalog.hadoop.DefaultComparator | |
12/08/28 18:52:30 INFO hadoop.TupleSerialization: using default comparator: cascalog.hadoop.DefaultComparator | |
12/08/28 18:52:30 INFO collect.SpillableTupleList: attempting to load codec: org.apache.hadoop.io.compress.GzipCodec | |
12/08/28 18:52:30 INFO collect.SpillableTupleList: found codec: org.apache.hadoop.io.compress.GzipCodec | |
12/08/28 18:52:30 INFO hadoop.TupleSerialization: using default comparator: cascalog.hadoop.DefaultComparator | |
12/08/28 18:52:30 INFO hadoop.TupleSerialization: using default comparator: cascalog.hadoop.DefaultComparator | |
12/08/28 18:52:30 INFO mapred.Task: Task:attempt_local_0009_r_000000_0 is done. And is in the process of commiting | |
12/08/28 18:52:30 INFO mapred.LocalJobRunner: | |
12/08/28 18:52:30 INFO mapred.Task: Task attempt_local_0009_r_000000_0 is allowed to commit now | |
12/08/28 18:52:30 INFO mapred.FileOutputCommitter: Saved output of task 'attempt_local_0009_r_000000_0' to file:/Users/paullam/Dropbox/Projects/Impatient/part6/output/tfidf | |
12/08/28 18:52:30 INFO mapred.LocalJobRunner: reduce > reduce | |
12/08/28 18:52:30 INFO mapred.Task: Task 'attempt_local_0009_r_000000_0' done. | |
12/08/28 18:52:30 INFO util.Hadoop18TapUtil: deleting temp path output/tfidf/_temporary | |
12/08/28 18:52:30 INFO checkpointed-workflow: Workflow completed successfully |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
$ cat output/tfidf/part-00000 | |
doc02 0.22314355131420976 area | |
doc01 0.44628710262841953 area | |
doc03 0.22314355131420976 area | |
doc05 0.9162907318741551 australia | |
doc05 0.9162907318741551 broken | |
doc04 0.9162907318741551 california's | |
doc04 0.9162907318741551 cause | |
doc02 0.9162907318741551 cloudcover | |
doc04 0.9162907318741551 death | |
doc04 0.9162907318741551 deserts | |
doc03 0.9162907318741551 downwind | |
doc01 0.22314355131420976 dry | |
doc02 0.22314355131420976 dry | |
doc03 0.22314355131420976 dry | |
doc05 0.9162907318741551 dvd | |
doc04 0.9162907318741551 effect | |
doc04 0.9162907318741551 known | |
doc03 0.5108256237659907 land | |
doc05 0.5108256237659907 land | |
doc01 0.5108256237659907 lee | |
doc02 0.5108256237659907 lee | |
doc04 0.5108256237659907 leeward | |
doc03 0.5108256237659907 leeward | |
doc02 0.9162907318741551 less | |
doc03 0.9162907318741551 lies | |
doc02 0.22314355131420976 mountain | |
doc03 0.22314355131420976 mountain | |
doc04 0.22314355131420976 mountain | |
doc01 0.9162907318741551 mountainous | |
doc04 0.9162907318741551 primary | |
doc02 0.9162907318741551 produces | |
doc04 0.0 rain | |
doc01 0.0 rain | |
doc02 0.0 rain | |
doc03 0.0 rain | |
doc04 0.9162907318741551 ranges | |
doc05 0.9162907318741551 secrets | |
doc01 0.0 shadow | |
doc02 0.0 shadow | |
doc03 0.0 shadow | |
doc04 0.0 shadow | |
doc02 0.9162907318741551 sinking | |
doc04 0.9162907318741551 such | |
doc04 0.9162907318741551 valley | |
doc05 0.9162907318741551 women |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
$ cat output/trap/part-m-00001-00001 | |
zoink |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
$ cat output/wc/part-00000 | |
air 1 | |
area 4 | |
australia 1 | |
broken 1 | |
california's 1 | |
cause 1 | |
cloudcover 1 | |
death 1 | |
deserts 1 | |
downwind 1 | |
dry 3 | |
dvd 1 | |
effect 1 | |
known 1 | |
land 2 | |
lee 2 | |
leeward 2 | |
less 1 | |
lies 1 | |
mountain 3 | |
mountainous 1 | |
primary 1 | |
produces 1 | |
rain 5 | |
ranges 1 | |
secrets 1 | |
shadow 4 | |
sinking 1 | |
such 1 | |
valley 1 | |
women 1 |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment