Hi, i have strange error on date parsing.
Format: yyyy-MM-dd'T'HH:mm:ss.SSS
2010-02-28T03:36:13.990 - no error
2010-03-28T03:36:13.990 - error
2010-04-28T03:36:13.990 - no error
2010-03-28T02:36:13.990 - no error
2010-03-28T03:36:13.990 - error
2010-03-28T04:36:13.990 - no error
Stack Trace:
2013/10/21 13:33:06 - Transformation metadata - The shared object fie [null] is empty!
2013/10/21 13:33:06 - load_posts_to_hadoop - Transformation is pre-loaded.
2013/10/21 13:33:06 - Spoon - Transformation opened.
2013/10/21 13:33:06 - Spoon - Launching transformation [load_posts_to_hadoop]...
2013/10/21 13:33:06 - Spoon - Started the transformation execution.
2013/10/21 13:33:06 - load_posts_to_hadoop - Dispatching started for transformation [load_posts_to_hadoop]
2013/10/21 13:33:06 - load_posts_to_hadoop - Nr of arguments detected:0
2013/10/21 13:33:06 - load_posts_to_hadoop - This is not a replay transformation
2013/10/21 13:33:06 - Transformation metadata - Natural sort of steps executed in 0 ms (9 time previous steps calculated)
2013/10/21 13:33:06 - load_posts_to_hadoop - I found 9 different steps to launch.
2013/10/21 13:33:06 - load_posts_to_hadoop - Allocating rowsets...
2013/10/21 13:33:06 - load_posts_to_hadoop - Allocating rowsets for step 0 --> Filter rows 2 2
2013/10/21 13:33:06 - load_posts_to_hadoop - prevcopies = 1, nextcopies=1
2013/10/21 13:33:06 - load_posts_to_hadoop - Transformation allocated new rowset [Filter rows 2 2.0 - Dummy (do nothing).0]
2013/10/21 13:33:06 - load_posts_to_hadoop - prevcopies = 1, nextcopies=1
2013/10/21 13:33:06 - load_posts_to_hadoop - Transformation allocated new rowset [Filter rows 2 2.0 - get_needed_values 2.0]
2013/10/21 13:33:06 - load_posts_to_hadoop - Allocated 2 rowsets for step 0 --> Filter rows 2 2
2013/10/21 13:33:06 - load_posts_to_hadoop - Allocating rowsets for step 1 --> Dummy (do nothing)
2013/10/21 13:33:06 - load_posts_to_hadoop - Allocated 2 rowsets for step 1 --> Dummy (do nothing)
2013/10/21 13:33:06 - load_posts_to_hadoop - Allocating rowsets for step 2 --> get_needed_values 2
2013/10/21 13:33:06 - load_posts_to_hadoop - prevcopies = 1, nextcopies=1
2013/10/21 13:33:06 - load_posts_to_hadoop - Transformation allocated new rowset [get_needed_values 2.0 - Row denormaliser.0]
2013/10/21 13:33:06 - load_posts_to_hadoop - Allocated 3 rowsets for step 2 --> get_needed_values 2
2013/10/21 13:33:06 - load_posts_to_hadoop - Allocating rowsets for step 3 --> Row denormaliser
2013/10/21 13:33:06 - load_posts_to_hadoop - prevcopies = 1, nextcopies=1
2013/10/21 13:33:06 - load_posts_to_hadoop - Transformation allocated new rowset [Row denormaliser.0 - Strings cut.0]
2013/10/21 13:33:06 - load_posts_to_hadoop - Allocated 4 rowsets for step 3 --> Row denormaliser
2013/10/21 13:33:06 - load_posts_to_hadoop - Allocating rowsets for step 4 --> input_posts
2013/10/21 13:33:06 - load_posts_to_hadoop - prevcopies = 1, nextcopies=1
2013/10/21 13:33:06 - load_posts_to_hadoop - Transformation allocated new rowset [input_posts.0 - Filter rows 2 2.0]
2013/10/21 13:33:06 - load_posts_to_hadoop - Allocated 5 rowsets for step 4 --> input_posts
2013/10/21 13:33:06 - load_posts_to_hadoop - Allocating rowsets for step 5 --> Strings cut
2013/10/21 13:33:06 - load_posts_to_hadoop - prevcopies = 1, nextcopies=1
2013/10/21 13:33:06 - load_posts_to_hadoop - Transformation allocated new rowset [Strings cut.0 - date_parse.0]
2013/10/21 13:33:06 - load_posts_to_hadoop - Allocated 6 rowsets for step 5 --> Strings cut
2013/10/21 13:33:06 - load_posts_to_hadoop - Allocating rowsets for step 6 --> date_parse
2013/10/21 13:33:06 - load_posts_to_hadoop - prevcopies = 1, nextcopies=1
2013/10/21 13:33:06 - load_posts_to_hadoop - Transformation allocated new rowset [date_parse.0 - Select values 2.0]
2013/10/21 13:33:06 - load_posts_to_hadoop - Allocated 7 rowsets for step 6 --> date_parse
2013/10/21 13:33:06 - load_posts_to_hadoop - Allocating rowsets for step 7 --> Select values 2
2013/10/21 13:33:06 - load_posts_to_hadoop - prevcopies = 1, nextcopies=1
2013/10/21 13:33:06 - load_posts_to_hadoop - Transformation allocated new rowset [Select values 2.0 - Hadoop File Output.0]
2013/10/21 13:33:06 - load_posts_to_hadoop - Allocated 8 rowsets for step 7 --> Select values 2
2013/10/21 13:33:06 - load_posts_to_hadoop - Allocating rowsets for step 8 --> Hadoop File Output
2013/10/21 13:33:06 - load_posts_to_hadoop - Allocated 8 rowsets for step 8 --> Hadoop File Output
2013/10/21 13:33:06 - load_posts_to_hadoop - Allocating Steps & StepData...
2013/10/21 13:33:06 - load_posts_to_hadoop - Transformation is about to allocate step [Filter rows 2 2] of type [FilterRows]
2013/10/21 13:33:06 - Filter rows 2 2.0 - distribution activated
2013/10/21 13:33:06 - Filter rows 2 2.0 - Starting allocation of buffers & new threads...
2013/10/21 13:33:06 - Filter rows 2 2.0 - Step info: nrinput=1 nroutput=2
2013/10/21 13:33:06 - Filter rows 2 2.0 - Got previous step from [Filter rows 2 2] #0 --> input_posts
2013/10/21 13:33:06 - Filter rows 2 2.0 - input rel is 1:1
2013/10/21 13:33:06 - Filter rows 2 2.0 - Found input rowset [input_posts.0 - Filter rows 2 2.0]
2013/10/21 13:33:06 - Filter rows 2 2.0 - output rel. is 1:1
2013/10/21 13:33:06 - Filter rows 2 2.0 - Found output rowset [Filter rows 2 2.0 - Dummy (do nothing).0]
2013/10/21 13:33:06 - Filter rows 2 2.0 - output rel. is 1:1
2013/10/21 13:33:06 - Filter rows 2 2.0 - Found output rowset [Filter rows 2 2.0 - get_needed_values 2.0]
2013/10/21 13:33:06 - Filter rows 2 2.0 - Finished dispatching
2013/10/21 13:33:06 - load_posts_to_hadoop - Transformation has allocated a new step: [Filter rows 2 2].0
2013/10/21 13:33:06 - load_posts_to_hadoop - Transformation is about to allocate step [Dummy (do nothing)] of type [Dummy]
2013/10/21 13:33:06 - Dummy (do nothing).0 - distribution activated
2013/10/21 13:33:06 - Dummy (do nothing).0 - Starting allocation of buffers & new threads...
2013/10/21 13:33:06 - Dummy (do nothing).0 - Step info: nrinput=1 nroutput=0
2013/10/21 13:33:06 - Dummy (do nothing).0 - Got previous step from [Dummy (do nothing)] #0 --> Filter rows 2 2
2013/10/21 13:33:06 - Dummy (do nothing).0 - input rel is 1:1
2013/10/21 13:33:06 - Dummy (do nothing).0 - Found input rowset [Filter rows 2 2.0 - Dummy (do nothing).0]
2013/10/21 13:33:06 - Dummy (do nothing).0 - Finished dispatching
2013/10/21 13:33:06 - load_posts_to_hadoop - Transformation has allocated a new step: [Dummy (do nothing)].0
2013/10/21 13:33:06 - load_posts_to_hadoop - Transformation is about to allocate step [get_needed_values 2] of type [SelectValues]
2013/10/21 13:33:06 - get_needed_values 2.0 - Starting allocation of buffers & new threads...
2013/10/21 13:33:06 - get_needed_values 2.0 - Step info: nrinput=1 nroutput=1
2013/10/21 13:33:06 - get_needed_values 2.0 - Got previous step from [get_needed_values 2] #0 --> Filter rows 2 2
2013/10/21 13:33:06 - get_needed_values 2.0 - input rel is 1:1
2013/10/21 13:33:06 - get_needed_values 2.0 - Found input rowset [Filter rows 2 2.0 - get_needed_values 2.0]
2013/10/21 13:33:06 - get_needed_values 2.0 - output rel. is 1:1
2013/10/21 13:33:06 - get_needed_values 2.0 - Found output rowset [get_needed_values 2.0 - Row denormaliser.0]
2013/10/21 13:33:06 - get_needed_values 2.0 - Finished dispatching
2013/10/21 13:33:06 - load_posts_to_hadoop - Transformation has allocated a new step: [get_needed_values 2].0
2013/10/21 13:33:06 - load_posts_to_hadoop - Transformation is about to allocate step [Row denormaliser] of type [Denormaliser]
2013/10/21 13:33:06 - Row denormaliser.0 - Starting allocation of buffers & new threads...
2013/10/21 13:33:06 - Row denormaliser.0 - Step info: nrinput=1 nroutput=1
2013/10/21 13:33:06 - Row denormaliser.0 - Got previous step from [Row denormaliser] #0 --> get_needed_values 2
2013/10/21 13:33:06 - Row denormaliser.0 - input rel is 1:1
2013/10/21 13:33:06 - Row denormaliser.0 - Found input rowset [get_needed_values 2.0 - Row denormaliser.0]
2013/10/21 13:33:06 - Row denormaliser.0 - output rel. is 1:1
2013/10/21 13:33:06 - Row denormaliser.0 - Found output rowset [Row denormaliser.0 - Strings cut.0]
2013/10/21 13:33:06 - Row denormaliser.0 - Finished dispatching
2013/10/21 13:33:06 - load_posts_to_hadoop - Transformation has allocated a new step: [Row denormaliser].0
2013/10/21 13:33:06 - load_posts_to_hadoop - Transformation is about to allocate step [input_posts] of type [XMLInputStream]
2013/10/21 13:33:06 - input_posts.0 - distribution activated
2013/10/21 13:33:06 - input_posts.0 - Starting allocation of buffers & new threads...
2013/10/21 13:33:06 - input_posts.0 - Step info: nrinput=0 nroutput=1
2013/10/21 13:33:06 - input_posts.0 - output rel. is 1:1
2013/10/21 13:33:06 - input_posts.0 - Found output rowset [input_posts.0 - Filter rows 2 2.0]
2013/10/21 13:33:06 - input_posts.0 - Finished dispatching
2013/10/21 13:33:06 - load_posts_to_hadoop - Transformation has allocated a new step: [input_posts].0
2013/10/21 13:33:06 - load_posts_to_hadoop - Transformation is about to allocate step [Strings cut] of type [StringCut]
2013/10/21 13:33:06 - Strings cut.0 - distribution activated
2013/10/21 13:33:06 - Strings cut.0 - Starting allocation of buffers & new threads...
2013/10/21 13:33:06 - Strings cut.0 - Step info: nrinput=1 nroutput=1
2013/10/21 13:33:06 - Strings cut.0 - Got previous step from [Strings cut] #0 --> Row denormaliser
2013/10/21 13:33:06 - Strings cut.0 - input rel is 1:1
2013/10/21 13:33:06 - Strings cut.0 - Found input rowset [Row denormaliser.0 - Strings cut.0]
2013/10/21 13:33:06 - Strings cut.0 - output rel. is 1:1
2013/10/21 13:33:06 - Strings cut.0 - Found output rowset [Strings cut.0 - date_parse.0]
2013/10/21 13:33:06 - Strings cut.0 - Finished dispatching
2013/10/21 13:33:06 - load_posts_to_hadoop - Transformation has allocated a new step: [Strings cut].0
2013/10/21 13:33:06 - load_posts_to_hadoop - Transformation is about to allocate step [date_parse] of type [SelectValues]
2013/10/21 13:33:06 - date_parse.0 - distribution activated
2013/10/21 13:33:06 - date_parse.0 - Starting allocation of buffers & new threads...
2013/10/21 13:33:06 - date_parse.0 - Step info: nrinput=1 nroutput=1
2013/10/21 13:33:06 - date_parse.0 - Got previous step from [date_parse] #0 --> Strings cut
2013/10/21 13:33:06 - date_parse.0 - input rel is 1:1
2013/10/21 13:33:06 - date_parse.0 - Found input rowset [Strings cut.0 - date_parse.0]
2013/10/21 13:33:06 - date_parse.0 - output rel. is 1:1
2013/10/21 13:33:06 - date_parse.0 - Found output rowset [date_parse.0 - Select values 2.0]
2013/10/21 13:33:06 - date_parse.0 - Finished dispatching
2013/10/21 13:33:06 - load_posts_to_hadoop - Transformation has allocated a new step: [date_parse].0
2013/10/21 13:33:06 - load_posts_to_hadoop - Transformation is about to allocate step [Select values 2] of type [SelectValues]
2013/10/21 13:33:06 - Select values 2.0 - distribution activated
2013/10/21 13:33:06 - Select values 2.0 - Starting allocation of buffers & new threads...
2013/10/21 13:33:06 - Select values 2.0 - Step info: nrinput=1 nroutput=1
2013/10/21 13:33:06 - Select values 2.0 - Got previous step from [Select values 2] #0 --> date_parse
2013/10/21 13:33:06 - Select values 2.0 - input rel is 1:1
2013/10/21 13:33:06 - Select values 2.0 - Found input rowset [date_parse.0 - Select values 2.0]
2013/10/21 13:33:06 - Select values 2.0 - output rel. is 1:1
2013/10/21 13:33:06 - Select values 2.0 - Found output rowset [Select values 2.0 - Hadoop File Output.0]
2013/10/21 13:33:06 - Select values 2.0 - Finished dispatching
2013/10/21 13:33:06 - load_posts_to_hadoop - Transformation has allocated a new step: [Select values 2].0
2013/10/21 13:33:06 - load_posts_to_hadoop - Transformation is about to allocate step [Hadoop File Output] of type [HadoopFileOutputPlugin]
2013/10/21 13:33:06 - Hadoop File Output.0 - distribution activated
2013/10/21 13:33:06 - Hadoop File Output.0 - Starting allocation of buffers & new threads...
2013/10/21 13:33:06 - Hadoop File Output.0 - Step info: nrinput=1 nroutput=0
2013/10/21 13:33:06 - Hadoop File Output.0 - Got previous step from [Hadoop File Output] #0 --> Select values 2
2013/10/21 13:33:06 - Hadoop File Output.0 - input rel is 1:1
2013/10/21 13:33:06 - Hadoop File Output.0 - Found input rowset [Select values 2.0 - Hadoop File Output.0]
2013/10/21 13:33:06 - Hadoop File Output.0 - Finished dispatching
2013/10/21 13:33:06 - load_posts_to_hadoop - Transformation has allocated a new step: [Hadoop File Output].0
2013/10/21 13:33:06 - load_posts_to_hadoop - This transformation can be replayed with replay date: 2013/10/21 13:33:06
2013/10/21 13:33:06 - load_posts_to_hadoop - Initialising 9 steps...
2013/10/21 13:33:06 - Filter rows 2 2.0 - Running on slave server #0/1.
2013/10/21 13:33:06 - get_needed_values 2.0 - Running on slave server #0/1.
2013/10/21 13:33:06 - Row denormaliser.0 - Running on slave server #0/1.
2013/10/21 13:33:06 - Select values 2.0 - Running on slave server #0/1.
2013/10/21 13:33:06 - Dummy (do nothing).0 - Running on slave server #0/1.
2013/10/21 13:33:06 - Hadoop File Output.0 - Running on slave server #0/1.
2013/10/21 13:33:06 - date_parse.0 - Running on slave server #0/1.
2013/10/21 13:33:06 - Strings cut.0 - Running on slave server #0/1.
2013/10/21 13:33:06 - input_posts.0 - Running on slave server #0/1.
2013/10/21 13:33:06 - Hadoop File Output.0 - Opening output stream in nocompress mode
2013/10/21 13:33:06 - Hadoop File Output.0 - Opening output stream in encoding: UTF-8
2013/10/21 13:33:06 - Hadoop File Output.0 - Opened new file with name [hdfs://192.168.160.134:54310/user/paul/so_posts.csv]
2013/10/21 13:33:06 - load_posts_to_hadoop - Step [Filter rows 2 2.0] initialized flawlessly.
2013/10/21 13:33:06 - load_posts_to_hadoop - Step [Dummy (do nothing).0] initialized flawlessly.
2013/10/21 13:33:06 - load_posts_to_hadoop - Step [get_needed_values 2.0] initialized flawlessly.
2013/10/21 13:33:06 - load_posts_to_hadoop - Step [Row denormaliser.0] initialized flawlessly.
2013/10/21 13:33:06 - load_posts_to_hadoop - Step [input_posts.0] initialized flawlessly.
2013/10/21 13:33:06 - load_posts_to_hadoop - Step [Strings cut.0] initialized flawlessly.
2013/10/21 13:33:06 - load_posts_to_hadoop - Step [date_parse.0] initialized flawlessly.
2013/10/21 13:33:06 - load_posts_to_hadoop - Step [Select values 2.0] initialized flawlessly.
2013/10/21 13:33:06 - load_posts_to_hadoop - Step [Hadoop File Output.0] initialized flawlessly.
2013/10/21 13:33:06 - Filter rows 2 2.0 - Starting to run...
2013/10/21 13:33:06 - Dummy (do nothing).0 - Starting to run...
2013/10/21 13:33:06 - get_needed_values 2.0 - Starting to run...
2013/10/21 13:33:06 - Row denormaliser.0 - Starting to run...
2013/10/21 13:33:06 - input_posts.0 - Starting to run...
2013/10/21 13:33:06 - Strings cut.0 - Starting to run...
2013/10/21 13:33:06 - input_posts.0 - Finished processing (I=6, O=0, R=0, W=20, U=0, E=0)
2013/10/21 13:33:06 - load_posts_to_hadoop - Transformation has allocated 9 threads and 8 rowsets.
2013/10/21 13:33:06 - Select values 2.0 - Starting to run...
2013/10/21 13:33:06 - Hadoop File Output.0 - Starting to run...
2013/10/21 13:33:06 - date_parse.0 - Starting to run...
2013/10/21 13:33:06 - Filter rows 2 2.0 - Finished processing (I=0, O=0, R=20, W=20, U=0, E=0)
2013/10/21 13:33:06 - get_needed_values 2.0 - Finished processing (I=0, O=0, R=17, W=17, U=0, E=0)
2013/10/21 13:33:06 - date_parse.0 - ERROR (version 4.4.0-stable, build 17588 from 2012-11-21 16.02.21 by buildguy) : Unexpected error
2013/10/21 13:33:06 - Dummy (do nothing).0 - Finished processing (I=0, O=0, R=3, W=3, U=0, E=0)
2013/10/21 13:33:06 - date_parse.0 - ERROR (version 4.4.0-stable, build 17588 from 2012-11-21 16.02.21 by buildguy) : org.pentaho.di.core.exception.KettleValueException:
2013/10/21 13:33:06 - date_parse.0 - ERROR (version 4.4.0-stable, build 17588 from 2012-11-21 16.02.21 by buildguy) : LastEditDate String : couldn't convert string [2010-03-28T03:36:13.990] to a date using format [yyyy-MM-dd'T'HH:mm:ss.SSS]
2013/10/21 13:33:06 - date_parse.0 - ERROR (version 4.4.0-stable, build 17588 from 2012-11-21 16.02.21 by buildguy) : Unparseable date: "2010-03-28T03:36:13.990"
2013/10/21 13:33:06 - date_parse.0 - ERROR (version 4.4.0-stable, build 17588 from 2012-11-21 16.02.21 by buildguy) :
2013/10/21 13:33:06 - date_parse.0 - ERROR (version 4.4.0-stable, build 17588 from 2012-11-21 16.02.21 by buildguy) : at org.pentaho.di.core.row.ValueMeta.convertStringToDate(ValueMeta.java:619)
2013/10/21 13:33:06 - date_parse.0 - ERROR (version 4.4.0-stable, build 17588 from 2012-11-21 16.02.21 by buildguy) : at org.pentaho.di.core.row.ValueMeta.getDate(ValueMeta.java:1670)
2013/10/21 13:33:06 - date_parse.0 - ERROR (version 4.4.0-stable, build 17588 from 2012-11-21 16.02.21 by buildguy) : at org.pentaho.di.trans.steps.selectvalues.SelectValues.metadataValues(SelectValues.java:345)
2013/10/21 13:33:06 - date_parse.0 - ERROR (version 4.4.0-stable, build 17588 from 2012-11-21 16.02.21 by buildguy) : at org.pentaho.di.trans.steps.selectvalues.SelectValues.processRow(SelectValues.java:394)
2013/10/21 13:33:06 - date_parse.0 - ERROR (version 4.4.0-stable, build 17588 from 2012-11-21 16.02.21 by buildguy) : at org.pentaho.di.trans.step.RunThread.run(RunThread.java:50)
2013/10/21 13:33:06 - date_parse.0 - ERROR (version 4.4.0-stable, build 17588 from 2012-11-21 16.02.21 by buildguy) : at java.lang.Thread.run(Thread.java:724)
2013/10/21 13:33:06 - date_parse.0 - ERROR (version 4.4.0-stable, build 17588 from 2012-11-21 16.02.21 by buildguy) : Caused by: java.text.ParseException: Unparseable date: "2010-03-28T03:36:13.990"
2013/10/21 13:33:06 - date_parse.0 - ERROR (version 4.4.0-stable, build 17588 from 2012-11-21 16.02.21 by buildguy) : at java.text.DateFormat.parse(DateFormat.java:357)
2013/10/21 13:33:06 - date_parse.0 - ERROR (version 4.4.0-stable, build 17588 from 2012-11-21 16.02.21 by buildguy) : at org.pentaho.di.core.row.ValueMeta.convertStringToDate(ValueMeta.java:614)
2013/10/21 13:33:06 - date_parse.0 - ERROR (version 4.4.0-stable, build 17588 from 2012-11-21 16.02.21 by buildguy) : ... 5 more
2013/10/21 13:33:06 - date_parse.0 - Finished processing (I=0, O=0, R=1, W=0, U=0, E=1)
2013/10/21 13:33:06 - Select values 2.0 - Finished processing (I=0, O=0, R=0, W=0, U=0, E=0)
2013/10/21 13:33:06 - load_posts_to_hadoop - load_posts_to_hadoop
2013/10/21 13:33:06 - Strings cut.0 - Finished processing (I=0, O=0, R=1, W=1, U=0, E=0)
2013/10/21 13:33:06 - load_posts_to_hadoop - load_posts_to_hadoop
2013/10/21 13:33:06 - Row denormaliser.0 - Finished processing (I=0, O=0, R=17, W=1, U=0, E=0)
2013/10/21 13:33:06 - Hadoop File Output.0 - Finished processing (I=0, O=0, R=0, W=0, U=0, E=0)
2013/10/21 13:33:06 - Spoon - The transformation has finished!!
2013/10/21 13:33:06 - load_posts_to_hadoop - ERROR (version 4.4.0-stable, build 17588 from 2012-11-21 16.02.21 by buildguy) : Errors detected!
2013/10/21 13:33:06 - load_posts_to_hadoop - ERROR (version 4.4.0-stable, build 17588 from 2012-11-21 16.02.21 by buildguy) : Errors detected!
Thank you for any help