2022-07-17 22:23:09.903 [main] INFO VMInfo - VMInfo# operatingSystem class => sun.management.OperatingSystemImpl 2022-07-17 22:23:09.913 [main] INFO Engine - the machine info => osInfo: Oracle Corporation 1.8 25.262-b10 jvmInfo: Linux amd64 3.10.0-957.el7.x86_64 cpu num: 128 totalPhysicalMemory: -0.00G freePhysicalMemory: -0.00G maxFileDescriptorCount: -1 currentOpenFileDescriptorCount: -1 GC Names [PS MarkSweep, PS Scavenge] MEMORY_NAME | allocation_size | init_size PS Eden Space | 256.00MB | 256.00MB Code Cache | 240.00MB | 2.44MB Compressed Class Space | 1,024.00MB | 0.00MB PS Survivor Space | 42.50MB | 42.50MB PS Old Gen | 683.00MB | 683.00MB Metaspace | -0.00MB | 0.00MB 2022-07-17 22:23:09.945 [main] INFO Engine - { "content":[ { "reader":{ "name":"hdfsreader", "parameter":{ "column":[ "*" ], "defaultFS":"hdfs://hadoop01:8020/", "encoding":"UTF-8", "fieldDelimiter":"\t", "fileType":"text", "path":"/user/hive/warehouse/user_info/user_info_data.txt" } }, "writer":{ "name":"hdfswriter", "parameter":{ "column":[ { "name":"user_id", "type":"string" }, { "name":"age", "type":"int" } ], "compress":"", "defaultFS":"hdfs://hadoop01:8020/", "fieldDelimiter":"\t", "fileName":"user_info_data_1.txt", "fileType":"text", "path":"/user/hive/warehouse/user_info/", "writeMode":"append" } } } ], "setting":{ "speed":{ "channel":"1" } } } 2022-07-17 22:23:09.976 [main] WARN Engine - prioriy set to 0, because NumberFormatException, the value is: null 2022-07-17 22:23:09.980 [main] INFO PerfTrace - PerfTrace traceId=job_-1, isEnable=false, priority=0 2022-07-17 22:23:09.980 [main] INFO JobContainer - DataX jobContainer starts job. 2022-07-17 22:23:09.983 [main] INFO JobContainer - Set jobId = 0 2022-07-17 22:23:10.019 [job-0] INFO HdfsReader$Job - init() begin... 2022-07-17 22:23:10.430 [job-0] INFO HdfsReader$Job - hadoopConfig details:{"finalParameters":[]} 2022-07-17 22:23:10.431 [job-0] INFO HdfsReader$Job - init() ok and end... 2022-07-17 22:23:11.634 [job-0] INFO JobContainer - jobContainer starts to do prepare ... 2022-07-17 22:23:11.634 [job-0] INFO JobContainer - DataX Reader.Job [hdfsreader] do prepare work . 2022-07-17 22:23:11.634 [job-0] INFO HdfsReader$Job - prepare(), start to getAllFiles... 2022-07-17 22:23:11.635 [job-0] INFO HdfsReader$Job - get HDFS all files in path = [/user/hive/warehouse/user_info/user_info_data.txt] 2022-07-17 22:23:12.585 [job-0] INFO HdfsReader$Job - [hdfs://hadoop01:8020/user/hive/warehouse/user_info/user_info_data.txt]是[text]类型的文件, 将该文件加入source files列表 2022-07-17 22:23:12.587 [job-0] INFO HdfsReader$Job - 您即将读取的文件数为: [1], 列表为: [hdfs://hadoop01:8020/user/hive/warehouse/user_info/user_info_data.txt] 2022-07-17 22:23:12.587 [job-0] INFO JobContainer - DataX Writer.Job [hdfswriter] do prepare work . 2022-07-17 22:23:12.677 [job-0] INFO HdfsWriter$Job - 由于您配置了writeMode append, 写入前不做清理工作, [/user/hive/warehouse/user_info/] 目录下写入相应文件名前缀 [user_info_data_1.txt] 的文件 2022-07-17 22:23:12.678 [job-0] INFO JobContainer - jobContainer starts to do split ... 2022-07-17 22:23:12.679 [job-0] INFO JobContainer - Job set Channel-Number to 1 channels. 2022-07-17 22:23:12.680 [job-0] INFO HdfsReader$Job - split() begin... 2022-07-17 22:23:12.681 [job-0] INFO JobContainer - DataX Reader.Job [hdfsreader] splits to [1] tasks. 2022-07-17 22:23:12.681 [job-0] INFO HdfsWriter$Job - begin do split... 2022-07-17 22:23:12.689 [job-0] INFO HdfsWriter$Job - splited write file name:[hdfs://hadoop01:8020//user/hive/warehouse/user_info__89e76b8e_4d1d_4098_abd2_b284680d4027/user_info_data_1.txt__97b736bd_0183_4af1_a817_d16087c405ed] 2022-07-17 22:23:12.689 [job-0] INFO HdfsWriter$Job - end do split. 2022-07-17 22:23:12.689 [job-0] INFO JobContainer - DataX Writer.Job [hdfswriter] splits to [1] tasks. 2022-07-17 22:23:12.701 [job-0] INFO JobContainer - jobContainer starts to do schedule ... 2022-07-17 22:23:12.706 [job-0] INFO JobContainer - Scheduler starts [1] taskGroups. 2022-07-17 22:23:12.804 [job-0] INFO JobContainer - Running by standalone Mode. 2022-07-17 22:23:12.811 [taskGroup-0] INFO TaskGroupContainer - taskGroupId=[0] start [1] channels for [1] tasks. 2022-07-17 22:23:12.819 [taskGroup-0] INFO Channel - Channel set byte_speed_limit to -1, No bps activated. 2022-07-17 22:23:12.820 [taskGroup-0] INFO Channel - Channel set record_speed_limit to -1, No tps activated. 2022-07-17 22:23:12.833 [taskGroup-0] INFO TaskGroupContainer - taskGroup[0] taskId[0] attemptCount[1] is started 2022-07-17 22:23:12.889 [0-0-0-writer] INFO HdfsWriter$Task - begin do write... 2022-07-17 22:23:12.889 [0-0-0-writer] INFO HdfsWriter$Task - write to file : [hdfs://hadoop01:8020//user/hive/warehouse/user_info__89e76b8e_4d1d_4098_abd2_b284680d4027/user_info_data_1.txt__97b736bd_0183_4af1_a817_d16087c405ed] 2022-07-17 22:23:12.893 [0-0-0-reader] INFO HdfsReader$Job - hadoopConfig details:{"finalParameters":["mapreduce.job.end-notification.max.retry.interval","mapreduce.job.end-notification.max.attempts"]} 2022-07-17 22:23:12.896 [0-0-0-reader] INFO Reader$Task - read start 2022-07-17 22:23:12.896 [0-0-0-reader] INFO Reader$Task - reading file : [hdfs://hadoop01:8020/user/hive/warehouse/user_info/user_info_data.txt] 2022-07-17 22:23:12.927 [0-0-0-reader] INFO UnstructuredStorageReaderUtil - CsvReader使用默认值[{"captureRawRecord":true,"columnCount":0,"comment":"#","currentRecord":-1,"delimiter":"\t","escapeMode":1,"headerCount":0,"rawRecord":"","recordDelimiter":"\u0000","safetySwitch":false,"skipEmptyRecords":true,"textQualifier":"\"","trimWhitespace":true,"useComments":false,"useTextQualifier":true,"values":[]}],csvReaderConfig值为[null] 2022-07-17 22:23:12.934 [0-0-0-reader] INFO Reader$Task - end read source files... 2022-07-17 22:23:12.972 [0-0-0-writer] ERROR HdfsWriter$Job - 写文件文件[hdfs://hadoop01:8020//user/hive/warehouse/user_info__89e76b8e_4d1d_4098_abd2_b284680d4027/user_info_data_1.txt__97b736bd_0183_4af1_a817_d16087c405ed]时发生IO异常,请检查您的网络是否正常! 2022-07-17 22:23:12.972 [0-0-0-writer] INFO HdfsWriter$Job - start delete tmp dir [hdfs://hadoop01:8020/user/hive/warehouse/user_info__89e76b8e_4d1d_4098_abd2_b284680d4027] . 2022-07-17 22:23:12.982 [0-0-0-writer] INFO HdfsWriter$Job - finish delete tmp dir [hdfs://hadoop01:8020/user/hive/warehouse/user_info__89e76b8e_4d1d_4098_abd2_b284680d4027] . 2022-07-17 22:23:12.986 [0-0-0-writer] ERROR WriterRunner - Writer Runner Received Exceptions: com.alibaba.datax.common.exception.DataXException: Code:[HdfsWriter-04], Description:[您配置的文件在写入时出现IO异常.]. - java.lang.IndexOutOfBoundsException: Index: 2, Size: 2 at java.util.ArrayList.rangeCheck(ArrayList.java:659) at java.util.ArrayList.get(ArrayList.java:435) at com.alibaba.datax.plugin.writer.hdfswriter.HdfsHelper.transportOneRecord(HdfsHelper.java:495) at com.alibaba.datax.plugin.writer.hdfswriter.HdfsHelper.transportOneRecord(HdfsHelper.java:323) at com.alibaba.datax.plugin.writer.hdfswriter.HdfsHelper.textFileStartWrite(HdfsHelper.java:306) at com.alibaba.datax.plugin.writer.hdfswriter.HdfsWriter$Task.startWrite(HdfsWriter.java:360) at com.alibaba.datax.core.taskgroup.runner.WriterRunner.run(WriterRunner.java:56) at java.lang.Thread.run(Thread.java:748) - java.lang.IndexOutOfBoundsException: Index: 2, Size: 2 at java.util.ArrayList.rangeCheck(ArrayList.java:659) at java.util.ArrayList.get(ArrayList.java:435) at com.alibaba.datax.plugin.writer.hdfswriter.HdfsHelper.transportOneRecord(HdfsHelper.java:495) at com.alibaba.datax.plugin.writer.hdfswriter.HdfsHelper.transportOneRecord(HdfsHelper.java:323) at com.alibaba.datax.plugin.writer.hdfswriter.HdfsHelper.textFileStartWrite(HdfsHelper.java:306) at com.alibaba.datax.plugin.writer.hdfswriter.HdfsWriter$Task.startWrite(HdfsWriter.java:360) at com.alibaba.datax.core.taskgroup.runner.WriterRunner.run(WriterRunner.java:56) at java.lang.Thread.run(Thread.java:748) at com.alibaba.datax.common.exception.DataXException.asDataXException(DataXException.java:40) ~[datax-common-0.0.1-SNAPSHOT.jar:na] at com.alibaba.datax.plugin.writer.hdfswriter.HdfsHelper.textFileStartWrite(HdfsHelper.java:317) ~[hdfswriter-0.0.1-SNAPSHOT.jar:na] at com.alibaba.datax.plugin.writer.hdfswriter.HdfsWriter$Task.startWrite(HdfsWriter.java:360) ~[hdfswriter-0.0.1-SNAPSHOT.jar:na] at com.alibaba.datax.core.taskgroup.runner.WriterRunner.run(WriterRunner.java:56) ~[datax-core-0.0.1-SNAPSHOT.jar:na] at java.lang.Thread.run(Thread.java:748) [na:1.8.0_262] Caused by: java.lang.IndexOutOfBoundsException: Index: 2, Size: 2 at java.util.ArrayList.rangeCheck(ArrayList.java:659) ~[na:1.8.0_262] at java.util.ArrayList.get(ArrayList.java:435) ~[na:1.8.0_262] at com.alibaba.datax.plugin.writer.hdfswriter.HdfsHelper.transportOneRecord(HdfsHelper.java:495) ~[hdfswriter-0.0.1-SNAPSHOT.jar:na] at com.alibaba.datax.plugin.writer.hdfswriter.HdfsHelper.transportOneRecord(HdfsHelper.java:323) ~[hdfswriter-0.0.1-SNAPSHOT.jar:na] at com.alibaba.datax.plugin.writer.hdfswriter.HdfsHelper.textFileStartWrite(HdfsHelper.java:306) ~[hdfswriter-0.0.1-SNAPSHOT.jar:na] ... 3 common frames omitted 2022-07-17 22:23:22.825 [job-0] INFO StandAloneJobContainerCommunicator - Total 4 records, 79 bytes | Speed 7B/s, 0 records/s | Error 0 records, 0 bytes | All Task WaitWriterTime 0.000s | All Task WaitReaderTime 0.000s | Percentage 0.00% 2022-07-17 22:23:22.826 [job-0] ERROR JobContainer - 运行scheduler 模式[standalone]出错. 2022-07-17 22:23:22.827 [job-0] ERROR JobContainer - Exception when job run com.alibaba.datax.common.exception.DataXException: Code:[HdfsWriter-04], Description:[您配置的文件在写入时出现IO异常.]. - java.lang.IndexOutOfBoundsException: Index: 2, Size: 2 at java.util.ArrayList.rangeCheck(ArrayList.java:659) at java.util.ArrayList.get(ArrayList.java:435) at com.alibaba.datax.plugin.writer.hdfswriter.HdfsHelper.transportOneRecord(HdfsHelper.java:495) at com.alibaba.datax.plugin.writer.hdfswriter.HdfsHelper.transportOneRecord(HdfsHelper.java:323) at com.alibaba.datax.plugin.writer.hdfswriter.HdfsHelper.textFileStartWrite(HdfsHelper.java:306) at com.alibaba.datax.plugin.writer.hdfswriter.HdfsWriter$Task.startWrite(HdfsWriter.java:360) at com.alibaba.datax.core.taskgroup.runner.WriterRunner.run(WriterRunner.java:56) at java.lang.Thread.run(Thread.java:748) - java.lang.IndexOutOfBoundsException: Index: 2, Size: 2 at java.util.ArrayList.rangeCheck(ArrayList.java:659) at java.util.ArrayList.get(ArrayList.java:435) at com.alibaba.datax.plugin.writer.hdfswriter.HdfsHelper.transportOneRecord(HdfsHelper.java:495) at com.alibaba.datax.plugin.writer.hdfswriter.HdfsHelper.transportOneRecord(HdfsHelper.java:323) at com.alibaba.datax.plugin.writer.hdfswriter.HdfsHelper.textFileStartWrite(HdfsHelper.java:306) at com.alibaba.datax.plugin.writer.hdfswriter.HdfsWriter$Task.startWrite(HdfsWriter.java:360) at com.alibaba.datax.core.taskgroup.runner.WriterRunner.run(WriterRunner.java:56) at java.lang.Thread.run(Thread.java:748) at com.alibaba.datax.common.exception.DataXException.asDataXException(DataXException.java:40) ~[datax-common-0.0.1-SNAPSHOT.jar:na] at com.alibaba.datax.plugin.writer.hdfswriter.HdfsHelper.textFileStartWrite(HdfsHelper.java:317) ~[na:na] at com.alibaba.datax.plugin.writer.hdfswriter.HdfsWriter$Task.startWrite(HdfsWriter.java:360) ~[na:na] at com.alibaba.datax.core.taskgroup.runner.WriterRunner.run(WriterRunner.java:56) ~[datax-core-0.0.1-SNAPSHOT.jar:na] at java.lang.Thread.run(Thread.java:748) ~[na:1.8.0_262] Caused by: java.lang.IndexOutOfBoundsException: Index: 2, Size: 2 at java.util.ArrayList.rangeCheck(ArrayList.java:659) ~[na:1.8.0_262] at java.util.ArrayList.get(ArrayList.java:435) ~[na:1.8.0_262] at com.alibaba.datax.plugin.writer.hdfswriter.HdfsHelper.transportOneRecord(HdfsHelper.java:495) ~[na:na] at com.alibaba.datax.plugin.writer.hdfswriter.HdfsHelper.transportOneRecord(HdfsHelper.java:323) ~[na:na] at com.alibaba.datax.plugin.writer.hdfswriter.HdfsHelper.textFileStartWrite(HdfsHelper.java:306) ~[na:na] ... 3 common frames omitted 2022-07-17 22:23:22.829 [job-0] INFO StandAloneJobContainerCommunicator - Total 4 records, 79 bytes | Speed 79B/s, 4 records/s | Error 0 records, 0 bytes | All Task WaitWriterTime 0.000s | All Task WaitReaderTime 0.000s | Percentage 0.00% 2022-07-17 22:23:22.957 [job-0] ERROR Engine - 经DataX智能分析,该任务最可能的错误原因是: com.alibaba.datax.common.exception.DataXException: Code:[HdfsWriter-04], Description:[您配置的文件在写入时出现IO异常.]. - java.lang.IndexOutOfBoundsException: Index: 2, Size: 2 at java.util.ArrayList.rangeCheck(ArrayList.java:659) at java.util.ArrayList.get(ArrayList.java:435) at com.alibaba.datax.plugin.writer.hdfswriter.HdfsHelper.transportOneRecord(HdfsHelper.java:495) at com.alibaba.datax.plugin.writer.hdfswriter.HdfsHelper.transportOneRecord(HdfsHelper.java:323) at com.alibaba.datax.plugin.writer.hdfswriter.HdfsHelper.textFileStartWrite(HdfsHelper.java:306) at com.alibaba.datax.plugin.writer.hdfswriter.HdfsWriter$Task.startWrite(HdfsWriter.java:360) at com.alibaba.datax.core.taskgroup.runner.WriterRunner.run(WriterRunner.java:56) at java.lang.Thread.run(Thread.java:748) - java.lang.IndexOutOfBoundsException: Index: 2, Size: 2 at java.util.ArrayList.rangeCheck(ArrayList.java:659) at java.util.ArrayList.get(ArrayList.java:435) at com.alibaba.datax.plugin.writer.hdfswriter.HdfsHelper.transportOneRecord(HdfsHelper.java:495) at com.alibaba.datax.plugin.writer.hdfswriter.HdfsHelper.transportOneRecord(HdfsHelper.java:323) at com.alibaba.datax.plugin.writer.hdfswriter.HdfsHelper.textFileStartWrite(HdfsHelper.java:306) at com.alibaba.datax.plugin.writer.hdfswriter.HdfsWriter$Task.startWrite(HdfsWriter.java:360) at com.alibaba.datax.core.taskgroup.runner.WriterRunner.run(WriterRunner.java:56) at java.lang.Thread.run(Thread.java:748) at com.alibaba.datax.common.exception.DataXException.asDataXException(DataXException.java:40) at com.alibaba.datax.plugin.writer.hdfswriter.HdfsHelper.textFileStartWrite(HdfsHelper.java:317) at com.alibaba.datax.plugin.writer.hdfswriter.HdfsWriter$Task.startWrite(HdfsWriter.java:360) at com.alibaba.datax.core.taskgroup.runner.WriterRunner.run(WriterRunner.java:56) at java.lang.Thread.run(Thread.java:748) Caused by: java.lang.IndexOutOfBoundsException: Index: 2, Size: 2 at java.util.ArrayList.rangeCheck(ArrayList.java:659) at java.util.ArrayList.get(ArrayList.java:435) at com.alibaba.datax.plugin.writer.hdfswriter.HdfsHelper.transportOneRecord(HdfsHelper.java:495) at com.alibaba.datax.plugin.writer.hdfswriter.HdfsHelper.transportOneRecord(HdfsHelper.java:323) at com.alibaba.datax.plugin.writer.hdfswriter.HdfsHelper.textFileStartWrite(HdfsHelper.java:306) ... 3 more