Load Failure Response
Response when requested COB Date folder does not exist
The server would return a HTTP 500 error, and an appropriate message indicating the directory location for the specified topic and CobDate does not exist.
Reponse with Parsing Exceptions
Note: The status below indicates the status of the current request. To check if there were
any errors in the data load, check the eventType
in the list of “events”.
{
"status": "success",
"data": {
"taskName": "7a8c982a87b050f2",
"timeTakenMs": 45957500,
"events": [
{
"threadName": "csv-worker-CsvSource-Bulk-2",
"threadId": 65,
"clazz": "com.qfs.msg.csv.impl.Parser",
"method": "onTranslationFailure",
"lineNumber": 477,
"timeStamp": 1584052234702,
"taskName": "7a8c982a87b050f2",
"cause": {
"type": "com.quartetfs.fwk.QuartetRuntimeException",
"message": "Exception while translating column ProductPrice with parser Plugin Value (key=double, class com.quartetfs.fwk.format.impl.DoubleParser) in file FileSystemFileInfo [fullName=C:\\source\\tools\\data-connectors\\data-connectors-csv\\target\\test-classes\\data-samples\\data\\2019-01-01\\badproduct.csv, lastModifiedTime=Thu Mar 12 18:30:15 EDT 2020]. Parsed text was \"abc\"",
"stackTrace": [
{
"methodName": "compute",
"fileName": "CSVColumnParser.java",
"lineNumber": 165,
"className": "com.qfs.msg.csv.impl.CSVColumnParser",
"nativeMethod": false
},
{
"methodName": "processColumn",
"fileName": "ColumnCalculationContext.java",
"lineNumber": 159,
"className": "com.qfs.msg.impl.ColumnCalculationContext",
"nativeMethod": false
},
{
"methodName": "process",
"fileName": "ColumnCalculationContext.java",
"lineNumber": 113,
"className": "com.qfs.msg.impl.ColumnCalculationContext",
"nativeMethod": false
},
.
.
.
],
"cause": {
"type": "java.lang.NumberFormatException",
"message": "Digit or '.' required",
"stackTrace": [
{
"methodName": "parseDouble",
"fileName": "TypeFormat.java",
"lineNumber": 491,
"className": "javolution.text.TypeFormat",
"nativeMethod": false
},
{
"methodName": "parseDouble",
"fileName": "TypeFormat.java",
"lineNumber": 562,
"className": "javolution.text.TypeFormat",
"nativeMethod": false
},
{
"methodName": "getParsedDouble",
"fileName": "DoubleParser.java",
"lineNumber": 54,
"className": "com.quartetfs.fwk.format.impl.DoubleParser",
"nativeMethod": false
},
.
.
.
],
"cause": null
}
},
"upTime": 17315,
"file": "C:\\source\\tools\\data-connectors\\data-connectors-csv\\target\\test-classes\\data-samples\\data\\2019-01-01\\badproduct.csv",
"line": "[soap_21, GreenSoap, Green, abc, 2019-01-01]",
"fileLineNumber": 18,
"tags": [
"health",
"dlc",
"source",
"parsing"
],
"eventType": "PARSING_EXCEPTION"
},
{
"threadName": "csv-worker-CsvSource-Bulk-2",
"threadId": 65,
"clazz": "com.qfs.msg.csv.impl.PublicationQueue$1",
"method": "onCompletion",
"lineNumber": 159,
"timeStamp": 1584052234708,
"taskName": "7a8c982a87b050f2",
"level": {
"name": "INFO",
"resourceBundleName": "sun.util.logging.resources.logging",
"localizedName": "INFO"
},
"upTime": 17321,
"file": "C:\\source\\tools\\data-connectors\\data-connectors-csv\\target\\test-classes\\data-samples\\data\\2019-01-01\\badproduct.csv",
"parsingInfo": {
"publishedObjectCount": 31,
"characterCount": 1390,
"lineCount": 34,
"byteCount": 1392,
"elapsedTime": "PT0.01S",
"aggregatedTime": 9051401,
"aggregatedReadingTime": 565000,
"aggregatedDecodingTime": 224301,
"aggregatedParsingTime": 8257599,
"aggregatedPublicationTime": 4501,
"exceptions": [],
"creationTime": "2020-03-12T22:30:34.698Z",
"numberSkippedLines": 1
},
"fileLineNumber": 0,
"tags": [
"parsing",
"csv",
"source"
],
"eventType": "LOAD_SUMMARY"
},
{
"threadId": 0,
"lineNumber": 0,
"timeStamp": 1584052234713,
"taskName": "7a8c982a87b050f2",
"level": {
"name": "INFO",
"resourceBundleName": "sun.util.logging.resources.logging",
"localizedName": "INFO"
},
"upTime": 0,
"message": "[LOAD_SUMMARY][CSV_SOURCE] CsvSource-Bulk completed fetch batch CsvSource-Bulk-1 in 15ms ( 1 channels { StoreMessageChannel [topic=BadProducts] } and scope {CobDate=2019-01-01} ). 1 KiB 368 bytes (1392) parsed into 34 lines (31 records published)). Average throughput: 87 KiB 400 bytes (89488)/s, 2,185 lines parsed/s, 1,992 published records/s.",
"tags": [
"summary",
"health",
"dlc",
"source"
],
"eventType": "SUMMARY"
}
],
"dlcLoadingStatus": "FAILED"
}
}
In this example, two CSV rows with errors resulted in two error events. Look at the event with type “PARSING_EXCEPTION”.
The details to look for in such exception events are file, line, fileLineNumber, eventType, detailMessage, etc.
Here’s an excerpt from one of the error messages:
"file": "C:\\source\\tools\\data-connectors\\data-connectors-csv\\target\\test-classes\\data-samples\\data\\2019-01-01\\badproduct.csv",
"line": "[soap_21, GreenSoap, Green, abc, 2019-01-01]",
"fileLineNumber": 18,
"message": "Exception while translating column ProductPrice with parser Plugin Value (key=double, class com.quartetfs.fwk.format.impl.DoubleParser) in file FileSystemFileInfo [fullName=C:\\source\\tools\\data-connectors\\data-connectors-csv\\target\\test-classes\\data-samples\\data\\2019-01-01\\badproduct.csv, lastModifiedTime=Thu Mar 12 18:30:15 EDT 2020]. Parsed text was \"abc\"",
"eventType": "PARSING_EXCEPTION"
The load summary event shows the number of rows that have been loaded:
{
"threadName": "csv-worker-CsvSource-Bulk-2",
"threadId": 65,
"clazz": "com.qfs.msg.csv.impl.PublicationQueue$1",
"method": "onCompletion",
"lineNumber": 159,
"timeStamp": 1584052234708,
"taskName": "7a8c982a87b050f2",
"level": {
"name": "INFO",
"resourceBundleName": "sun.util.logging.resources.logging",
"localizedName": "INFO"
},
"upTime": 17321,
"file": "C:\\source\\tools\\data-connectors\\data-connectors-csv\\target\\test-classes\\data-samples\\data\\2019-01-01\\badproduct.csv",
"parsingInfo": {
"publishedObjectCount": 31,
"characterCount": 1390,
"lineCount": 34,
"byteCount": 1392,
"elapsedTime": "PT0.01S",
"aggregatedTime": 9051401,
"aggregatedReadingTime": 565000,
"aggregatedDecodingTime": 224301,
"aggregatedParsingTime": 8257599,
"aggregatedPublicationTime": 4501,
"exceptions": [],
"creationTime": "2020-03-12T22:30:34.698Z",
"numberSkippedLines": 1
},
"fileLineNumber": 0,
"tags": [
"parsing",
"csv",
"source"
],
"eventType": "LOAD_SUMMARY"
}
The key is to check that publishedObjectCount = (lineCount - nbSkippedLines) in the LOAD_SUMMARY event.