Data360 Analyze

 View Only
  • 1.  txt file creation fails due to no excapechar error

    Employee
    Posted 12-15-2021 11:22

    Hey support,

    I would like to create a tab delimited txt file on the output, but the Output Delimited node throws me a "need to escape, but no escapechar set" error.

    The output file's structure should be:

    1st row: email address

    2nd row: header string

    3rd row: fields names (separated with commas)

    from 4th row: data (tab delimited)

    Thank you for help.



  • 2.  RE: txt file creation fails due to no excapechar error

    Employee
    Posted 12-15-2021 12:04

    Can you please provide us with a example of the required output file and the corresponding source data (e.g. as a .csv) that would be used as the data payload in the output file. 

    Your description does not specify what constitues the header string on the 2nd row.



  • 3.  RE: txt file creation fails due to no excapechar error

    Employee
    Posted 12-15-2021 12:42

    Hi Adrian,

    I attach the simplified input and output.

    Every record's last field is filled with END.

    The 2nd header line look like this: "HEADERBRAIN1" + str(now) + "B" + in1.Fieldxyz + "IMRWEBTAXIER" + recordCount

    The input comes from a database source originally.

    Thanks in advance.

     

    Attached files

    input.xlsx
    output.txt

     



  • 4.  RE: txt file creation fails due to no excapechar error

    Employee
    Posted 12-15-2021 14:02

    The version of Data360 where I try to implement this is 3.4.3.5100

    Could be a solution to generate the .txt files through a Transform node by a python code?



  • 5.  RE: txt file creation fails due to no excapechar error

    Employee
    Posted 12-16-2021 09:04

    Firstly, I do not have a v.3.4.x instance as this version is no longer supported (see the support lifecycle page). You should upgrade to a supported version at the earliest opportunity.

    The data flow logic can leverage Python code but it is more involved than using a single node. For example:

     

     

    The 'Go' Create Data node is configured to output the 'Fieldxyz' field.

    The 'Test Data' node represents your source data (per the input.xlsx file). The Aggregate node is configured to count the number of input data records:

    The Lookup node is left at its default configuration - meaning the fields from the inputs will be merged.

    The 'Add _IsHeader field' Transform node is configured to add the '_IsHeader' field to the input data set.

    The 'Generate Header rows' Transform node is configured to create the first two rows in the output file:

    The Cat node is configured to produce the union of the two input data sets :

    The 'Create Output Records' Transform node is configured to generate the data that wil be written to the output file:

    #### ConfigureFields Script

    #### ProcessRecords Script

    The above two scripts are included in the 'Create Output Records Transform Node Scripts.txt' file below.

    The 'Write Output File' Output CDV/Delimited node writes the output data. It is configured with the Filename.  

    The node is also configured with a custom FieldDelimiter - here it is set to Hex code for the ASCII 'BEL' character (0x07) but you could use any character that is guaranteed to not be included in the input data. The HeaderMode is set to 'None' to supress the writing of the field header record to the output file. The FileExistsBehavior property can also be set to 'Overwrite' if required.

    The contents of the output file then contains the following when viewed in Notepad++:

     

    For users of Analyze v.3.6.x/ v.3.8.x see the attached example data flow.

     

    Attached files

    Write_Custom_Delimited_File - 16 Dec 2021.lna
    Create Output Records Transform Node Scripts.txt

     



  • 6.  RE: txt file creation fails due to no excapechar error

    Employee
    Posted 12-28-2021 09:28

    Thanks for the help and the detailed guidance!

    We're in the migration process for a higher version of the tool (3.6.8.6779), just we still have outstanding connections to handle between Azure cloud and the non-Azure environments.