This post has bas been made on behalf of a user who posted the following question on the (now retired) Lavastorm Forums:
Need help on this date reformat. I have a data that have a date&time in the format below but I've tried to reformat it to "MMDDCCYY""Thu Mar 12 00:00:03 2020"
Assuming your input datetime string is in a field named 'testDate' you could use the following script in a Filter node:
## Trim any leading and trailing whitespace charsdt = testDate.trim()## Split the string on the space charsdt_elements = dt.split(" ")## Construct a new string containing only the required date elementsdateStr = dt_elements + " " + dt_elements + " " + dt_elements## Convert the new string to a Date typedateValue = date(dateStr, "m D CCYY")## Define the formatted date stringoutDateStr = pad(str(month(dateValue)), 2, "0" ) + pad(str(day(dateValue)), 2, "0" ) + str(year(dateValue))emit *emit outDateStr as "outDate"
Note, you will need to either ensure there are no Null values in the input data, or modify the above script to handle the Null values.
Thank you Adrian...here is the complete data that i'm trying to split them into different column name according to each field name but at this time they all under one column name "Record".
Thu Mar 12 00:00:03 2020Acct-Session-Id = "0/0/3/320_032848C5"Framed-IP-Address = 126.96.36.199Framed-Protocol = PPPUser-Name = "email@example.com"Cisco-AVPair = "connect-progress=LAN Ses Up"Cisco-AVPair = "nas-tx-speed=1000000000"Cisco-AVPair = "nas-rx-speed=1000000000"Acct-Session-Time = 5382Acct-Input-Octets = 71721063Acct-Output-Octets = 256278499Acct-Input-Packets = 175185Acct-Output-Packets = 238067Acct-Authentic = RADIUSAcct-Status-Type = Interim-UpdateNAS-Port-Type = EthernetNAS-Port = 50331968NAS-Port-Id = "0/0/3/320"Connect-Info = "ACCESS-DYNAMIC"Cisco-AVPair = "client-mac-address=c471.5457.9edd"Cisco-AVPair = "circuit-id-tag=1 atm 02/34:0.34"Service-Type = Framed-UserNAS-IP-Address = 10.254.32.14PMIP6-Home-HN-Prefix = 3546:3732:3343::/50Event-Timestamp = "Mar 12 2020 00:00:03 +13"NAS-Identifier = "ha_bng2.kalianet.to"Acct-Delay-Time = 0Proxy-State = 0x3836Acct-Input-Octets64 = 71721063Acct-Output-Octets64 = 256278499FreeRADIUS-Acct-Session-Start-Time = "Mar 11 2020 22:30:21 +13"Tmp-String-9 = "ai:"Acct-Unique-Session-Id = "242a5d975725a2e3cb54058f74c10900"Timestamp = 1583924403
Assuming each input data record comprises the structure in your example above you can use a FIlter node to separate out the record datetime element and split out the embedded key value pairs:
The output of the node would be as follows:
However, the above view of the output data has been sorted by the 'key' field to highlight that each input record contains multiple elements with the same key, Typically, the Pivot Data to Names node would be used to transpose the data so that the values of the key fields could be used as field names and the 'value' field's contents could be used as the corresponding value for the new field. However, in this case the node would generate an error as you cannot have multiple output fields with the same name (Cisco-AVPair in this case).
You would need to disambiguate the key field's value in line with your requirements before the data could be transposed using the Pivot Data to Names node.
The example is attached as a text file (copy the contents onto the canvas) and also as a .brg file (for LAE 6.x).