One option is to keep those .csv files for an arbitrary period of time. If an issue arises you could search all those files for data of interest. But you need some utility to search many files for text of interest and you could run into space issues which would require deleting deleting old data.
I prefer to import the data into a SQL database. The data can be kept as long as needed and is easier to search.
With SQL server we can use the bcp utility to import the .csv files into a database table. The table should match the fields we get in the .csv file. The following is a sample create table statement to match the .csv file format.
CREATE TABLE [dbo].[AuditRecords](
[UserName] [nvarchar](32) NULL,
[Activity] [nvarchar](16) NULL,
[Timestamp] [datetime] NULL,
[ServerName] [nvarchar](8) NULL,
[Scenario] [nvarchar](16) NULL,
[Year] [smallint] NULL,
[Period] [nvarchar](3) NULL,
[Entity] [nvarchar](32) NULL,
[Value] [nvarchar](16) NULL,
[Account] [nvarchar](40) NULL,
[ICP] [nvarchar](24) NULL,
[Custom1] [nvarchar](32) NULL,
[Custom2] [nvarchar](24) NULL,
[Custom3] [nvarchar](8) NULL,
[Custom4] [nvarchar](32) NULL,
[DataValue] [nvarchar](24) NULL
)
[UserName] [nvarchar](32) NULL,
[Activity] [nvarchar](16) NULL,
[Timestamp] [datetime] NULL,
[ServerName] [nvarchar](8) NULL,
[Scenario] [nvarchar](16) NULL,
[Year] [smallint] NULL,
[Period] [nvarchar](3) NULL,
[Entity] [nvarchar](32) NULL,
[Value] [nvarchar](16) NULL,
[Account] [nvarchar](40) NULL,
[ICP] [nvarchar](24) NULL,
[Custom1] [nvarchar](32) NULL,
[Custom2] [nvarchar](24) NULL,
[Custom3] [nvarchar](8) NULL,
[Custom4] [nvarchar](32) NULL,
[DataValue] [nvarchar](24) NULL
)
Now that we have a table to hold our data we need to import it. The bcp utility uses an import format file to specify how to parse fields in the file being imported. The specifications for the format file are kind of klunky which you can read about here. But this is the format file I use:
10.0
16
1 SQLCHAR 0 64 ";" 1 UserName SQL_Latin1_General_CP1_CI_AS
2 SQLCHAR 0 32 ";" 2 Activity SQL_Latin1_General_CP1_CI_AS
3 SQLCHAR 0 24 ";" 3 Timestamp ""
4 SQLCHAR 0 16 ";" 4 ServerName SQL_Latin1_General_CP1_CI_AS
5 SQLCHAR 0 32 ";" 5 Scenario SQL_Latin1_General_CP1_CI_AS
6 SQLCHAR 0 7 ";" 6 Year ""
7 SQLCHAR 0 6 ";" 7 Period SQL_Latin1_General_CP1_CI_AS
8 SQLCHAR 0 64 ";" 9 Entity SQL_Latin1_General_CP1_CI_AS
9 SQLCHAR 0 32 ";" 10 Value SQL_Latin1_General_CP1_CI_AS
10 SQLCHAR 0 80 ";" 11 Account SQL_Latin1_General_CP1_CI_AS
11 SQLCHAR 0 48 ";" 12 ICP SQL_Latin1_General_CP1_CI_AS
12 SQLCHAR 0 64 ";" 13 Custom1 SQL_Latin1_General_CP1_CI_AS
13 SQLCHAR 0 48 ";" 14 Custom2 SQL_Latin1_General_CP1_CI_AS
14 SQLCHAR 0 16 ";" 15 Custom3 SQL_Latin1_General_CP1_CI_AS
15 SQLCHAR 0 64 ";" 16 Custom4 SQL_Latin1_General_CP1_CI_AS
16 SQLCHAR 0 48 "\r\n" 17 DataValue SQL_Latin1_General_CP1_CI_AS
16
1 SQLCHAR 0 64 ";" 1 UserName SQL_Latin1_General_CP1_CI_AS
2 SQLCHAR 0 32 ";" 2 Activity SQL_Latin1_General_CP1_CI_AS
3 SQLCHAR 0 24 ";" 3 Timestamp ""
4 SQLCHAR 0 16 ";" 4 ServerName SQL_Latin1_General_CP1_CI_AS
5 SQLCHAR 0 32 ";" 5 Scenario SQL_Latin1_General_CP1_CI_AS
6 SQLCHAR 0 7 ";" 6 Year ""
7 SQLCHAR 0 6 ";" 7 Period SQL_Latin1_General_CP1_CI_AS
8 SQLCHAR 0 64 ";" 9 Entity SQL_Latin1_General_CP1_CI_AS
9 SQLCHAR 0 32 ";" 10 Value SQL_Latin1_General_CP1_CI_AS
10 SQLCHAR 0 80 ";" 11 Account SQL_Latin1_General_CP1_CI_AS
11 SQLCHAR 0 48 ";" 12 ICP SQL_Latin1_General_CP1_CI_AS
12 SQLCHAR 0 64 ";" 13 Custom1 SQL_Latin1_General_CP1_CI_AS
13 SQLCHAR 0 48 ";" 14 Custom2 SQL_Latin1_General_CP1_CI_AS
14 SQLCHAR 0 16 ";" 15 Custom3 SQL_Latin1_General_CP1_CI_AS
15 SQLCHAR 0 64 ";" 16 Custom4 SQL_Latin1_General_CP1_CI_AS
16 SQLCHAR 0 48 "\r\n" 17 DataValue SQL_Latin1_General_CP1_CI_AS
Now we have a place to store our audit data and a way to import it. The last step is to automate the whole process. I run a PowerShell script daily that uses the AuditExport_cmd.exe to export yesterday's audit data to a .csv, and then import it into the database. I also retain the .csv files in case something goes haywire with the bcp import. I organize these retention folders by month and year in a parent folder named D:\LogArchives and periodically purge old retention folders.
The files created by the AuditExport utility have a maximum size of 17 Mb which may create more than one file per export. The script needs the ability to import multiple files in a single batch. The full script will export the data to a holding area, create a folder for retention if needed, then loop through the files in the holding area, import each with bcp into the database, then move file to the retaining folder.
One issue I ran into is that the data exported is in unicode format but bcp requires ASCII. Also bcp runs on the SQL server. I resolved this by importing the .csv file using Get-Content then exporting it using Out-File with the ASCII -encoding switch. Since I have to run bcp on the SQL server the Out-File target is a share on the SQL server. That share also has the bcp format file and a batch file to run the import. This resolves the file format issue and keeps everything tidy on the SQL server.
The batch file on the SQL server is:
bcp DataAuditArchive.dbo.AuditRecords in T:\DataAudit\TempDataAudit.csv -U DataAudit_update -P Password123 -F 2 -f T:\DataAudit\DataAudit.fmt -e T:\DataAudit\DataAudit.err
Here my database is named DataAuditArchive and the table is AuditRecords. The batch file, bcp format file, error file, and temporary copy of the .csv being imported are all in the T:\DataAudit folder. This one line batch file gets called from a PowerShell script that runs on the HFM server.
I also want to save all of the output in case something goes wrong. I do this by piping all of the results to a log file that gets saved in the monthly archive folder. This gives an audit trail for the script file processing to allow troubleshooting. If I find that one or more of the .csv files did not import I can import those manually from the retaining folder to ensure a complete audit record.
In the PowerShell script below the $Folder variable holds the name of the current months retaining folder which will house all of the .csv files for the month as well as the log file. The $Working folder houses the initial export from the AuditExport utility. The $BCPWorking folder is the share on the SQL Server that has the batch file, bcp import format, and the ASCII version of the .csv file currently being processed.
# Powershell script to archive and truncate data audit logs ending yesterday.
#
$Utility = 'D:\Oracle\Middleware\EPMSystem11R1\products\FinancialManagement\Utilities\HFMAuditExtractCmdLine_x64.exe'
$AppName = 'COMMA' # Name of the HFM Application
$Parms = ' -u D:\HFM.udl -a '+$AppName+' ' # UDL file and application name
$Data = ' -r ' # flag to extract and truncate the data audit logs
#$Data = ' -x ' # flag to extract and truncate the data audit logs
$Logs = 'D:\LogArchives\' # Parent folder for the log files
$TempFolder = 'Daily' # Store the dump from the HFMAuditExtract utility here
$BcpWorking = '\\SqlSvr\DataAudit' # local folder on SQL server for BCP to use for import
$Start = ' -s 1/1/2013' # Arbitrary start date to make sure we get all previous data
$D = get-date
# New folder for each month that is named with the year and month
#
$Folder = $D.ToString('yyyy')+"_"+$D.ToString('MM')+$D.ToString('MMM')
# Capture everything through yesterday
#
$D = $D.AddDays(-1)
$End = ' -e '+$D.ToString('MM')+'/'+$D.ToString('dd')+'/'+$D.ToString('yyyy')
$Target = $Logs+$Folder # Destination for the logs
if (-not (test-path $Target)) { MkDir $Target } # Create the destination folder if needed
$Working = $Logs+$TempFolder
$Output = ' -d '+$Working # Add the -d so this can be used as a parameter for the utility.
# Extract and truncate the task log
# NOTE: the truncate task for data also does an extract
#
$CMD=$Utility+$Output+$Start+$End+$Parms+$Data # Create the string to execute. Path of utility and all parameters
echo $CMD # Echo so we know what we did
invoke-expression $CMD # Execute the AuditExtract command line
# Process the file(s) that were created in the TempFolder
#
$TempFile = $BCPWorking + "\TempDataAudit_$AppName.csv"
$ScriptLog = $Target + '\BcpImport.log'
$BatchName = "T:\DataAudit\Import$AppName.cmd"
$AuditLogs = get-childitem $Working -filter "Data*$AppName*.csv"
foreach ($Log in $AuditLogs) {
echo $Log.FullName
# Export the file be ANSI encoding because bcp has trouble with format files and unicode data files
#
get-content $Log.FullName | out-file $TempFile -encoding ASCII
# Run bcp to import the data
#
$BcpResults = invoke-command -computername SqlSrv -scriptblock { Param($Bat) & cmd.exe /C $Bat } -ArgumentList $BatchName
# Move the file dated archive folder
#
$MoveResults = Move-Item $Log.FullName $Target -passthru
# Delete the ASCII file we just imported
#
if (test-path $TempFile) { del $TempFile }
# And save the results to the script log
#
(' * * * ') | out-file $ScriptLog -append
$Log.FullName | out-file $ScriptLog -append
(' ') | out-file $ScriptLog -append
$BcpResults | out-file $ScriptLog -append
(' ') | out-file $ScriptLog -append
$MoveResults | out-file $ScriptLog -append
}
#
$Utility = 'D:\Oracle\Middleware\EPMSystem11R1\products\FinancialManagement\Utilities\HFMAuditExtractCmdLine_x64.exe'
$AppName = 'COMMA' # Name of the HFM Application
$Parms = ' -u D:\HFM.udl -a '+$AppName+' ' # UDL file and application name
$Data = ' -r ' # flag to extract and truncate the data audit logs
#$Data = ' -x ' # flag to extract and truncate the data audit logs
$Logs = 'D:\LogArchives\' # Parent folder for the log files
$TempFolder = 'Daily' # Store the dump from the HFMAuditExtract utility here
$BcpWorking = '\\SqlSvr\DataAudit' # local folder on SQL server for BCP to use for import
$Start = ' -s 1/1/2013' # Arbitrary start date to make sure we get all previous data
$D = get-date
# New folder for each month that is named with the year and month
#
$Folder = $D.ToString('yyyy')+"_"+$D.ToString('MM')+$D.ToString('MMM')
# Capture everything through yesterday
#
$D = $D.AddDays(-1)
$End = ' -e '+$D.ToString('MM')+'/'+$D.ToString('dd')+'/'+$D.ToString('yyyy')
$Target = $Logs+$Folder # Destination for the logs
if (-not (test-path $Target)) { MkDir $Target } # Create the destination folder if needed
$Working = $Logs+$TempFolder
$Output = ' -d '+$Working # Add the -d so this can be used as a parameter for the utility.
# Extract and truncate the task log
# NOTE: the truncate task for data also does an extract
#
$CMD=$Utility+$Output+$Start+$End+$Parms+$Data # Create the string to execute. Path of utility and all parameters
echo $CMD # Echo so we know what we did
invoke-expression $CMD # Execute the AuditExtract command line
# Process the file(s) that were created in the TempFolder
#
$TempFile = $BCPWorking + "\TempDataAudit_$AppName.csv"
$ScriptLog = $Target + '\BcpImport.log'
$BatchName = "T:\DataAudit\Import$AppName.cmd"
$AuditLogs = get-childitem $Working -filter "Data*$AppName*.csv"
foreach ($Log in $AuditLogs) {
echo $Log.FullName
# Export the file be ANSI encoding because bcp has trouble with format files and unicode data files
#
get-content $Log.FullName | out-file $TempFile -encoding ASCII
# Run bcp to import the data
#
$BcpResults = invoke-command -computername SqlSrv -scriptblock { Param($Bat) & cmd.exe /C $Bat } -ArgumentList $BatchName
# Move the file dated archive folder
#
$MoveResults = Move-Item $Log.FullName $Target -passthru
# Delete the ASCII file we just imported
#
if (test-path $TempFile) { del $TempFile }
# And save the results to the script log
#
(' * * * ') | out-file $ScriptLog -append
$Log.FullName | out-file $ScriptLog -append
(' ') | out-file $ScriptLog -append
$BcpResults | out-file $ScriptLog -append
(' ') | out-file $ScriptLog -append
$MoveResults | out-file $ScriptLog -append
}
This script is scheduled to run daily which keeps the DataAudit table in the HFM database lean and efficient. The audit table becomes our source of truth unless the event happened today in which case we use the Data Audit tab in Workspace.