May 23, 2016

Auditing Financial Reports

You know how it goes.  Somebody starts a project and then forgets about it.  Or there are multiple versions of a report that were made for various purposes.  Or a report was critical at one point but eventually interest waned and is now unused.  Whatever the cause you wind up with a bunch of reports, books, and batches that are never used.  But you can't tell which ones aren't used if you don't have an audit trail.

There other uses for an audit trail beyond finding which reports are used.  Maybe you need to know who runs the reports, or how frequently reports are generated, or if reports are run at a certain times so you can plan maintenance.  Or maybe you're just curious.


Oracle docs 1353965.1 and 1947774.1 show the FRLogging.log file holds the audit data of Financial Reporting objects.  The FRLogging.log contains a bunch of information including start and stop times for reports and books, steps taken to produce output, queries used to pull from the data source, and other details.

The Problem

As noted in the 1947774.1 document you configure logging in the  MIDDLEWARE_HOME/user_projects/domains/EPM_System/config/fmwconfig/servers/FinancialReporting0/logging.xml file.  By default BI+ limits the log file size to 1Mb.  When the log reaches that limit the FRLogging.log file is renamed by appending a sequential number to the file name and a new log file is created.  By default only 50Mb of log files are retained.

This is good news because BI+ won't create an infinite number of logs which could fill your drive and crash your system.  But this is bad news because you can blow through 50Mb of logging in an afternoon of running multiple report batches during month end close, thus losing historical data.

Also the logs contain a lot of information that isn't interesting from an audit stand point.  We just want lines that show the name of the report as well as the time it ran and the user who ran it.

We have to parse the FRLogging logs to get just the summary information.  But we can't process the FRLogging.log file directly because it is held open by the Financial Reporting services and new records are constantly being appended.

The Solution

Windows has a .NET class called FileSystemWatcher.  This will monitor a file or folder and take action when file system events occur such as creating, deleting, or renaming a file.  We can use this feature to monitor the FRLogging.log file.  When it gets renamed we find the newest numbered file in the folder and process that.  This allows us to process a file that is not in use and prevents us from processing a log file more than once.

There are several lines in the FRLogging.log that are useful for audits.  The START: record gets added when the job starts and the JOB: record gives a completion summary of the job including the total time it took the job to complete.  In between these are one or more REPORT: records.  If you run just the report you get one REPORT: record but for a book or batch you will get multiple records, one for each report generated.


Each of those lines contain the string "start:" (note the colon in the string).  For the JOB: and REPORT: records this string identifies the start time of the event while for the START: records the string is the record identifier.  No other records contain this string so this is a sure way to parse the log for the audit data.

To put this all together we need a script that will run FileSystemWatcher, wait for the FRLogging.log file to get renamed, find the latest numbered file, do a case-insensitive search for lines with the string "start:", and drop those lines into a file we can retain for as long as we need.  This PowerShell script accomplishes those tasks:

# PoSh script to keep a record of all report usage
#   Use the FileSystemWatcher to monitor the FRLogging.log.  When the file gets renamed
#  we pull the job information from the most recent file and save it to an audit log
#
$Folder = `
"D:\Oracle\Middleware\user_projects\domains\EPMSystem\servers\FinancialReporting0\logs"
$File   = "FRLogging.log" # File to monitor with FileSystemWatcher
$Filter = "FRLogging*.log" # Filter to find the most recently renamed file
$Output = "D:\FRAudit.log" # Running audit trail

# Configure the FileSystemWatcher.  Create the watcher object, then define the
# folder and file to watch
#
$watcher        = New-Object System.IO.FileSystemWatcher
$watcher.Path   = $Folder
$watcher.Filter = $File

# Define the action to take when the file we are watching gets renamed
#
$renamed = Register-ObjectEvent $watcher "Renamed" -Action {

  # Get the FullName of the most recent numbered FRLogging*.log file
  $LogFile = (Get-ChildItem $Folder $Filter | Sort LastWriteTime -desc)[1].FullName

  # Do a case insensitive search for all lines that contain the string "start:"
  $NewRecs = get-content $LogFile | where {$_ -match "start:"}

  # And append those lines to our running audit trail
  $NewRecs | out-file $Output -append
}

If you use this script you will need to change the value for the $Folder variable to match your environment.  The value of the $Output variable is arbitrary but must be the full path to a file for your running audit trail.  The tricky part of the script is the line where we find the most recent log file.  Get-ChildItem returns an array of files matching the filter in the folder.  We sort that array by LastWriteTime in reverse order and take the second element of the array (index [1]) to get the latest numbered file since the first element (array index [0]) is the FRLogging.log file.

But how do we make sure the script stays running to continuously monitor the FRLogging.log file?

An easy solution is to use Windows Task Scheduler to create a task that runs at system startup.  You can also start the task manually when you first create it or if you adjust it later.  Since there is nothing to terminate the script the task keeps running in the background.  We basically get a service that starts automatically at every reboot.


Our new audit log is just a raw dump of the the useful lines from the FRLogging.log file.  You will still need to parse that data to isolate the information of interest to you.  (I have another PowerShell script to do that and will send it to you if you are interested.)  But the key point is that we can leverage the transient information Oracle gives us and generate a persistent audit trail.  And we can do it for free using built-in tools and with minimal overhead.