January 30, 2018

Blind Date

We use FDQM to pull data from our various ERP systems into HFM.  Whenever possible we use import scripts to pull data directly from the ERP system or an intermediary SQL datamart rather than from flat files.  In addition to integration scripts that do a complete pull of the data from an external source, FDQM can use data pump scripts which provide a way to customize processing just one field when importing a file.

FDQM scripting uses vbScript so scripting features like file access are available as well as ADODB to make database connections and queries.  FDQM also provides its own API libraries to access native features such as the current POV.  These are available via the objects API (application programming interface), DW (Data Window for database access) and RES (FDQM resources).  These can all be seen in the Object Browser in FDM Workbench.


In our source systems the data is stored with fields that identify the period and year.  The RES.PstrPer object returns the period currently selected in FDQM. This is the value in the Text Description field in the Control Table for Periods



To access the period and year we take appropriate substrings of the RES.PstrPer value.

Dim StrPer 'Uses the Date POV to find the current period
Dim StrYr 'Uses the Date POV to find the current year
strPer = Left(RES.PstrPer,3) 'Retrieve the period from the POV
strYr  = Right(RES.PstrPer,4) 'Retrieve the year from the POV

If the periods are referenced by number instead of name there are a variety of ways to accomplish the translation.  I like finding the position of the period in the full list of periods then doing some math.

' The strAllPers contains the abbreviations of all the periods
'   Search for the position of the current period, subtract 1, divide by 4 and add 1
'   So for Jan we get ((1-1=0)/4=0)+1 = 1
'      for Feb we get ((5-1=4)/4=1)+1 = 2
'   etc.
'   for Dec we get ((45-1=44)/4=11)+1 = 12
'
Dim strPerNum ' Numeric value of period
Dim strAllPers ' All periods for use in index
strAllPers = "Jan Feb Mar Apr May Jun Jul Aug Sep Oct Nov Dec"
strPerNum = ((InStr(strAllPers,strPer)-1)/4)+1

Problem


For one of our ERP systems we have to provide the dates for the start and end of the period to a SQL stored procedure.  The RES.PdtePerKey retrieves the end date for the period which is the Period field in the Periods Control Table as shown in the previous screen shot.  Then use the vbScript DateAdd function to calculate the start date of the period.

Dim dtePerKey ' Last date of the current POV period
dtePerKey = RES.PdtePerKey

Dim dteStart ' Start date of the current POV period
dteStart = DateAdd("d",1,DateAdd("m", -1, dtePerKey))

For Sep - 2016 we get 9/30/16 as dtePerKey.  We subtract one month to get the last day of August, then add one day to get the first day of September.  Note that the RES.PdtePerKey returns a date value, not a string, so use appropriate conversion functions as needed.

Our problem came when we converted from calendar year periods to 4-4-5 fiscal periods.  This means we can't guarantee that the period end date is the last day of the month or the period start date is the first of the month.  What's a mother to do?

First we configure the correct period end dates for our fiscal periods in the Periods control table.  Define the period end date and the prior period end date along with the target period and year.



In the Workbench Client there are a number of Accelerators.  Under the section Point-Of-View Information there is an accelerator named Get Period Prior Date Key.



Sweet.  In the FDQM Periods control table we define the correct end dates for the periods, use the accelerator to get the prior period end date, then add one day to get our period start date.  Double-clicking the accelerator adds the following code to your script.

'Declare Local Variables
Dim dtePriorPeriodKey

'Get prior period date key
dtePriorPeriodKey = API.POVMgr.fPeriodKey(API.POVMgr.PPOVPeriod, True).dteDateKey

Easy-peasy.  But when you run the script you get:

Error: An error occurred importing the file.  Detail: Object required: 'API'

Ain't that a kick in the knickers?  It turns out you can only use the calls to API objects in data pump scripts, not integration scripts.

Fortunately we have standard vbScript features available including ADODB which allows access to databases.  Any databases.  Even our FDQM database.  FDQM uses the tPOVPeriod table to store the period data.  We can use ADODB to make a connection to the FDQM database, query the tPOVPeriod table for the period in our POV, get the PriorPeriodKey value, then add one day.

The code I use which includes some error handling looks like this:

' Find the start date of the period.  We can't use the API in the import scripts
'   The tPOVPeriod table in the FDQM database has the period information which includes
'   the prior period end date.  So we make another ADODB connection to that database
'   and find the record for this period.
' Get the PriorPeriodKey field from the result set and add one day
'
Dim fdmSS    ' Connection to FDQM database
Dim fdmRs    ' Records returned from query
Set fdmSS = CreateObject("ADODB.Connection")
Set fdmRs = CreateObject("ADODB.Recordset")
fdmSS.Open "Provider=SQLOLEDB.1;Integrated Security=SSPI;Persist Security Info=False;Initial Catalog=FDQM;Data Source=MySQLServer;"

' The query should return one record
'
Dim fdmQ
fdmQ = "SELECT * FROM tPOVPeriod where PeriodDesc = '" & RES.PstrPer & "'"
fdmRS.Open fdmQ, fdmSS

' If we get no records then something is broken
'
If fdmRS.bof And fdmRS.eof Then
RES.PstrActionValue = "No Start date found"
Import_Script=False
Exit Function
Else
Dim dtePriPer  ' Prior period end date as datetime value
dtePriPer = fdmRS.Fields("PriorPeriodKey").Value
dteStart = DateAdd("d",1,dtePriPer)   ' Add 1 day to get first day of current period
End If

While I used this technique to get to the prior period date we can get any fields we want from any FDQM tables.
  • tPOVPeriod has the period control table data
  • tPOVCategory has the Categories control table data
  • tCtrlCurrency has the Currency control table data
  • tPOVPartition has the partition/location data
  • tDataMap has the mapping tables for all partitions and dimensions
  • tLogActivity has the process and error logging data
  • tSecUser lists all provisioned users and their security level
  • tSecUserPartition lists all users, their provisioned partitions, and default partition
I can't think of a good reason why you would want to get to some of this data in an import script, but it is available if you need it.  

October 5, 2017

Library Access

Once upon a time several people were working on several different applications in our development environment.  We were adjusting shared dimensions, building and deleting HFM applications, and generally kicking sand around the playground.

http://aminoapps.com/page/cartoon/9847425/forget-the-internet-when-you-have-a-library-card-arthur-meme-contest

Then one day one of the developers couldn't get into the Application Library.  They got an error dialog with the title Requested Service not found and the helpful text:

----
Requested Service not found
----
Code: com.hyperion.awb.web.common.DimensionServiceException

Description: An error occurred processing the result from the server.

Actor: none


We also noticed we couldn't get to the Dimension Library.  And our scheduled taskflows had stopped working.  Since we could still login and open our HFM and Essbase applications we could rule out issues with foundation services, HFM, and Essbase.  Everything pointed to a problem with EPMA.

Analysis


While there were errors in the EPMA logs there wasn't an obvious smoking gun.  We engaged both Oracle support and our integration partner.  The odd thing about Oracle support support is that there is no option for the EPMA product.  All of the disciplines are cross-trained on EPMA but you have to really hunt around for someone who is an EPMA expert.

Between the three of us we noticed this set of errors in the DimensionServer.log:

[2017-08-26T23:57:18.764-04:00] [EPMADIM] [NOTIFICATION:32] [EPMADIM-1] [EPMADIM.Hyperion.DimensionServer.LibraryManager] [tid: EPMA Server Startup] [ecid: disabled,0] Starting 11.1.2.2.00 FixInvalidDynamicPropertyReferences task
[2017-08-26T23:57:19.263-04:00] [EPMADIM] [INTERNAL_ERROR:32] [EPMADIM-1] [EPMADIM.Hyperion.DimensionServer.LibraryManager] [tid: EPMA Server Startup] [ecid: disabled,0] Failed 11.1.2.2.00 FixInvalidDynamicPropertyReferences task    at System.String.InternalSubStringWithChecks(Int32 startIndex, Int32 length, Boolean fAlwaysCopy)
   at Hyperion.DimensionServer.DAO.LibraryManagerDAO.FixInvalidDynamicPropReferencesForDimension(Int32 libraryID, Int32 applicationID, Int32 dimensionID, String dynamicProps, Int32 propertiesDimID, Int32 dynamicPropsPropID)
   at Hyperion.DimensionServer.DAO.LibraryManagerDAO.<>c__DisplayClass64.<FixInvalidDynamicPropReferencesForLib>b__63(DALDatasetCommand command)
   at Hyperion.DataAccessLayerCore.DataAccessLayer.ExecLoadSQL(String SQL, Action`1 onSetParams, Action`1 onRead, DatabaseContext context)
   at Hyperion.DataAccessLayerCore.DataAccessLayer.ExecLoadSQL(String SQL, Action`1 onSetParams, Action`1 onRead)
   at Hyperion.DimensionServer.DAO.LibraryManagerDAO.FixInvalidDynamicPropReferencesForLib(Int32 libraryID)
   at Hyperion.DimensionServer.DAO.LibraryManagerDAO.FixInvalidDynamicPropReferences()
   at Hyperion.DimensionServer.LibraryManager.FixInvalidDynamicPropertyReferences()
[2017-08-26T23:57:19.294-04:00] [EPMADIM] [ERROR:32] [EPMADIM-1] [EPMADIM.Hyperion.DimensionServer.Global] [tid: EPMA Server Startup] [ecid: disabled,0] An error occurred during initialization of the Dimension Server Engine:  startIndex cannot be larger than length of string.
Parameter name: startIndex.    at Hyperion.DimensionServer.LibraryManager.FixInvalidDynamicPropertyReferences()
   at Hyperion.DimensionServer.Global.Initialize(ISessionManager sessionMgr, Guid systemSessionID, String sqlConnectionString)
[2017-08-26T23:57:19.294-04:00] [EPMADIM] [NOTIFICATION:32] [EPMADIM-1] [EPMADIM.Hyperion.DimensionServer.Utility.ChannelUtility] [tid: EPMA Server Startup] [ecid: disabled,0] Listening using IPv4

(I'm copying everything in the block above so it gets found by search engines.  But I highlighted the failed task that clued us into the solution.)  These errors were thrown each time EPMA services restarted.  And the errors started around the time the taskflows started failing.  Earlier in the log we just see the FixInvalidDynamicPropertyReferences task start and finish.  This looked like the culprit.

Solution


If you bingoogle “DimensionServer FixInvalidDynamicPropertyReferences” you find a couple articles that show the error above:


Both articles note the cause as a specific Oracle patch.  It is an older patch that was superseded in our environments.  Since there were no recent patches applied we initially glossed over this.

But the fix is to run a SQL UPDATE statement to replace blank strings with NULL values:

UPDATE DS_Property_Dimension
   SET c_property_value = null
   FROM DS_Property_Dimension pd
      JOIN DS_Library lib
         ON lib.i_library_id = pd.i_library_id
      JOIN DS_Member prop
         ON prop.i_library_id = pd.i_library_id
            AND prop.i_dimension_id = pd.i_prop_def_dimension_id
            AND prop.i_member_id = pd.i_prop_def_member_id
      JOIN DS_Dimension d
         ON d.i_library_id = pd.i_library_id
            AND d.i_dimension_id = pd.i_dimension_id
  WHERE
      prop.c_member_name = 'DynamicProperties' AND
      pd.c_property_value IS NOT NULL AND pd.c_property_value = '';

Since we had exhausted all other avenues we replaced the UPDATE statement with a SELECT statement:

SELECT c_property_value
   FROM DS_Property_Dimension pd
      JOIN DS_Library lib
         ON lib.i_library_id = pd.i_library_id
      JOIN DS_Member prop
         ON prop.i_library_id = pd.i_library_id
            AND prop.i_dimension_id = pd.i_prop_def_dimension_id
            AND prop.i_member_id = pd.i_prop_def_member_id
      JOIN DS_Dimension d
         ON d.i_library_id = pd.i_library_id
            AND d.i_dimension_id = pd.i_dimension_id
  WHERE
      prop.c_member_name = 'DynamicProperties' AND
      pd.c_property_value IS NOT NULL AND pd.c_property_value = '';

The query returned a handful of records where the c_property_value was blank instead of NULL.  We ran the UPDATE query and after restarting the services the libraries, taskflows, and other EPMA features were available.

Conclusion


Prior to this we had issues with dimensions in one of the applications under development.  Part of fixing that required deleting and importing dimensions and adjusting attributes.  Our suspicion is that somewhere along the line blanks got imported instead of NULLs.

The first lesson is that EPMA issues can be tricky to resolve.  It seems odd that Oracle wouldn't have support staff focused on EPMA since that binds all of the other EPM featuers together along with foundation services.

The other lesson is that if you have an issue and find something that looks like it might help don't automatically discard it just because your situation is different.  Many times there is a way to check if the solution might apply to you.

June 11, 2017

HFM Data Audit (part 2)

In the previous post I detailed the mechanics of enabling and exporting data audits in HFM.  But what can you do with data once it's exported?

One option is to keep those .csv files for an arbitrary period of time.  If an issue arises you could search all those files for data of interest.  But you need some utility to search many files for text of interest and you could run into space issues which would require deleting deleting old data.

I prefer to import the data into a SQL database.  The data can be kept as long as needed and is easier to search.

With SQL server we can use the bcp utility to import the .csv files into a database table.  The table should match the fields we get in the .csv file.  The following is a sample create table statement to match the .csv file format.

CREATE TABLE [dbo].[AuditRecords](
[UserName]   [nvarchar](32) NULL,
[Activity]   [nvarchar](16) NULL,
[Timestamp]  [datetime] NULL,
[ServerName] [nvarchar](8) NULL,
[Scenario]   [nvarchar](16) NULL,
[Year]       [smallint] NULL,
[Period]     [nvarchar](3)  NULL,
[Entity]     [nvarchar](32) NULL,
[Value]      [nvarchar](16) NULL,
[Account]    [nvarchar](40) NULL,
[ICP]        [nvarchar](24) NULL,
[Custom1]    [nvarchar](32) NULL,
[Custom2]    [nvarchar](24) NULL,
[Custom3]    [nvarchar](8)  NULL,
[Custom4]    [nvarchar](32) NULL,
[DataValue]  [nvarchar](24) NULL
)

Now that we have a table to hold our data we need to import it.  The bcp utility uses an import format file to specify how to parse fields in the file being imported.  The specifications for the format file are kind of klunky which you can read about here.  But this is the format file I use:

10.0
16
1   SQLCHAR   0   64      ";"      1     UserName           SQL_Latin1_General_CP1_CI_AS
2   SQLCHAR   0   32      ";"      2     Activity           SQL_Latin1_General_CP1_CI_AS
3   SQLCHAR   0   24      ";"      3     Timestamp          ""
4   SQLCHAR   0   16      ";"      4     ServerName         SQL_Latin1_General_CP1_CI_AS
5   SQLCHAR   0   32      ";"      5     Scenario           SQL_Latin1_General_CP1_CI_AS
6   SQLCHAR   0   7       ";"      6     Year               ""
7   SQLCHAR   0   6       ";"      7     Period             SQL_Latin1_General_CP1_CI_AS
8   SQLCHAR   0   64      ";"      9     Entity             SQL_Latin1_General_CP1_CI_AS
9   SQLCHAR   0   32      ";"      10    Value              SQL_Latin1_General_CP1_CI_AS
10  SQLCHAR   0   80      ";"      11    Account            SQL_Latin1_General_CP1_CI_AS
11  SQLCHAR   0   48      ";"      12    ICP                SQL_Latin1_General_CP1_CI_AS
12  SQLCHAR   0   64      ";"      13    Custom1            SQL_Latin1_General_CP1_CI_AS
13  SQLCHAR   0   48      ";"      14    Custom2            SQL_Latin1_General_CP1_CI_AS
14  SQLCHAR   0   16      ";"      15    Custom3            SQL_Latin1_General_CP1_CI_AS
15  SQLCHAR   0   64      ";"      16    Custom4            SQL_Latin1_General_CP1_CI_AS
16  SQLCHAR   0   48      "\r\n"   17    DataValue          SQL_Latin1_General_CP1_CI_AS

Now we have a place to store our audit data and a way to import it.  The last step is to automate the whole process.  I run a PowerShell script daily that uses the AuditExport_cmd.exe to export yesterday's audit data to a .csv, and then import it into the database.  I also retain the .csv files in case something goes haywire with the bcp import.  I organize these retention folders by month and year in a parent folder named D:\LogArchives and periodically purge old retention folders.

The files created by the AuditExport utility have a maximum size of 17 Mb which may create more than one file per export.  The script needs the ability to import multiple files in a single batch.  The full script will export the data to a holding area, create a folder for retention if needed, then loop through the files in the holding area, import each with bcp into the database, then move file to the retaining folder.

One issue I ran into is that the data exported is in unicode format but bcp requires ASCII.  Also bcp runs on the SQL server.  I resolved this by importing the .csv file using Get-Content then exporting it using Out-File with the ASCII -encoding switch.  Since I have to run bcp on the SQL server the Out-File target is a share on the SQL server.  That share also has the bcp format file and a batch file to run the import.  This resolves the file format issue and keeps everything tidy on the SQL server.

The batch file on the SQL server is:

bcp DataAuditArchive.dbo.AuditRecords in T:\DataAudit\TempDataAudit.csv -U DataAudit_update -P Password123 -F 2 -f T:\DataAudit\DataAudit.fmt -e T:\DataAudit\DataAudit.err

Here my database is named DataAuditArchive and the table is AuditRecords.  The batch file, bcp format file, error file, and temporary copy of the .csv being imported are all in the T:\DataAudit folder.  This one line batch file gets called from a PowerShell script that runs on the HFM server.

I also want to save all of the output in case something goes wrong.  I do this by piping all of the results to a log file that gets saved in the monthly archive folder.  This gives an audit trail for the script file processing to allow troubleshooting.  If I find that one or more of the .csv files did not import I can import those manually from the retaining folder to ensure a complete audit record.

In the PowerShell script below the $Folder variable holds the name of the current months retaining folder which will house all of the .csv files for the month as well as the log file.  The $Working folder houses the initial export from the AuditExport utility.  The $BCPWorking folder is the share on the SQL Server that has the batch file, bcp import format, and the ASCII version of the .csv file currently being processed.

# Powershell script to archive and truncate data audit logs ending yesterday.
#

$Utility = 'D:\Oracle\Middleware\EPMSystem11R1\products\FinancialManagement\Utilities\HFMAuditExtractCmdLine_x64.exe'
$AppName = 'COMMA' # Name of the HFM Application
$Parms = ' -u D:\HFM.udl -a '+$AppName+' '  # UDL file and application name
$Data = ' -r '                              # flag to extract and truncate the data audit logs
#$Data = ' -x '                             # flag to extract and truncate the data audit logs
$Logs = 'D:\LogArchives\'                   # Parent folder for the log files
$TempFolder = 'Daily' # Store the dump from the HFMAuditExtract utility here
$BcpWorking = '\\SqlSvr\DataAudit'          # local folder on SQL server for BCP to use for import
$Start = ' -s 1/1/2013'                     # Arbitrary start date to make sure we get all previous data

$D = get-date

# New folder for each month that is named with the year and month
#
$Folder = $D.ToString('yyyy')+"_"+$D.ToString('MM')+$D.ToString('MMM')

# Capture everything through yesterday
#
$D = $D.AddDays(-1)
$End = ' -e '+$D.ToString('MM')+'/'+$D.ToString('dd')+'/'+$D.ToString('yyyy')

$Target = $Logs+$Folder                              # Destination for the logs
if (-not (test-path $Target)) { MkDir $Target }      # Create the destination folder if needed
$Working = $Logs+$TempFolder
$Output = ' -d '+$Working                            # Add the -d so this can be used as a parameter for the utility.

# Extract and truncate the task log
#   NOTE: the truncate task for data also does an extract
#
$CMD=$Utility+$Output+$Start+$End+$Parms+$Data       # Create the string to execute.  Path of utility and all parameters
echo $CMD                                            # Echo so we know what we did
invoke-expression $CMD                               # Execute the AuditExtract command line

# Process the file(s) that were created in the TempFolder
#
$TempFile = $BCPWorking + "\TempDataAudit_$AppName.csv"
$ScriptLog = $Target + '\BcpImport.log'
$BatchName = "T:\DataAudit\Import$AppName.cmd"

$AuditLogs = get-childitem $Working -filter "Data*$AppName*.csv"
foreach ($Log in $AuditLogs) {
    echo $Log.FullName

# Export the file be ANSI encoding because bcp has trouble with format files and unicode data files
#
    get-content $Log.FullName | out-file $TempFile -encoding ASCII

#   Run bcp to import the data
#
  $BcpResults = invoke-command -computername SqlSrv -scriptblock { Param($Bat) & cmd.exe /C $Bat } -ArgumentList $BatchName

# Move the file dated archive folder
#
  $MoveResults = Move-Item $Log.FullName $Target -passthru

# Delete the ASCII file we just imported
#
  if (test-path $TempFile) { del $TempFile }

# And save the results to the script log
#
  (' * * * ') | out-file $ScriptLog -append
  $Log.FullName | out-file $ScriptLog -append
  (' ') | out-file $ScriptLog -append
  $BcpResults | out-file $ScriptLog -append
  (' ') | out-file $ScriptLog -append
  $MoveResults | out-file $ScriptLog -append
}

This script is scheduled to run daily which keeps the DataAudit table in the HFM database lean and efficient.  The audit table becomes our source of truth unless the event happened today in which case we use the Data Audit tab in Workspace.

May 8, 2017

HFM Data Audit (part 1)

Don't you hate it when you're watching a TV show and it builds to a crecendo only to leave you with a "To be continued..." screen at the end.  You invest all that time just to get teased into watching next week.  I'm letting you know up front I'm not covering everything I intend to show about HFM data auditing.  This post will deal with the mechanics of configuring and exporting the data audits.  The next post will address what we can do with the exports to make them useful for reference.


We used to get reports from users in the field that somebody moved their cheese.  Fingers were pointed, accusations made, email storms erupted, tsunamis demolished fragile shore lines, meteors collided in the stratosphere, all cascading into a cacophony of dissonance that gradually decrecendoed until the next crisis wherein the entire process repeated.  The decision was made to enable data auditing in HFM to prevent such catastrophes.

The good thing about data auditing in HFM is that it records every transaction including data entry, data loads, data clears and data copies.  The bad thing about data auditing in HFM is that it records every transaction, so unless you are surgical about what gets audited you can generate an enormous log.


Data auditing is configured at the intersection Scenario and Account.  For Scenarios the field Enable Data Audit has to be set to Y (yes) or O (override) instead of the default of N.  Y indicates that everything in that scenario gets audited, O indicates the auditing is optional and will defer to the configuration of the Account dimension.  Unless you have a small Account dimension you should configure the audited scenarios with the O.  Only audit scenarios where users make changes.


For the Accounts dimension change the Enable Data Audit field to Y for the accounts to be audited in Scenarios configured with O.  Again, only audit accounts where users make changes.  There is no value to auditing accounts that are calculated.  FDQM it has its own audit trail so if you use that to load data you should try to omit accounts that FDQM loads and just audit accounts where users manually submit or clear data.  In our case we knew there were specific areas that generated problems so we focused on accounts in those areas.


The audit data can be viewed in workspace by selecting Navigate > Administer > Consolidation Administration > Audit > Data Audit.  The resulting data can be filtered by any of the POV fields.  The data grid will show who did it, what they did, when they did it, where they did it, the POV of the intersection touched, and the value assigned.

The data audit details are in the table <AppName>_DataAudit.  The HFM administrators guide advises that the number of records in the table be less than 500,000 to avoid degrading performance.  For a large company half a million records can be added in an afternoon of data loads during month end close.

The audit data can be exported and truncated from the Consolidation Administration tab.  But a better solution is to find an automated way of exporting and truncating the table.  In the EPMSystem11R1\products\FinancialManagement\Utilities folder is the command line utility HFMAuditExtractCmdLine.exe.  This utility has switches to truncate and/or extract the data and task audit logs for a consolidation application.  Data can be extracted or truncated within a specified date range and extracted to any available location.  Note that the utility requires a Universal Data Link (.udl) file which must be configured with the connection string to the HFM database along with credentials that have at least db_datawriter access to the database.


The command line to extract the data is pretty straightforward.  Assign the destination folder, full path to the .udl file, specify the application, delimiter, start, and end dates, and the switches for which operations to perform.  One quirk I found, at least with the 64-bit version of the utility for 11.1.2.2.500, is that -k and -r switches both export and truncate the data instead of just truncating.

HFMAuditExtractCmdLine_x64 -d c:\Extracts -u C:\MyHFM.udl -a Comma -s 2017/01/01 -e 2017/02/29 -r

This example command will extract the data to the C:\Extracts folder, use the C:\MyHFM.udl file to connect to the HFM database and the Comma application, then extract and truncate all of the data audit records recorded between January an February of 2017.  Note that in the example I'm using the default delimiter which is the semicolon.  If you specify a delimiter it must be a printable character since there is a known issue with the command line utility which can't process [Tab] as a delimiter.

The command line utility is available up to version 11.1.2.3, but is not bundled with version 11.1.2.4 and above.  However the format of the table has not changed between versions so the utility will still work.  Oracle patch 9976978 can be downloaded to get an older version of the utility.  You can also get the latest PSU for a previous version which will have a copy of the utility in the files\products\FinancialManagement\Utilities folder.

The utility can be configured in a batch file or run directly from the Task Scheduler.  But we still need to adjust the end date.  In the next post I will detail sample code that exports the data to a .csv file, adjusting for the date, and imports that .csv file into a database.  I use this as a nightly scheduled process to keep the HFM database lean and preserve the audit data for forensic examination.

February 14, 2017

More Trouble with Temp Files

The title of this post kind of gives away the plot before I even start the story.  But it does give me a chance to use this picture in my blog.


We update our HFM application metadata monthly.  We first deploy the changes in our development environment and test any required changes to member lists, rules, etc.  After the metadata is loaded into EPMA the HFM application has to be redeployed.  In this case the redeploy failed.  The message displayed was almost as useless as a tribble:

The custom error module does not recognize this error.
This is like something you would find on the DailyWTF, an error that tells you nothing more than it's an error.

In the Consolidation Administration tab was an error message that was a little more descriptive, but equally unenlightening.

Could not determine wsdl ports.
Searching the Oracle knowledge base and binggoolging turned up nothing useful.  In cases dealing with workspace we sometimes resolve unusual issues by clearing the browser cache and reopening the browser, but that didn't help.

The deployment failed at 6% which normally indicates a problem with EPMA.  Everything on the EPMA server looked normal and there were no obvious errors in the logs.  We tried restarting just the EPMA services which did not help.  We then restarted all the EPM services to close all connections, flush java caches, and try to clear up whatever was causing the error, but the redeploy still failed at the same step.

Then I checked the HFM server which in our case is a different host than EPMA.  The C: drive was almost full.  Because this is a dev environment I don't have any alerts configured so I didn't have advance warning of the problem.

Analysis

A handy tool for finding disk hogs is the sysinternals du utility.  It runs from the command line but the syntax is easy and you can use keyboard shortcuts to quickly drill down to find the problem folders.

While DU can scan a whole drive and find the hogging subfolders this can take a bit of time.  My strategy is to check just one level use the -L 1 paramenter, then drill down from there one level at a time.  This is usually a quicker way of finding the offending folder.

Here I look in the C:\Users folder because I suspect the problem is one of the profiles.  Clearly the problem is the profile taking over 5Gb with an account that starts with r and ends with v.

Using DOS shortcuts I hit the up arrow key to repeat the last command, add the backslash, type an [r] to start the username, use the [Tab] key to auto-fill the rest of the name, and press [Enter] to see results for the next level down.  Repeat this technique until you find the problem folder.

In this case it was the user profile of the service account that runs the HFM and other EPM servcies.  In the AppData\Local\Temp folder there were a bunch of temp files, many of them tens of megs in size.  After deleting all the temp files from previous years we freed up 5Gb of drive space and the deployment succeeded.


Even though this processing is handled by EPMA, there is still data being written to the HFM server.  Presumably this is so HFM has something to process once its turn in the deployment comes.

Conclusion

This is another episode where temp files don't get cleaned up after execution.  Also note that the .tmp files use the convention of a 4 hex digit as part of the name.  Had we not run out of drive space it is conceivable we would have run into name collision like we did with the Permanent Temporary Files.

This has become another task during the monthly maintenance where logs and other items get purged or truncated to keep the logs manageable and drive space clear.

What is curious in all of this is why aren't these temp files cleaned up as a matter of course?  While it makes sense to keep temp files around for troubleshooting if a process fails, surely whatever process generates these things could have a final step of cleaning up its droppings after it receives a success notice.  Even children know enough to clean up after their dog.