Load Data Warehouse job fails with error ORA-06502: PL/SQL: numeric or value error: Bulk bind: Error in define on DWH_INV_ASSIGN_SUM_FACTS_LOAD or DWH_INV_TASK_SUM_FACTS_LOAD

Document ID : KB000101868
Last Modified Date : 15/06/2018
Show Technical Document Details
Issue:
On running Load Data Warehouse job, Full or Incremental, the following error is thrown every time:
ClarityDB - isOracle? - An error occurred executing this job entry :
Couldn't execute SQL: BEGIN
 DWH_INV_ASSIGN_SUM_FACTS_LOAD
  (
  P_DBLINK => 'CLARITYDBLINK',
  P_LAST_LOAD_DATE => to_date('1910/01/01 00:00:00', 'yyyy/MM/dd HH24:mi:ss'),
  P_CURRENT_DIM_LOAD_DATE => to_date('2018/06/04 13:31:43', 'yyyy/mm/dd HH24:MI:SS'),
  P_ARRAY_SIZE => 50000
  );
END;
 
[CA Clarity][Oracle JDBC Driver][Oracle]ORA-20100: ENCOUNTERED EXCEPTION WHILE INSERTING INTO DWH_INV_ASSIGN_SUMMARY_FACTS. SQLERRM : ORA-06502: PL/SQL: numeric or value error: Bulk bind: Error in define
ORA-06512: at "CLARITY154DWH.DWH_INV_ASSIGN_SUM_FACTS_LOAD", line 66
ORA-06512: at line 2
Cause:
A user entered an invalid entry (over 1,000,000,000,000,000) for ETC/Pending ETC/Pending Actuals or any other value on the assignment. We do not recommend users to enter impossible values as this is not how the product is meant to be used. In some cases the user enters billions for ETC which is not possible to happen in a single life span.
 
Resolution:
DE41741 Entering a very high value on an assignment in PPM is allowed,which then fails Load Data Warehouse job with ORA-06502: PL/SQL: numeric or value error: Bulk bind: Error in define
fixed in CA PPM 15.5.
The fix will ensure that all invalid entries that go over , will be set to 0 in DWH. This way the job will no longer fail if a user enters an invalid entry.
 
 
Workaround:
  1. Query the PRASSIGNMENT table and look for any values that are going outside the limits such as
  2. 1,000,000,000,000,000
  3. Once you find them, correct the value for this assignment from CA PPM UI
  4. Run Load Data Warehouse job - Full