Unable to transfer DB successfully to AWS RDS Instance due to Oracle/AWS bug

Document ID : KB000117119
Last Modified Date : 08/10/2018
Show Technical Document Details
Question:
All attempts to transfer the Harvest DB to AWS RDS are failing: Database import with multiple dump files failed with ORA-02367:read ended but had expected more data.

We are looking for assistance in migrating Oracle DB instance to RDS, a where clause to split the harversion table in half? All prior planning transfers were successful. Actual production migration had to be rolled back due to this issue:

Database import with multiple dump files failed with ORA-02367:read ended but had expected more data
Environment:
CA Harvest SCM v12.5 and up
Answer:
Here is the structure of the table:
 
SQL> desc harversiondata
Name                                      Null?    Type
----------------------------------------- -------- ----------------------------
VERSIONDATAOBJID                          NOT NULL NUMBER
REFCOUNTER                                NOT NULL NUMBER
DATASIZE                                  NOT NULL NUMBER
COMPRESSED                                NOT NULL CHAR(1)
COMPDATASIZE                              NOT NULL NUMBER
FILEACCESS                                         CHAR(9)
MODIFYTIME                                         DATE
CREATETIME                                         DATE
DCB                                                VARCHAR2(256)
TEXTFILE                                  NOT NULL NUMBER
ITEMOBJID                                 NOT NULL NUMBER
ISDATAINCHUNKS                            NOT NULL CHAR(1)
VERSIONDATA                                        BLOB
 
SQL>
 
The “VERSIONDATAOBJID” field is the unique identifier for the table, so the easiest way to specify just a section of the table would be to use a where clause similar to this example in your SQL statement:
 
WHERE VERSIONDATAOBJID BETWEEN 1 AND 1000
 
You would select the range of values that best meets your needs.  The size of the “blob” field for each record in this table will be different, since each record stores the data for a different file (or version of a file) in your database.  You can use the DATASIZE field to learn the size of the blob field for each record.
 
 
Additional Information:
You might also consider a possible method to handle any projects that are obsolete or that can be archived to reduce the total size of the production database before moving it.  This might make the process more efficient.
 
Harvest has a utility called “HMVPROJ” (for Move Project) that will let you move or copy a Harvest project from one Harvest database to another, or even just to delete the project and all its contents.  It might be worth taking a look at this utility to see if it can be of help as you prepare to move your database.  Here is where you can find more information on this: https://docops.ca.com/ca-harvest-scm/13-0/en/administrating/copy-or-move-a-project-from-one-database-to-other