Tuesday 8 June 2021

OCI : ORA-20003: Reject limit reached, query table "ADMIN"."COPY$8_LOG" for error

 

We are importing object data into OCI autonomous database schema  getting error while uploading csv file. 

Error : ORA-20003: Reject limit reached, query table "ADMIN"."COPY$8_LOG" for error

Solution : Remove 1st  line header 

C:\Users\varun>sqlplus admin/Cloud$123456@orcladb_medium


SQL*Plus: Release 19.0.0.0.0 - Production on Mon Jun 7 17:53:16 2021

Version 19.3.0.0.0


Copyright (c) 1982, 2019, Oracle.  All rights reserved.


Last Successful login time: Mon Jun 07 2021 17:37:32 +05:30


Connected to:

Oracle Database 19c Enterprise Edition Release 19.0.0.0.0 - Production

Version 19.5.0.0.0


SQL> BEGIN

  2  DBMS_CLOUD.COPY_DATA(

  3   TABLE_NAME =>'SALES',

  4  SCHEMA_NAME =>'SOE',

  5  CREDENTIAL_NAME =>'ADMIN_CRED_NAME',

  6  FILE_URI_LIST =>'https://objectstorage.ap-mumbai-1.oraclecloud.com/n/bmdsoh48o0kf/b/bucket-20210415-1806/o/sales_csvsales.csv',

  7  FORMAT => JSON_OBJECT('delimiter' value ',', 'quote' value '"', 'ignoremissingcolumns'

  8  value 'true', 'dateformat' value 'DD-MM-YYYY HH24:MI:SS', 'blankasnull' value 'true')

  9  );

 10  END;

 11  /

BEGIN

*

ERROR at line 1:

ORA-20003: Reject limit reached, query table "ADMIN"."COPY$8_LOG" for error

details

ORA-06512: at "C##CLOUD$SERVICE.DBMS_CLOUD", line 1229

ORA-06512: at "C##CLOUD$SERVICE.DBMS_CLOUD", line 3584

ORA-06512: at "C##CLOUD$SERVICE.DBMS_CLOUD", line 3607

ORA-06512: at line 2



SQL>



  • Remove top line 1 . 



  • Run Again using admin user

BEGIN

DBMS_CLOUD.COPY_DATA(

 TABLE_NAME =>'SALES',

SCHEMA_NAME =>'SOE',

 CREDENTIAL_NAME =>'ADMIN_CRED_NAME',

 FILE_URI_LIST =>'https://objectstorage.ap-mumbai-1.oraclecloud.com/n/bmdsoh48o0kf/b/soe_expdp_bucket/o/soe_salessales.csv',

 FORMAT => JSON_OBJECT('delimiter' value ',', 'quote' value '"', 'ignoremissingcolumns' 

value 'true', 'dateformat' value 'DD-MM-YYYY HH24:MI:SS', 'blankasnull' value 'true')

);

END; 



  • The LOGFILE_TABLE and BADFILE_TABLE columns contain the names of external tables for accessing the log file and bad file related to each load operation.

SELECT * FROM USER_LOAD_OPERATIONS WHERE TYPE = 'COPY' ORDER BY ID;



  • Delete the load operation logs

exec DBMS_CLOUD.DELETE_ALL_OPERATIONS(type => 'COPY');

No comments:

Post a Comment