tag:blogger.com,1999:blog-3890970057024318255.post8436659093092371078..comments2024-03-15T08:36:23.954+03:00Comments on Database Administration Tips: How To Import A Schema on Amazon RDSMahmmoud ADELhttp://www.blogger.com/profile/15299387537990081025noreply@blogger.comBlogger22125tag:blogger.com,1999:blog-3890970057024318255.post-37845872182163305402024-02-05T15:57:17.620+03:002024-02-05T15:57:17.620+03:00F8907
defillama
ellipal
shiba
defilama
safepal
led...F8907<br /><a href="https://defillama-w.com/" rel="nofollow">defillama</a><br /><a href="https://www.ellipalwallets.com/" rel="nofollow">ellipal</a><br /><a href="https://shibainswap.com/" rel="nofollow">shiba</a><br /><a href="https://web3-defilama.com/" rel="nofollow">defilama</a><br /><a href="https://safepalwallets.com/" rel="nofollow">safepal</a><br /><a href="https://ledger-wallet.live/" rel="nofollow">ledger wallet</a><br /><a href="https://www.web3-eigenlayer.com/" rel="nofollow">eigenlayer</a><br /><a href="https://web3-solflare.com/" rel="nofollow">solflare</a><br /><a href="https://www.web-trezorsuite.com/" rel="nofollow">trezor suite</a><br />61630SariahD875Bnoreply@blogger.comtag:blogger.com,1999:blog-3890970057024318255.post-2254418113766493192024-01-20T17:40:39.975+03:002024-01-20T17:40:39.975+03:00F705D
Görüntülü Sohbet Parasız
Tiktok İzlenme Satı...F705D<br /><a href="https://play.google.com/store/apps/details?id=com.nanytoo.app&hl=tr" lang="tr" rel="nofollow">Görüntülü Sohbet Parasız</a><br /><a href="https://dostca.com/tiktok-izlenme-satin-al/" rel="nofollow">Tiktok İzlenme Satın Al</a><br /><a href="https://referanskodunedir.com.tr/kripto-para-uretme-siteleri/" rel="nofollow">Kripto Para Üretme Siteleri</a><br /><a href="https://referanskodunedir.com.tr/bitcoin-mining-nasil-yapilir/" rel="nofollow">Bitcoin Mining Nasıl Yapılır</a><br /><a href="https://dostca.com/tiktok-begeni-satin-al/" rel="nofollow">Tiktok Beğeni Satın Al</a><br /><a href="https://titandijital.com.tr/expanse-coin-hangi-borsada/" rel="nofollow">Expanse Coin Hangi Borsada</a><br /><a href="https://titandijital.com.tr/nwc-coin-hangi-borsada/" rel="nofollow">NWC Coin Hangi Borsada</a><br /><a href="https://referanskodunedir.com.tr/okex-borsasi-guvenilir-mi/" rel="nofollow">Okex Borsası Güvenilir mi</a><br /><a href="https://referanskodunedir.com.tr/coin-kazanma-siteleri/" rel="nofollow">Coin Kazanma Siteleri</a><br />2A4C5Angela15234noreply@blogger.comtag:blogger.com,1999:blog-3890970057024318255.post-56345914369387205492024-01-19T04:50:30.432+03:002024-01-19T04:50:30.432+03:007167C
Kripto Para Kazma Siteleri
Qlc Coin Hangi Bo...7167C<br /><a href="https://referanskodunedir.com.tr/kripto-para-kazma-siteleri/" rel="nofollow">Kripto Para Kazma Siteleri</a><br /><a href="https://titandijital.com.tr/qlc-coin-hangi-borsada/" rel="nofollow">Qlc Coin Hangi Borsada</a><br /><a href="https://dostca.com/threads-begeni-satin-al/" rel="nofollow">Threads Beğeni Satın Al</a><br /><a href="https://referanskodunedir.com.tr/bitcoin-para-kazanma/" rel="nofollow">Bitcoin Para Kazanma</a><br /><a href="https://dostca.com/threads-izlenme-satin-al/" rel="nofollow">Threads İzlenme Satın Al</a><br /><a href="https://titandijital.com.tr/fuckelon-coin-hangi-borsada/" rel="nofollow">Fuckelon Coin Hangi Borsada</a><br /><a href="https://play.google.com/store/apps/details?id=com.nanytoo.app&hl=tr" rel="nofollow">Görüntülü Sohbet</a><br /><a href="https://dostca.com/tiktok-izlenme-hilesi/" rel="nofollow">Tiktok İzlenme Hilesi</a><br /><a href="https://titandijital.com.tr/bonk-coin-hangi-borsada/" rel="nofollow">Bonk Coin Hangi Borsada</a><br />38BA6Joanne98AC7noreply@blogger.comtag:blogger.com,1999:blog-3890970057024318255.post-19734666840826884652023-04-14T04:20:59.063+03:002023-04-14T04:20:59.063+03:00Would you mind sharing the complete command you ar...Would you mind sharing the complete command you are using along with the error message?Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-3890970057024318255.post-82871737346126334752023-04-13T22:18:26.403+03:002023-04-13T22:18:26.403+03:00adding new files as you mentioned would not work. ...adding new files as you mentioned would not work. Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-3890970057024318255.post-67776417080935599502022-09-11T21:39:12.162+03:002022-09-11T21:39:12.162+03:00Look, I'm having the same issue as you, AWS su...Look, I'm having the same issue as you, AWS support wasn't able to help they just suggested running impdp from another EC2 instance using NETWORK_LINK parameter to pump the data to the remote RDS instance using a database link, Oracle support said that if you use DBMS_DATAPUMP.METADATA_FILTER then you cannot use REMAP SCHEMA parameter "DBMS_DATAPUMP.METADATA_REMAP" along with it.<br /><br />It's obvious that this package will start go mad when METADATA_FILTER parameter get called, I'm still having my SR with Oracle support open for weeks without having a solution, and I guess you may went through the same process like me ;-)Mahmmoud ADELhttps://www.blogger.com/profile/15299387537990081025noreply@blogger.comtag:blogger.com,1999:blog-3890970057024318255.post-74728267605054967942022-09-10T17:42:00.785+03:002022-09-10T17:42:00.785+03:00While importing data in AWS RDS we are getting bel...While importing data in AWS RDS we are getting below error..Please help us<br /><br /><br /><br />Error starting at line : 1 in command -<br />DECLARE<br /> v_hdnl NUMBER;<br />BEGIN<br /> v_hdnl := DBMS_DATAPUMP.OPEN(<br /> operation => 'IMPORT',<br /> job_mode => 'TABLE',<br /> version => 'COMPATIBLE',<br /> job_name => null);<br /> DBMS_DATAPUMP.ADD_FILE(<br /> handle => v_hdnl,<br /> filename => 'OBIEEDP_MIG_TABLES_090922.dmp',<br /> directory => 'DATA_PUMP_DIR',<br /> filetype => dbms_datapump.ku$_file_type_dump_file);<br /> DBMS_DATAPUMP.SET_PARALLEL(<br /> handle => v_hdnl,<br /> degree => 3);<br /> DBMS_DATAPUMP.ADD_FILE(<br /> handle => v_hdnl,<br /> filename => 'import_OBIEEDP_MIG_TABLES_090922.log',<br /> directory => 'DATA_PUMP_DIR',<br /> filetype => dbms_datapump.ku$_file_type_log_file);<br /> DBMS_DATAPUMP.METADATA_FILTER(v_hdnl,'NAME_LIST','''PARTY_BASIC_INFO''');<br /> DBMS_DATAPUMP.METADATA_REMAP(v_hdnl,'REMAP_SCHEMA','NEW_CORE_DM','DATAMIG');<br /> DBMS_DATAPUMP.METADATA_REMAP(v_hdnl,'REMAP_TABLESPACE','NEW_CORE_DM_TBS','TS_DATAMIG_01');<br /> DBMS_DATAPUMP.SET_PARAMETER(v_hdnl,'TABLE_EXISTS_ACTION','REPLACE'); <br /> DBMS_DATAPUMP.START_JOB(v_hdnl);<br />END;<br />Error report -<br />ORA-31627: API call succeeded but more information is available<br />ORA-06512: at "SYS.DBMS_DATAPUMP", line 7333<br />ORA-06512: at "SYS.DBMS_SYS_ERROR", line 79<br />ORA-06512: at "SYS.DBMS_DATAPUMP", line 4929<br />ORA-06512: at "SYS.DBMS_DATAPUMP", line 7327<br />ORA-06512: at line 26<br />31627. 00000 - "API call succeeded but more information is available"<br />*Cause: The user specified job parameters that yielded informational<br /> messages.<br />*Action: Call DBMS_DATAPUMP.GET_STATUS to retrieve additional information.Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-3890970057024318255.post-76154168946643630702022-09-10T17:41:09.804+03:002022-09-10T17:41:09.804+03:00hihiAnonymousnoreply@blogger.comtag:blogger.com,1999:blog-3890970057024318255.post-81313103493077792712022-08-17T18:59:28.365+03:002022-08-17T18:59:28.365+03:00What would I need to do if I have to remap multipl...What would I need to do if I have to remap multiple tablespace into on tablespaceparttimerhttps://www.blogger.com/profile/14337881740296685444noreply@blogger.comtag:blogger.com,1999:blog-3890970057024318255.post-10578325940513082302022-02-05T13:22:50.905+03:002022-02-05T13:22:50.905+03:00Thanks for this blog keep sharing your thoughts li...Thanks for this blog keep sharing your thoughts like this...<br /><a href="https://fitatambaram.in/introduction-of-c/" rel="nofollow">Introduction of CC++</a><br /><a href="https://fitatambaram.in/introduction-of-c/" rel="nofollow">What is C++ Programming Language</a>Johnhttps://www.blogger.com/profile/12067112419633292764noreply@blogger.comtag:blogger.com,1999:blog-3890970057024318255.post-56519956761362527362021-12-23T08:46:11.550+03:002021-12-23T08:46:11.550+03:00Genexbd is capable of storing a large number of re...Genexbd is capable of storing a large number of records efficiently. New data can be calculated easily and older data can be updated or deleted. With a database, one can search and retrieve information quickly. The role of data in marketing, business, government, etc., has become increasingly important. <br /><a href="https://genexdbs.com/" rel="nofollow">https://genexdbs.com/</a>Aswihttps://www.blogger.com/profile/15686271675061585615noreply@blogger.comtag:blogger.com,1999:blog-3890970057024318255.post-5569288948244027152021-10-19T09:08:41.491+03:002021-10-19T09:08:41.491+03:00Great, thanks for sharing this post.Much thanks ag...Great, thanks for sharing this post.Much thanks again. Awesome.<br /><a href="https://www.kitsonlineonline%20trainings.com/course/aws-online-training" rel="nofollow">aws online training</a>KITS Technologieshttps://www.blogger.com/profile/01255736173821596606noreply@blogger.comtag:blogger.com,1999:blog-3890970057024318255.post-18400978105288551992020-11-10T18:51:44.179+03:002020-11-10T18:51:44.179+03:00Simply add more lines for dbms_datapump.add_file, ...Simply add more lines for dbms_datapump.add_file, each line represent a dump file; like this:<br /><br />....<br /> dbms_datapump.add_file(hdnl,'EXPORT_01.dmp','DATA_PUMP_DIR');<br /> dbms_datapump.add_file(hdnl,'EXPORT_02.dmp','DATA_PUMP_DIR');<br />....<br />Mahmmoud ADELhttps://www.blogger.com/profile/15299387537990081025noreply@blogger.comtag:blogger.com,1999:blog-3890970057024318255.post-61100704537070776662020-11-10T17:59:44.040+03:002020-11-10T17:59:44.040+03:00I need a hand, how to do this with more than 1 dum...I need a hand, how to do this with more than 1 dump file?Anonymoushttps://www.blogger.com/profile/14837313112432277434noreply@blogger.comtag:blogger.com,1999:blog-3890970057024318255.post-21145857639485985832020-10-25T20:57:38.509+03:002020-10-25T20:57:38.509+03:00DBMS_DATAPUMP.METADATA_TRANSFORM(handle => hdnl...DBMS_DATAPUMP.METADATA_TRANSFORM(handle => hdnl, name => 'OID', 0, null);<br />*<br />ERROR at line 7:<br />ORA-06550: line 7, column 1:<br />PLS-00306: wrong number or types of arguments in call to 'METADATA_TRANSFORM'<br />ORA-06550: line 7, column 1:<br />PL/SQL: Statement ignoredphanihttps://www.blogger.com/profile/12133838402218197805noreply@blogger.comtag:blogger.com,1999:blog-3890970057024318255.post-83426938879562384642020-08-05T21:23:08.775+03:002020-08-05T21:23:08.775+03:00Hello, i had a quick question if i run this from S...Hello, i had a quick question if i run this from SQLDeveloper it runs ok - <br /><br />SELECT rdsadmin.rdsadmin_s3_tasks.upload_to_s3( p_bucket_name => '', p_prefix => '', prefix => '', p_directory_name => 'DATA_PUMP_DIR') AS TASK_ID FROM DUAL;<br /><br />But if i put that statement as part of a Stored procedure i get the below error - <br /><br />"rdsadmin"."rdsadmin_s3_tasks"."upload_to_s3" :invalid identifer. Would you know why? vishalhttps://www.blogger.com/profile/07164458486760235490noreply@blogger.comtag:blogger.com,1999:blog-3890970057024318255.post-59670841878208870702020-02-24T08:47:51.953+03:002020-02-24T08:47:51.953+03:00Hi Saud,
The error is not clear enough for me, bu...Hi Saud,<br /><br />The error is not clear enough for me, but can you remove this piece "handle =>" from all the lines and try again?Mahmmoud ADELhttps://www.blogger.com/profile/15299387537990081025noreply@blogger.comtag:blogger.com,1999:blog-3890970057024318255.post-11140793789882950202020-02-22T20:32:03.706+03:002020-02-22T20:32:03.706+03:00Getting below error : Any idea?
Error starting a...Getting below error : Any idea?<br /><br /><br />Error starting at line : 1 in command -<br />DECLARE<br /> ind NUMBER; -- Loop index<br /> h1 NUMBER; -- Data Pump job handle<br /> percent_done NUMBER; -- Percentage of job complete<br /> job_state VARCHAR2(30); -- To keep track of job state<br /> le ku$_LogEntry; -- For WIP and error messages<br /> js ku$_JobStatus; -- The job status from get_status<br /> jd ku$_JobDesc; -- The job description from get_status<br /> sts ku$_Status; -- The status object returned by get_status<br />BEGIN<br /><br /> h1 := DBMS_DATAPUMP.OPEN( operation => 'IMPORT', job_mode => 'SCHEMA', job_name=>null);<br /><br />-- Specify the single dump file and its directory <br /> DBMS_DATAPUMP.ADD_FILE(handle => h1, directory => 'DATA_PUMP_DIR', filename => 'RMS_TEST.dmp');<br />-- Specify the logfile for the import process: [Very important to read it later after the completion of the import] <br /> DBMS_DATAPUMP.ADD_FILE(handle => h1, directory => 'DATA_PUMP_DIR', filename => 'RMS_TEST.LOG', filetype => DBMS_DATAPUMP.KU$_FILE_TYPE_LOG_FILE);<br /><br />-- Disable Archivelog for the import: [12c new feature] DBMS_DATAPUMP.metadata_transform ( handle => h1, name => 'DISABLE_ARCHIVE_LOGGING', value => 1);<br /><br />-- REMAP SCHEMA:<br />-- DBMS_DATAPUMP.METADATA_REMAP(h1,'REMAP_SCHEMA','EPS_OWNER','EPS');<br />-- If a table already exists: [SKIP, REPLACE, TRUNCATE]<br /> DBMS_DATAPUMP.SET_PARAMETER(h1,'TABLE_EXISTS_ACTION','REPLACE');<br /><br />-- REMAP TABLESPACE: DBMS_DATAPUMP.METADATA_REMAP(h1,'REMAP_TABLESPACE','EPS','TBS_EPS');<br /><br />-- Start the job. An exception is returned if something is not set up properly. <br /> DBMS_DATAPUMP.START_JOB(h1);<br /><br />-- The following loop will monitor the job until it get complete.meantime the progress information will be displayed:<br /> percent_done := 0;<br /> job_state := 'UNDEFINED';<br /> while (job_state != 'COMPLETED') and (job_state != 'STOPPED') loop<br /> dbms_datapump.get_status(h1,<br /> dbms_datapump.ku$_status_job_error +<br /> dbms_datapump.ku$_status_job_status +<br /> dbms_datapump.ku$_status_wip,-1,job_state,sts);<br /> js := sts.job_status;<br /><br />-- If the percentage done changed, display the new value. <br /> if js.percent_done != percent_done<br /> then<br /> dbms_output.put_line('*** Job percent done = ' ||<br /> to_char(js.percent_done));<br /> percent_done := js.percent_done;<br /> end if;<br /><br />-- If any work-in-progress (WIP) or Error messages were received for the job, display them. <br /> if (bitand(sts.mask,dbms_datapump.ku$_status_wip) != 0)<br /> then<br /> le := sts.wip;<br /> else<br /> if (bitand(sts.mask,dbms_datapump.ku$_status_job_error) != 0)<br /> then<br /> le := sts.error;<br /> else<br /> le := null;<br /> end if;<br /> end if;<br /> if le is not null<br /> then<br /> ind := le.FIRST;<br /> while ind is not null loop<br /> dbms_output.put_line(le(ind).LogText);<br /> ind := le.NEXT(ind);<br /> end loop;<br /> end if;<br /> end loop;<br /><br />-- Indicate that the job finished and gracefully detach from it. <br /> dbms_output.put_line('Job has completed');<br /> dbms_output.put_line('Final job state = ' || job_state);<br /> dbms_datapump.detach(h1);<br />END;<br />Error report -<br />ORA-39001: invalid argument value<br />ORA-06512: at "SYS.DBMS_SYS_ERROR", line 79<br />ORA-06512: at "SYS.DBMS_DATAPUMP", line 3507<br />ORA-06512: at "SYS.DBMS_DATAPUMP", line 3756<br />ORA-06512: at line 15<br />39001. 00000 - "invalid argument value"<br />*Cause: The user specified API parameters were of the wrong type or<br /> value range. Subsequent messages supplied by<br /> DBMS_DATAPUMP.GET_STATUS will further describe the error.<br />*Action: Correct the bad argument and retry the API.<br />Saud Rahmanhttps://www.blogger.com/profile/09231043505067448359noreply@blogger.comtag:blogger.com,1999:blog-3890970057024318255.post-66144704334546242212020-02-17T12:42:09.912+03:002020-02-17T12:42:09.912+03:00I'm familiar with the same error as I faced it...I'm familiar with the same error as I faced it many times, just simply remove "handle =>" from all the lines of the import command so it will look like this:<br /><br /><br />DECLARE<br /> ind NUMBER; -- Loop index<br /> h1 NUMBER; -- Data Pump job handle<br /> percent_done NUMBER; -- Percentage of job complete<br /> job_state VARCHAR2(30); -- To keep track of job state<br /> le ku$_LogEntry; -- For WIP and error messages<br /> js ku$_JobStatus; -- The job status from get_status<br /> jd ku$_JobDesc; -- The job description from get_status<br /> sts ku$_Status; -- The status object returned by get_status<br />BEGIN<br /><br /> h1 := DBMS_DATAPUMP.OPEN( operation => 'IMPORT', job_mode => 'SCHEMA', job_name=>null);<br /><br />-- Specify the single dump file and its directory DBMS_DATAPUMP.ADD_FILE(h1, directory => 'DATA_PUMP_DIR', filename => 'EXPORT_eps_owner_STG_04-03-19.dmp');<br />-- Specify the logfile for the import process: [Very important to read it later after the completion of the import] DBMS_DATAPUMP.ADD_FILE(h1, directory => 'DATA_PUMP_DIR', filename => 'import_eps_owner_STG_04-03-19.LOG', filetype => DBMS_DATAPUMP.KU$_FILE_TYPE_LOG_FILE);<br /><br />-- Disable Archivelog for the import: [12c new feature] DBMS_DATAPUMP.metadata_transform (h1, name => 'DISABLE_ARCHIVE_LOGGING', value => 1);<br /><br />-- REMAP SCHEMA:<br />-- DBMS_DATAPUMP.METADATA_REMAP(h1,'REMAP_SCHEMA','EPS_OWNER','EPS');<br />-- If a table already exists: [SKIP, REPLACE, TRUNCATE]<br /> DBMS_DATAPUMP.SET_PARAMETER(h1,'TABLE_EXISTS_ACTION','SKIP');<br /><br />-- REMAP TABLESPACE: DBMS_DATAPUMP.METADATA_REMAP(h1,'REMAP_TABLESPACE','EPS','TBS_EPS');<br /><br />-- Start the job. An exception is returned if something is not set up properly. DBMS_DATAPUMP.START_JOB(h1);<br /><br />-- The following loop will monitor the job until it get complete.meantime the progress information will be displayed:<br /> percent_done := 0;<br /> job_state := 'UNDEFINED';<br /> while (job_state != 'COMPLETED') and (job_state != 'STOPPED') loop<br /> dbms_datapump.get_status(h1,<br /> dbms_datapump.ku$_status_job_error +<br /> dbms_datapump.ku$_status_job_status +<br /> dbms_datapump.ku$_status_wip,-1,job_state,sts);<br /> js := sts.job_status;<br /><br />-- If the percentage done changed, display the new value. if js.percent_done != percent_done<br /> then<br /> dbms_output.put_line('*** Job percent done = ' ||<br /> to_char(js.percent_done));<br /> percent_done := js.percent_done;<br /> end if;<br /><br />-- If any work-in-progress (WIP) or Error messages were received for the job, display them. if (bitand(sts.mask,dbms_datapump.ku$_status_wip) != 0)<br /> then<br /> le := sts.wip;<br /> else<br /> if (bitand(sts.mask,dbms_datapump.ku$_status_job_error) != 0)<br /> then<br /> le := sts.error;<br /> else<br /> le := null;<br /> end if;<br /> end if;<br /> if le is not null<br /> then<br /> ind := le.FIRST;<br /> while ind is not null loop<br /> dbms_output.put_line(le(ind).LogText);<br /> ind := le.NEXT(ind);<br /> end loop;<br /> end if;<br /> end loop;<br /><br />-- Indicate that the job finished and gracefully detach from it. dbms_output.put_line('Job has completed');<br /> dbms_output.put_line('Final job state = ' || job_state);<br /> dbms_datapump.detach(h1);<br />END;<br />/<br />Mahmmoud ADELhttps://www.blogger.com/profile/15299387537990081025noreply@blogger.comtag:blogger.com,1999:blog-3890970057024318255.post-45800612491872962482020-02-17T10:47:37.348+03:002020-02-17T10:47:37.348+03:00For one table it worked with me and exported succe...For one table it worked with me and exported successfully. for another table while I'm using exactly the same import block, I'm keep getting this error:<br /><br />ORA-06550: line 22, column 46:<br />PLS-00312: a positional parameter association may not follow a named association<br />ORA-06550: line 22, column 3:<br />PL/SQL: Statement ignored<br />ORA-06550: line 25, column 3:<br />PLS-00306: wrong number or types of arguments in call to 'SET_PARAMETER'<br />ORA-06550: line 25, column 3:<br />PL/SQL: Statement ignored<br />ORA-06550: line 28, column 46:<br />PLS-00312: a positional parameter association may not follow a named association<br />ORA-06550: line 28, column 3:<br />PL/SQL: Statement ignored<br />06550. 00000 - "line %s, column %s:\n%s"Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-3890970057024318255.post-88786761797929622372020-02-06T20:14:52.345+03:002020-02-06T20:14:52.345+03:00This format will work for you:
SELECT rdsadmin.rd...This format will work for you:<br /><br />SELECT rdsadmin.rdsadmin_s3_tasks.download_from_s3(<br /> p_bucket_name => '', <br /> p_s3_prefix => 'oracle/files/exports/', <br /> p_directory_name => 'DATA_PUMP_DIR') <br /> AS TASK_ID FROM DUAL; <br /><br />Don't forget the "/" after the last directory in the path :-)Mahmmoud ADELhttps://www.blogger.com/profile/15299387537990081025noreply@blogger.comtag:blogger.com,1999:blog-3890970057024318255.post-36627757970313249242020-02-06T18:45:46.300+03:002020-02-06T18:45:46.300+03:00I've the export files stored under the followi...I've the export files stored under the following tree:<br />oracle/files/exports<br /><br />any idea how to download the files under that tree without downloading the whole bucket? my bucket have TBs of files however my export files are just 12GB.Anonymousnoreply@blogger.com