June 4, 2023

Top Oracle DBA Export/Import - Data Pump Interview Questions and Answers - 2023

Essential Oracle Export/Import (exp/imp) - Data Pump (expdp/impdp) Interview Questions/FAQs for experienced


1. What is the use of CONSISTENT option in exp?
Answer:
Cross-table consistency. Implements SET TRANSACTION READ ONLY. Default value N.

2. What is the use of DIRECT=Y option in exp?
Answer:
Setting direct=yes, to extract data by reading the data directly, bypasses the SGA, bypassing the SQL command-processing layer (evaluating buffer), so it should be faster. Default value N.

3. What is the use of COMPRESS option in exp?
Answer:
Imports to one extent. Specifies how export will manage the initial extent for the table data. This parameter is helpful during database re-organization. Export the objects (especially tables and indexes) with COMPRESS=Y. If the table was spawning 20 Extents of 1M each (which is not desirable, taking into account performance), if you export the table with COMPRESS=Y, the DDL generated will have an initial of 20M. Later on, when importing the extents will be coalesced. Sometimes it is found desirable to export with COMPRESS=N, in situations where you do not have contiguous space on the disk (tablespace), and do not want imports to fail.

4. How to improve exp performance?
Answer:
1. Set the BUFFER parameter to a high value. Default is 256KB.
2. Stop unnecessary applications to free the resources.
3. If you are running multiple sessions, make sure they write to different disks.
4. Do not export to NFS (Network File Share). Exporting to disk is faster.
5. Set the RECORDLENGTH parameter to a high value.
6. Use DIRECT=yes (direct mode export).

5. How to improve imp performance?
Answer:
1. Place the file to be imported in a separate disk from datafiles.
2. Increase the DB_CACHE_SIZE.
3. Set LOG_BUFFER to big size.
4. Stop redolog archiving, if possible.
5. Use COMMIT=n, if possible.
6. Set the BUFFER parameter to a high value. Default is 256KB.
7. It's advisable to drop indexes before importing to speed up the import process or set INDEXES=N and building indexes later on after the import. Indexes can easily be recreated after the data was successfully imported.
8. Use STATISTICS=NONE
9. Disable the INSERT triggers, as they fire during import.
10. Set Parameter COMMIT_WRITE=NOWAIT(in Oracle 10g) or COMMIT_WAIT=NOWAIT (in Oracle 11g) during import.

6. What is the use of INDEXFILE option in imp?
Answer:
Will write DDLs of the objects in the dump file into the specified file.

7. What is the use of IGNORE option in imp?
Answer:
Will ignore the errors during import and will continue the import.

8. What are the differences between expdp and exp (Data Pump or normal exp/imp)?
Answer:
Data Pump is server centric (files will be at the server).
Data Pump has APIs, from procedures we can run Data Pump jobs.
In Data Pump, we can stop and restart the jobs.
Data Pump will do parallel execution.
Tapes & pipes are not supported in Data Pump.
Data Pump consumes more undo tablespace.
Data Pump import will create the user, if the user doesn’t exist.

9. Why expdp is faster than exp (or) why Data Pump is faster than conventional export/import?
Answer:
Data Pump is block mode, exp is byte mode. 
Data Pump will do parallel execution.
Data Pump uses direct path API.

10. How to improve expdp performance?
Answer:
Using parallel option which increases worker threads. This should be set based on the number of CPUs.

11. How to improve impdp performance?
Answer:
Using parallel option which increases worker threads. This should be set based on the number of CPUs.

12. In Data Pump, where the jobs info will be stored (or) if you restart a job in Data Pump, how it will know from where to resume?
Answer:
Whenever Data Pump export or import is running, Oracle will create a table with the JOB_NAME and will be deleted once the job is done. From this table, Oracle will find out how much job has been completed and from where to continue etc.
Default export job name will be SYS_EXPORT_XXXX_01, where XXXX can be FULL or SCHEMA or TABLE.
Default import job name will be SYS_IMPORT_XXXX_01, where XXXX can be FULL or SCHEMA or TABLE.

13. What is the order of importing objects in impdp?
Answer:
 Tablespaces
 Users
 Roles
 Database links
 Sequences
 Directories
 Synonyms
 Types
 Tables/Partitions
 Views
 Comments
 Packages/Procedures/Functions
 Materialized views

14. How to import only metadata?
Answer:
CONTENT= METADATA_ONLY

15. How to import into different user/tablespace/datafile/table?
Answer:
REMAP_SCHEMA
REMAP_TABLESPACE
REMAP_DATAFILE
REMAP_TABLE
REMAP_DATA

16. How to export/import without using an external directory?
Answer:

17. Using Data Pump, how to export in higher version (11g) and import into lower version (10g), can we import to 9i?
Answer:

18. Using normal exp/imp, how to export in higher version (11g) and import into lower version (10g/9i)?
Answer:

19. How to do transport tablespaces (and across platforms) using exp/imp or expdp/impdp?
Answer:

20.

3 comments:

  1. really nice questions on expdp. Thanks for sharing.

    ReplyDelete
  2. Hi There,

    This is indeed great! But I think perhaps you are generally referring #topic which is getting unsustainable.

    Can anyone suggest on TEMP table concept. I have below requirement and I tried the below but I am getting error.

    select partyno into #tmptbl
    from partym --------- when I executed this I got ORA-00911: invalid character error.

    my requirement is :
    I need to execute 3 queries one by one and hold the result of the first query and second query to feed as input to 3rd query. at the end of the job I need to delete first query and second query result.

    It was cool to see your article pop up in my google search for the process yesterday. Great Guide.
    Keep up the good work!

    Gracias,
    Preethi.

    ReplyDelete