Teach Yourself Oracle 8 In 21 Days
- Day 9 -
Managing Data
An important aspect of Oracle8 administration is getting data into and out of
the database. There are several different ways of performing both of these tasks:
- Export--Put a database's content and structure into a binary export file. Export
files can be read only by the Oracle Import utility.
- Import--Use data from an import file to re-create the database's content and
structure.
- SQL*Loader--This is a very flexible tool that is used to load ASCII or flat-file
data into an Oracle database. The SQL*Loader utility offers many options, as you
will see later today.
- Backup/recovery--The backup and recovery features have their own way of loading
and restoring data. Because this is covered on Days 17, "Recovering the Database,"
and 18, "Administering Oracle Replication," I do not go into it here.
Using Export and Import
New Term: The Oracle Export utility
is designed to write Oracle object definitions and data to an Oracle-specific binary
file. This file is known as the export file. An export file is Oracle specific
and can be read only by the Oracle Import utility, which you will learn about today.
The Export utility can be used for several different purposes, including
- To back up the database--Export can be used to create a backup of the Oracle
database, but is not the most efficient backup mechanism. Backup and recovery are
covered on Days 16-18.
- To move data between databases--You can export data to an export file and then
import this data back into a different database. This is a great way to transfer
tables from one database to another.
- To rebuild a database--If you have a database whose tablespaces are fragmented,
you can use Export and Import to defragment the database. This could improve performance
if fragmentation is slowing you down.
- To reorganize a database--If you want to reorganize the locations of datafiles
and so on, you can use Export and Import. In this manner, you can create new tablespaces
and use Import to reload this data.
As you can see, there are several different reasons for using Export and Import.
Export
The Oracle Export utility provides a straightforward and simple service. Export
writes object definitions and table data to an Oracle-format binary file. This information
can be used to transfer data between databases on different machines or to supplement
the normal backup process.
The export file first contains the object data followed by the related objects.
For example, if tables have indexes on them, first the table data is written to the
export file, then the object's indexes.
NOTE: The binary files written by the
Oracle Export utility can be read only by the Oracle Import utility. If you want
to transfer data to another RDBMS or read it with another utility, do not use Export.
The export file is also used to reorganize data within the database. The normal
backup process copies an image of the datafile; if recovery is needed, it can only
write back the same image. Because Export organizes the data as it is written to
the export file, importing that data does not necessarily place the data in the exact
same place on disk. This provides a great advantage because it can reduce fragmentation
and row chaining.
NOTE: Before you can run Export and Import,
the administrative SQL script CATEXP.SQL must have been run once on the
database. The administrative SQL script CATALOG.SQL, which is typically
run at database-creation time, automatically runs CATEXP.SQL.
As with almost every Oracle utility, the export function can be accomplished graphically
via Data Manager or by using the EXP80 program.
Export Using Data Manager
To perform an export graphically, you must first invoke Data Manager. For Data
Manager to work, the Oracle agent and Enterprise Manager must be running.
NOTE: Data Manager must be run by a user
with DBA privileges if a full export is to be performed.
- 1. When you invoke Data Manager, you will see the initial screen (shown
in Figure 9.1).
Figure 9.1.
Data Manager.
- 2. From Data Manager, you can invoke the export operation by selecting
Data | Export. This invokes the Data Manager Export wizard. This wizard asks a number
of questions concerning the export operation. As shown in Figure 9.2, the first screen
involves the location of the export file.
Figure 9.2.
Screen one of the Data Manager Export wizard.
TIP: Instead of creating the export file
in the default directory, I prefer to create a directory specifically for export
files. You can easily change the location where the export file will be created in
the Data Manager.
- 3. Screen two of the Data Manager Export wizard lets you choose the objects
to be exported. By default, the entire database is selected, as shown in Figure 9.3.
Figure 9.3.
Screen two of the Data Manager Export wizard.
4. In screen three, you can select the objects you want to export, including
grants, in-dexes, table rows, and constraints. To perform an export in full mode,
leave all of the boxes checked. At this point you can also choose the Direct Export
option, which is not the default.
5. Screen four allows you to select the record length (this is necessary
if you are transferring an export file to another OS), buffer size, and log file.
I find that the default settings are usually sufficient.
6. Screen five allows you to select whether you are performing a complete
or incremental export. Statistics modes are also selected here. These indicate whether
estimated or calculated statistics will be gathered for the optimizer when this data
is imported. Finally, you can configure the consistency mode and the extent merging
options here. This screen is shown in Figure 9.4.
Figure 9.4.
Screen five of the Data Manager Export wizard.
7. Screen seven summarizes the export actions that will be taken when the
Finish button is clicked, as shown in Figure 9.5.
Figure 9.5.
Screen seven of the Data Manager Export wizard.
8. If you selected to schedule the export rather than perform this operation
immediately, you are presented with screen six, which is shown in Figure 9.6. If
you did not select the schedule option, the Data Manager Export wizard jumps from
screen five to screen seven.
Figure 9.6.
Screen six of the Data Manager Export wizard.
9. The status of the export is displayed in the Export Status screen. Errors
and successful completion messages are displayed here.
Export Using the EXP80 Utility
Exports can be performed via the graphical utilities, as well as with a command-line
option. When you run the Export utility via the NT command prompt, a program called
EXP80 is run.
NOTE: Under Windows NT, the export program
is called EXP80. Under other operating systems, it may be known simply as EXP.
Export has several options and modes of operation:
- Full--Operating Export in full mode causes all database objects not in the SYS
schema to be exported. Because the SYS schema and schema objects are created
at database-creation time, the SYS schema and schema objects are excluded.
NOTE: That the SYS schema objects
are not exported in the full mode is another good reason not to create objects under
the SYS schema.
- Table--Operating Export in table mode allows you to specify which schema objects
to export.
- User--Operating Export in user mode exports all objects belonging to the specified
user. These objects include tables, data, grants, and indexes.
Export can also be used to export partitions in a partitioned table. This is accomplished
by exporting in table mode and specifying the table partition.
Using Export
Export is invoked from the command line and supports a variety of options. Because
of the importance of the Export and Import commands, these options are described
here. For more detailed information, refer to the Oracle8 documentation.
The Syntax for Export
Export is invoked with the following syntax:
SYNTAX:
EXP80 username/password [ options ... ]
To simplify the usage of Export and to allow for the standard use of options,
you can use a parameter file and store the export parameters. I recommend this for
consistency, reusability, and error reduction. To invoke Export with the parameter
file, use this syntax:
EXP80 username/password PARFILE=filename [ options ... ]
If you use the parameter file, it is not necessary to use command-line options,
but you can if you want. The parameter file contains a list of parameters, one per
line. An example of a parameter file is included at the end of this section. The
available Export parameters use the following format:
PARAMETER=value
The Export parameters are
- BUFFER=number--This parameter specifies the size of the copy buffer
(in bytes) used by Export. If this is zero, one row at a time will be fetched.
- COMPRESS=[Y or N--This parameter specifies how the initial extent is
treated. If this parameter is set to Y, Export consolidates all table data
into one extent. If this parameter is set to N, Export uses the current
storage parameters to create the extents. The default is Y.
- CONSISTENT=[Y or N--This parameter specifies whether the export will
be performed in a manner that causes data to be consistent to a single point in time.
A value of Y causes the export to be consistent but can consume significant
rollback space and can fail if there is significant update activity. The default
is N.
- CONSTRAINTS=[Y or N--This parameter specifies whether to export table
constraints. The default is Y.
- DIRECT=[Y or N--This parameter specifies whether the direct path export
option is used, thus bypassing the SQL command-processing layer. Direct path export
cannot be used on some schema objects. The default is N.
- FEEDBACKnumber--This parameter specifies that a dot be displayed for
every number exported rows. For example, if FEEDBACK=100, you see a dot
for every 100 exported rows to indicate the progress of the export operation. The
default is 0.
- FILEfilename--This parameter specifies the name of the export file.
The default is EXPDAT.DMP.
- FULL=[Y or N--This parameter specifies whether a full database export
is performed. Specifying FULL=Y exports in full mode. The default is N.
- GRANTS=[Y or N--This parameter specifies that grants are exported. The
default is Y.
- HELP=[Y or N--If this parameter is set to Y, a list of export
parameters is displayed. The default is N.
- INCTYPEtype--This parameter specifies an incremental export. The values
available are COMPLETE, CUMULATIVE, and INCREMENTAL.
- INCTYPE=COMPLET--This parameter serves as the baseline for CUMULATIVE
and INCREMENTAL exports. It is the same as FULL=Y with additional
information updated for incremental exports.
- INCTYPE=CUMULATIV--This parameter exports tables that have changed since
the last COMPLETE or CUMULATIVE export.
- INCTYPE=INCREMENTA--This parameter exports tables that have changed
since the last COMPLETE, CUMULATIVE, or INCREMENTAL export.
- INDEXES=[Y or N--This parameter specifies that indexes should be exported.
The default is Y.
- LOGlogfile--This parameter specifies the name of a log file in which
to write error and status messages. Even when this parameter is set, messages are
still displayed on the screen.
- OWNERowner_name(s)--This parameter specifies a list of users whose objects
will be exported in user mode.
- PARFILEparam_file--This parameter specifies the parameter filename.
- POINT_IN_TIME_RECOVER=[Y or N--This parameter specifies whether a point-in-time
recovery will be performed on import.
- RECORD=[Y or N--This flag specifies that the system tables SYS.INCVID,
SYS.INCFIL, and SYS.INCEXP record a cumulative or incremental export.
The default is Y.
- RECORDLENGTHnumber--This parameter specifies the size in bytes of the
file record. This is used if you are going to transfer the export file to another
operating system.
- RECOVERY_TABLESPACEts_name--This parameter specifies the names of the
tablespaces to be used in the point-in-time recovery.
- ROWS=[Y or N--This parameter specifies that the rows of table data should
be exported. The default is Y.
- STATISTICStype--Available values are ESTIMATE, COMPUTE,
and NONE. The default is ESTIMATE.
- STATISTICS=ESTIMAT--This parameter specifies that statistics are generated
on the tables using the ESTIMATE method when the data is imported.
- STATISTICS=COMPUT--This parameter specifies that statistics are generated
on the tables using the COMPUTE method when the data is imported.
- STATISTICS=NON--This parameter specifies that statistics are not generated
on the tables when the data is imported.
- TABLEStables--This parameter specifies a list of tables to export when
using Export in table mode.
- USERIDusername/password--This parameter specifies the username
and password of the Oracle user performing the export.
Here is an example of a parameter file:
FULL=Y
BUFFER=8192
FILE=D:databaseexportEXPDAT.DMP
STATISTICS=COMPUTE
If no parameters are specified, the Export utility prompts you for values. Most
prompted items come with a default value. An example of using Export interactively
is shown in Listing 9.1.
Listing 9.1. Example of EXP80.
D:>exp80
Export: Release 8.0.2.0.2 - Beta on Thu Jul 17 18:15:20 1997
Copyright (c) Oracle Corporation 1979, 1994, 1996. All rights reserved.
Username: internal
Password:
Connected to: Oracle8 Server Release 8.0.2.0.2 - Beta
With the distributed, heterogeneous, replication, objects
and parallel query options
PL/SQL Release 3.0.2.0.2 - Beta
Enter array fetch buffer size: 4096 >
Export file: EXPDAT.DMP >
(2)U(sers), or (3)T(ables): (2)U >
Export grants (yes/no): yes >
Export table data (yes/no): yes >
Compress extents (yes/no): yes >
Both methods export the data in the same way. There exists the possibility of
certain error conditions that may or may not cause Export to fail. These error conditions
are described in the Oracle documentation.
Import
The Oracle Import utility has one function: to load data that has been exported
into an Oracle database. The Import utility can read only exported data. If you want
to load other data into an Oracle database, you must use another utility such as
SQL*Loader, which is discussed later today.
As described earlier today, Export and Import can be used for several different
functions:
- For backup/recovery--Backup and recovery are covered on Days 16-18.
- To move data between databases on different systems.
- To rebuild a database--This is useful to eliminate fragmentation and chained
rows.
- To reorganize a database--If you want to reorganize the locations of datafiles
and so on, you can use Export and Import.
There are several different reasons for using Export and Import. As with the Export
utility, imports can be performed via Data Manager or via a command-line program.
Import Using Data Manager
Data Manager can be used to export, import, or load data. The import operation
is much less complicated than the export operation because the export file contains
information about the options used during the export process.
- 1. To invoke the import operation, select Data | Import in Data Manager.
The initial screen in the Data Manager Import wizard allows you to select the export
file from which you want to import. This filename can be typed, or it can be selected
via the Browse function (see Figure 9.7).
Figure 9.7.
Screen one of the Data Manager Import wizard.
2. Select the objects to be imported in screen two. By default, no objects
are selected. You must select at least one object for the import to work. This is
shown in Figure 9.8.
Figure 9.8.
Screen two of the Data Manager Import wizard.
3. In screen three, the objects that were exported are selected. If you
do not want to import a schema object (such as grants and the like), it can be deselected
at this point.
4. Screen four allows you to select the record length (this is necessary
if you are im-porting an export file from another OS), buffer size, and log file.
I find that the default settings are usually sufficient.
5. Screen five contains advanced options such as the import type and whether
to im-port all data or only recent data. From this screen you can also write index-creation
commands to a file to allow you to re-create the indexes rather than import them.
You'll also find the Commit after each array insert and the Overwrite existing data
files checkboxes here. This screen is shown in Figure 9.9.
Figure 9.9.
Screen five of the Data Manager Import wizard.
6. As with the Export wizard, the final screen shows a list of the selected
items for final approval before the import operation proceeds. Screen six is displayed
only if you have chosen to schedule the import operation to occur at a later time.
7. Like the Export utility, the Import utility shows a status screen and
logs status in-formation to a log file. For example, Figure 9.10 contains numerous
error messages, which are due to the fact that I tried to import data over existing
tables and did not select to overwrite existing data in screen five. This offers
you some protection from accidentally destroying your own data by unintentionally
overwriting good data.
Figure 9.10.
The Data Manager Import wizard's Import Status screen.
Import Using the IMP80 Utility
As with the Export utility, the Import utility features a command-line utility
for performing imports. The Import utility's command-line utility is called IMP80
under Windows NT. Under certain other operating systems, it is known simply as IMP.
IMP80 supports a variety of options, many very similar to the export options.
Because of the importance of the Export and Import commands, these options are described
here. For more detailed information, refer to the Oracle8 documentation.
The Syntax for Import
Import is invoked with the following syntax:
IMP80 username/password [ options ... ]
As with Export, you can store the Import parameters in a parameter file in Import.
I recommend this for consistency, reusability, and error reduction. To invoke Import
with the parameter file, use this syntax:
IMP80 username/password PARFILE=filename [ options ... ]
If you use the parameter file, it is not necessary to use command-line options,
but you can if you want. The parameter file contains a list of parameters, one per
line. An example of a parameter file is included at the end of this section. The
available Import parameters use the following format:
PARAMETER=value
The Import parameters are
- ANALYZE=[Y or N]--This parameter specifies that the SQL ANALYZE
command be run. The default is Y.
- BUFFERnumber--This parameter specifies the size of the copy buffer (in
bytes) used by Import. If this is zero, one row at a time will be fetched.
- CHARSETcharacter_set--This parameter specifies the character set used
for the export if it was an Oracle6 export. From Oracle7 forward, the character-set
information is written to the export file. Import will use this value to verify that
the export file was written with this character set. It is usually not necessary
to use this option.
- COMMIT=[Y or N--This parameter specifies that a commit should occur
after each array insert. By default, the commit occurs after each table is imported.
The default is N.
- DESTROY=[Y or N--This parameter specifies that the original datafiles
should be reused. This essentially adds the reuse option to the CREATE TABLESPACE
operation. The default is N.
- FEEDBACKnumber--This parameter specifies that a dot be displayed for
every number of imported rows. For example, if FEEDBACK=100, you will see
a dot for every 100 rows, indicating the progress of the import operation. The default
is 0.
- FILEfilename--This parameter specifies the name of the export file.
The default is EXPDAT.DMP.
- FROMUSERusername--This parameter specifies a schema to import the data
for. If no schema is identified, the entire export file will be imported.
- FULL=[Y or N--This parameter specifies whether a full database import
is performed. Specifying FULL=Y imports in full mode. The default is N.
- GRANTS=[Y or N--This parameter specifies that grants should be exported.
The default is Y.
- HELP=[Y or N--If this parameter is set to Y, a list of Export
parameters will be displayed. The default is N.
- IGNORE=[Y or N--This parameter describes how the system will operate
if table-creation errors occur. If IGNORE=Y, the import will continue without
reporting an error if an error occurs. If IGNORE=N, an error will be reported
before the import continues. The default is N.
- INCTYPEtype--This parameter specifies the type of import. Available
options are SYSTEM and RESTORE.
- INCTYPE=SYSTE--This parameter restores the most recent copy of the SYSTEM
objects.
- INCTYPE=RESTOR--This parameter restores all user and database objects
that are in the export file.
- INDEXES=[Y or N--This parameter specifies that indexes should be imported.
The default is Y.
- INDEXFILEfilename--This parameter specifies that index-creation syntax
should be written to filename. This allows you to easily re-create the indexes rather
than restore them.
- LOGlogfile--This parameter specifies the name of a log file in which
to write error and status messages. Even with this parameter set, messages are still
displayed on the screen.
- PARFILEparam_file--This parameter specifies the parameter filename.
- POINT_IN_TIME_RECOVER=[Y or N--This parameter specifies that you want
to recover to a particular point in time. The default is N.
- RECORDLENGTHnumber--This parameter specifies the size in bytes of the
file record.
- ROWS=[Y or N--This parameter specifies that the rows of table data should
be exported. The default is Y.
- SHOW=[Y or N--This parameter specifies that the contents of the export
file are displayed on the screen rather than imported into the database. This can
be useful in debugging situations. The default is N.
- SKIP_UNUSABLE_INDEXES=[Y or N--If set to Y, this parameter
specifies that unusable indexes are not immediately rebuilt, thus allowing you to
rebuild them after the import has completed. The default is N.
- TABLEStables--This parameter specifies a list of tables to import when
using Import in table mode.
- TOUSERusername--This parameter specifies a list of users whose schema
will be used on the import. If you want to import one schema's data to another schema,
use FROMUSER and TOUSER.
- USERIDusername/passwd--This parameter specifies the username
and password of the Oracle user performing the export.
Here is an example of a parameter file:
FULL=Y
BUFFER=8192
FILE=D:databaseexportEXPDAT.DMP
STATISTICS=ESTIMATE
If no parameters are specified, the Import utility will prompt you for values.
Most prompted items come with a default value. An example of using Import interactively
is shown in Listing 9.2.
Listing 9.2. Example of IMP80.
D:>imp80
Import: Release 8.0.3.0.0 - Production on Sat Jul 19 12:24:53 1997
(c) Copyright 1997 Oracle Corporation. All rights reserved.
Username: system
Password:
Connected to: Oracle8 Enterprise Edition Release 8.0.3.0.0 - Production
With the Partitioning and Objects options
PL/SQL Release 8.0.3.0.0 - Production
Import file: EXPDAT.DMP > d:databaseexportEXPDAT.DMP
Enter insert buffer size (minimum is 4096) 30720>
Export file created by EXPORT:V08.00.03 via conventional path
List contents of import file only (yes/no): no >
Ignore create error due to object existence (yes/no): no >
Import grants (yes/no): yes >
Import table data (yes/no): yes >
Import entire export file (yes/no): no >
Username: ETW
Enter table(T) or partition(T:P) names. Null list means all tables for user
Enter table(T) or partition(T:P) name or . if done:
Whether you use Import via Data Manager or with the IMP80 utility, the outcome
will be the same. As mentioned earlier, Import can be used only with data generated
from Export. If you want to load ASCII data or other data into a database, you must
use the SQL*Loader utility.
Using SQL*Loader
SQL*Loader is another Oracle utility that is used for loading data into an Oracle
database. Where the Import utility is designed to accept data in a specific format,
SQL*Loader is designed to be flexible and to accept data in a variety of formats.
SQL*Loader accepts two input file types: the actual input datafile and a loader
control file. The control file is used to specify the format of the datafile(s).
The control file is also used to specify such things as the column data types, field
delimiters, and various other data-specific information.
Like Export and Import, SQL*Loader can be invoked from within Data Manager or
from the command line. Regardless of which method you use, you are still required
to specify a control file.
The Control File
The control file is used to specify information about the data to be loaded. The
format of the control file contains control information and can also contain the
data itself.
The control file can contain multiple lines for each statement and is not case
sensitive except for characters within single or double quotes. The control file
can also include comments that are indicated by double hyphens (--).
The control file has more than 90 keywords that can be used to specify the format
of the data and how it is to be loaded. Because of the large number of options, I
cover only the key topics here. The entire list of keywords can be found in the Oracle
documentation.
The basics of the control file involve control statements that tell SQL*Loader
the following:
- What operation to perform
- Where to find the input datafile
- Where to load the data to
- The format of the data
The data can be of fixed length or delimited.
Fixed Record Format
When loading data that has a fixed record format (each field is the same length),
you must specify the length of the fields in the control file. Here is an example
of a fixed record control file:
LOAD DATA
INFILE `D:databaseloaddogs.dat'
INTO TABLE "ETW".dogs2 (
Id POSITION(01:02) INTEGER EXTERNAL,
Name POSITION(05:08) CHAR,
OWNER_ID POSITION(11:11) INTEGER EXTERNAL,
BREED_ID POSITION(13:13) INTEGER EXTERNAL,
RANK POSITION(15:16) INTEGER EXTERNAL,
NOTES POSITION(18:20) CHAR)
The various components consist of
- A load directive
- The input file specification
- A table definition
- A data format definition
This is enough information to define the load operation.
Variable Record Format
To load a variable record format datafile (all columns are not the same size),
you must specify a column delimiter. This column delimiter indicates to SQL*Loader
where one column finishes and another picks up. Here is an example of a variable
record control file:
LOAD DATA
INFILE `D:databaseloaddogs2.dat'
INTO TABLE "ETW".dogs2
FIELDS TERMINATED BY "," OPTIONALLY ENCLOSED BY `"`
(id, name CHAR, owner_id, breed_id, rank, notes CHAR)
The various components consist of
- A load directive
- The input file specification
- A table definition
- A data format definition
This provides SQL*Loader with enough information to load the data. There are a
few more optional parameters, which you will see next.
Optional Parameters in the Control File
There are numerous options available to SQL*Loader via the control file or at
the command line. Here are some useful options available to the loader:
- BAD=filename--The bad file filename. This is where bad data is logged.
- CONTROLfilename--The name of the control file.
- DATAfilename--The name of the input datafile.
- DIRECT=(TRUE or FALSE--This specifies whether the direct path loader
is used. The direct path loader is described in the next section.
- PARALLEL=(TRUE or FALSE--This specifies a parallel load.
- LOADn--The number of records to load. The default is all.
- LOGfilename--The log file filename.
- ROWSn--The number of rows in each array insert.
- SKIPn--The number of logical records to skip.
- UNRECOVERABLE=(TRUE or FALSE--Available only with the direct path load.
This parameter specifies that the load data operation does not log to the redo log
files. This makes the loading operation faster but data cannot be recovered if the
load fails.
Using the Direct Path Loader
The conventional loading technique uses SQL INSERT statements to load
the data into the database. Each insert goes through all the logic and steps performed
in a standard INSERT statement. To improve performance, another option is
available: the direct path loader.
When you use the direct path loader, data is inserted directly into the datafiles,
thus bypassing much of the logic involved in the conventional path load. The direct
path loader is faster and more efficient, but there are a few restrictions on it:
- You cannot direct path load a clustered table.
- You cannot direct path load a table when active transactions are pending.
The advantages of using the direct path loader include the capacity to bypass
many of the steps taken by the conventional loader and the capability to load in
parallel. Another advantage is the capability to use the UNRECOVERABLE option.
With the UNRECOVERABLE option set, the load is not logged; thus, performance
is enhanced. Nonetheless, there is an inherent danger in using the UNRECOVERABLE
option: Because it is not logged, it is unrecoverable. Be sure to perform a backup
soon after using the UNRECOVERABLE option.
Loading Using Data Manager
To use the loader from Data Manager, select Data | Load. This invokes the Data
Manager Load wizard. The first screen, shown in Figure 9.11, prompts you to select
a control file, and you cannot proceed until you do so. Either type the name of the
control file or use the Browse function.
Figure 9.11.
Screen one of the Data Manager Load wizard.
After you select the control file, proceed to screen two (shown in Figure 9.12).
Here you can fill out the following file descriptors:
- Data File--Enter the name of the datafile.
- Log File--Specify where the logging information is to be stored.
- Bad File--Specify where records that have errors should be stored.
- Discard File--Specify where rejected and uninserted records should be stored.
- Parallel File--Specify whether direct loads can perform concurrent sessions.
These files will be used in the load if not already specified in the control file.
Figure 9.12.
Screen two of the Data Manager Load wizard.
Screen three, shown in Figure 9.13, allows you to select advanced options such
as direct path loading, skipped records, records to load, and so on.
Figure 9.13.
Screen three of the Data Manager Load wizard.
After you select from these options, you jump to screen five, the summary page,
which allows you to review your selections before you proceed. Screen four is displayed
only if you have chosen to schedule the load.
The Data Manager wizard makes it quite convenient to perform operations such as
exports, imports, and loads. But as with the other utilities, SQL*Loader can also
be run from the command line, as shown in the next section.
Loading Using the SQLLDR80 Utility
Invoke SQL*Loader on the command line by using the program SQLLDR80. This is the
Windows NT filename; under other operating systems, it may be known simply as SQLLDR.
All the options mentioned in the section about control files are available to SQL*Loader
from the command line.
Because most of the options can be placed in the control file, only a few command-line
options are necessary:
- USERNAME=username/password--This parameter specifies the username
under which to run the loader.
- PARFILEparameter_file--This parameter optionally specifies the name
of a parameter file that contains additional parameters.
- BADfilename--This parameter specifies the bad file filename. This is
where bad data is logged.
- CONTROLfilename--This parameter specifies the name of the control file.
- DATAfilename--This parameter specifies the name of the input datafile.
- DIRECT=(TRUE or FALSE--This parameter specifies whether the direct path
loader is used.
- PARALLEL=(TRUE or FALSE--This parameter specifies whether a parallel
load can occur.
- LOADn--This parameter specifies the number of records to load. The default
is all.
- LOGfilename--This parameter specifies the log file filename.
- ROWSn--This parameter specifies the number of rows in each array insert.
- SKIPn--This parameter specifies the number of logical records to skip.
These parameters are also available for use in the control file. If you are specifying
multiple parameters and running the loader frequently, I recommend putting the parameters
in the control file or in a parameter file.
Summary
Today you learned the various methods for moving data in and out of the Oracle
database. The utilities described here are very useful and are frequently used.
I have included the Export and Import utilities here rather than in the backup
and recovery lessons because they have significantly more uses than just for backup
and recovery. The Export and Import utilities can be used to move data between systems
or to help reorganize a database, whereas the backup and recovery process has only
one use.
SQL*Loader can be used to load both fixed and variable length records into the
database. You saw both methods today.
Both a graphical and a command-line option are available with all these utilities.
The graphical option is available through the Enterprise Manager's Data Manager utility.
Command-line options are available through various utilities.
What's Next?
On Day 10, "Administering User Accounts," you will learn how to administer
user accounts in the Oracle RDBMS. You will see how users are defined and modified,
as well as profiles and roles. Administering user accounts is probably the most common
job of the Oracle DBA because new users are constantly being added or modified. You
will learn tomorrow how this task is simplified by using roles and profiles.
Q&A
- Q Why would I use the Export/Import utilities?
A The Export/Import utilities can be used for several purposes, including
backups, movement of data between systems, database rebuilds, and database reorganization.
Q How is the loader different from Import?
A SQL*Loader is designed to load formatted datafiles of various formats,
whereas the Import utility can only load export files.
Q What kind of data can be loaded with SQL*Loader?
A Any data can be loaded with SQL*Loader.
Q What is the difference between the conventional path loader and the direct
path loader?
A The conventional path loader essentially loads the data by using INSERT
statements, whereas the direct path loader bypasses much of the logic involved with
that and loads directly into the datafiles.
Workshop
The workshop provides quiz questions to help you solidify your understanding of
the material covered and exercises to provide you with experience in using what you've
learned. Find answers to the quiz questions in Appendix A, "Answers."
Quiz
- 1. What is Export used for?
2. What is Import used for?
3. What is SQL*Loader used for?
4. What Oracle utilities can be used to load export files?
5. Name two uses of Export/Import.
6. What is the name of the Windows NT version of the Export utility?
7. What is the name of the Windows NT version of the Import utility?
8. What is the name of the Windows NT version of SQL*Loader?
9. What is the different between a fixed and a variable record load file?
10. What is the parameter file?
Exercises
- 1. Perform an export using Data Manager.
2. Perform an export using the EXP80 utility.
3. Import the data using Data Manager.
4. Import the data using the IMP80 utility.
5. Load a small dummy table using the load option in Data Manager.
© Copyright, Macmillan Computer Publishing. All
rights reserved.
|