Monitoring Data Pump

posted Aug 5, 2011, 6:09 AM by Sachchida Ojha   [ updated Nov 14, 2014, 3:01 PM ]
The gadget spec URL could not be found

A simple way to gain insight into the status of a Data Pump job is to look into a few views maintained within the Oracle instance the Data Pump job is running. These views are DBA_DATAPUMP_JOBS, DBA_DATAPUMP_SESSIONS, and V$SESSION_LOGOPS. These views are critical in the monitoring of your export jobs so, as we will see in a later article, you can attach to a Data Pump job and modify the execution of the that job.

DBA_DATAPUMP_JOBS
This view will show the active Data Pump jobs, their state, degree of parallelism, and the number of sessions attached.

SQL> select * from dba_datapump_jobs

OWNER_NAME JOB_NAME               OPERATION  JOB_MODE   STATE         DEGREE    ATTACHED_SESSIONS
---------- ---------------------- ---------- ---------- ------------- --------- -----------------
JKOOP      SYS_EXPORT_FULL_01     EXPORT     FULL       EXECUTING     1          1
JKOOP      SYS_EXPORT_SCHEMA_01   EXPORT     SCHEMA     EXECUTING     1          1

DBA_DATAPUMP_SESSIONS
This view give gives the SADDR that assist in determining why a Data Pump session may be having problems. Join to the V$SESSION view for further information.

SQL> SELECT * FROM DBA_DATAPUMP_SESSIONS
The gadget spec URL could not be found
OWNER_NAME JOB_NAME SADDR ---------- ------------------------------ -------- JKOOPMANN SYS_EXPORT_FULL_01 225BDEDC JKOOPMANN SYS_EXPORT_SCHEMA_01 225B2B7C

V$SESSION_LONGOPS
This view helps determine how well a Data Pump export is doing. Basically gives you a progress indicator through the MESSAGE column.

SQL> select username,opname,target_desc,sofar,totalwork,message from V$SESSION_LONGOPS

USERNAME OPNAME               TARGET_DES SOFAR TOTALWORK  MESSAGE
-------- -------------------- ---------- ----- ---------- ------------------------------------------------
JKOOP    SYS_EXPORT_FULL_01   EXPORT       132        132 SYS_EXPORT_FULL_01:EXPORT:132 out of 132 MB done
JKOOP    SYS_EXPORT_FULL_01   EXPORT        90        132 SYS_EXPORT_FULL_01:EXPORT:90 out of 132 MB done
JKOOP    SYS_EXPORT_SCHEMA_01 EXPORT        17         17 SYS_EXPORT_SCHEMA_01:EXPORT:17 out of 17 MB done
JKOOP    SYS_EXPORT_SCHEMA_01 EXPORT        19         19 SYS_EXPORT_SCHEMA_01:EXPORT:19 out of 19 MB done
The gadget spec URL could not be found

SQL> select sid, serial#, sofar, totalwork, dp.owner_name, dp.state, dp.job_mode
from gv$session_longops sl, gv$datapump_job dp
where sl.opname = dp.job_name and sofar != totalwork;

SID SERIAL# SOFAR TOTALWORK OWNER_NAME STATE JOB_MODE
———- ———- ———- ———- —————————— —————————— ——————————
122 64151 1703 2574 SYSTEM EXECUTING FULL

You can monitor an Oracle import in several ways:

Monitor at the OS - Do a "ps -ef" on the data pump process and watch it consume CPU. You can also monitor the data pump log file with the "tail -f", command, watching the progress of the import in real time. If you watch the import log, be sure to include the feedback=1000 parameter to direct import to display a dot every 1,000 lines of inserts.

Monitor with the data pump views - The main view to monitor import jobs are dba_datapump_jobs and dba_datapump_sessions.

Monitor with longops - You can query the v$session_longops to see the progress of data pump, querying the sofar and totalwork columns.

select    sid,    serial#
from    v$session s,    dba_datapump_sessions d
where     s.saddr = d.saddr;

select    sid,    serial#,    sofar,    totalwork
from    v$session_longops;

select x.job_name,b.state,b.job_mode,b.degree
, x.owner_name,z.sql_text, p.message
, p.totalwork, p.sofar
, round((p.sofar/p.totalwork)*100,2) done
, p.time_remaining
from dba_datapump_jobs b
left join dba_datapump_sessions x on (x.job_name = b.job_name)
left join v$session y on (y.saddr = x.saddr)
left join v$sql z on (y.sql_id = z.sql_id)
left join v$session_longops p ON (p.sql_id = y.sql_id)
WHERE y.module='Data Pump Worker'
AND p.time_remaining > 0;

The gadget spec URL could not be found

The following are the major new features that provide this increased performance, as well as enhanced ease of use:

  • The ability to specify the maximum number of threads of active execution operating on behalf of the Data Pump job. This enables you to adjust resource consumption versus elapsed time. See PARALLEL for information about using this parameter in export. See PARALLEL for information about using this parameter in import. (This feature is available only in the Enterprise Edition of Oracle Database 10g.)

  • The ability to restart Data Pump jobs. See START_JOB for information about restarting export jobs. See START_JOB for information about restarting import jobs.

  • The ability to detach from and reattach to long-running jobs without affecting the job itself. This allows DBAs and other operations personnel to monitor jobs from multiple locations. The Data Pump Export and Import utilities can be attached to only one job at a time; however, you can have multiple clients or jobs running at one time. (If you are using the Data Pump API, the restriction on attaching to only one job at a time does not apply.) You can also have multiple clients attached to the same job. See ATTACH for information about using this parameter in export. See ATTACH for information about using this parameter in import.

  • Support for export and import operations over the network, in which the source of each operation is a remote instance. See NETWORK_LINK for information about using this parameter in export. See NETWORK_LINK for information about using this parameter in import.

  • The ability, in an import job, to change the name of the source datafile to a different name in all DDL statements where the source datafile is referenced. See REMAP_DATAFILE.

  • Enhanced support for remapping tablespaces during an import operation. See REMAP_TABLESPACE.

  • Support for filtering the metadata that is exported and imported, based upon objects and object types. For information about filtering metadata during an export operation, see INCLUDE and EXCLUDE. For information about filtering metadata during an import operation, see INCLUDE and EXCLUDE.

  • Support for an interactive-command mode that allows monitoring of and interaction with ongoing jobs. See Commands Available in Export's Interactive-Command Mode and Commands Available in Import's Interactive-Command Mode.

  • The ability to estimate how much space an export job would consume, without actually performing the export. See ESTIMATE_ONLY.

  • The ability to specify the version of database objects to be moved. In export jobs, VERSION applies to the version of the database objects to be exported. See VERSION for more information about using this parameter in export.

    In import jobs, VERSION applies only to operations over the network. This means that VERSION applies to the version of database objects to be extracted from the source database. See VERSION for more information about using this parameter in import.

  • Most Data Pump export and import operations occur on the Oracle database server. (This contrasts with original export and import, which were primarily client-based.) See Default Locations for Dump, Log, and SQL Files for information about some of the implications of server-based operations.

The gadget spec URL could not be found

The DBA_DATAPUMP_JOBS and USER_DATAPUMP_JOBS Views

The DBA_DATAPUMP_JOBS and USER_DATAPUMP_JOBS views identify all active Data Pump jobs, regardless of their state, on an instance (or on all instances for Real Application Clusters). They also show all Data Pump master tables not currently associated with an active job. You can use the job information to attach to an active job. Once you are attached to the job, you can stop it, change its parallelism, or monitor its progress. You can use the master table information to restart a stopped job or to remove any master tables that are no longer needed.

Table 1-1 describes the columns in the DBA_DATAPUMP_JOBS view and the USER_DATAPUMP_JOBS view.

Table 1-1 DBA_DATAPUMP_JOBS View and USER_DATAPUMP_JOBS View

Column Datatype Description
OWNER_NAME
VARCHAR2(30)
User who initiated the job (valid only for DBA_DATAPUMP_JOBS)
JOB_NAME
VARCHAR2(30)
User-supplied name for the job (or the default name generated by the server)
OPERATION
VARCHAR2(30)
Type of job
JOB_MODE
VARCHAR2(30)
Mode of job
STATE
VARCHAR2(30)
State of the job
DEGREE
NUMBER
Number of worker processes performing the operation
ATTACHED_SESSIONS
NUMBER
Number of sessions attached to the job


Note:

The information returned is obtained from dynamic performance views associated with the executing jobs and from the database schema information concerning the master tables. A query on these views can return multiple rows for a single Data Pump job (same owner and job name) if the query is executed while the job is transitioning between an Executing state and the Not Running state.

The DBA_DATAPUMP_SESSIONS View

The DBA_DATAPUMP_SESSIONS view identifies the user sessions that are attached to a job. The information in this view is useful for determining why a stopped operation has not gone away.

Table 1-2 describes the columns in the DBA_DATAPUMP_SESSIONS view.

Table 1-2 The DBA_DATAPUMP_SESSIONS View

Column Datatype Description
OWNER_NAME
VARCHAR2(30)
User who initiated the job.
JOB_NAME
VARCHAR2(30)
User-supplied name for the job (or the default name generated by the server).
SADDR
RAW(4) (RAW(8) on 64-bit systems)
Address of session attached to the job. Can be used with V$SESSION view.

Monitoring the Progress of Executing Jobs

Data Pump operations that transfer table data (export and import) maintain an entry in the V$SESSION_LONGOPS dynamic performance view indicating the job progress (in megabytes of table data transferred). The entry contains the estimated transfer size and is periodically updated to reflect the actual amount of data transferred.


Note:

The usefulness of the estimate value for export operations depends on the type of estimation requested when the operation was initiated, and it is updated as required if exceeded by the actual transfer amount. The estimate value for import operations is exact.

The V$SESSION_LONGOPS columns that are relevant to a Data Pump job are as follows:

  • USERNAME - job owner

  • OPNAME - job name

  • TARGET_DESC - job operation

  • SOFAR - megabytes (MB) transferred thus far during the job

  • TOTALWORK - estimated number of megabytes (MB) in the job

  • UNITS - 'MB'

  • MESSAGE - a formatted status message of the form:

    '<job_name>: <operation_name> : nnn out of mmm MB done'
    
|| Usage Notes:
|| This script is provided to demonstrate various features of Oracle 10g's 
|| new DataPump and should be carefully proofread before executing it against || any existing Oracle database to insure that no potential 
damage can occur. || */
The gadget spec URL could not be found
----- -- Listing 1.1: Setting up a DIRECTORY object for DataPump use. -- Note that the directory folder need not exist for this command -- to succeed, but any subsequent attempt to utilize the DIRECTORY -- object will fail until the folder is created on the server. -- This should be run from SYSTEM for best results ----- DROP DIRECTORY export_dir; CREATE DIRECTORY export_dir as 'c:\oracle\export_dir'; GRANT READ, WRITE ON DIRECTORY export_dir TO hr, sh; ----- -- Listing 1.2: Determining what object types can be exported/imported -- and filtering levels available ----- COL object_path FORMAT A25 HEADING 'Object Path Name' COL comments FORMAT A50 HEADING 'Object Description' COL named FORMAT A3 HEADING 'Nmd|Objs' TTITLE 'Database-Level Exportable Objects' SELECT object_path ,named ,comments FROM database_export_objects; TTITLE 'Schema-Level Exportable Objects' SELECT object_path ,named ,comments FROM schema_export_objects; TTITLE 'Table-Level Exportable Objects' SELECT object_path ,named ,comments FROM table_export_objects; ----- -- Listing 1.3: A simple DataPump Export operation. Note that if the export -- dump file already exists when this is executed, Oracle will -- return an ORA-39000 error and terminate the operation ----- EXPDP hr/hr DUMPFILE=export_dir:hr_schema.dmp LOGFILE=export_dir:hr_schema.explog >> DataPump Export command issued: SET ORACLE_SID=zdcdb EXPDP system/******** PARFILE=c:\rmancmd\dpe_1.expctl >> DataPump Export parameters file (dpe_1.expctl): DIRECTORY=export_dir SCHEMAS=HR,OE JOB_NAME=hr_oe_schema DUMPFILE=export_dir:hr_oe_schemas.dmp LOGFILE=export_dir:hr_oe_schemas.explog >> Results of Export Operation: Export: Release 10.1.0.2.0 - Production on Thursday, 10 March, 2005 17:52 Copyright (c) 2003, Oracle. All rights reserved. Connected to: Oracle Database 10g Enterprise Edition Release 10.1.0.2.0 - Production With the Partitioning, OLAP and Data Mining options FLASHBACK automatically enabled to preserve database integrity. Starting "SYSTEM"."HR_OE_SCHEMA": system/******** parfile=c:\rmancmd\dpe_1.expctl Estimate in progress using BLOCKS method... Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA Total estimation using BLOCKS method: 2.562 MB Processing object type SCHEMA_EXPORT/USER Processing object type SCHEMA_EXPORT/SYSTEM_GRANT Processing object type SCHEMA_EXPORT/ROLE_GRANT Processing object type SCHEMA_EXPORT/DEFAULT_ROLE Processing object type SCHEMA_EXPORT/TABLESPACE_QUOTA Processing object type SCHEMA_EXPORT/SE_PRE_SCHEMA_PROCOBJACT/PROCACT_SCHEMA Processing object type SCHEMA_EXPORT/TYPE/TYPE_SPEC Processing object type SCHEMA_EXPORT/SEQUENCE/SEQUENCE Processing object type SCHEMA_EXPORT/TABLE/TABLE Processing object type SCHEMA_EXPORT/TABLE/GRANT/OBJECT_GRANT Processing object type SCHEMA_EXPORT/TABLE/INDEX/INDEX Processing object type SCHEMA_EXPORT/TABLE/CONSTRAINT/CONSTRAINT Processing object type SCHEMA_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS Processing object type SCHEMA_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS Processing object type SCHEMA_EXPORT/TABLE/COMMENT Processing object type SCHEMA_EXPORT/TABLE/AUDIT_OBJ Processing object type SCHEMA_EXPORT/PACKAGE/PACKAGE_SPEC Processing object type SCHEMA_EXPORT/FUNCTION/FUNCTION Processing object type SCHEMA_EXPORT/PROCEDURE/PROCEDURE Processing object type SCHEMA_EXPORT/PACKAGE/COMPILE_PACKAGE/PACKAGE_SPEC/ALTER_PACKAGE_SPEC Processing object type SCHEMA_EXPORT/FUNCTION/ALTER_FUNCTION Processing object type SCHEMA_EXPORT/PROCEDURE/ALTER_PROCEDURE Processing object type SCHEMA_EXPORT/VIEW/VIEW Processing object type SCHEMA_EXPORT/VIEW/GRANT/OBJECT_GRANT Processing object type SCHEMA_EXPORT/PACKAGE/PACKAGE_BODY Processing object type SCHEMA_EXPORT/TABLE/CONSTRAINT/REF_CONSTRAINT Processing object type SCHEMA_EXPORT/TABLE/TRIGGER Processing object type SCHEMA_EXPORT/TABLE/INDEX/SE_TBL_FBM_INDEX_INDEX/INDEX Processing object type SCHEMA_EXPORT/TABLE/INDEX/STATISTICS/SE_TBL_FBM_IND_STATS/INDEX_STATISTICS Processing object type SCHEMA_EXPORT/SE_POST_SCHEMA_PROCOBJACT/PROCACT_SCHEMA . . exported "HR"."LOBSEP" 9.195 KB 1 rows . . exported "HR"."BIGATFIRST" 277.7 KB 10000 rows . . exported "HR"."APPLICANTS" 10.46 KB 30 rows . . exported "HR"."APPLICANTS_1" 11.5 KB 45 rows . . exported "HR"."APPROLES" 6.078 KB 7 rows . . exported "HR"."APPS" 5.632 KB 3 rows . . exported "HR"."COST_CENTERS" 6.328 KB 29 rows . . exported "HR"."COST_CENTER_ASSIGNMENTS" 6.312 KB 20 rows . . exported "HR"."COUNTRIES" 6.093 KB 25 rows . . exported "HR"."DATEMATH" 6.984 KB 1 rows . . exported "HR"."DEPARTMENTS" 7.101 KB 28 rows . . exported "HR"."DIVISIONS" 5.335 KB 3 rows . . exported "HR"."EMPLOYEES" 16.67 KB 118 rows . . exported "HR"."EMPLOYEE_HIERARCHY" 6.414 KB 5 rows . . exported "HR"."JOBS" 7.296 KB 27 rows . . exported "HR"."JOB_HISTORY" 6.765 KB 15 rows . . exported "HR"."LOCATIONS" 7.710 KB 23 rows . . exported "HR"."MY_USER_ROLES" 6.453 KB 10 rows . . exported "HR"."PAYROLL_CHECKS" 7.609 KB 6 rows . . exported "HR"."PAYROLL_HOURLY" 6.039 KB 3 rows . . exported "HR"."PAYROLL_SALARIED" 5.687 KB 3 rows . . exported "HR"."PAYROLL_TRANSACTIONS" 7.195 KB 6 rows . . exported "HR"."REGIONS" 5.296 KB 4 rows . . exported "HR"."TIMECLOCK_PUNCHES" 5.718 KB 6 rows . . exported "HR"."USERS" 5.968 KB 3 rows . . exported "HR"."USER_ROLES" 6.453 KB 10 rows . . exported "HR"."IOT_TAB" 0 KB 0 rows . . exported "HR"."NO_UPDATES" 0 KB 0 rows . . exported "HR"."PLAN_TABLE" 0 KB 0 rows Master table "SYSTEM"."HR_OE_SCHEMA" successfully loaded/unloaded ****************************************************************************** Dump file set for SYSTEM.HR_OE_SCHEMA is: C:\ORACLE\EXPORT_DIR\HR_OE_SCHEMAS.DMP Job "SYSTEM"."HR_OE_SCHEMA" successfully completed at 17:53 ----- -- Listing 1.4: A simple DataPump Import. Note that only database objects from -- the HR schema will be used to populate a new schema (HR_OLTP), -- and all objects other than tables and their dependent objects -- will be excluded from the import ----- >> SQL to create new HR_OLTP schema: DROP USER hr_oltp CASCADE; CREATE USER hr_oltp IDENTIFIED BY misdev DEFAULT TABLESPACE example TEMPORARY TABLESPACE temp02 QUOTA 50M ON example PROFILE default; GRANT CONNECT TO hr_oltp; GRANT RESOURCE TO hr_oltp; >> DataPump Import command issued: SET ORACLE_SID=dbaref IMPDP system/****** PARFILE=export_dir:dpi_1.impctl >> DataPump Import parameters file (dpi_1.impctl): DIRECTORY=export_dir JOB_NAME=hr_oltp_import DUMPFILE=export_dir:hr_oe_schemas.dmp LOGFILE=export_dir:hr_oltp_import.implog REMAP_SCHEMA=hr:hr_oltp STATUS=5 >> Results of Import operation: Import: Release 10.1.0.2.0 - Production on Thursday, 10 March, 2005 18:02 Copyright (c) 2003, Oracle. All rights reserved. Connected to: Oracle Database 10g Enterprise Edition Release 10.1.0.2.0 - Production With the Partitioning, OLAP and Data Mining options Master table "SYSTEM"."HR_OLTP_IMPORT" successfully loaded/unloaded Starting "SYSTEM"."HR_OLTP_IMPORT": system/******** parfile=c:\rmancmd\dpi_1.impctl Processing object type SCHEMA_EXPORT/SEQUENCE/SEQUENCE Processing object type SCHEMA_EXPORT/TABLE/TABLE Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA . . imported "HR_OLTP"."LOBSEP" 9.195 KB 1 rows . . imported "HR_OLTP"."BIGATFIRST" 277.7 KB 10000 rows . . imported "HR_OLTP"."APPLICANTS" 10.46 KB 30 rows . . imported "HR_OLTP"."APPLICANTS_1" 11.5 KB 45 rows . . imported "HR_OLTP"."APPROLES" 6.078 KB 7 rows . . imported "HR_OLTP"."APPS" 5.632 KB 3 rows . . imported "HR_OLTP"."COST_CENTERS" 6.328 KB 29 rows . . imported "HR_OLTP"."COST_CENTER_ASSIGNMENTS" 6.312 KB 20 rows . . imported "HR_OLTP"."COUNTRIES" 6.093 KB 25 rows . . imported "HR_OLTP"."DATEMATH" 6.984 KB 1 rows . . imported "HR_OLTP"."DEPARTMENTS" 7.101 KB 28 rows . . imported "HR_OLTP"."DIVISIONS" 5.335 KB 3 rows . . imported "HR_OLTP"."EMPLOYEES" 16.67 KB 118 rows . . imported "HR_OLTP"."EMPLOYEE_HIERARCHY" 6.414 KB 5 rows . . imported "HR_OLTP"."JOBS" 7.296 KB 27 rows . . imported "HR_OLTP"."JOB_HISTORY" 6.765 KB 15 rows . . imported "HR_OLTP"."LOCATIONS" 7.710 KB 23 rows . . imported "HR_OLTP"."MY_USER_ROLES" 6.453 KB 10 rows . . imported "HR_OLTP"."PAYROLL_CHECKS" 7.609 KB 6 rows . . imported "HR_OLTP"."PAYROLL_HOURLY" 6.039 KB 3 rows . . imported "HR_OLTP"."PAYROLL_SALARIED" 5.687 KB 3 rows . . imported "HR_OLTP"."PAYROLL_TRANSACTIONS" 7.195 KB 6 rows . . imported "HR_OLTP"."REGIONS" 5.296 KB 4 rows . . imported "HR_OLTP"."TIMECLOCK_PUNCHES" 5.718 KB 6 rows . . imported "HR_OLTP"."USERS" 5.968 KB 3 rows . . imported "HR_OLTP"."USER_ROLES" 6.453 KB 10 rows . . imported "HR_OLTP"."IOT_TAB" 0 KB 0 rows . . imported "HR_OLTP"."NO_UPDATES" 0 KB 0 rows . . imported "HR_OLTP"."PLAN_TABLE" 0 KB 0 rows Processing object type SCHEMA_EXPORT/TABLE/GRANT/OBJECT_GRANT Processing object type SCHEMA_EXPORT/TABLE/INDEX/INDEX Processing object type SCHEMA_EXPORT/TABLE/CONSTRAINT/CONSTRAINT Processing object type SCHEMA_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS Processing object type SCHEMA_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS Processing object type SCHEMA_EXPORT/TABLE/COMMENT Processing object type SCHEMA_EXPORT/TABLE/AUDIT_OBJ Processing object type SCHEMA_EXPORT/TABLE/CONSTRAINT/REF_CONSTRAINT Processing object type SCHEMA_EXPORT/TABLE/TRIGGER ORA-39082: Object type TRIGGER:"HR_OLTP"."BIN$55fGDdubQL6YVYB0dGS/nw==$1" created with compilation warnings ORA-39082: Object type TRIGGER:"HR_OLTP"."BIN$55fGDdubQL6YVYB0dGS/nw==$1" created with compilation warnings ORA-39082: Object type TRIGGER:"HR_OLTP"."SECURE_EMPLOYEES" created with compilation warnings ORA-39082: Object type TRIGGER:"HR_OLTP"."SECURE_EMPLOYEES" created with compilation warnings ORA-39082: Object type TRIGGER:"HR_OLTP"."TR_BRIU_APPLICANTS" created with compilation warnings ORA-39082: Object type TRIGGER:"HR_OLTP"."TR_BRIU_APPLICANTS" created with compilation warnings ORA-39082: Object type TRIGGER:"HR_OLTP"."UPDATE_JOB_HISTORY" created with compilation warnings ORA-39082: Object type TRIGGER:"HR_OLTP"."UPDATE_JOB_HISTORY" created with compilation warnings Processing object type SCHEMA_EXPORT/TABLE/INDEX/SE_TBL_FBM_INDEX_INDEX/INDEX Processing object type SCHEMA_EXPORT/TABLE/INDEX/STATISTICS/SE_TBL_FBM_IND_STATS/INDEX_STATISTICS Job "SYSTEM"."HR_OLTP_IMPORT" completed with 8 error(s) at 18:02 ----- -- Listing 1.5: Querying status of DataPump operations ----- TTITLE 'Currently Active DataPump Operations' COL owner_name FORMAT A06 HEADING 'Owner' COL job_name FORMAT A20 HEADING 'JobName' COL operation FORMAT A12 HEADING 'Operation' COL job_mode FORMAT A12 HEADING 'JobMode' COL state FORMAT A12 HEADING 'State' COL degree FORMAT 9999 HEADING 'Degr' COL attached_sessions FORMAT 9999 HEADING 'Sess' SELECT owner_name ,job_name ,operation ,job_mode ,state ,degree ,attached_sessions FROM dba_datapump_jobs ; TTITLE 'Currently Active DataPump Sessions' COL owner_name FORMAT A06 HEADING 'Owner' COL job_name FORMAT A06 HEADING 'Job' COL osuser FORMAT A12 HEADING 'UserID' SELECT DPS.owner_name ,DPS.job_name ,S.osuser FROM dba_datapump_sessions DPS ,v$session S WHERE S.saddr = DPS.saddr ;

The gadget spec URL could not be found

The gadget spec URL could not be found
Comments