BI Data Extract
This page is intended to describe the configuration of the MIS/BI Extracts from a technical perspective.
This process has already been documented from a user perspective in ctms:BI_Data_Extract_Guide
Configuration
The files that will be created are defined in table mis_extract_header
Only active files are generated.
| COLUMN_NAME | DATA_LENGTH | NOTES |
|---|---|---|
| EXTRACT_FILENAME | 120 | Less than 25 characters only |
| EXTRACT_FILENAME_EXTENSION | 40 | csv |
| TABLE_NAME | 120 | The table being extracted. This must be unique. For more complex queries with multiple tables, consider using a view name instead.
Where the extract requires another table linked to selected the data (for example, SCH_ORD_REFERENCE), these may be listed in the table names and linked in the here clause, as long as they do not exceed 120 characters. For example table name " Alternatively, use a sub-query for the select e.g. |
| DELIMITER | 40 | The delimiter character. This must be enclosed in PL/SQL syntax to concatenate the values. For example, use '||'®'||' for a unique separator character. Not relevant if using tab delimiting, as the package will instead use CHR (9).
|
| EXTRACT_OUTPUT_DIRECTORY | 200 | Usually /webint/dbname/export or /webint/dbname/MIS
|
| DATABASE_SOURCE | 120 | dbname |
| WHERE_CLAUSE | 4000 | Where clause. See note below. |
| RUN_FREQUENCY | 80 | EVERY_DAY |
| ACTIVE_FLAG | 4 | Y |
| TABLE_TYPE | 4 | N/A |
- For Where Clause:
- Standing data tables should be always be exported. So clause should be "
WHERE 1=1" - Transactional data should export any data created or updated in the last calendar day or days (depending on customer requirements). So "
WHERE ((created_date BETWEEN SYSDATE-4 AND SYSDATE) OR (updated_date BETWEEN SYSDATE-4 AND SYSDATE))". Date column names are dependent on the table (or tables) being selected.
- Standing data tables should be always be exported. So clause should be "
The columns (up to a maximum of 50) are defined in table mis_extract_detail.
| COLUMN_NAME | DATA_LENGTH | NOTES |
|---|---|---|
| TABLE_NAME | 120 | Parent key TABLE_NAME from above. |
| COLUMN_NAME | 120 | The column from the table |
| DATA_TYPE_SIZE | 120 | the data type and length e.g. VARCHAR2(40), NUMBER(5), DATE
|
| COLUMN_POSITION | 22 | The position of this column in the extract file |
| FORMATING_REQUIREMENT | 240 | Usually used for dates e.g. TO_CHAR(START_TIME,'RRRR-MM-DD HH24:MI')
|
- Up to 50 columns only.
- If more columns are required, define another extract for the additional columns.
- Note that the table name must be unique, so if multiple extracts are required from the same table, this would need to be aliased e.g. "
SCH_TRIP as st2". Note that this will need to be less than 120 characters in length.
Warning: This may not work.
Sample SQL
-- What denotes the files to be created select * from mis_extract_header WHERE active_flag = 'Y' FOR UPDATE select mih.extract_filename || '.' || mih.extract_filename_extension "File_Name" , Table_Name , Case WHERE_CLAUSE WHEN ' WHERE 1=1' THEN 'All Data' ELSE 'Daily Data' END "Contains" from mis_extract_header mih WHERE active_flag = 'Y' select * from mis_extract_header FOR UPDATE select * from mis_extract_header WHERE WHERE_CLAUSE NOT LIKE '%1=1%' AND ACTIVE_FLAG = 'Y' -- Set active the ones that you want UPDATE mis_extract_header SET active_flag = 'N' WHERE extract_filename <> 'Z_RES_CARRIER_TYPE' -- Set the output directory and database source UPDATE mis_extract_header SET EXTRACT_OUTPUT_DIRECTORY = '/webint/ststtst/export', DATABASE_SOURCE = 'ststtst'; -- Set up the fields being exported select * from mis_extract_detail where table_name like 'SCH_ORD_ITEMS%' order by table_name, column_position select table_name, column_name, data_type_size, column_position from mis_extract_detail where table_name like 'SCH_ORD_ITEMS%' order by table_name, column_position
Parameters
System parameters control the extract.
- MIS_CHAR_SET - WE8ISO8859P1
- MIS_DELETED_DAYS - 30
- MIS_DELETE_REQUIRED - Y
- MIS_FTP_DESTINATION_DIRECTORY - if a subdirectory is required, define it here.
- MIS_FTP_DESTINATION_IP_ADDRESS - an IP address or URLL for the FTP or SFTP
- MIS_FTP_DESTINATION_PASSWORD - the password for the FTP/SFTP server
- MIS_FTP_DESTINATION_PORT - the port, 22 for SFTP, 23 for FTP
- MIS_FTP_DESTINATION_USERNAME - the username for the FTP/SFTP server
- MIS_FTP_PROTOCOL - FTP or SFTP
- MIS_TAB_DELIMITER - Y if you want the files generated to be tab delimited.
If using SFTP, define also the level of logging:
- SFTP_LOG_LEVEL - 0 to 3.
Sample SQL:
-- Set up MIS parameters SELECT * FROM ADM_SYSTEM_PARAM WHERE PARAM_NAME LIKE 'MIS%' FOR UPDATE SELECT * FROM ADM_SYSTEM_PARAM WHERE PARAM_NAME LIKE 'SFTP%' FOR UPDATE
Directory access
First, create directory from PLSQL Developer, from Directories object/create new
Usually /webint/dbname/export or /webint/dbname/MIS
Then grant permissions for MTS_USER to access (EDI user)
For FTP/SFTP destinations you will need to set up the fingerprint
Sample SQL:
-- grant permission for MTS_USER to access (EDI user) GRANT READ,WRITE ON DIRECTORY EXPORT TO MTS_USER; BEGIN dp_sftp.open_connection( i_host => 'the host', i_trust_server => true ); dp_sftp.close_connection; END; -- See the fingerprint select * from sftp_known_hosts
Packages
The package to send MIS/BI data is DP_MIS
Sample SQL:
select DP_MIS.GET_PACKAGE_VERSION from dual select SYSDATE FROM DUAL
Auditing and Troubleshooting
Sample SQL:
-- Audit for SFTP
select * from adm_log
--WHERE PROG_NAME IN ('DP_SFTP')
where date_created <= TO_DATE('2025-10-31 07:20:00', 'YYYY-MM-DD HH24:MI:SS')
AND PROG_NAME NOT IN ('TRM', 'PAR')
-- where err_type <> 'DEBUG' AND STMT < 2255004935 -- PROG_NAME IN ('DP_SFTP')
order by stmt desc
order by adm_log.date_created desc
-- Audit log from the MIS package
select * from Mis_extract_run_detail
order by last_ran desc
-- Checking job
Select * from DBA_JOBS
where UPPER(what) like '%DAILY_EXTRACTS%'
-- When did it last run?
-- Check the last run date/time on the jobs table to the sysdate
select sysdate from dual;
-- Run the job now
DECLARE
g_process_name edi_process_header.process_name%TYPE;
BEGIN
g_process_name := 'DAILY_EXTRACTS';
DP_REPORTS.P_RUN_PROCESS(g_process_name);
COMMIT;
END;
-- Or just find the job number, and then use PLSQL to run the job from the Jobs list to the left.
-- Jobs not running? Check the following:
select logins from v$instance; -- if not allowed, won't run jobs
select value from dba_scheduler_global_attribute where attribute_name='SCHEDULER_DISABLED' -- if TRUE won't run jobs
select value from v$parameter where name='job_queue_processes'; -- if 0 won't run jobs (most common
alter system set job_queue_processes=20; -- Fix to above issue