|
|
Line 1: |
Line 1: |
| <div class="noprint"> | | <div class="noprint"> |
| {{#vardefine:Client|DHLV}} | | {{#vardefine:Client|PROD}} |
| {{#vardefine:ClientName|DHL}} | | {{#vardefine:ClientName|PROD}} |
| {{#vardefine:System|''CALIDUS'' Vision}} | | {{#vardefine:System|''CALIDUS'' ePOD}} |
| {{#vardefine:Doc_Title|Mothercare/ELC Requirements}} | | {{#vardefine:Doc_Title|Load Start Metrics}} |
| {{#vardefine:Version|0.2}} | | {{#vardefine:Version|0.1}} |
| {{#vardefine:Date|27th March 2012}} | | {{#vardefine:Date|25th February 2014}} |
| {{#vardefine:Reference|297616}} | | {{#vardefine:Reference|XXXXXX}} |
| | {{#vardefine:Year|2014}} |
| </div> | | </div> |
| {{Doc_Title | | {{Doc_Title |
Line 15: |
Line 16: |
| |Version={{#var:Version}} | | |Version={{#var:Version}} |
| |Date={{#var:Date}} | | |Date={{#var:Date}} |
| | |Year={{#var:Year}} |
| }} | | }} |
|
| |
|
| <!-- TOC --> | | <!-- TOC --> |
| = Introduction = | | <div class="noprint"> |
| <!-- The introduction will detail the initial requirements supplied by the client -->
| | = Functional Overview = |
| This document is the {{#var:Doc_Title}} for {{#var:System}}.
| | == Client Requirement == |
| | Currently, 'stop' debrief from {{#var:System}} is not possible for C-TMS where a trip has stops but with no orders (e.g. start-up (SU) stop or closedown (CL) stop where no order activity occurs). Trips are therefore not set to COMPLETED status. |
|
| |
|
| == Objective == | | == Solution Overview == |
| The primary purpose of this document is to document the requirements gathered from {{#var:ClientName}}, at the Mothercare NDC in Daventry, on 21-22 March 2012.
| | A message needs to be triggered from {{#var:System}} to cover the debrief of the stop times for start-up (SU) and/or closedown (CL) stops. C-TMS needs to handle these new messages and update the stop times on the appropriate trip stops. |
|
| |
|
| The document will highlight the premise of the system being created, the modifications required to achieve this and the development effort. Furthermore, all project-related costs will be estimated where they differ or are extended from the standard estimates.
| | A new trigger point is required in {{#var:System}} to generate the {{#var:System}}_EXPORT_LOAD message exactly as above but as soon as possible after the the start ODO is captured: |
|
| |
|
| This document has been written in a manner such that it can be approved by non-technical representatives of {{#var:ClientName}} whilst also being of sufficient detail to allow the Functional or Technical Specification phase for this area to begin. | | This will be utilised to send the stop level debrief actual times for the SU stop and also the start ODO value. |
|
| |
|
| == Scope and Limitations ==
| | Analysis is required to find a logical point where this can be triggered, ensuring that it is only sent once at the start. |
| This document is based on the documentation provided by {{#var:ClientName}}, as referred to in the appendices, as well as information gleaned from site visits and workshops with {{#var:ClientName}}.
| |
|
| |
|
| <!-- ANY scope or limitations, bulleted. -->
| | When the load is downloaded to the device for the first time the device date and time are stored in the following fields: |
| * The changes will be made in the latest version of the {{#var:System}} system.
| | * EPL_LOAD_START_ACTUAL_DATE |
| | * EPL_LOAD_START_ACTUAL_TIME |
|
| |
|
| <!-- NEW PAGE -->
| | The ODO reading is also stored in the following field: |
| | * EPL_MILEAGE_START |
|
| |
|
| = Client Requirements =
| |
| The following is an extraction of the BRD created by the client, listing the critical requirements of the system (referenced as item 1 in Appendix B).
| |
|
| |
|
| The Mothercare contract requires an LMS that can give them the following:
| | == Scope == |
| | * This functionality will be developed in the latest version of {{#var:System}} only. |
|
| |
|
| '''CRITICAL:'''
| |
| *Productivity information, in the form of PI rate, at employee level and aggregated by shift, activity, site, made available to the site management team on a live* basis and with minimal intervention (* 15 minute update intervals)
| |
| *Operator level PI rate with enough data integrity that it can be used for performance management purposes and ability to look at rate over a rolling 6 week period which includes number of hours spend on each activity to reflect the sites PIP policy as agreed with the Union
| |
| *Ability to print off or export to Excel spreadsheet to keep a record
| |
| *A detailed reporting suite that will allow both sites to look at performance at any level of hierarchy within the site (by activity, shift, sites, date ranges etc)
| |
| *Enquiries to enable a user to select range of activities / dates or shifts / users or groups
| |
| *Graphs of PI rate by employee to be displayed in each area (where each employee is identified by a unique confidential code or number)
| |
|
| |
|
| '''NON CRITICAL: '''
| | <!-- NEW PAGE --> |
| *Dashboard to display RAG by activity for current shift
| | = Set-up = |
| *Under-performers and over-performers report to list everyone who performed under 75% PI and over 100% PI for current shift and/or week-to-date
| |
| **Effectiveness report to give a picture of indirect tasks and hours (vs. direct)
| |
| *Monitoring tools to include:
| |
| **PkMS / Kronos mismatch alerts
| |
| **User swipes report
| |
| **User log-on & log-off report
| |
| **Exceptions alerts (forced moves, skipped picks, task change, nils…)
| |
|
| |
|
| '''Scope'''
| | == Pre-requisites == |
| | * A working {{#var:System}} system. |
|
| |
|
| '''In Scope'''
| |
| *All direct activities currently reported on the MIS for EDC and NDC
| |
| *All indirect activities currently reported on the MIS for EDC and NDC
| |
| *Interface with Kronos and both versions of PkMS
| |
|
| |
|
| '''Out of Scope'''
| | == Menu Structure == |
| *PKMS and Kronos system changes
| | None |
| *RCS grade jobs are excluded (e.g. admin, planning, superusers)
| |
|
| |
|
| '''Measurement Areas'''
| |
|
| |
|
| The following areas will be measured:
| | == Data == |
| | Modify or create an EPOD_XF_CONFIG record for EPL_XF_ID "LOAD" and ensure this has an EPL_XF_MSG_TYPE set to "BOTH". Assign this to a Site. |
|
| |
|
| '''NDC'''
| |
| {| class="wikitable"
| |
| |-
| |
| !
| |
| ! Area
| |
| ! Activity
| |
| |-
| |
| | 1
| |
| | Intake
| |
| | NDC - Goods In Small Box
| |
| |-
| |
| |
| |
| |
| |
| | NDC - Goods In Large Box
| |
| |-
| |
| |
| |
| |
| |
| | NDC - Goods In Hanging
| |
| |-
| |
| | 2
| |
| | Putaway
| |
| | NDC - PD03
| |
| |-
| |
| |
| |
| |
| |
| | NDC - Direct To Active
| |
| |-
| |
| |
| |
| |
| |
| | NDC - Putaway boxed
| |
| |-
| |
| |
| |
| |
| |
| | NDC - Putaway hanging
| |
| |-
| |
| | 3
| |
| | Replen
| |
| | NDC - Replenishment large
| |
| |-
| |
| |
| |
| |
| |
| | NDC - Replenishment small
| |
| |-
| |
| | 4
| |
| | UK pick
| |
| | NDC - UK Picking BP20 - Large
| |
| |-
| |
| |
| |
| |
| |
| | NDC - UK Picking BP20 - Small
| |
| |-
| |
| |
| |
| |
| |
| | NDC - UK Picking DCR - Large
| |
| |-
| |
| |
| |
| |
| |
| | NDC - UK Picking DCR - Small
| |
| |-
| |
| |
| |
| |
| |
| | NDC - UK Picking DCR - Hanging
| |
| |-
| |
| | 5
| |
| | INT pick
| |
| | NDC - INT Picking BP20 - Large
| |
| |-
| |
| |
| |
| |
| |
| | NDC - INT Picking BP20 - Small
| |
| |-
| |
| |
| |
| |
| |
| | NDC - INT Picking DCR - Large
| |
| |-
| |
| |
| |
| |
| |
| | NDC - INT Picking DCR - Small
| |
| |-
| |
| |
| |
| |
| |
| | NDC - INT Picking DCR - Hanging
| |
| |-
| |
| | 6
| |
| | Mini Club inbound
| |
| | NDC - Mini Club PD06
| |
| |-
| |
| |
| |
| |
| |
| | NDC - Mini Club DTA
| |
| |-
| |
| |
| |
| |
| |
| | NDC - Mini Club Replen
| |
| |-
| |
| | 7
| |
| | Mini Club Pick
| |
| | NDC - Mini Club Picking - Ratio
| |
| |-
| |
| |
| |
| |
| |
| | NDC - Mini Club Picking - Single
| |
| |-
| |
| | 8
| |
| | Mini Club outbound
| |
| | NDC - Mini Club Nesting
| |
| |-
| |
| |
| |
| |
| |
| | NDC - Mini Club Despatch
| |
| |-
| |
| | 9
| |
| | UK marshalling
| |
| | NDC - UK Marshalling large
| |
| |-
| |
| |
| |
| |
| |
| | NDC - UK Marshalling small
| |
| |-
| |
| | 10
| |
| | INT marshalling
| |
| | NDC - INT Marshalling
| |
| |-
| |
| | 11
| |
| | UK despatch
| |
| | NDC - UK Despatch
| |
| |-
| |
| | 12
| |
| | INT nesting
| |
| | NDC - INT Nesting
| |
| |-
| |
| | 13
| |
| | INT despatch
| |
| | NDC - INT Despatch
| |
| |}
| |
|
| |
|
| '''EDC'''
| | <!-- NEW PAGE --> |
| {| class="wikitable"
| | = Functional Description = |
| |-
| | == Admin Changes == |
| !
| | Add the following new fields to the XF Config Maintenance screen XF_Config.aspx: |
| ! Area
| | * EPL_XF_MSG_TYPE - A DDL field with allowed text values as follows: |
| ! Activity
| | ** "START" - description "When Load Started" |
| | ** "END" - description "When Load Completed" (default value) |
| | ** "BOTH" - description "When Load Started and Completed" |
| | This should be enabled for entry for export ID "LOAD" only. Note that the existing blank field value should default to "END". |
|
| |
|
| |-
| |
| | 1
| |
| | Intake
| |
| | EDC - Goods In
| |
|
| |
|
| |-
| | == Import/Export Changes == |
| |2
| | The current functionality to send LOAD_UPDATE messages (EPL_XF_ID = "LOAD") will be extended. The existing field EPL_XF_MSG_TYPE (referred to as EPL_XF_MSG_TYPES in other specifications) will be extended to define whether the process is sending messages at the start of a load (value "START"), end of a load (values "END" or blank) or both (value "BOTH"). |
| | Putaway
| |
| | EDC - Putaway
| |
|
| |
|
| |-
| | This will be achieved by changing the existing database procedure EPOD_XFER_SELECT_LOAD to: |
| |3
| | * Optionally link to the EPOD_XF_CONFIG of the Load's EPL_SITE_ID. |
| | Replen
| | * Select records based on the value of EPOD_XF_CONFIG.EPL_XF_MSG_TYPE, as follows: |
| | EDC - Replenishment (Average)
| | ** "START" - EPL_XFER_FLAG = "" AND (EPL_MILEAGE_START <> 0 OR EPL_UDF_LOAD_START <> "") |
| | ** "END" or "" (blank) - (EPL_XFER_FLAG = "" OR EPL_XFER_FLAG = "P") |
| | ** "BOTH" - The two above combined i.e. (EPL_XFER_FLAG = "" AND (EPL_MILEAGE_START <> 0 OR EPL_UDF_LOAD_START <> "") ) OR EPL_XFER_FLAG = "" OR EPL_XFER_FLAG = "P" |
|
| |
|
| |-
| | {{Note}} The existing load creation DAL and database definition or EPOD_LOAD and the database procedure EPOD_LOAD_INSERT must be checked to ensure that new loads are created with a blank value in EPL_XFER_FLAG. |
| |4
| |
| | UK pick EDC - UK Picking Bulk
| |
|
| |
|
| |-
| |
| |
| |
| |
| |
| | EDC - UK Picking VNA/Wide
| |
|
| |
|
| |-
| | The existing .NET procedure EPOD_SYS_EXPORT.ExportAllOutstandingLoads will take the returned records and interface them as per the settings. Note that any without a valid EPOD_XF_CONFIG against the site will be set as transferred (value "Y"). This procedure will be modified to check the value of EPL_XFER_FLAG before it is changed, as follows: |
| |
| | * "" (Blank) - Set to "N". |
| |
| | * "N" or "P" - Set to "Y". |
| | EDC - UK Picking Tote
| |
|
| |
|
| |-
| |
| |5
| |
| | INT pick
| |
| | EDC - INT Pick & stick
| |
|
| |
|
| |-
| | The actual export XML sent for load updates need not be changed, as it already contains all the details required by external systems to update correctly. |
| |
| |
| |
| |
| | EDC - INT Picking small box
| |
|
| |
|
| |-
| | <!-- |
| |
| | MEDIA LANDSCAPE YES |
| |
| | = Appendix A: TEST PLAN = |
| | EDC - INT Picking large box
| |
|
| |
|
| |-
| | {{TestPlan_Header |
| |
| | |Title={{#var:Doc_Title}} |
| |
| | |Log={{#var:Reference}} |
| | EDC - INT Picking FPP
| | |Description=Test the ... |
| | | |MenuAccess=None |
| |- | | |Prerequisites=None |
| |6
| | |Objective=To test that: ... |
| | UK consolidation
| | }} |
| | EDC - UK Consolidation
| | {{ #vardefine: Cycle | 0 }}{{ #vardefine: SubCycle | 0 }} |
| | | {{TestPlan_CycleHeader |
| |-
| | |Cycle={{ #vardefineecho: Cycle | {{ #expr: {{ #var: Cycle }} + 1 }} }}{{ #vardefine: SubCycle | {{ #var: Cycle }} }} |
| |7
| | |Title=Test Title |
| | INT consolidation
| |
| | EDC - INT Consolidation
| |
| | |
| |-
| |
| |8
| |
| | UK marshalling
| |
| | EDC - UK Marshalling
| |
| | |
| |-
| |
| |9
| |
| | INT marshalling
| |
| | EDC - INT Marshalling
| |
| | |
| |-
| |
| |10
| |
| | UK despatch
| |
| | EDC - UK Despatch (Average)
| |
| | |
| |-
| |
| |11
| |
| | INT despatch
| |
| | EDC - INT Despatch (Average)
| |
| |}
| |
| | |
| =Overview of Solution= | |
| ''CALIDUS'' Vision is a productivity and visibility tool, designed to mine and analyse system data.
| |
| | |
| Data Mining works by connecting to a known system and pulling data from the host database, analysing it and storing the result in the database for viewing within the ''CALIDUS'' Vision system.
| |
| | |
| The host systems in use at the Mothercare and ELC NDCs are PkMS (for WMS activity and system data) and Kronos (for activity and time information).
| |
| | |
| These 3 databases (2 separate instances of PkMS and 1 of Kronos) will be mined for data.
| |
| | |
| The PkMS data mines will be completed by connecting directly to the databases using an OLEDB connector and mining the data direct from the tables.
| |
| | |
| The Kronos data mine will be completed by importing a flat file from Kronos.
| |
| | |
| Each of these data mining processes will be created within ''CALIDUS'' Vision.
| |
| | |
| Additionally, several new screens or amendments to existing screens will be entered into, taken from the Critical list of changes above.
| |
| | |
| Each item will be discussed in detail in the following sections, along with any risks and technical assumptions.
| |
| | |
| == Detailed Notes on Data Mining ==
| |
| This section focuses on the detailed mapping that occurred at the site. Where possible, areas have been mapped and queries created to extract the data as required. Note that not all areas have yet been mapped - further analysis work is required to finalise this.
| |
| | |
| The connection to the AS/400 databases will be through a tool used by the existing MIS system - HiT OLEDB AS/400 (website: HITSW.COM).
| |
| | |
| This is required to be installed as part of the ''CALIDUS'' Vision implementation on the server being provided by DHL. This will also be required by the development team within OBS. There is a license cost for this product, which will be part of the DHL project costs. There will be a purchase, support and potentially a developer cost per product - this cost information must be confirmed by the DHL IT team.
| |
| * license/support per year: $500
| |
| * Purchase cost: Unknown.
| |
| * Developer cost: Unknown.
| |
| | |
| The server running the data mine will also require:
| |
| * Microsoft .NET framework.
| |
| * Microsoft IIS.
| |
| * Oracle MySQL enterprise database.
| |
| The costs for the enterprise database have been covered in detail within previous estimates and are not covered again here.
| |
| | |
| The data mining programs will be written in Microsoft Visual Studio .NET 2010, as the driver is proven to work in this programming environment. Furthermore, the existing mechanism of writing datamining programs (through Windows Scripting Host) may not be fast enough, given the quantity of data to be mined and the potential slowness of some of the queries (see later for details). OBS have extensive experience of coding within the .NET language, but have no ''CALIDUS'' Vision data mining programs written in this language.
| |
| Will have to extend the dev time to account for .NET dev - can't be based on existing dev in vbscript.
| |
| | |
| The PkMS databases are not accessible from outside the DHL Mothercare network, so OBS will not be able to connect to a test instance of the DB to test the data mining code directly from the development environment.
| |
| | |
| In order to develop this solution effectively, OBS must either:
| |
| * Create a DB2 database within the OBS domain, copying the structure and sample data of the existing system
| |
| * Connect to a DB2 database as above, made accessible within the wider DHL network
| |
| * Create a copy of the structure and sample data within another database type (e.g. MySQL).
| |
| The third approach will be followed, as the other two are impractical.
| |
| | |
| This will allow OBS to test functionality against similar tables and data within the development environment. However, the final testing must be completed on the destination machine itself, with a version of the product built specifically for the OLEDB driver installed on that machine.
| |
| | |
| '''Note:'''
| |
| | |
| ''CALIDUS'' Vision is not extracting data by the method agreed in previous discussions with the operation, nor in the way that ''CALIDUS'' Vision expects.
| |
| | |
| For example, the transaction table is the core of the records used to calculate the productivity. The existing MIS system extracts data by date from the Task Header and Detail records based on the date completed, not the transaction table. Each activity is extracted from different tables in separate queries.
| |
| | |
| Therefore, multiple table reads and SQL statements must be done for each extract based on the individual activity and area.
| |
| | |
| This affects the speed of analysis of requirements (i.e. analysing 8 streams rather than 1) and the speed in which these will be processed.
| |
| | |
| '''Note:'''
| |
| | |
| Furthermore, the existing tables are not optimised for reading by Date, resulting in slower extraction.
| |
| The data mining may not run as fast as was initially expected. This can only be evaluated once the system is built. A schedule (i.e. the rate at which the data mine will run) will be set when this speed can be evaluated.
| |
| | |
| '''Note:'''
| |
| | |
| As the data mine must now pull data on a timed basis from several tables separately, the existing mechanism of checking from last data mine date and time to current time must be changed to a stricter schedule.
| |
| | |
| For example, pulling the first task information may be from 08:00:00 to now (potentially 08:15:00) will extract this data. By the time the second file is mined, the current time will have moved on (for example 08:15:20). This will result in data being mined across several areas with different parameters and will show unexpected results. The data mine must now check hold a schedule i.e. from last data mine (e.g. 08:00:00) to next scheduled break point (e.g. 08:15:00) regardless of the current time.
| |
| | |
| '''Note:'''
| |
| | |
| The core systems will be unavailable at certain times in the day, for backup purposes. The data mining process must take this into account, when building the schedule for extraction.
| |
| | |
| If the core database is unavailable when attempting to connect to the database, the extraction will be abandoned for that run. Only when the database is available will the interval be calculated, based on the number of 15-minute periods that can be mined from the last successful data mine, the schedule end time not exceeding the current time.
| |
| | |
| For example, if the last successful data mine was for data up until 08:00:00, and the current time is now 08:20:00, if the database is unavailable, the mining will be abandoned. The next run at current time 08:35:00 successfully connects. The number of intervals in 15 minute periods will be calculated, ensuring that the calculated end time does not exceed the current time. In this example, the resulting interval will be 30 minutes (2 15 minute periods), giving an end time of 08:30:00. The next interval of 08:45:00 is not allowed, as this exceeds the current time.
| |
| | |
| The system will keep 48 hours of raw data (or more) so that figures can be recalculated as required. The system will ensure that the calculations only recalculate last 48 hours (more accurately, today and yesterday), to speed up the process.
| |
| | |
| In all the data extracts below:
| |
| * Warehouse - will be defaulted to "MOT" (for Mothercare NDC) or "EDC" (for the ELC DC).
| |
| * Owner - will be mapped to "UK", "INT" (International), "MINI" (Mini Club) or Space-filled for tasks that are not related to these others (e.g. combined UK and International Receipt and Putaway).
| |
| | |
| === System Data === | |
| System Data is seen to be data that shows the number of outstanding tasks of each type in the system at the time that the data mine occurred (e.g. 34 pallets awaiting putaway, 50 orders sent to pick, 350 individual pick tasks awaiting picking, etc).
| |
| | |
| Base activity figures (i.e. not broken down to large/small/hanging, UK/International etc) have little meaning to the operation. These figures will not be mined from the system, but will instead be calculated by ''CALIDUS'' Vision from the Extended Activity information.
| |
| | |
| The base task types that will require supporting are:
| |
| *''Intake = RC''
| |
| *''Putaway = PU''
| |
| *''Replen = RP''
| |
| *''Pick = PP''
| |
| *Marshalling = MA - this is the move at the end of pick to the assigned marshalling bay. This is done by a separate user, not the picker.
| |
| *Despatch = DE - Loading and Despatch
| |
| *Nesting = NE - consolidation of pallets in marshalling across orders (INT and MINI only)
| |
| *Consol = CO - consolidation of pallets in marshalling
| |
| Note that ''italicised'' tasks already exist within the system. Those not italicised will be added.
| |
| | |
| Inbound Shipments figures (e.g. preadvices in the system for today, and a count of SKUs and Total Qty) are not applicable to this system, as the data held by PkMS is not scheduled. These figures, if produced, would be meaningless to the operation.
| |
| | |
| Putaway and Replenishment figures will be mined with one query.
| |
| | |
| Pick tasks will be mined with another.
| |
| | |
| Marshalling, Nesting, Despatch and Consolidation system data does not exist within PkMS - these are ad-hoc activities and the number or quantity of outstanding tasks cannot be found.
| |
| | |
| The queries to extract this information have not yet been created - the DHL IT team will write the SQL for this system data extraction and provide this to OBS.
| |
| | |
| '''Note on picks'''
| |
| Picks can be partially picked, then the user goes back to complete the task at a later time. For example:
| |
| * The original number of pick tasks for an order is 4
| |
| * The user picks all of one of the pick tasks and only part of the second.
| |
| * The count of the number of pick tasks at that point will be 3, as there is still some of 2 left, which will count as a task.
| |
| | |
| '''Note:'''
| |
| The Total Number of Order displayed on the System Information screen will be calculated from the Order Status Detail feed seen later in this document.
| |
| | |
| The Picking Containers information displayed on this screen is not applicable - this has been replaced with new tasks (Marshalling, Loading, Despatch, Nesting and Consolidation).
| |
| | |
| === Standing Data - Users ===
| |
| The users from the two PkMS systems will be mined for the User ID and Name. The data was completely mapped.
| |
| | |
| === WCS Alerts (Exceptions) ===
| |
| This file was mapped to 'Skip Pick' functionality within PkMS. This functionality exists within Mothercare and ELC and can be extracted in an identical way.
| |
| | |
| All fields required for this functionality were mapped with no issues.
| |
| | |
| It was noted that the latest version of the WCS Alerts screen (as yet unreleased to production) would be used, to allow the user to click on an exception to see the details.
| |
| | |
| === WMS Order Status ===
| |
| These ''CALIDUS'' Vision screens are used to see an overview of the orders available in the core systems, and to see a 'drill-down' of the data, showing the individual orders at each status.
| |
| | |
| These flows were mapped in detail.
| |
| | |
| Slight changes will be required to the screens to support the statuses within the core systems, as follows:
| |
| {| class="wikitable"
| |
| |-
| |
| ! Status
| |
| ! Description
| |
| ! Mapped to
| |
| |-
| |
| | 10
| |
| | Unselected/Avail
| |
| | Available
| |
| |-
| |
| | 15
| |
| | Prewaved
| |
| | (New Status)
| |
| |-
| |
| | 16
| |
| | Awaiting Replen
| |
| | (New Status)
| |
| |-
| |
| | 20
| |
| | Released for Pick
| |
| | Pick Pending
| |
| |-
| |
| | 35
| |
| | In Picking
| |
| | Pick Pending
| |
| |-
| |
| | 40
| |
| | Pick Pack Complete
| |
| | Picked
| |
| |-
| |
| | 55
| |
| | Marshalled
| |
| | (New Status)
| |
| |-
| |
| | 58
| |
| | Marshalling in progress
| |
| | (New Status)
| |
| |-
| |
| | 70
| |
| | Loaded
| |
| | (New Status)
| |
| |}
| |
| | |
| No Despatched figures will be sent across.
| |
| | |
| The dates against the figures will be the Order Created date only - no booking date exists within PkMS for this purpose. Note that this could be modified on reading the data, by adding a number of days (e.g. 3) to the order creation date, to make visibility easier within the ''CALIDUS'' Vision.
| |
| | |
| Approximately 150,000 orders will be pulled (based off the Mothercare live system on the day).
| |
| | |
| A new field 'Pick Wave' will be added to the existing Order Status Details table and screen, to allow the user to see the Wave under which the order was sent to pick.
| |
| | |
| The existing Order Type (Priority) and Haulier fields will be used to map the Order Type and Size respectively. The Owner will use used to map the general type (UK, INT or MINI).
| |
| | |
| === Warehouse Map===
| |
| The Warehouse Map is used to define the total of locations within areas and aisles, and is used to show the usage of the areas and aisles within the warehouse (i.e. percentage full).
| |
| | |
| It is possible to map the Pick Faces, Marshalling Lanes, Marshall Locations and Bulk Storage areas with several small queries, all of which have been created and mapped.
| |
| | |
| The number of cases in locations cannot be easily mapped, but the locations can be seen to be full or empty, allowing the existing screens to work without modification.
| |
| | |
| The following zones were identified:
| |
| *Zone H/F - Bulk Locations
| |
| *Zone I/P - Pack and Hold (Labelled as Marshalling Locations within ''CALIDUS'' Vision)
| |
| *Zone D - Marshalling Locations
| |
| *Zone H/F/X/Z/K/L (Excluding specific H/F Bulk locations) - Pick Faces.
| |
| | |
| ===Extended Activities Data===
| |
| This activity feed forms the bulk of the information required by ''CALIDUS'' Vision to calculate the productivity of the users.
| |
| | |
| Each of the base task types will be extracted as follows:
| |
| *Putways/Replens: From the Tasks tables
| |
| *Receipts: From the Transaction Table
| |
| *Picks: From the Transaction tables linked to Order Header and Detail
| |
| *Nesting: As yet unknown - see notes below.
| |
| *Consolidation: As yet unmapped, as this is a function in EDC only.
| |
| *Marshalling: Split into two extracts, one for UK and one for INT, linking from Transactions to the Carton tables.
| |
| *Despatch: As yet unmapped.
| |
| | |
| '''Notes:'''
| |
| * The flows for the Mothercare NDC were extensively mapped, but were still not complete at the time of writing.
| |
| * The ELC DC flows have not been mapped at all.
| |
| * It has been shown that extracting Nesting information without checking for INT or MINICLUB is fast, whereas finding this information through the database is extremely slow. It is recommended that this is either excluded from the extraction entirely, or the extraction does not attempt to split into UK or MINICLUB. If this is required, this extract will have to be evaluated when the datamine is built, to ensure that the schedule of runs is sufficiently long to allow this extract to complete.
| |
| | |
| ===Kronos Data Feeds===
| |
| | |
| '''Note:''' As yet, examples of the two Kronos data feeds have not been provided and therefore not mapped. An example of each of the data feeds need to be provided by the DHL IT team so that analysis can be completed. For estimation purposes, this document assumes that the feeds will be text and fixed format. If this is different, further costs may be accrued (for example, if the data feed is in MS Excel format, MS Office should be installed on the server to aid in extracting the file).
| |
| | |
| Examination of a similar file used within the existing MIS system shows that the data comes across in a Raw format, so shows only the scan codes that have been created to identify the specific activities within the warehouses. These codes will be entered into a cross-reference table within ''CALIDUS'' Vision.
| |
| | |
| The ''CALIDUS'' Vision table will be extended to include the new Kronos ID and indexed for cross-reference.
| |
| | |
| This lookup table will be used to stamp the Owner and User Activity (the task on which the current user is working).
| |
| | |
| The Date and Time stamps against the Kronos data will be used to provide the start and end time of tasks mined from PkMS from the Extended Activity feed above.
| |
| | |
| '''Note:''' It is core to the ''CALIDUS'' Vision system that the timestamps against the core systems (PkMS and Kronos) are identical. As the systems are not currently checked for this, ''CALIDUS'' Vision will include a simple '+/-Number of Seconds' parameter that will be user-maintainable. This will identify the number of seconds that the PkMS systems' time differs from the Kronos system time (one for each version of PkMS). If the core systems are synchronised with each other, this will not be necessary.
| |
| | |
| == Data Mining Process ==
| |
| The new datamining processes will be written in Visual Studio .NET 2010, utilising the HiT OLEDB AS/400 connector to access the core PkMS databases.
| |
| | |
| It is unknown as to whether 32- or 64-bit will be used, although both exist, as this depends on the configuration of the server on which the system will reside.
| |
| | |
| The configuration of the AS/400 connecter will be provided by the DHL IT team, as this is already in use within the operation for the existing MIS system.
| |
| | |
| Note that this program will be written with extensive logging and error checking built into the code. This is because the final test build will be:
| |
| * Untested against the AS/40 database
| |
| * Timed to ascertain the respective speeds of each area of a data mine (the data extract, the loading of the data, the analysis of the data), to ensure that the fastest processing is being used in all areas.
| |
| | |
| To aid in the process of debugging, the process will:
| |
| * Write a text log file for each run, recreated each time:
| |
| * Also write all logging information to the Vision database, for persistent logging.
| |
| In order to protect against the log files building up over time, but allowing the maximum log file for debugging purposes, the process will write to these log tables cyclically, ensuring that there are never more than X,000 records stored at any time. This limit will be controlled by a parameter within Vision.
| |
| | |
| ===Main Process===
| |
| The main process flow for the Data mining process will be as follows:
| |
| *Connect to Vision database and get parameters for extracts.
| |
| *Calculate interval parameter.
| |
| *Get list of extracts to attempt.
| |
| *Complete the Data Extracts for:
| |
| ** Mothercare NDC
| |
| ** ELC DC
| |
| ** Kronos 48-hour feed.
| |
| ** Kronos Incremental Times Feed.
| |
| *Analyse the data and produce the productivity figures.
| |
| | |
| ===Mothercare NDC Extract===
| |
| The Mothercare data extract will be as follows:
| |
| *Connect to Core Database (through AS/400 OLEDB connector).
| |
| *If not connected, log issue and move to next extract.
| |
| *If connected:
| |
| **Extract Standing Data:
| |
| ***Users
| |
| **Extract System Data (Tasks Outstanding at this time):
| |
| *** Putaways (by detailed activity)
| |
| *** Replens (by detailed activity)
| |
| *** Picks (by detailed activity)
| |
| *** Order Status Details
| |
| *** Warehouse Map details (Area/Aisle Usage information)
| |
| **Summarise Base Task information.
| |
| **Summarise Order Status information for the Order Status Screens and the System Overview (total of orders available for pick)
| |
| **Extract Detailed Activity Information:
| |
| ***Receipts
| |
| ***Putways/Replens
| |
| ***Picks
| |
| ***Nesting
| |
| ***Marshalling (UK)
| |
| ***Marshalling (INT)
| |
| ***Despatch
| |
| **All data will be stamped when extracting, with:
| |
| *** Shift (based on the task time)
| |
| *** Base Activity (Receipt, Putaway, Replen, etc)
| |
| *** Extended Activity (Small/Large/Hanging/Ratio, BP20/DCR/PD06)
| |
| *** Owner (UK, INT, MINICLUB, None)
| |
| *** Warehouse (MOT)
| |
| | |
| ===ELC DC Extract===
| |
| The ELC data extract will be as follows:
| |
| *Connect to Core Database (through AS/400 OLEDB connector).
| |
| *If not connected, log issue and move to next extract.
| |
| *If connected
| |
| **Extract Standing Data:
| |
| ***Users
| |
| **Extract System Data (Tasks Outstanding at this time):
| |
| *** Putaways (by detailed activity)
| |
| *** Replens (by detailed activity)
| |
| *** Picks (by detailed activity)
| |
| *** Order Status Details
| |
| *** Warehouse Map details (Area/Aisle Usage information)
| |
| **Summarise Base Task information.
| |
| **Summarise Order Status information for the Order Status Screens and the System Overview (total of orders available for pick).
| |
| **Extract Detailed Activity Information:
| |
| ***Receipts
| |
| ***Putways/Replens
| |
| ***Picks
| |
| ***Consolidation
| |
| ***Marshalling (UK)
| |
| ***Marshalling (INT)
| |
| ***Despatch
| |
| **All data will be stamped when extracting, with:
| |
| *** Shift (based on the task time)
| |
| *** Base Activity (Receipt, Putaway, Replen, etc)
| |
| *** Extended Activity (Bulk, VNA, Tote, Pick & Stick, Small/Large, FPP)
| |
| *** Owner (UK, INT, None)
| |
| *** Warehouse (EDC)
| |
| **When all complete, store new interval to database.
| |
| | |
| ===Kronos 48-hour Feed=== | |
| The Kronos 48-hour data extract will be as follows:
| |
| *Check inbound folder for new extract file
| |
| *If not exists, log issue and move to next extract.
| |
| *If exists, move file to working folder.
| |
| *Pre-process file, to remove any unnecessary information (repeating headers, etc)
| |
| *Remove all stored Kronos data from Vision table.
| |
| *Import data to table, calculating any time difference from EDC/NDC to the Dates/Times and storing this in additional fields.
| |
| | |
| ===Kronos Incremental Times Feed===
| |
| The Kronos Incremental Times data extract will be as follows:
| |
| *Check inbound folder for new extract file
| |
| *If not exists, log issue and move to next extract.
| |
| *If exists, move file to working folder.
| |
| *Pre-process file, to remove any unnecessary information (repeating headers, etc)
| |
| *Import data to table (keeping 48 hours).
| |
| | |
| The users table will be updated with the Kronos User ID and the current swiped activity.
| |
| | |
| ===Analysis of Data===
| |
| At this point:
| |
| * All Kronos data for the last 48 hours will be present
| |
| * All transactional information will be present for at last the last 48 hours.
| |
| * All data will be stamped with:
| |
| ** Shift
| |
| ** Base Activity
| |
| ** Extended Activity
| |
| ** Owner
| |
| ** Warehouse
| |
| | |
| The process will merge the two core data tables (Base System and Kronos), ordering the resulting data by Date and Time. The data will be totalled for each:
| |
| * Warehouse
| |
| * Employee
| |
| * Owner
| |
| * Shift
| |
| * Extended Activity
| |
| calculating:
| |
| * Number of Tasks
| |
| * Sum of Time Taken
| |
| * Sum of Quantity on Tasks
| |
| | |
| When complete, these figures will then be totalled into the Base Task Details.
| |
| | |
| The Analysis process will then create any cross-reference lookups or productivity settings required if new extended tasks have been added to the system.
| |
| | |
| == Database Modifications ==
| |
| New data tables are required to store:
| |
| * Kronos Data
| |
| * Mothercare/ELC data
| |
| * Kronos Activity Code Cross References
| |
| | |
| The existing Activity tables (emp_details and emp_details_ext) will be modified to store:
| |
| * Shift
| |
| | |
| The existing Users table will be modified to store the Kronos User ID.
| |
| | |
| New rules are required to control:
| |
| * Kronos Time Difference (NDC)
| |
| * Kronos Time Difference (MOT)
| |
| | |
| New database procedures and views are required to calculate:
| |
| * Activity summary information
| |
| | |
| == Screens ==
| |
| The screens available for use for the operations based on the extracted data above will be:
| |
| *Order Status 1 and 2
| |
| *Time to Completion for all extractions of the following task types:
| |
| ** Putaway
| |
| ** Replen
| |
| ** Pick
| |
| *Exceptions (WCS Alerts)
| |
| *Extended Task Enquiries and Extractions
| |
| *Base Task Enquiries and Extractions
| |
| *Aisles Usage
| |
| *Areas Usage
| |
| *System Overview
| |
| * Single and Summary screens for base tasks, showing:
| |
| ** Lowest 10 by Productivity
| |
| ** Highest 10 by Productivity
| |
| ** Current by Productivity vs Target
| |
| ** Daily by Productivity vs Target
| |
| *Users and Users Details screens
| |
| | |
| It was noted that the latest version of the WCS Alerts screen (as yet unreleased to production) would be used, to allow the user to click on an exception to see the details of the exception, not just the summary.
| |
| | |
| The existing Admin and Settings screens will be used for all maintenance and will require no modifications.
| |
| | |
| === Additional Menu Items ===
| |
| Menu items will be created for each of the Extended Detail Extractions, as shown in the [[#Client Requirements|Client Requirements]] section.
| |
| One of the following screens will be added to the Extended Extractions menu for each extended type:
| |
| * A Single Summary screen showing:
| |
| ** Lowest 10 by Productivity
| |
| ** Highest 10 by Productivity
| |
| ** Current by Productivity vs Target
| |
| ** Daily by Productivity vs Target
| |
| * Time to Completion
| |
| | |
| === Warehouse and Shift Summary Screens ===
| |
| New versions of the existing Warehouse Summary, Warehouse Weekly Summary and Shift Summary screens will be created to show Extended activity tasks rather than the existing base activities. These will also display the Owner Code on the screens.
| |
| | |
| === Extended Detail Enquiry / Extended Productivity Enquiry ===
| |
| These screens will be modified to allow the user to select 'Shift' as a selection type from the existing 'Level' drop-down list.
| |
| | |
| If selected, a sub-list will be shown below this, allowing the user to select from the available shifts. The shift will be defaulted to the User's default Shift value, although this can be changed.
| |
| | |
| When selected, all productivity information for users that exist within the shift start and end times will be displayed.
| |
| | |
| These screens will also be modified to include the Owner Code in the grid display. Owner will be added to the selection criteria, if the user is allowed to view multiple owners, and allow the users to select from a list of available owners.
| |
| | |
| The existing functionality in the screen, of allowing the users to
| |
| *select a date range of data to be reported
| |
| *export the list to CSV or Formatted Output
| |
| *Select each Activity (extended extraction) through the Task Type drop-down list will fulfil the following criteria:
| |
| *Productivity information, in the form of PI rate, at employee level and aggregated by shift, activity, site, made available to the site management team on a live* basis and with minimal intervention (15 minute update intervals)
| |
| *Operator level PI rate with enough data integrity that it can be used for performance management purposes and ability to look at rate over a rolling 6 week period which includes number of hours spend on each activity to reflect the sites PIP policy as agreed with the Union.
| |
| *Ability to print off or export to Excel spreadsheet to keep a record
| |
| *A detailed reporting suite that will allow both sites to look at performance at any level of hierarchy within the site (by activity, shift, sites, date ranges etc)
| |
| *Enquiries to enable a user to select range of activities / dates or shifts / users or groups
| |
| | |
| {{Note}} The live basis of reporting is based entirely around the data mining refresh rate.
| |
| | |
| === Area User PI Graph ===
| |
| A new graph will be created, allowing the following:
| |
| | |
| The user will be allowed to select an activity from the drop-down list. This will be populated with the available activities, as shown in the [[#Client Requirements|Client Requirements]] section. The user will also be able to select an Owner (from UK, INT, MiniClub or Blank), if they have not been set up with a default owner. The user will also be able to select a Shift, defaulting to the user's default shift.
| |
| | |
| Once selected, a bar chart of the average rate of the Activity within the warehouse will be shown. Two lines on the graph will indicate the Target and Minimum rates required for the activity and owner. The bar will be vertical, the y-axis showing percentage of Target. The bar will be RAG-coloured, red for below minimum, amber for between minimum and target and green for exceeding target.
| |
| | |
| A bar chart will be shown below that, showing a similar barchart, but broken down to each employee within the selected shift. Each bar will be marked with an Employee 'Code', which is representative of the user. This is expected to be the Kronos ID. All other functionality of the bar above will be maintained (Min and Target lines, RAG-colouring, etc.
| |
| | |
| This will fulfil the following criteria:
| |
| *Graphs of PI rate by employee to be displayed in each area (where each employee is identified by a unique confidential code or number)
| |
| | |
| <!-- MEDIA LANDSCAPE YES -->
| |
| | |
| = Appendix A: Table of SCRs and Ballpark Estimates =
| |
| == Developments Cost ==
| |
| {{ #vardefine: SCR | 0 }} | |
| {{REQ_SCR_Header}}{{REQ_SCR_Line
| |
| |SCR={{ #vardefineecho: SCR | {{ #expr: {{ #var: SCR }} + 1 }} }}
| |
| |System=
| |
| |Area=Analysis
| |
| |Description=Further Datamapping of EDC and Kronos feeds
| |
| |Estimate=1575
| |
| |Notes=2
| |
| }} | |
| {{REQ_SCR_Line | |
| |SCR={{ #vardefineecho: SCR | {{ #expr: {{ #var: SCR }} + 1 }} }} | |
| |System=
| |
| |Area=Dev Environment
| |
| |Description=Build copy database in OBS and load data
| |
| |Estimate=1000
| |
| |Notes=
| |
| }}
| |
| {{REQ_SCR_Line
| |
| |SCR={{ #vardefineecho: SCR | {{ #expr: {{ #var: SCR }} + 1 }} }}
| |
| |System= | |
| |Area=Dev Environment
| |
| |Description=Create dev environment
| |
| |Estimate=500
| |
| |Notes= | | |Notes= |
| }} | | }} {{TestPlan_Test |
| {{REQ_SCR_Line
| | |Test={{ #vardefineecho: SubCycle | {{ #expr: {{ #var: SubCycle }} + 0.01 }} }} |
| |SCR={{ #vardefineecho: SCR | {{ #expr: {{ #var: SCR }} + 1 }} }}
| | |Action=Do something |
| |System=
| | |Result=Some expected results |
| |Area=Analysis
| | |Remarks= |
| |Description=Functional Specification
| | |PassFail= |
| |Estimate=1575
| | }} {{TestPlan_CycleFooter}} |
| |Notes=
| |
| }}
| |
| {{REQ_SCR_Line
| |
| |SCR={{ #vardefineecho: SCR | {{ #expr: {{ #var: SCR }} + 1 }} }}
| |
| |System=Vision Mining
| |
| |Area=Development
| |
| |Description=Structure of .NET application, procedures and logging
| |
| |Estimate=1000
| |
| |Notes=
| |
| }}
| |
| {{REQ_SCR_Line | |
| |SCR={{ #vardefineecho: SCR | {{ #expr: {{ #var: SCR }} + 1 }} }} | |
| |System=Vision Mining
| |
| |Area=Development
| |
| |Description=Kronos Feed
| |
| |Estimate=2000
| |
| |Notes=3
| |
| }}
| |
| {{REQ_SCR_Line
| |
| |SCR={{ #vardefineecho: SCR | {{ #expr: {{ #var: SCR }} + 1 }} }}
| |
| |System=Vision Mining
| |
| |Area=Development
| |
| |Description=Mothercare NDC PkMS Feed
| |
| |Estimate=5250
| |
| |Notes=
| |
| }}
| |
| {{REQ_SCR_Line
| |
| |SCR={{ #vardefineecho: SCR | {{ #expr: {{ #var: SCR }} + 1 }} }}
| |
| |System=Vision Mining
| |
| |Area=Development
| |
| |Description=ELC DC PkMS Feed
| |
| |Estimate=2500
| |
| |Notes=3
| |
| }}
| |
| {{REQ_SCR_Line
| |
| |SCR={{ #vardefineecho: SCR | {{ #expr: {{ #var: SCR }} + 1 }} }}
| |
| |System=Data Analysis
| |
| |Area=Development
| |
| |Description=Add new base task types to views.
| |
| |Estimate=787.5
| |
| |Notes=
| |
| }}
| |
| {{REQ_SCR_Line
| |
| |SCR={{ #vardefineecho: SCR | {{ #expr: {{ #var: SCR }} + 1 }} }}
| |
| |System=Data Analysis
| |
| |Area=Development
| |
| |Description=Merge Kronos data into view
| |
| |Estimate=525.0
| |
| |Notes=
| |
| }}
| |
| {{REQ_SCR_Line
| |
| |SCR={{ #vardefineecho: SCR | {{ #expr: {{ #var: SCR }} + 1 }} }}
| |
| |System=Data Analysis
| |
| |Area=Development
| |
| |Description=New packages to analyse data
| |
| |Estimate=1050
| |
| |Notes=
| |
| }}
| |
| {{REQ_SCR_Line
| |
| |SCR={{ #vardefineecho: SCR | {{ #expr: {{ #var: SCR }} + 1 }} }}
| |
| |System=Data Analysis
| |
| |Area=Development
| |
| |Description=New tables
| |
| |Estimate=525.0
| |
| |Notes=
| |
| }}
| |
| {{REQ_SCR_Line
| |
| |SCR={{ #vardefineecho: SCR | {{ #expr: {{ #var: SCR }} + 1 }} }}
| |
| |System=Screens
| |
| |Area=Development
| |
| |Description=Summary changes * 3
| |
| |Estimate=787.5
| |
| |Notes=
| |
| }}
| |
| {{REQ_SCR_Line
| |
| |SCR={{ #vardefineecho: SCR | {{ #expr: {{ #var: SCR }} + 1 }} }}
| |
| |System=Screens
| |
| |Area=Development
| |
| |Description=Detail screens * 2
| |
| |Estimate=1050
| |
| |Notes=
| |
| }}
| |
| {{REQ_SCR_Line
| |
| |SCR={{ #vardefineecho: SCR | {{ #expr: {{ #var: SCR }} + 1 }} }}
| |
| |System=Screens | |
| |Area=Development | |
| |Description=New graph | |
| |Estimate=525 | |
| |Notes=
| |
| }}
| |
| {{REQ_SCR_Line
| |
| |SCR={{ #vardefineecho: SCR | {{ #expr: {{ #var: SCR }} + 1 }} }}
| |
| |System=
| |
| |Area=Testing
| |
| |Description=System Testing
| |
| |Estimate=5000
| |
| |Notes=
| |
| }}
| |
| {{REQ_SCR_Line
| |
| |SCR={{ #vardefineecho: SCR | {{ #expr: {{ #var: SCR }} + 1 }} }}
| |
| |System=
| |
| |Area=Testing
| |
| |Description=Integration Testing
| |
| |Estimate=5000
| |
| |Notes=
| |
| }}
| |
| {{REQ_SCR_Line
| |
| |SCR={{ #vardefineecho: SCR | {{ #expr: {{ #var: SCR }} + 1 }} }}
| |
| |System=
| |
| |Area=Implementation
| |
| |Description=Create new screens in database and menus
| |
| |Estimate=525
| |
| |Notes=
| |
| }}
| |
| {{REQ_SCR_Line
| |
| |SCR=
| |
| |System=
| |
| |Area=
| |
| |Description='''Total'''
| |
| |Estimate=31175
| |
| |Notes=
| |
| }}
| |
| {{REQ_SCR_Footer}} | |
|
| |
|
| '''Notes:'''
| | MEDIA LANDSCAPE NO |
| #Any high level ballpark estimates for development are based on the basic information provided and are subject to detailed design and creation of an SCR.
| | --> |
| # 2 days further analysis required to map ELC and Kronos, plus 1 day documentation.
| | <!-- NEW PAGE --> |
| # Analysis on these areas is not complete - the estimates here are ballpark only.
| |
| | |
| <!-- MEDIA LANDSCAPE NO --> | |
| {{Doc_Appendix | | {{Doc_Appendix |
| |Appendix=B | | |Appendix=A |
| |Estimate=N | | |Estimate=Y |
| |Glossary=WCS | | |Glossary={{#var:System}} |
| |Ref1=DE05 - Business Requirements Specification CV Mothercare V 0.03.doc | | |Ref1=FS-311976 CTMS-ePOD Start End Stop Debrief |
| |RefV1=0.03 | | |RefV1=1.0 |
| |RefDate1=15/03/2012 | | |RefDate1= |
| |REQ=0 | | |REQ=0 |
| |EST=0 | | |EST=0 |
| |FS=0 | | |FS=0.5 |
| |TS=0 | | |TS=0 |
| |DEV=0 | | |DEV=1.5 |
| |ST=0 | | |ST=0.25 |
| |IMP=0 | | |IMP=0.25 |
| |Client= | | |EREQ=0 |
| |Year= | | |EEST=0 |
| | |EFS=0 |
| | |ETS=0 |
| | |EDEV=0 |
| | |ESTT=0 |
| | |EIMP=0 |
| | |Client={{#var:Client}} |
| | |Year={{#var:Year}} |
| |FSEST=Y | | |FSEST=Y |
| |Rev1=Rev1 | | |FOC=Y |
| |Rev1Title=Rev1 Title | | |Rev1=Matt Tipping |
| }} | | |Rev1Title=Project Manager |
| | |Rev2Title=Client Representative |
| | }}</div> |
| | [[Category:{{#var:Client}} FS]] |