Quantcast
Channel: SCN : Document List - SAP Advanced Planning & Optimization (SAP APO)
Viewing all 24 articles
Browse latest View live

Leverage Business Process Mapping for Faster SAP APO Adoption

$
0
0

Companies invest lot of time and money implementing “Advanced Planning systems” like SAP APO. Due to inherent complexity of these systems, it is important to have clear and detailed linkages of the business processes which get enabled by them. Having business process maps which can start at a business domain level and end at a detailed transaction in SAP APO would help the planners in understanding it better and aid in faster adoption.

Business process maps if done correctly can not only be used for training the planners but also could be very handy during any business transformation programs. They also help represent processes visually, provide common understanding of the entire process, specific roles of the process participants and the systems used to enable those processes.

Here are my top five tips to ensure your business process mapping exercise is fruitful, yield results as expected and enable planners to use and adopt SAP APO with ease.

Tip 1: Finalize the scope of business process mapping exercise

It is very important to clearly define the business processes which should be mapped to ensure time and effort involved is utilized optimally. During implementation of SAP APO, you should focus your process mapping effort to various planning processes ( like demand planning, supply planning etc.) which are getting enabled and not spend time discussing and mapping the execution and transaction oriented processes.

Resources required to do this exercise need to be clearly identified with their roles defined. Business process owners, IT team and consultants implementing SAP APO are all required to be involved so that processes could be mapped correctly, completely and are agreed by everybody.

Tip 2: Select the right software to create and deliver the process maps

Process Mapping will not be effective if the software used to create the process maps is not right for your organization. PowerPoint and Visio are very commonly used softwares to develop process maps and are generally available in almost all the organizations. Advanced tools like ARIS have lot of features and are very popular whenever large scale process mapping exercise is carried out.

There are also products like ProcessesNOW which not only provides prebuilt library of process maps based on the Supply Chain Operations reference (SCOR) model but also enables delivery of these process maps via a portal which can be installed directly on the intranet. All the process maps directly open in the portal and the user can drill up and down right from the portal making it much easier to navigate. It also provides options to link various documents like training presentations, business blueprint, functional specifications etc.with the process map directly thereby creating a knowledge repository for the users which serves as a single stop shop for all the training needs.

Tip 3: Keep process maps simple, readable and consistent

Process maps should be as simple as possible so that people reading it have no difficulty in understanding it. There should be enough detail for the user to comprehend the process easily but it shouldn't be so intense and dense that the user gets overwhelmed.

Golden rule for Business Process Mapping is “Less is More”.

It is a good practice to create swimlanes to organizes activities into groups based on who is responsible for the different steps within a process flow. Using swimlanes, it is easy to map out the complete process, roles, responsibilities and the interdependencies between various groups.

Another important aspect is to select and finalize the symbols which would be used in creating process maps. Its better to limit the number of symbols so that the process maps are easily readable and understood. In order to refer the processes clearly, it is important to have a standard naming convention so that various processes/activities and tasks are named according to the level at which they are.

Tip 4: Jump Start your process mapping initiative by using prebuilt process maps

Process mapping initiative could be a very time consuming exercise depending on number of processes selected and process participants involved. This act as a deterrent for the top management to have such initiatives and lot of times advanced planning solutions are implemented without any process mapping exercise. This results in poor adoption of the system by the users since they are lost and have no means to understand how a process is enabled by the new system.

One way to jumpstart your process mapping initiatives is to buy prebuilt process maps which can drastically reduce your process mapping time and cost. Products like “ProcessesNOW” have prebuilt library of process maps for the supply chain planning processes for SAP APO which could reduce the time to develop the process maps by up to 70%. This would also ensure that all the process maps are consistent, follow same naming convention and look very similar.

Tip 5: Clearly map the linkage between the business processes and SAP APO

The main objective of a process mapping initiative should be to document the process in such a way that it captures all the important details but still could be viewed at a higher level. Business Processes could be mapped with 4-5 levels of abstraction with increasing level of detail and last level should link the business process with SAP APO.

  • Level 1 could represent a business domain for which the processes will be mapped. For example, Planning, Sourcing, Manufacturing etc. could be the level 1 process.
  • Level 2 could represent various group processes within a business domain. For example, under planning business domain, demand planning, supply planning, replenishment planning could be the level 2 processes.
  • Level 3 could represent a specific process for a group process. For example, under supply planning, capturing all the supply planning requirements could be the level 3 process.
  • Level 4 and Level 5 can represent sub processes and activity/task for a specific process.
  • Level 6 could represent how a particular activity or task will be performed in SAP APO.

Having these six levels will give clear visibility to the users to understand a business process and how it is enabled by implementation of SAP APO.

I hope you will benefit from these guidelines which were developed through various SAP APO projects I have been part of.


Debug a CIF

$
0
0

This document explains how we can debug the CIF at PO, when the data is sent from ECC to APO.

In this example, we are debugging the CIF for Materials. Same process can be applied to all the iModel objetcs.

 

1- SMQ2: Inbound Queue at APO

Check that there are no entries using SMQ2 at APO side:

CIF0.jpg

CIF01.jpg

 


2- SMQR: Deregister queues at APO side

In APO, go to transaction SMQR. Select Queue name CF* and press Deregistration:

CIF1.jpg

CIF2.jpg

 

Note the the Type U is displayed. This means that the Queue is Deregistred:

CIF3.jpg

 

 

 

3- CMF1: Create Integration Model

 

Go to transaction CFM1 at ECC side and create an Integration Model for Materials. It is recommended to work with a single materials in order to make easier the debug:

CIF4.jpg

 

Generate the IModel:

CIF7.jpg

 


4- CFM2: Activate or Deactivate Integration Model

 

Use CFM2 at ECC side to activate the IModel:

CIF8.jpg

 

Select the entry, press Activate and Start:

CIF9.jpg

 

A pop up will appear confirming that the queue is deregistred. Press Ignore

CIF0.jpg

 

 

5- SMQ2: Inbound Queue at APO

 

Now we are ready to debug the CIF.

Open a new APO session and go to transaction SMQ2 at APO:

CIF11.jpg

 

Click twice on the queue name as many times as needed still the debug option is displayed:

CIF12.jpg

CIF13.jpg

 

Press the botton with the red arrow to start the debug:

CIF14.jpg

 

Done... the ABAP debugger appears:

CIF15.jpg

 

 

 

6- SMQR: Register queues at APO side

 

Once the debug is done, remember to Register the APO queues:

CIF1.jpg

CIF17.jpg

CIF18.jpg

Adding/Modifying MRP Controller in APO

$
0
0

MRP controller is basically the person who is responsible for MRP (Material Requirement Planning) and ensures that all required materials for building the Finished Goods/sub-assembly is available on time.


Based on the business requirement, there can be multiple scenarios where there is either new planner to be added in system or change in existing planner data. In such scenarios we need to make corresponding changes both in ECC and APO so that both the systems are in sync and planning results are as expected. This document explains the basic steps which needs to be performed to carry out this activity –


Basic requirements for setting planner in APO:


Following information needs to be collected from the Business .

1) MRP controller number

2) Planner name / Description

3) Authorizations in APO system (From those listed below)


  • Prod Planner
  • SNP Planner
  • Demand Planner
  • Transp. Planner
  • Purch. Planner
  • ICH Planner

 

  1. Check the MRP controller number in R/3 using SPRO transaction. Ensure that the planner exists in R/3 system.

     1.JPG

SPRO -> Materials management -> Consumption-Based Planning -> Master Data -> Define MRP Controllers


     2.JPG

2. Check the MRP controller number to be added in APO is present in R/3.Select any entry and click on Position

 

     3.JPG

In the pop-up window, enter the plant no. and MRP controller no. to be checked.


     4.JPG

     5.JPG

The above steps help us to identify which MRP controller needs to be changed/created in APO. For changes in APO, below steps can be followed -


Step 1: Execute transaction: SPRO in APO (below screen will appear). Click on ‘SAP Reference IMG’


     1.JPG

Step 2: Follow the navigation as shown below. Execute the highlighted link

SPRO -> Advance Planning and Optimization -> Supply Chain Planning -> Specify Person responsible (Planner)


     7.JPG


Step3: Depending on the case, we may have to either

     a)     Create a new planner by copying from existing planner with similar authorizations

OR

     b)     Change the authorizations / Description of existing planner

 

Step 3A: Create new controller by copying from existing controller

     1.      Click on the Planner No. who has similar authorizations as mentioned in the case

     2.      Click on ‘Copy As…’


     8.JPG

Step 4 .Enter the details as required then Press ‘Enter’ key


     9.JPG

Step 5. Click on ‘Save’.


     10.JPG

When we do any configuration related changes in the system it is advisable to do it through the ‘Transport Request’.


The following steps would help to create the Transport Request once we are done with the creation/change of the planner. Click the save button shown in the screenshot above. We would be prompted with the screen below:


     11.JPG

Step 6: Here click on ‘Create Request’ or Press F8. A new window will pop-up


Enter the Short description for request.


E.g.  SNP planner code change.


Click on save.


     12.JPG

Step 7: Transport Request will be automatically generated. Note the Request number and click


Step 8:Now that you have created the TR (transport request) successfully. You can see the transport through the transaction SE01 and fill in your request that got generated


     13.JPG

     14.JPG

Conclusion: Planner plays a very important role in APO so we need to make sure that the MRP controller in R/3 and Planner in APO are in sync.

Mass Creation Of Product Specific Transportation Lane creation

$
0
0

The objective of this document is to highlight the options available for mass creation of product specific Transportation lane.

 

Transportation lane is the External procurement Relationship  master data object used in APO planning process.

 

The ways to create this master data object is listed below

 

     1. Created through CIF from ECC. Special Procurement Key in Material master and  Purchase Info record can trigger the creation of the Transportation lane in APO.

 

     2. Create in APO manually through the transaction /N/SAPAPO/SCC_TL1  or through the custom program.

 

This document will bring out the options available in the transaction /N/SAPAPO/SCC_TL1 to create Transportation lane for multiple product location.

 

Steps 1: Access the transaction /N/SAPAPO/SCC_TL1 enter the model, source and destination location and hit on create.

 

Capture.PNG

 

Step 2: Click on Creation of New entry and select Mass selection, as highlighted in the below screen shots.

Capture3.PNG

Step 3: Hit on Mass selection and you will get the options to enter Multiple products, Location products ,selection based on SNP Planners etc as as shown in the below screen shot.

 

Capture4.PNG

Enter Multiple products and execute the transaction by clicking on the clock.

 

And finally save the data to create product specific transportation lane.

SCM Core Interface- Handbook (PART-1)

$
0
0

1. Introduction

APO core interface (CIF) is a standard interface which connects an APO & a standard R/3 system. CIF interface enables the data exchange between an SAP ERP system (SAP R/3, SAP R/3 Enterprise, or SAP ECC), and a connected SAP SCM system. CIF enables the integration of master data from SAP ECC to SCM APO (one way only) and the integration of transactional data in both ways, from ECC to APO and vice versa. The basic idea of the integration is to write events for each planning relevant change – e.g. the creation of an order – and use these as trigger point for the data transfer. Technically the transfer is performed via qRFC. Which objects (i.e. planned orders, stock) should be transferred from SAP ECC to SCM APO is controlled by the integration models, which could be regarded as something like the master data of the CIF.

 

2. Business Objective

The main objective of configuring the ECC core interface is to integrate the centralized source of master data from ECC to APO & publishing planning result from SCM planning system (APO) to SAP OLTP system (ECC) & vice versa. It is essential to maintain the consistent data & its flow between planning and execution system for better planning result & optimum inventory level.  This configuration also suffices the objective of real time transactional data transfer between planning and execution system & delta transfer for master data changes.

CIF provides the following central functions:

  1. Supply the master and transaction data, relevant for planning, in SCM APO system.
  2. Techniques for initial, incremental, and real-time data transfers from ECC to SCM APO
  3. Techniques for publishing planning results back from SCM APO to ECC system

 

3. Key Design Areas

  • Technical Integration between ECC and APO (CIF Configuration)
  • Integration of Master data and Transactional data (Design of Integration Models)
  • CIF Error Handling and Queue Management

 

4. Technical Integration between ECC & APO (CIF Configuration)

  • As of SAP ECC 6.0, CIF is an integrated part of the system. (If you use SAP SCM 5.0 with ERP systems up to SAP ECC 5.0, you receive CIF via the relevant SAP R/3 Plug-In. The SAP R/3 Plug-In is a combined Plug-In used to link SAP components such as SAP SCM to the ERP system up to and including SAP ECC 5.0) .
  • SAP ERP Central Component 6.0 (SAP ECC 6.00) and later releases will directly contain all the interfaces necessary for the technical integration with other SAP components. These interfaces were previously part of the SAP R/3 Plug-In.

1.jpg

 

5. CIF Set-up & Related Configuration Task

5.1.      Configuration in R/3

    • Define logical system (Transaction code- BD54)

We defined logical system for both ECC and APO system. To enable the transfer of data via APO Core Interface (CIF), you need to name both                    the ERP system in which you are working and the SAP APO system to which you want to transfer data as logical systems.

 

2.jpg

Note:-The above activity is dependent upon the target system concerned. For this reason, there is no transport to the production system. As a result, you need to also make the settings for the relevant target system in the production system manually, Or Alternatively, you can maintain the entire ECC & APO logical system name (as per SAP system landscape & client strategy) in Development client  and include them in a transport. This will make all the logical system available across ECC system landscape & minimize the manual activities.

 

  • Assign logical system to client (Transaction code- SCC4)

         In this step, assign the logical system to a client. All the fields like City, Std. currency, logical system should be maintained. Otherwise there will be           some futile CIF error during the data transfer from ECC to APO.

          3.jpg

Note: - These settings cannot be transported. When a new system is being set up, these settings must be made after the system installation has been completed.

 

  • Specify APO release (Transaction code- NDV2)

In this step, you have to specify the release level of the SAP APO system that is defined as the target system. The release level will activate the compatible functionality for the data transfer.

4.jpg

 

Note: - The above activity is dependent upon the target system concerned. For this reason, there is no transport to the production system. As a result, you need to also make the settings for the relevant target system in the production system manually or

Alternatively, you can maintain the all of the logical system name (as per SAP system landscape) in Development client and include them in a transport. This will make all the logical system available across ECC system landscape & minimize the manual activities.

 

 

  • Setup RFC destination (Transaction code- SM59)

In this step, create the RFC destination which has the same name as the target logical system. This RFC enables the connection to the SCM APO system.

5.jpg

 

Performance Tip: - By activating the load balancing & defining the logon group, you can use the load balancing procedure while connecting to the APO system. This logon group needs to be assigned to the various servers in transaction code RZ12.

6.jpg

When ECC is connected to APO system it always uses ALEREMOTE user in order to process any CIF queues.

 

  • Assign RFC destinations to various application cases (Transaction code- CFC7)

You assign various application areas to the logical system and the RFC destination. This is optional configuration & purely depends on the business requirement & guidelines for the type of RFC user to be used for security reasons. Some of the applications only work with RFC dialog user e.g. GATP availability check, but from security point of view, most of the clients e.g. life science, FMCG doesn’t allow to create the RFC user as dialog user which can be misused.

To meet the business requirement, there is a need to create a new RFC destination with different RFC user of type either Dialog or Service user with very limited & restricted authorization. The authorization has to be provided for the type of application to be use.

7.jpg

The below applications can be triggered from ECC into APO

8.jpg

 

  • Set target system and queue type (Transaction code- CFC1)

In this step, you can set the queue type for the target system specified. It is always recommended to use the inbound queues in the target system to control the processing of the CIF queues with heavy data load.

The queue type (inbound or outbound) determines whether the queues processing is controlled by the sending or receiving system

9.jpg

 

  • Set user parameter (Transaction code- CFC1)

    In this IMG activity, you can make user-specific entries for the following parameters:

    • You can use this to set whether and how data records should be logged in the application log for the user specified
    • User can use this functionality to debug in case of any queues get created for that user. Any master data or transactional date created by that user will be stuck in the CIF queue which the user can debug and analyze the issue
    • Here, you should maintain entry for user name as “*” for normal or detailed logging of the application log.
    • RFC user should be maintained with the normal logging.
    • CIF administrator id should be maintained for detailed logging along with Debugging mode “ON”.

          10.jpg

 

  • Determine Number Ranges for Parallelization (Transaction code- CFC8)

          11.jpg

 

  • Define filter and selection block size (Transaction code- CFC3)
    • You determine the number of filter objects that are processed in one block in the APO Core Interface.
    • Also you determine the number of data objects that are transferred to SAP APO in a remote function call (RFC) at the initial data transfer.
    • In normal cases, it is recommended not to change the above Filter & Block size.
    • You can use these settings to improve system performance during the initial data transfer. The optimum values for improved performance vary from case to case and are largely dependent upon the client data situation. Therefore, you are recommended to experiment with the settings in individual cases in your system. 

              12.jpg

   

                13.jpg

         

                14.jpg

                    Note: - Refer to OSS note # 436527 for the block size recommendation.

 

  • Configure Change transfer for master data (Transaction code- CFC9)
    • All the master data changes e.g. material, customer, vendor & resources are configured as periodic transfer to APO.
    • During the initial transfer of the resource, it is defined to create external capacity for the resource for 30 days in past and 730 days in future.
    • It is recommended to create single-mixed resource in APO for the resources having single capacity and for which the indicator “can be used by several operation” is not set.
    • It is recommended to configure to create Multi-Mixed resource in APO for the resources having multiple capacities or for capacities for which the indicator “can be used by several operations” is set.

          15.jpg

              Note: -

      • Master data change transfer from ECC to APO can be configured as immediate or periodic, which is completely depends on client requirement.
      • If SNP & PP/DS both are in scope then it is always recommended to use the resource type as mentioned above. This is very critical setting & cannot be altered later neither in ECC nor in APO. You need to delete the resource in APO to change the resource type. This cleanup can be a mini project.


  • Activate ALE Change Pointers Generally (Transaction code- BD61)

          This configuration is a prerequisite for transferring master data changes with change pointers.

            16.jpg


  • Activate ALE Change Pointers per Message Type

          The following change pointers are important to be activated e.g. Vendor master, Customer master, Set-up group and Source of supply, material           MRP area, Subcontracting PDS.

17.jpg


Note: - Activation of change pointer is only needed if the corresponding master data is used in system.


  • Activate online Transfer Using BTE (Transaction code- BF11)
    • We activate BTE for the integration with SAP APO in order to activate the online transfer of both transaction data changes and some master data changes like Material master and Resource. We Set the ND-APO (New Dimension Plug-In APO) and NDI (New Dimension Integration) application indicators to active.

18.jpg

              Note: - This is not the default setting; hence make sure both the above mentioned indicators are set.


  • QIN Scheduler (Transaction code- SMQR)
    • When APO sends the planning results to the connected ECC system, it generates the CIF queues in the inbound of the ECC system as per the configuration. Those inbound Queues that are to be automatically processed by the target system must be registered in the QIN scheduler.
    • The below settings used for QIN scheduler worked for most of the clients. Changes can be done based upon the queue processing.

19.jpg


  • QOUT Scheduler (Transaction code- SMQS)
    • Outbound qRFCs (outbound queues) are processed by the QOUT scheduler. The QOUT scheduler is configured using transaction code SMQS. The targets system to which the outbound qRFCs are to be sent are registered in the QOUT scheduler.
    • The below settings used for QOUT scheduler worked for most of the clients. Changes can be done based upon the queue processing.

20.jpg



5.2.    Configuration in SCM APO System

      • Define Logical System (Transaction code- BD54)

We defined logical system for both ECC and APO system. To enable the transfer of data via APO Core Interface (CIF), you need to name both the ERP system in which you are working and the SAP APO system to which you want to transfer data as logical systems.

21.jpg


              Note: - The above activity is dependent upon the target system concerned. For this reason, there is no transport to the production system. As a                     result, you need to also make the settings for the relevant target system in the production system manually, Or

Alternatively, you can maintain the all of the logical system name (as per SAP system landscape) in Development client and include them in a transport. This will make all the logical system available across ECC system landscape & minimize the manual activities.

 

  • Assign logical system to the Client (Transaction code- SCC4)

In this step, assign the logical system to a client. All the fields like City, Std. currency, logical system should be maintained. Otherwise there will be some futile CIF error during the data transfer from ECC to APO.

22.jpg

Note: -These settings cannot be transported. When a new system is being set up, these settings must be made after the system installation has been completed.

 

  • Setup RFC Destination (Transaction code-SM59)

In this step, create the RFC destination which has the same name as the target logical system. This RFC enables the connection to the ECC system.

23.jpg

Performance Tip: - By activating the load balancing & defining the logon group, you can use the load balancing procedure while connecting to the ECC system. This logon group needs to be assigned to the various servers in transaction code RZ12.

 

24.jpg

When APO is connected to ECC system it always uses ALEREMOTE user in order to process any CIF queues.

 

  • Assign RFC Destinations to Various Application Cases (Transaction code- SPRO)
    • You assign various application areas to the logical system and the RFC destination. This is optional configuration & purely depends on the business requirement & guidelines for the type of RFC user to be used for security reasons. Some of the applications only work with RFC dialog user e.g. display application log from APO into ECC system, but from security point of view, most of the clients e.g. life science, FMCG doesn’t allow to create the RFC user as dialog user which can be misused.
    • To meet the business requirement, there is a need to create a new RFC destination with different RFC user of type either Dialog or Service user with very limited & restricted authorization. The authorization has to be provided for the type of application to be use. So that while accessing the data from the target system through a Remote function call, the system will use the specified RFC destination specified in the configuration of “Assign RFC destinations to various Application cases”.

              25.jpg


  • Maintain Business System Group (Transaction code- /SAPAPO/C1)
    • This configuration determines the assignment to a business system group of this system and the respective ECC systems that are to be connected.
    • If this APO system is connected with multiple ECC system using the same number range of material master, plant, vendor & customer master, then we need to define multiple BSG groups to bring master data from each system.

26.jpg

 

  • Assign Logical system to Business System Group (Transaction code- /SAPAPO/C2)

In this step, to enable error-free communication, every source system (ERP system) must be assigned to a BSG. We assign the logical system to the BSG group & the queue type.

Here, we have also activated the CIF post processing functionality for CIF error handling of transaction data.

27.jpg

Note: - It is recommended to use the Inbound Queues if transferring a large amount of data to the ERP system to ensure an even load on the ERP system. Ensure to maintain settings for the Queue-In (QIN) Scheduler in the qRFC monitor on the ERP side.

 

  • Set User Parameter (Transaction code- /SAPAPO/C4)

In this IMG activity, you can make user-specific entries for the following parameters:

    • You can use this to set whether and how data records should be logged in the application log for the user specified.
    • User can use this functionality to debug in case of any queues get created for that user. Any master data or transactional date created by that user will be stuck in the CIF queue which the user can debug and analyze the issue.
    • Here, you should maintain entry for user name as “*” for normal or detailed logging of the application log.
    • RFC user should be maintained with the normal logging.
    • CIF administrator id should be maintained for detailed logging along with Debugging mode “ON”.

              28.jpg


  • QIN Scheduler (Transaction code- SMQR)
    • When ECC sends the master and transaction data to the connected ECC system, it generates the CIF queues in the inbound of the APO system as per the configuration. Those inbound Queues that are to be automatically processed by the target system must be registered in the QIN scheduler.
    • The below settings used for QIN scheduler worked for most of the clients. Changes can be done based upon the queue processing.

              29.jpg

 

  • QOUT Scheduler (Transaction code- SMQS)
    • Outbound qRFCs (outbound queues) are processed by the QOUT scheduler. The QOUT scheduler is configured using transaction code SMQS. The targets system to which the outbound qRFCs are to be sent are registered in the QOUT scheduler.
    • The below settings used for QOUT scheduler worked for most of the clients. Changes can be done based upon the queue processing.

          30.jpg

 

  • Maintain Distribution Definition (Publication) (Transaction code- /SAPAPO/CP1)

In order to publish the planning results from APO system to ECC system, we have maintained all the publication types for the required locations for which we wanted to publish the order to ECC system. This is maintained for both in-house and external procurement.

              31.jpg

                  Note: - If the distribution definitions are not maintained then the planning results will not transfer back to the connected ERP system. The data                     inconsistency between ECC & APO system will even not be captured in CCR report.


5.4.  qRFC queue names for CIF

        QRFC queue names for the CIF are always set up according to the following rules:

        CF<CIF object ID><serialization character string>

        The CIF objects that are transferred from an ERP system to the APO system are listed below:

CIF object

ID

Serialization Character string

Batch

BTC

CHARG+(10) + MATNR+(9)

Resources

CAPA

NAME+(18)

Characteristic

CHR

ATNAM+(19)

Class

CLA

KLART+(3) + CLASS+(16)

Inspection lot

LOT

PRUEFLOS+(17)

Material master

MAT

WERKS+(4) + MATNR+(14)

Planned order

PLO

ORDNR+(12)

Confirmation

PPC

ORDERNR

Reservation

RSV

ORDNR+(12)

Purchase order

PO

DOC+(10)

Purchase Requisition

PO

DOC+(10)

 

 

The CIF objects that are transferred to an ERP system from SAP APO are listed below:

CIF object

ID

Serialization Character string

Delivery confirmation

CD

ORDERNO

Confirmation

CF

GUID or ORDERNO

VMI Sales order

CO

ORDERNO

Delivery

DLV

001

Purchase requisition

EP

GUID or ORDERNO

Purchase order

EP

GUID or ORDERNO

Planned order

IP

GUID or ORDERNO

Process/Production Order

IP

GUID or ORDERNO

Production campaign

PC

GUID or ORDERNO

  1. Man. Reservation

RV

GUID or ORDERNO

Shipment

SHP

001

Stock transport order

TO

GUID or ORDERNO

 

 

5.4.  CIF Interface Configuration Checklist

          Below is the CIF configuration checklist or tracker which can be maintained to monitor the CIF configuration in ECC & APO system          simultaneously. This will avoid any configuration to get missed.


  1. S.no

Configuration Nodes

APO

APO Status

ECC

ECC Status

1

Name Logical System

Basis

Completed

Basis

Completed

2

Assign Logical System to a Client

Basis

Completed

Basis

Completed

3

Specify SAP APO Release

NA

NA

Basis/Functional

Completed

4

Create RFC User

Basis

Completed

Basis

Completed

5

Set Up RFC Destination

Basis

Completed

Basis

Completed

6

Set Target System and Queue Type

CIF Functional

Completed

CIF Functional

Completed

7

Set User Parameters

Basis/CIF Functional

Completed

Basis/CIF Functional

Completed

8

Configure Application Log

CIF Functional

Not Started

CIF Functional

No Change

9

Determine Number Ranges for Parallelization

NA

NA

Basis

No Change

10

Set Filter and Selection Block Size

No Change

No Change

No Change

No Change

11

Configure Change Transfer for Master Data

NA

NA

CIF Functional

Completed

12

Activate ALE Change Pointers Generally

NA

NA

CIF Functional

Not Started

13

Activate ALE Change Pointers for Message Types

NA

NA

CIF Functional

Not Started

14

Activate Online Transfer Using BTE

NA

NA

CIF Functional

Completed

16

Activate Cross-System Update Logic

NA

NA

CIF Functional

Completed

17

Maintain Business System Group

CIF Functional

Completed

NA

NA

18

Assign Logical System and Queue Type

CIF Functional

Completed

NA

NA

19

Maintain Object-Specific Settings

CIF Functional

Not Started

NA

NA

20

Maintain Publication Settings

CIF Functional

Not Started

NA

NA

 

  1. S.no

Activities

APO

APO Status

ECC

ECC Status

1

Setup Outbound scheduler

Basis

Completed

Basis

Completed

2

Setup Inbound scheduler

Basis

NA

Basis

NA

3

Activate outbound scheduler for CIF

Functional

Completed

Functional

Completed

4

Activate Inbound scheduler for CIF

Functional

NA

Functional

NA

5

Create Integration model

NA

NA

Functional

Completed

6

Activate Integration model

NA

NA

Functional

Completed

7

Maintain Publication Settings (After Master Data transfer)

Functional

Completed

 

NA

 

 

6. Integration of Master data and Transactional data (Design of Integration Models)

The integration model controls the transfer of master data and transaction data. It is generated in the ERP system and contains all data that is to be transferred to the SCM system. It is uniquely identified by name and application. There are 2 steps in integration models

  • Create & Generate Integration models (Transaction code- CFM1)

          When you generate an integration model, you specify which data objects are to be selected from the total dataset in the ERP system for the                    transfer. To create the integration model, follow the below steps:

    • First select the object types (for example, material masters) to be selected on the Create Integration Model selection screen.
    • Next, you select specific selection criteria (in most cases, a material/plant combination) that further restrict the object types you have already selected. If you have already selected Material Masters, for example, you could now enter an MRP controller. In this way, you define filter objects
    • . Filter objects are used to select which data objects are transferred to a specified SCM system. In the example, all material masters for a particular MRP controller are selected.

 

  • Activate Integration model (Transaction code- CFM2)
    • The activation of integration models results in an initial transfer. If you work with SAP APO, the online transfer of transaction data is released.
    • As standard, the integration models to be activated are compared with the integration models that are already active. You can generate multiple integration models. However, only one version can be activated for each model at a time.
    • You can activate and deactivate several integration models simultaneously.
    • Integration models must remain active to enable online transfer.
    • Activation of master or transaction data integration model should be followed a logical sequence & recommended to grouped as below

 

      • ATP Customizing, Setup groups (group/key) and product allocation
      • Plants & distribution centers
      • Change master records
      • Class & characteristics
      • Material master (+ classes, characteristic)
      • Batches
      • MRP area and material master for MRP areas
      • Planning Products
      • Materials for GATP check
      • Product allocation
      • Vendors
      • Customers
      • Work Centers
      • Production data structure
      • Purchase Info Record, scheduling agreement, contracts

 

Note: - Normally, there are some challenges to generate & activate the material dependent integration model for master & transaction data during initial transfer, when the master data objects to be transferred are huge. In that case we need to do some work to design the integration model to have the optimum no of data objects per IM.

The critical data objects are:

  • Material master
  • Purchase Info record/source list
  • Sales Orders
  • Stocks
  • Batches

 

The following are the recommendations which can be used to do the initial data transfer successfully.

  • Split the material master integration model  by either material type / MRP controller/ material master number range or in combinations
  • If the data split is not feasible then add data objects in the sequential manner. It means, first create & activate the IM for e.g. material type FERT. Next time, add some more material type in the same variant & activate it. This way, there will not be heavy load on the system at a time & it will be distributed over multiple IM transfer.


6.1. Data Flow between ECC to APO & its Frequency

    • Master Data flow from SAP (ECC) --> APO

Master Data

Daily Batch Job- Once in a Day

Manually

Plant

No

Periodic, Need basis

Vendor, Customer

Yes; Create new, Change existing

Need basis

Material

Yes; Create new, Change existing

Need basis

Info Record

Yes; Create new, Change existing

Need basis

PDS

Yes; Create new, Change existing

Need basis

 

    • Transaction Data Flow SAP (ECC) --> APO

Transactional Data

Daily Batch Job- Once in a Day

Real Time

Purchase Requisition's

Yes; Activate IM for new APO relevant materials

Create, Change

Inspection lot

Yes; Activate IM for new APO relevant materials

Create, Change

Purchase Orders

Yes; Activate IM for new APO relevant materials

Create, Change

Stocks

Yes; Activate IM for new APO relevant materials

Create, Change

Stock Transfer Orders

Yes; Activate IM for new APO relevant materials

Create, Change

Sales Orders

Yes; Activate IM for new APO relevant materials

Create, Change

 

          Note: - The above mentioned data flow frequency from ECC to APO is just a recommendation based on various project experiences. Still, these can           vary from project to project depends on business requirements    .


6.2. Publication of procurement proposal from APO to ECC (Transaction code- /SAPAPO/C5)

    Transactional Data can be transferred from APO to ECC in two ways

  • Periodically – it is recommended to use the periodic transfer of SNP planning result to ECC. It improves system performance, locking issues, & time to review & correct the planning results before publishing to ECC.
  • Immediately/Real Time – This ensures real time data transfer and better consistency between APO & ECC. It is recommended to use this method for short term planning method e.g. PP/DS, CTM.

 

---------Contd... PART-2-----------------------------------------------------------------------------------------------------------------------------------------------------------------------

SCM Core Interface- Handbook (PART-2)

$
0
0

In continuation of PART-1

 

7.  CIF Error Handling and Queue Management

7.1.  Activate CIF Error Handling in SCM APO System

    • Transaction code- /SAPAPO/C2

              32.jpg

                  Here we choose the error handling method as Post-processing of errors. No splitting of LUW in order to activate the post-processing of the                         logical systems.

If you are transferring a large amount of data from SAP APO to SAP R/3, and you want to be ensure that an even system load is placed on SAP R/3, choose Inbound Queues.

Since we choose Inbound Queues in SAP APO, We did the necessary settings for the QIN Scheduler (Queue-In Scheduler) in the qRFC monitor in SAP R/3(As explained above). Queues that are to be processed automatically by SAP R/3 must be registered in the QIN Scheduler.


    7.2.  CIF Error Handling (Post Processing)

    • Transaction code- /SAPAPO/CPP1

This functionality in APO will allow the CIF user to see all the logged CIF error messages centrally in APO. This process is independent of the queue type (inbound or outbound) and CIF errors in both the system (ECC and APO) can be handled from this one transaction in APO.

CIF error handling ensures that all CIF queue entries are processed during the data transfer. Faulty queues no longer lead to queue blocks. Instead, they are logged in post processing records in the relevant target system for the data transfer. You can then call these post processing records at a later point in time in CIF Post processing. Once the error has been corrected you can then send the objects to the relevant target system again.

If a change to transaction data cannot be posted in the target system due to an error, the system creates a post processing record with the error status Faulty for the object concerned.

CIF error handling is not available in the following situations, which means that CIF queues hang when errors occur:

      • At the initial data transfer(Master Data and Transactional Data)
      • At the transfer of master data (initial and change transfer)
      • At short dumps or when liveCache is unavailable
      • When the target system is unavailable
      • When an object is locked in the target system (as before for the repetition of the transfer)
      • If errors occur in customer exits or BAdIs that run in CIF inbound function  modules during integration

You can find information about restrictions to CIF error handling when using certain applications and functions in SAP Note 602484.

Since not all errors are included in CIF error handling, faulty queue entries may continue to exist once CIF error handling has been activated. Faulty queue entries can also block objects that are resent by CIF post processing. Therefore, you still need to monitor CIF queues by using the qRFC monitors for inbound or outbound queues, or by using the SCM Queue Manager.

 

  • Steps for CIF Post Processing
    • Invoke transaction /SAPAPO/CPP1 in APO. Choose the target system and this should be the connected ECC system. Make sure that the indicator “Select Data from R3” is turned on. Processing status should be “1” for selecting the data which needs to be processed. However you can select the other statuses like processed, obsolete etc. depending upon the requirement.

              33.jpg

    • The navigation tree on the left side shows the system connection and the transfer directions; under each transfer direction, there are different order categories.
    • Select and process the records for each group of the “order categories”, e.g. “In-House Production”, “External Procurement”, etc.
    • These messages appear in detail on the right side of the screen and by double clicking on the rows for External procurement or In House Production.
    • On the right side of the screen there are two sections. One is worklist of the inbound system (top) and second is objects processed in this session (bottom)

              34.jpg

    • Select the records to be processed.  If there are too many records to be processed, you may sort them into different groups and process one group at a time.  You may sort them by products and by location, and select a few product/location in a group.

              35.jpg

    • Depending upon the object types and the direction of the record, you can either select send to R/3 or Send to APO.

         36.jpg

 

    • Once the records are in process, they will be moved from the “Worklist” window to “Objects Process” window.

                 37.jpg


    • Refresh the “Object Processed” window by click on the refresh button. If the errors are still persistent, it will be reflected in the status. Sometime this does not provide the actual error occurred.                 

                38.jpg


    • So To resolve the issue, you may select a desired entry and click on the application log button.

               39.jpg


    • The application log will be displayed

                40.jpg

      • Once the issue has been resolved, move the records back to the “Worklist” window to be processed again by selecting the record in “Object Processed” window and the “click on the icon.   Then you can process the object once again.
      • Resolve all the issues until the “Objects Processed” window is clear if it’s possible.
      • You use the Set Entry as Obsolete indicator to set the processing status of the post processing record to Obsolete (Set Manually); for example, if you do not want an object to be sent again. The object itself is unaffected by this action.
      • You use the Remove Obsolescence Indicator to reset the processing status to Still for Processing. As a result, the post processing record is displayed in yellow.


7.3.  Queue Management

As mentioned in the restrictions of CIF post processing, all the errors did not get captured in the CIF post processing. We still need to monitor the CIF queues.

    • CIF Queue Monitoring Steps and Procedure

Transactions to check for Errors (R/3 and APO)

SMQ1 - qRFC Monitor (Outbound Queue)

SMQ2 - qRFC Monitor (Inbound Queue)

41.jpg

42.jpg

      • Click on Change View.
      • This will show you only the error (SYSFAIL) queues with the error message.


  • /SAPAPO/CQ - CIF queue manager

43.jpg

                    Note: The indicator for expand mode is performance incentive. The Transaction will take little bit longer time if you set this indicator.

 

                    44.jpg


      • Double click on the error message on the left side box so that the details of the error will appear on the right side box:

                    52.jpg

Note:

      • Please do not delete any queue from this screen as this will delete the error queue as well as the queues which are waiting for the error queue.
      • Please select the queue and click on “ENGINE ICON”. This will take you to the same SMQ2 and SMQ1 screen where you can delete the single queue as well as you can reprocess the waiting queues once you delete the error queue.

 

  • Evaluate CIF Application Log

          Sometime the error message does not provide the detail information like product, location etc. The additional detail information can be seen in the           CIF application log. Please proceed as following:

         

               > In APO: Double click on the CIF error message within the CIF entry(Transaction SMQ2 and SMQ1) or execute transaction : /n/SAPAPO/C3


               > In R/3:

In order to get the details of the Error message:

Transaction in R/3: CFG1 - Display CIF Application Log

      • In both the cases copy the External or Transaction ID to get the details of the error message.

               45.jpg

    • Enter the External ID, From Date (Yesterday) and To date (Today) and Execute It.

     It will give the details of the error message displayed in the SMQ2 or SMQ1 transaction.

 

  • CIF Error Resolution Process
    • Each CIF error should be evaluated against the list of errors and actions below. Any new errors should be investigated and resolved if possible with the list of known errors and actions updated.
    • When the product, location and order number (if relevant) have been recorded along with the error message and necessary action then the CIF entry can be deleted.
    • If there is any new error comes up and needs some time to investigate, the same queue can be saved by selecting the option edit à
    • Once the saved queue is investigated and completed with the root cause analysis, then the saved queue can be restored (edit à
    • An email should be sent to the Key User clearly detailing the product, location and order (if relevant) and the action(s) to be taken.
    • When the Key User confirms that the action has been completed, an ad-hoc CIF reconciliation report should be run for the relevant product(s) to correct any differences between ECC and APO.
    • If no confirmation from the Key User is received, then differences will be picked up in the weekly CIF reconciliation report.

 

8. Practical Challenges

There are various challenges which CIF admin or functional consultants will encounter while integrating the ECC & APO systems. Based on various project experiences, the following are some of the issues which are encountered.

 

8.1. Huge CIF queue build up in APO in-bound

If you are using the inbound queues both in ECC & APO system, then you might get into this issue.

 

When the data is transferred from ECC to APO the data is not visible in APO. Similarly, when the planning results from APO are transferred to ECC, the results are not visible in ECC & not returned to APO with ECC order number range. This is applicable for master & transaction data, both.

All the transferred data will be blocked in CIF inbound queue (SMQ2) with “Ready” status & you need to manually activate each LUW individually. This may take long time to clear all the queues

These queues will not be displayed in CIF queue manager (/SAPAPO/CQ). If you will run the reconciliation report (/SAPAPO/CCR) report, the differences between ECC & APO changes will reflect. But if will again transfer the differences from CCR report then it will again add one more record in the APO inbound. So it is always recommended to check the CIF blocks, clear them & then use the CCR for reconciliation.

.

Root Cause: - This issue mainly pops up when the inbound schedulers are either not registered in SMQR or if registered but inactive (Type= “U”, it should be “R”) in ECC & APO system.

 

           Example: - Inbound scheduler is registered but inactive (Type-U) in APO.

           46.jpg

              You have made some changes in the existing purchase requisition in ECC & saved. This change should transfer to APO immediately. But when                you look into APO, the changes are not reflecting.

     The changes are blocked in the APO inbound with “Ready” status & need manual intervention to clear them       47.jpg


8.2.  Master & Transaction Data change transfer to APO is not real time

This issue is applicable for all type of transaction data & only those master data objects which are configured as immediate transfer, e.g. material master, vendor & customer master, resource.

 

When there are any changes in the existing master data objects or transaction data which are integrated with APO & part of active integration model, still the changes are not reflecting in APO. You will not find any queue blocks for these changes. If you will run the reconciliation report in APO for transaction data consistency check (/SAPAPO/CCR), you will not find any record with differences. These changes will transfer to APO when the CIF job will run for respective integration model.

 

Root Cause: - This issue mainly pops up when you have not activated the online transfer using business transaction events for application “ND-APO” & NDI.

For master data, you can additionally check the “change pointers” are active or not, for the message type for the objects mentioned above.

       48.jpg


8.3  Change pointers for master data changes are not recorded

This issue is applicable for all type of master data objects which are relevant to APO planning. 

 

When there are any changes in the existing master data objects which are integrated with APO & part of active integration model, still the changes are not reflecting in APO. You will not find any CIF queue blocks for these changes. These changes will not even transfer to APO when the CIF job will run for respective integration model.

If you will check the table BDCP & BDCPS (for change pointer), you will not find the change pointer that you are looking for.

 

Root Cause: - This issue mainly pops up when you have not activated the “ALE change pointers” globally in the transaction code BD61, or the change pointer for specific message type for the master data object in the transaction code BD50 shown below.


        49.jpg


8.4  Master Data Changes are not getting transferred to APO

This issue is applicable for all type of master data objects which are relevant to APO planning. 

       

When there are any changes in the existing master data objects which are integrated with APO & part of active integration model, still the changes are not reflecting in APO. You will not find any CIF queue blocks for these changes. These changes will not even transfer to APO when the CIF job will run for respective integration model.

You have checked that the master the change pointers are active globally & for all the required master data message types.

 

Root Cause: - This issue mainly pops up when you have multiple integration models active for the same object. You should check the active integration model for the master data object by using the transaction code- CFM5 in ECC.

Conclusion: - you should have only one active integration model for any unique master data object.



8.5  Custom field value change in material master is not triggering change transfer to APO

This issue is applicable for all type of master data objects which are relevant to APO planning and enhanced by adding Z custom fields.

   

When there are any changes in the Z custom field value (Enhanced master data objects with Z fields relevant to APO planning) in the existing master data objects, the changes are not recorded & transferred to APO. The objects are integrated with APO & part of active integration model. There are no CIF queue blocks for these changes. These changes will only transfer to APO when there is an initial transfer for the respective master data objects. 

 

You have checked that the change pointers are active globally (T.code-BD61) & for all the required master data message types (T.code-BD50).

 

Root Cause: - This issue mainly pops up when there is Z custom fields added in the APO relevant master data object but these tables & fields are not added in the transaction code BD62 for the specific message type. 

 

Conclusion: - All fields of the required master data objects should be maintained in the transaction code- BD62, which should trigger the change transfer upon any change in ECC.



8.6  APO orders are not getting transferred from APO to ECC

This issue is only applicable to the transaction data which are planned in APO & to be sending back to ECC system.

   

Planning run has created procurement proposals in APO. These procurement proposals have to be send back to ECC. The new or changed proposals are not reflecting in ECC system.  The transaction data is integrated with APO & part of active integration model. There are no CIF queue blocks for these changes. These changes are not reflecting even in the reconciliation report (/SAPAPO/CCR).

 

You have checked that the all the basic CIF configuration is maintained.

 

Root Cause: - This issue mainly pops up when the publication definition have not been maintained for the Plant & ECC logical system combination in APO. /SAPAPO/CP1



8.7   Poor performance of CIF background job

 

  • First, check the setting in transaction CFC2. It is recommended to use the “Normal” logging to keep the system performance better. For administrator, who is responsible to resolve interface issues, can be given detailed logging access with “debugging on”.
  • Second, schedule the performance improvement CIF batch jobs on regular basis, at least once in a week.
    • Archive logs using t.code CFGD
    • Configure the application log using t. code CFC6
  • Third, check the block size settings of the filter objects using t.code CFC3. The size of the filter object improves system performance.
  • The block size that should be used in each individual case is largely dependent upon the current data situation.
  • There is no thumb rule to define the block size. It’s a judgmental, hit & trial method to find the optimum filter size. It is recommended to use the default settings as a starting point.
  • Fourth, use Parallelized process wherever available.



8.8  Common configuration objects are not in sync between ECC & APO

There are various common configuration objects in ECC & APO which should in sync. This is one of the prerequisites before you initiate the master data transfer from ECC to APO. If the configuration is not in sync (object missing in APO) then there will be initial error in CIF data transfer.

Some of the common configuration objects are:

    • Insert Regions
    • Currencies
    • Calendar
    • Unit of Measure
    • Time Zones

 

Root cause: Standard configuration has been changed in ECC for the UOM, Calendars, or new configuration objects have been added. These changes must be updated in APO.

 

Solution: - you can update the UOM, Currency & Calendars using the RSA1 transaction if the connected ECC system is created & active under the SAP source systems. You can use the “Transfer global settings” from the context.

          50.jpg

              

               For other configuration object, either compares and maintains it manually, if missing, or use the compare tool under the Utility menu bar to                compare the objects from concerned ECC system & update all differences in one shot.

          51.jpg    


9.  CIF Housekeeping Job

  • Report – RAPOKZFX

     Detect and correct inconsistencies between material master and integration models with report RAPOKZFX. In rare cases, inconsistencies can occur      between data in integration models and field APOKZ in table MARC. They may occur if you activate a model that refers to a material master that is      being changed at the same time. In this case, the activation is finished successfully but the APOKZ is not set correctly, and an error message is      displayed. The inconsistency can result in an error during the ATP check and when transferring production and planned orders

 

  • Report – RCIFMAX

     As of R/3 Plug-In 2002.1, report RCIFIMAX should be scheduled regularly to find Inconsistencies between the integration model sources and their      runtime versions. This report must not be run in parallel with activations of integration models.

 

  • Report – RSQOWKEX & RSQIWKEX – (Exceptional use only)

You can activate qRFC queues using the reports RSQOWKEX (outbound queues) and RSQIWKEX (inbound queues). In normal operation, however, it is not necessary to run these programs regularly because almost all queue entries are processed without errors. In case of queue errors, these should be detected by the procedures described below, and analyzed and corrected accordingly. The error analysis should suggest preventive measures to reduce the number of future exceptions. In exceptional cases, or, for example, on test systems, you can use reports RSQOWKEX and RSQIWKEX. If you start these reports at an inappropriate time or with too many queues selected, they may cause an excessive additional system load.

 

  • Report - /SAPAPO/CIF_DELTAREPORT3

Detect and correct external inconsistencies between APO and R/3 with report /SAPAPO/CIF_DELTAREPORT3 (transaction /SAPAPO/CCR). To ensure that all relevant transaction data objects (such as purchase, production or sales orders, and stocks) for which there are active integration models exist in both APO and R/3, this report should be scheduled to run:

    • Periodically, and preferably daily, to detect and reconcile possible inconsistencies as soon as possible. This is important because otherwise further inconsistencies can be generated and cause subsequent planning to be based on incorrect data.
    • In case a recovery of your liveCache or your APO database had to be executed, but was incomplete (point-in-time recovery, loss of data,)
    • In case you have evidence of inconsistencies between your APO and your R/3 OLTP system
    • In case queue entries have been deleted.

 

  • Report - /SAPAPO/OM17 -  ( In case of Recovery only)

Internal consistency between APO DB and live-Cache is checked by transaction /SAPAPO/OM17. If it is necessary to reconcile the internal consistency, for example in case of a recovery, we recommend doing this first before checking and reconciling external consistency.

 

 

10.  Performance Optimization Job

To optimize the performance of the data transfer between the APO and the connected R/3 OLTP system(s) and to prevent accumulation of useless data in the systems, several reorganization jobs must be scheduled to run regularly.

  • Administration Jobs Related to Data Transfer (R/3)
    1. Delete application log with report RDELALOG. If writing of application logs is enabled in (R/3) transaction CFC2 or APO transactions /SAPAPO/C4 or /SAPAPO/C41) – and this should be done in a production system for certain users and for problem analysis only – old logs must be deleted regularly. It is recommended to run this job daily and delete logs older than 7 days.
    2. Delete ALE change pointers with report RBDCPCLR. If changes to master data are transferred periodically via ALE (as it is recommended), processed change pointers must be deleted regularly. After completing this, if your database system on the R/3 side is Oracle, run report RBDCPIDXRE to reorganize the Oracle indexes on tables BDCP and BCDPS. See SAP Note 328355.
    3. Delete old inactive integration model versions with report RIMODDEL. Every time an integration model is generated, a new version is created, distinguished by a timestamp. The old version is deactivated and the new one is activated. Old versions must be deleted regularly.

 

  • Administration Jobs Related to Data Transfer (APO)
    1. Delete application log with report /SAPAPO/RDELLOG. Same as RDELALOG in R/3 (see above).
    2. Check processing of APO change pointers with report /SAPAPO/RDMCPPROCESS. To verify that all change pointers created are processed, after publishing of planning results to R/3 run report /SAPAPO/RDMCPPROCESS without restricting the selection of orders and confirm that message “No change pointers were selected” is displayed. If change pointers remain unprocessed, contact the application support team to clarify whether these change pointers are necessary and why they are not processed.
    3. Deletion of R/3 data those are no longer required in APO with report /SAPAPO/SDORDER_DEL.
    4. Delete old results of CIF delta report using report /SAPAPO/CIF_DELTAREPORT3_REORG. As it is now possible to save the results of a Delta report run, it is necessary to delete outdated results from the database. The spool list from this report contains the number of records deleted.
    5. Delete post-processing records with report /SAPAPO/CIF_POSTPROC_REORG. Processed and obsolete post-processing records are no longer required and should be deleted. This report is used to do so. Non-deletion of these records will have an increasingly negative impact on CIF performance over the time. The deletion is a two-step process. In a first run, outdated records that meet the selection criteria with the status still to be processed are set to status obsolete (set manually). In a second run, all processed and all obsolete records are deleted.


  • Note:

          a)     Deleting change pointers may cause inconsistencies, as the corresponding order changes are not transferred to R/3.

 

In SAP APO database tables, the tables expand with data from SAP R/3 documents. However, this data is no longer required; no corresponding information exists in liveCache. In addition, the performance of the initial data supply or of other transfer processes with a high data volume is affected negatively. The obsolete records need to be deleted regularly to control the size of certain tables (e.g. /SAPAPO/SDFIELD and /SAPAPO/POSMAPN) and to improve the performance of the Sales order updates on SAP APO side. For details, see SAP Note 504620.\

 


11.  Real CIF Error encountered in various project


  • Error: CFEP000000043548  ED1CLNT120            @OJ@            Internal number assignment not defined

Cause:- There is no internal number range maintained for purchase  requisition document type “NB”.

Solution:- Create pur req no range in ECC & enter it into the Document types under purchasingàPur Req for type NB “internal no rng”

 

  • Error: CFEP000000043548  ED1CLNT120            @OJ@            Enter Purch. group

Cause:- Purchasing Group is not maintained in ECC system for the given material.

Solution:- Maintain the Purchasing group in the material master & reactivate the queue.

 

  • Error: CFIP009000001374   ED1CLNT120            @OJ@            21.12.2015 date comes after end of valid factory calendar

Cause:-The planning calendar validity is ending in 2010 while the orders dates are lying in 2013.

Solution:- Extend the planning calendar.

 

  • Error: CFIP000100000896   ED1CLNT120            @OJ@            Messages for WM staging: Check order 100000896

Cause: The Production supply area has not maintained in the resource or in the BOM. System first picks the Prod Supply area from the resource master & if that is not maintained in the Resource then it will look in the BOM component.

This is used if the Warehouse management is active.

 

        Note:- In the SPRO (Log Exe:--Wh mgmt àinterfaceàdefine production) production supply area is maintained or use trxn code OMK0.

 

Solution:-Maintain the Production supply area & should be assign to the Resource (for the master recipe for which component is assigned) or assign directly to the BOM component. Ensure that the storage location assigned in the Production supply area for a component should be assigned in the MMSC trxn.

 

 

  • Error: CFIP0000072359       EQ1CLNT210            @OJ@            Backflushing only possible for materials kept in stock

Cause: The Backflush indicator in the material master is “1” always backflush. While the valuation class in the material master is WIPS – WIP inventory stock.

Hence while converting planned order into process order system reads the accounting view data to calculate the cost & determine the accounts using the valuation class. That time system is giving the CIF error.

Note:- The Qty & value updating field in the material type config is not relevant to this error.

Solution:-Either changes the Valuation class in the accounting view of material master or else change the back flush indicator from 1 to blank.

 

  • Error: CFPLO000100000382           MQ1CLNT210           @OJ@            Two constraints that exclude each other or create unrealizable situation

Cause:

Process Order directly created in ECC system & sent to APO. In ECC Master Recipe is created but the production version is not created & also not CIFed to APO. Hence the error occurred.

 

Note: - Following observations:-

  • Delete the queue.
  • Create the production version in ECC but don’t CIF to APO.
  • Now if you will create the Process order in ECC system then it will go to APO without any error. But there will not be source of supply in the process order in APO. The reason is that the PPM is still not CIFed to APO.
  • All the operation will also appear in APO but without any operation description.
  • You can also schedule the order in APO without PPM availability but the dates will calculate based on ECC master data not as per APO.

 

     Resolution:-Create the Production version in ECC & send it to APO.

 

  • Source of supply () not in source list (Material/Plant) despite source list requirement
    • Vendor is not included in the source list (if it is purchase info record).
    • Vendor /scheduling agreement is not included in the source list for the material plant.
    • The validity dates are not in the range of the purchase requisition.
    • The validity dates of the Pur Info record maintained in the info record  is different then valid in the source list.
    • The validity date of the source of supply in APO is inconsistent with the validity dates in ECC system.
    • Source list indicator is set in the material master “Purchasing view”.
    • Source list is maintenance is mandatory at plant level irrespective of the material type (In-house or External)

     Solution:-

    • Maintain the vendor or scheduling agreement in the source list with the consistent validity date in ECC & APO.
    • The validity dates should be same in ECC & APO for the outline agreement & the purchase info record.
    • If after removing the “Source list indicator” from the material master still the same error is reflecting that means in the source list configuration the ‘Source list indicator” is mandatory for that location.


  • Not possible to determine shipping data for material MAT1  at YYYY
    • Check the Sales organization data of Material master and it should be maintained with correct sales area which is maintained in customizing “Setup for Stock Transport Order”.
    • Make sure that the Customizing for “Set Up for Stock Transport Order” is correctly maintained. Ask the user to check the same and if necessary take the help of R/3 Support team.
    • In order to check the Customizing for “Set Up for Stock Transport Order” , follow the path : SPRO à Material Management à Purchasing à Purchase Order à Set Up for Stock Transfer Order.

               Here please make sure that

      1. Receiving plant is assigned to correct customer Number created by the Sales area of supplying plant.
      2. Also check correct delivery document type (NL-Intra Company and NLCC-Inter Company) has been assigned to the supplying plant.
      3. A correct purchasing document type should be assigned to the combination of receiving plant and supplying plant.

 

  • No sales area is assigned to sold-to party XXXXXXXXX and plant YYYY
    • Check in the customizing of R/3 that whether any sales area is assigned to this VMI customer and plant combination. The Path to check this is SPRO àIntegration with Other SAP Components  à Advanced Planning and Optimization à Application Specific Settings and Enhancements à Settings and Enhancements for Sales Order à Settings for Vendor Managed Inventory à Assign Sales Area and Order Type to Ordering Party/Plant.
    • This setting is required if the customer is a VMI customer for one plant for some products without which the sales orders will not get any numbers from R/3.



12.  CIF- Important T-codes


R/3:

Transaction code

Description

CFM1

Create integration model

CFM2

Activate/deactivate integration models

CFG1

View CIF application log

CFC2

User parameters for CIF

CFC3

Block sizes for initial transfer

CFM5

Filter object search in integration models

CFC1

Define logical systems as APO systems

NDV2

Setting of release level of APO systems

SMQ1/SMQ2

qRFC monitor incl. functions start, stop, execute

SM59

Definition of RFC destinations

SALE

Definition of logical systems

 

 

APO:

Transaction code

Description

/SAPAPO/C3

View CIF application log

/SAPAPO/C4

Setting of user parameters CIF

/SAPAPO/C5

Send planning results to R/3

/SAPAPO/C1

Create business system group

/SAPAPO/C2

Assign logical systems to a business system group

/SAPAPO/CQ

SCM Queue Manager

/SAPAPO/CCR

Comparison/reconciliation tool

SMQ1/SMQ2

qRFC monitor incl. functions start, stop, execute

SM59

Definition of RFC destinations

SALE

Definition of logical systems

/SAPAPO/CPP

CIF Post processing

 

References

  • SAP Note: 563806
  • SAP Note: 369007
  • SAP note: 786446
  • SDN: www.sdn.sap.com
  • SAP Help: www.help.sap.com
  • For more information, visit the Supply Chain Management homepage



gATP/Rule based/ Enhanced Customizing

$
0
0

Rules Based ATP – Enhanced Customizing Settings

 

One of the powerful feature of GATP is rule based ATP where location/product can be substituted based on rule maintained. Check
instructions are maintained to activate Rules based ATP and rules are modeled to achieve various business objectives. Few common scenarios become complex due to existing check instruction options.


For example, scenarios where RBATP is not active for all materials, we need to define different check instructions and another big task to achieve check mode for those materials, either by material master update or enhancement. 


Designing solution becomes easier after advance check instruction option for RBATP.  Advance options can be achieved implementing BADI enhancement for customizing (please refer implementation section).

 

In Newer version SAP provides advanced options related to RBATP. Following three areas got extra options which helps achieving better business results without enhancements

 

  1. Start Rules
  2. Create Sub-item
  3. Copy Segments  1.png

Check Instructions RBA Frame Screenshot after Advance setting BADI implementation

 

Start Rules:

In business use cases, where RBATP is not necessary for all business scenarios for same material location combination, then it becomes
difficult to have different check mode to have different check instruction. It can be achieved by having different check instruction for corresponding different check mode. Determination of check mode can be done using configurations i.e item category based determination or custom enhancements

 

By having option of Y,It has been easy to handle situation, single check mode and check instruction will work for all material location combination where RBATP is required or not required. Just need to model rule determination as per requirement.

 

Option Y basically eliminate the need of various check instruction in case of RBATP was not required all scenarios. It avoids error in case of rule is missed.

2.png

X- Evaluate Rule Immediately (Similar to old Option start immediately)

Y – Evaluate Rules immediately if no rules, check Original requirement (new option, very useful when Rule is not maintained for all
combination)

Blank – Evaluate rules only if requirement is not full confirmed (Similar to old option)

 

Create Subitem:

New line item has been a pain in most of the cases for rule based ATP substitutions. SAP has come-up to suppress Subitem to avoid TAPA line
item in case of location substitution. In enhanced customizing, we see 2 more options for merging Subitems.

Specifies whether subitems of the input location product are to be merged into their main item after the execution of rules-based ATP (RBA).

 

M – Merge Subitems into main item (new)

For as of SCM 7.0, You use this value if you do not want Subitems to be created if the main item and the confirmed substitutions have the location product and other data in common. Under certain conditions, the RBA can include the confirmations of the confirmed substitutions in the main item and does not create Subitems for the confirmed substitutions. (This procedure is called "merge Subitems".) Note that the system is not allowed to perform a merge if both a partial confirmation and a remaining requirement exist for an item.

 

As of 7.01, Merge option is available after activating business function. If you have selected the Merge Subitems checkbox, the system is allowed to perform a merge if a remaining requirement exists for an item


N – Never Create Subitems (New)

You can use the value N in RBA scenarios in which you want to prevent the creation of subitems.


X – Always create Subitems (Old)

 

Blank  - Create Subitems (Old)

3.png
Copy Segment:


What data should be copied over while doing ATP check in substituted plants, SAP has come up with following options

 

A – Check Mode Cross plant

 

B – Check Mode and Sub-location cross-plant

 

C – Check Mode, version and sub location cross plant

 

X – Check Mode and version cross plant

 

Blank – Do not copy segment (Old)

4.pngImplementation

To achieve enhanced customizing implement BADI using IMG path, need to follow the below path in SPRO.


SPRO --> Advanced Planning and Optimization --> Global Available-to-Promise (Global ATP) --> Enhancements --> Rules --> Implementation:

Enhancement of RBA Customizing

5.pngDeactivation of settings

If you want to deactivate, you can follow the same path which you used to activate. It will deactivate the implementation of BADI.

 

You use report /SAPAPO/CHECK_CUST_V_ATP07 to revert values of configuration. Below are valid values


Start Rules


Blank, X


Create Subitem


Blank, X


Copy ATP Segment Data


Blank

 

6.png

(/SAPAPO/CHECK_CUST_V_ATP07) - Report can be execute in display mode

 

Prerequisite –
SAP Supply Chain Management    (SAP SCM) 7.0 SP 14

SAP enhancement package 1     for SAP SCM 7.0 SP 11
SAP enhancement package 2     for SAP SCM 7.0 SP 08

 

Alternatively, implement SAP Note 1832840

 

References : SAP Note 1832840, help.sap.com/scm703

Insight of Planning Service Manager in SAP APO SPP

$
0
0

This document will helps you in understanding the insight of Planning Service Manager (PSM) and the steps you have to follow in order to use PSM functionality in SAP APO SPP (Service Parts Planning).

 

About SPP– Service Parts Planning is a dedicated module provided by SAP to support the business solutions for Automotive Industries where the sales behaviour of the spare parts is too volatile and unpredictable. SPP is high breed product which incorporates advanced features when compared to standard APO planning modules like SNP etc.

 

Refer the link for more information - Service Parts Planning (SPP) - SAP Library

 

About PSM– The whole planning process of SPP revolves with PSM, We can say PSM as the heart of SPP module. SAP has delivered group of predefined services for each of the functionality ( i.e, for example you don’t have to use/perform Mass Processing for Forecasting, SNP Planning run, Deployment runs etc as we do in APO DP/SNP modules), All you've to do is to make use of the appropriate services provided by SAP for your planning run.

 

Core areas where we use PSM in SPP

  • Managing Demand and Realignments (BOD, Stocking, Super session)
  • Forecast
  • Product Interchangeability
  • Stock Planning / Inventory Planning
  • Distribution Requirements Planning (DRP)
  • Deployment
  • Inventory Balancing
  • Surplus and Obsolescence Planning
  • Shortage Analysis
  • Inbound Delivery Monitor
  • SDPR
  • Service Fill Monitor

 

Requirements for PSM(Required valid Authorization to configure the below steps)

  • Create a Selection for PSM
  • Create a Process Profile
  • Create a Planning Profile
  • Create a Trigger Group

 

This document will walk you through how to do the forecast using PSM services.

 

NOTE : Assumption - All required setting for forecasting was already maintained in the system, Like – Service Profile, Demand loaded in APO BW, General configuration Settings for Forecasting, Master Data, Trigger Activations, Easy Access settings ..etc).

 

SPP completely differs from the standard APO method of forecasting which we do in demand planning, i.e.SPP does not use Master Forecast Profile, Planning Area, Planning Book, Storage Bucket Profile, Planning Bucket profile etc.

 

Some of the available PSM services related to forecasting

 

  • SPP: Leading indicator based forecasting  (SPP_LFI_SERVICE)
  • SPP: Forecast Service (SPP_FCS_SERVICE), Statistical, Composite, Automatic Model Selection etc.
  • SPP: Forecasting for TPOP parts (SPP_FCS_SERVICE_TPOP)
  • SPP: Forecast Disaggregation Service (SPP_FCST_DISAGGREGATION)
  • SPP: SPP: Forecast Service (StdDev. Regular) (SPP_FCS_SERVICE_MSE)
  • SPP: Forecast Approval Service (Regular) (SPP_FCST_RELEASE)

 

NOTE– Scope of this document will be using only one service in order explain how to use PSM service for forecasting.

 

Step 1

 

Create a selection profile.

 

TXN - /N/SAPAPO/PE_SEL

Report - /N/SAPAPO/PE_SELECTION.

 

Update the input parameters like Location, Product, Product Group etc which will serve as the selection criteria for four PSM job.

Selection report.png

selection profile.png

In the above example we are going to do the forecasting for a single part across all locations.

 

1– Define the type of selection for your planning services, as we are going to run forecasting we will go with Location Product type selection.

2– Define the name of the selection, You can create or edit your existing selection if you have one.

3 – Define the planning objects for which the planning should be done, in our case its just for 1 product “XXXXX” across  all location “ * ”

4 & 5 -  you can define further selection parameters to refine your planning object to an detailed level.

further selection.png

6 - Once you've filled your details click on "Save Selection" Capture.JPG

 

Step 2 (Not needed if you want PSM to run across all the triggers available for a planning object with respect to corresponding Service)

 

Follow the Menu in Customizing screen.

 

Create a Trigger Group. IMG -->SCM BASISàDATA MANAGEMENT LAYER --> TRIGGERS --> DEFINE TRIGGER GROUP AND ATTRIBUTES

Trg group.png

You define which are all the triggers your PSM job should take account of during the execution of services (Forecast Service in our case), by creating a trigger group you can further restrict the selections so that within the product locations only those products will be selected for further processing which has at least one of the Trigger that you defined in your Trigger Group.

Click on new entries.png and give a name and Description.

Trig Grp name n desc.png

I’ve created a Trigger Group called TEST and assigned below 3 standard (Activated) Triggers to it.

Trigger group 1.png

Have assigned the following standard SPP service SPP_FCS_SERVICE to the trigger group TEST.

Trigger group services.png

You can choose which Selection Type like Product / Location /Location Product have to be used ( Useful when selection with same name but different types exist).

Trg grp sel types.png

You can include / exclude a Trigger of a specific service if needed. Better to leave them blank unless conditionally needed.

trg grp include.png

Step 3

 

Create a Process Profile.

 

TXN - /SAPAPO/PE_CFG3

IMG --> SCM BASIS --> PLANNING SERVICE MANAGER --> DEFINE PROCESS PROFILE

 

Process profile will serve as a container for all of your planning parameters of the PSM job which tells the system how to execute the services technically and what are header wrap up should be followed before executing the PSM services

 

The system creates packages of planning objects (location products) that are processed together according to these settings. You can specify the package creation method and maximum and minimum package sizes. You can also specify whether the system processes the packages in parallel.

 

At the end of a process block or a package, the system always calls up the STORAGE SERVICE with the DATA MANAGER.

 

  • &INIT: Clears all data manager buffers (deletes all data in the buffer)
  • &SAVE_ALL: Saves all new and modified data to the database
  • &SAVE+INIT: First saves all new and modified data to the database and then clears all data manager buffers

 

Default is &SAVE + INIT.

 

IMG --> SCM BASIS --> PLANNING SERVICE MANAGER --> DEFINE STORAGE SERVICE PROFILE

storage service.png

STORAGE SERVICE & DATA MANAGER will tell your system when and how to save the processed results.


Creating a Process Profile.

Process Profile.png

1 – Define Name and Description.

2– Define Package Creation Method,

Method – How should your planning objects grouped before creating a package, Min Pack Size, Max Pack Size.

3 - Package Reprocessing (optional) – Whether you want the system to reprocess the failed package during the execution.

Read Trigger First (optional) – whether you want the system to read the triggers of your trigger group first before processing the planning objects filled in your selection.

PSM selection Result – How you want the PSM engine to deal with your selected Planning Objects.

Buffer Selection (optional) – Works when you use example (dummy) service in PSM.

4 – Parallel Processing related settings , its advisable to use this efficiently.

5– How you want to store and use your application logs

 

Step  4

 

Create Planning Profile

 

TXN - /SAPAPO/PE_CFG

IMG --> SCM BASIS -->PLANNING SERVICE MANAGER --> DEFINE PLANNING PROFILE

 

Define name and description

Plann prof.png

Define Process Block

process block.png

You can create more than one Process Block within a single Planning Profile. Its like Step within a batch job. I've filled our planning profile with our Selection, Process Profile & Trigger Group what we had created earlier.

 

Adding PSM services to the Planning Profile --> Process Block

service list.png

Sequence number– You can add multiple PSM services within a Process Block.(example - you can add Forecast Disaggregation Service with sequence number 2)

Planning Service– I've added the relevant PSM service which will calculate the general forecast (define the suitable service as per your need).

Service Profile– Profile which will have complete control over the PSM service (Parameters defined for the service which will tells the service how to do the calculations)

 

All Set, Lets run the PSM job.

 

Two ways to do that

  • Dialogue processing.
  • Background processing (Scheduled Batch job / directly from screen).

 

TXN - /N/SAPAPO/PE_RUN

SE38 - /N/SAPAPO/PE_EXEC

psm run.png

  GOTO TXN - /N/SAPAPO/PE_RUN and click onexe button.png when you want to run the PSM in Foreground.

 

When you run in the background you can see the job (SM37) log as below. Our product with the Selection have at least one trigger of our Trigger Group in 46 location

job log.png

You can see PSM ran for 1 product  46 planning objects (location Product) got created.

 

You can view the logs of the PSM run with TXN /N/SAPAPO/PE_LOG_DISP


NOTE - Below topics of PSM are not covered in detail with this document and they are always provided by SAP, Still you can create user defined PSM services, Pakage Creation Methods, Data Managers etc.

Capture1.JPG

Check the below link for more information – SAP Library

 

Use of the Planning Service Manager for SPP - SAP Library


APO SPP Manual Adjustment of Demand History - Do's & Dont's

$
0
0

I Request SCN to allow me to dedicate this document for my Good Friend Vijay Varadharajan.


Overview


This document helps the forecaster’s / Supply chain analyst to follow the precautions while manipulating the data with the Manual Adjustment of Demand History screen in SAP so that inconsistencies within the system could be prevented.

 

Transaction - /n/sapapo/sppdmdh.

 

Do’s

 

When you’re updating the histories in any of the below highlighted Key Figures (Any changeable Key Figures), Please adhere to the steps explained below.

1.png

Step  1

2.png

  • Select the KeyFigure Line and then click on “Change” button
  • Enter the value into the respective bucket where you want to change the figures.
  • Once done click on the “Save” button, in the series as explained in the above screen shot
  • The below confirmation will pop on successfully saving the entries.

3.png

Step 2.

  • If you like to work on different part you can stay with the transaction and you can work without leaving the transaction by following the step 1.
  • If you like to work on the same part then you must leave the transaction and wait for a minute so that the data what you’ve edited previously will get updated into the system respectively (Need a minute for Daemon to Update the data into DEMCRT).
  • Still if you want to update the same Product-Location-Bucket without leaving the transaction. Make sure you wait for a minute and refresh the screen until you see the changed numbers reflecting in “Demand: Final History”, if you see the values updated into Final History then proceed with further changing.


CAUTION- Please leave the transaction and come back again after a minute if you want to edit the historical values for the same Part, same Location and same time bucket combination.

In our example, the user tried to edit the history from 25 (refer above screen shot) to some other value for the same Part, same Location and same time bucket.

4.png

System warns the user as the data for particular Location-Product is been locked ( i.e Daemon still in progress of updating the data from DEMCRT to the Multiproviders), since the user just edited and saved the value for the same combination within a minute ago for which the system trying to update the new value across the system.

When system shows the above warning message, it strongly suggests you to hold the changes for the particular part and it wants you to try after sometime (May be after 1 min), it’s advisable to leave the transaction and come back again for the next change after a minute.

 

Don’t’s


Here’s an example where the user violated the STEP 2 which leads to inconsistencies in the system for a particular part’s Historical values.

 

In the Time Bucket M.06.2011, the original history was 143. User changed the history to 15 first time and then tried to change the value again. Now the system warns the user with below warning message as the previous change was not yet updated into the system completely.

 

User violated the system warning and changed the history to 10 in Time Bucket M.06.2011 within a minute and without leaving the transaction.

5.png

However, Doing so the system will show you the message as the data got saved successfully as given in the below screen shot.

6.png

7.png

As a result of the above violation, Negative values in the demand Histories created in the system.

143 – Initial demand History.

 

First change – 15 (143 – 15 = 128), so -128 from 143 of original demand history for the product-location-M.06.2011 combination has been captured by Daemons but not yet updated into the system.

Second Change – 10 (143 – 10 = 133), so -133 from 143 of original demand history for the product-location-M.06.2011 combination has been accumulated by Daemons to its previous value and yet to be updated into the system.


Here is how the system will update the Histories finally.

143 (original Dem History) – 128 (1st Change) – 133 (2nd change)

143 – 261 = -118


As a result, when you try to view the Part’s forecasting information in Forecast UI for the Product-location combination system will throw out an error saying “Inconsistent Historical Data” as given below.

8.png

CAUSE - As for the reason!! It is a consequence of the deamon updating 9ADEMCRT records to 9ADEMMUL with a frequency of one minute. The locprod is locked as long as data is saved to 9ADEMCRT. (Technically it is not possible to lock it further.) And then if you change demand during the intermediate time between deamon runs, demand is calculated how much to be adjusted.


Solution: Running Stocking Realignment may resolve this kind of inconsistencies

 

NOTE– There could have been NEW notes released by SAP to overcome this issue, But am not sure.


Check these Notes as well.

1836600 - Inconsistent historical data of orders with rejection reason

1673288– Negative Forecast is calculated

Heuristics for fixed pegging relationships

$
0
0

SCM APO PPDS

   SCM APO PPDS has two heuristics- Delete fixed pegging relationships and Fix pegging relationships.


Pegging relationships

       The fixed pegging relationships in SAP APO are retained even after a document change in the SAP R/3 system. For example, if you have created fixed pegging relationships between a sales order and a planned order, then the pegging relationship to the sales order still exists even after the planned order has been converted into a production order.

 

Fix pegging relationships heuristics - SAP_PP_019

       Upon executing this 'Fix pegging relationships' heuristics, fix pegging quantities gets generated.

This can be verified in the transaction /sapapo/peg1 under 'Fix Pegged Quantity'.

 

Delete Fixed Pegging Relationships - SAP_PP_011

      Upon executing this 'Delete Fixed Pegging Relationships' heuristics, the earlier created fixed pegged quantities would get deleted.

This can be verified in the transaction /sapapo/peg1 under 'Fix Pegged Quantity'.

 

 

For more information on fixed pegging relationships, you can go through the following link.

Heuristics - Production Planning and Detailed Scheduling (PP/DS) - SAP Library

Hello, macro(2)

$
0
0

(Re-publish)

Previous posts: Hello Macro (1) http://scn.sap.com/docs/DOC-44540

     

    c.    Change mode and Data source

    

When you create a target row (or column) in the macro, you'll need to pay attention to the 'Change Mode' setting of it. In most cases we use the default 'Value Change' setting, which means we want to set the quantity value in the cell, but in some cases, we'll need to use 'Attribute Change' for some functions, like setting background/foreground color, visibility or change mode of a cell/row/column. So when you try to set some 'attributes'  instead of values of the planning book with provided macro functions and it does not work, first check whether change mode is set correctly or not.

 

1.png

Except for 'Value Change' and 'Attribute Change', there're also some other options for this field (some of them are only available in higher releases after SCM7.0).

The last four of them are only for the key figures that are set as 'Fixable' instead of 'Simple' in the planning area settings.

 

- Redisaggregation

    This setting forces the re-disaggregation to detailed level (CVC level) after data is set to this key figure.

    If you only use 'Value Change' here, the data won't be disaggregated until it is saved to liveCache.

- Initialization

    In case that you have "differentiate between zero and initial" functionality in the system (higher that SCM5.0), you may want to set a key figure to 'initial' instead of 'zero'.

    To achieve this, you use this functionality.  SAP notes 1223998 and 1068603 explain the usage in details.

- Value Change with Preceding Defixing

- Value Change with Following Fixing

- Adjustment of the level Fixing

- Adjustment of the level Fixing with Subsequent Fixing

   These four options are used to control the fixing functionality in a key figure. The either fix or de-fix the value, or make adjustment at different levels according to fixing situation.

 

Comparing to 'Change mode', 'Data source' is a setting for the source rows.

For example, you have a macro that calculates KF1 = KF2, then 'Change mode' is set for KF1, while 'Data source' is set for KF2.

Since the default setting is 'Value', you'll need to set it explicitly if you need 'Attribute' of a row or column. Generally 'Row Attribute' will return the row number while 'Column Attribute' will return the column number.

 

2.png

 

2.  Which macros are executed during my process on the planning book?

 

There's an command 'MSDP_DBO', which you can input it into the command field, any time before the process you want to check.

You'll need to open any selection (load some data) before you activate this command.

Enter the command when you're in /sapapo/sdp94.

 

3.png

 

Then press enter, you'll get a confirmation message in the status bar.

 

4.png

 

Now perform the process you want to check, for example, load some data, and you'll get pop-up window telling you which macro is to be executed.

 

5.png

 

 

3.  How to debug my macro to identify a problem?

 

1) Enter point to debug a macro

    - Set Break Point in function module /SAPAPO/ADVX_MACRO_CALCULATION. Every macro is called up in this function

      There's a statement at around line 500 -

         LOOP AT l_t_macro_sequence ASSIGNING <fs_s_macro_sequence>.

      Coding  before this line is to collect the macros to be executed into table   l_t_macro_sequence[]. If you set breakpoint here, you'll get to know what macros are to be executed.

 

      Then at around line 600, macro program is called up by statement:

        PERFORM (<fs_s_macro_sequence>-gen_formname) IN PROGRAM (<fs_s_macro_sequence>-gen_prgname)

      Here <FS_S_MACRO_SEQUENCE>-GEN_PRGNAME is the generated program name for the macro book, which you can review in transaction SE38, while <FS_S_MACRO_SEQUENCE>-GEN_FORMNAME is the routine which corresponds to the macro. It is the start point of the macro.

 

     If you step into the routine, and check the comment lines above the routine, you'll find:

     *&---------------------------------------------------------------------*

     *&      Form  4LD09IE0HIWENBKIVQ6JN97VG                        ->This is  <FS_S_MACRO_SEQUENCE>-GEN_FORMNAME

     *&---------------------------------------------------------------------*

     *       #TR# Generated Form Routine for Macro

     *       Macro ID  : 4LD09I6BYKAP4P12PW47D795O

     *       Macro Name: Forecast = Forecast * AddKF1                ->This is your macro name defined in macro builder

    *----------------------------------------------------------------------*

 

      Then you scroll down till the line with below comments:

      *&---------------------------------------------------------------------*

      *& #TR# (M.5.8) Macro execution: Call Step Form Routine

     *&---------------------------------------------------------------------*

      Below it you can see a series of 'Perform' statement, each of which corresponds to a step in the macro.

 

      If you step into the routine, and check the comment lines above the routine, you'll find:

     *&---------------------------------------------------------------------*

     *&      Form  4LD09L5VC0QMDGLCZL0Z9YR6K                     ->This is  routine name

     *&---------------------------------------------------------------------*

     *       #TR# Generated Form Routine for Macro Step

     *       Macro ID  : 4LD09I6BYKAP4P12PW47D795O

     *       Macro Name: Forecast = Forecast * AddKF1            ->This is your macro name defined in macro builder

     *       Step ID   : 4LD09LDJUZCBW34T5F3BK0PWC

     *       Step Name : Calculate                                                  -> This is your step name defined in macro builder

     *----------------------------------------------------------------------*

 

     ** Notice that displaying macro name and step name in the routine's comment is only for higher versions (SCM 7.0 or higher)

 

     - Use MSDP_DBO command, and drag 'Debug' shortcut or 'Debug' script onto the popup window to start debugger

       Push 'Generate a shortcut' button which is on the general took bar of SAP GUI, enter below parameters and push 'Finish' to create a debug shortcut.

        

6.png

 

                    Or enter below text into a text file to create a 'Debug' script.

                         

                [FUNCTION]

                Command=/H

 

                Title=Debugger

                Type=SystemCommand

 

         Then when you get the popup window as below, just drag the shortcut or script directly on the window, and press the green tick to continue, you'll then enter the debugger.

         Press F5 once, and you'll be led to the macro relevant coding.

 

   - Find the program of your macro and step directly.

 

     In the macro builder, if you select the menu Edit -> Book Information, you'll get a screen that shows some useful information of your macro book.

     There is a filed called 'Generated Macro Program'. Here you can get the program name corresponds to your macro book. (The <FS_S_MACRO_SEQUENCE>-GEN_FORMNAME in 1) ).

     Open this program (should be quite long) in SE38, and look for your macro name or step name with 'Find' function, you'll directly find the macro or step you want to debug.

     ** Notice that displaying macro name and step name in the routine's comment is only for higher versions (SCM 7.0 or higher)

 

2) Input/Output parameters for the forms (both for macro and macro step)

    

      - I_T_LINES (input)

         This table contains all lines in the data view. Here you'll get to know the line number (field 'LINE') corresponds to each key figure (field 'FELDH').

      - I_T_COLS (input)

         Corresponding to I_T_LINES, this table contains all columns in the data view. You'll get to know the periodicity (daily, weekly, monthly, etc.) for which macro is executed, as well as column number (field 'COLUMN' corresponding to each bucket (field 'PERDY').

      - I_T_ADV_PLOB_VALUES (input)

         As introduced in the 'Marco execution level' part, this table contains the planning objects on which macro is executed.

      - C_T_TAB (input and output)

          This table contains the value/data in each cell. The routine changes this table when data is changed in a cell.

          Each cell is identified by its row number (field 'Z') and column number (field 'C'), which correspond to the value in table I_T_LINES and I_T_COLS. The value of the cell is saved in field 'V'.

      - C_T_TAB_OLD (input and output)

          This table saves the old value of the changed cells. Whenever a change is made in table C_T_TAB (value change or new value), a corresponding entry will be written to C_T_TAB_OLD with field 'V' filled with the old value.

          This table is important because when save data to liveCache, a delta save is performed, which means only the changed data are saved to liveCache. If a cell is not logged into C_T_TAB_OLD, system will not consider it as 'changed' and thus will not be saved to liveCache.

          It is especially important to understand this when you create BADI macro. It is a general case that C_T_TAB is filled by BADI macro but C_T_TAB_OLD is not filled, which results data change in C_T_TAB could not be saved into liveCache.

 

Follow up posts: Hello Macro (3) http://scn.sap.com/docs/DOC-45093

How to read order details from the liveCache with FM /SAPAPO/OM_ORDER_GET_DATA

$
0
0

Very often when analyzing issues in APO I find it useful to read order details from the liveCache.


We can see from the order data stored in the liveCache when it was created, whether it was already transferred to ECC, what is the net processing times (in seconds) of the modes of the activities and, of course, compare all that information to the interactive screens we use all the time, be it Master Data (like PDS display) or application-specific (like Product View or a Planning Book).


One way to do this is by executing Function Module /SAPAPO/OM_ORDER_GET_DATA, as described below:


1) Get the OrderID number of the order.

 

At the Product View, execute ok-code GT_IO and then copy the value of the field Internal Order corresponding to the order you want to get the details on:

1_orderid.jpg

2_orderid.jpg

 

Another way would be to open the Order View (/SAPAPO/RRP2) for the order, then enable the debugger with ok-code /h and press enter twice to open the debugger. Then, copy the orderid from field GT_ORDERS[1]-ORDERID

          3_orderid.jpg

          4_orderid.jpg

 

2) Get the GUID of the Planning Version
    (you can skip this step if you are reading order data from the Active Version / Planning Version 000).  

 

          This can be done in transaction /SAPAPO/OM16. There, specify the name of the Model and the name of the Planning Version and execute. Then, copy the value of field Guid of Planning Vers.

          5_plngversionguid.jpg

          6_guidofplanningversion.jpg

 

3) Execute in transaction SE37 the Function Module /SAPAPO/OM_ORDER_GET_DATA.


     At the selection screen of the transaction, specify the Function Module and click in Test/Execute button (F8)

          7_functionmodule.jpg

 

     On Test Function Module screen, set the indicator Uppercase/Lowercase

          8_uppercaselowercase.jpg

 

     In the Details View of Import Parameter IS_GEN_PARAMS, specify the GUID of the Planning Version (always 000 for the Active version)

          9_is_gen_params.jpg

          10_simversion.jpg

 

     Press Back button. Then, in the Details View of IT_ORDER, specify the OrderID

          11_it_orders.jpg

          12_it_orders.jpg

 

     Press back button and then execute the Function Module. Then, click on Save button in the popup and check the results:

          13_execute.jpg

          14_save.jpg

          15_results.jpg

 

4) Analyze the data

 

  • You can check, for example, the date and time this order was created in APO, in export parameter ET_ORDERS, field CREATION_TIME (all dates are displayed in UTC, exactly as stored in the liveCache).
  • If you are in a scenario with alternative sequences, you can see in ET_MODES, field NET_PROC_TIME, what is the net processing time of that mode, in seconds - the number of working seconds needed to execute the activity, as calculated during PDS/PPM explosion.
  • For dynamic setup activities, the REM_PROC_TIME and/or NET_PROC_TIME may be 0. That's because the net duration of sequence-dependent setup is calculated dynamically, from the setup matrix.
  • You can see the time-continuous and the bucket capacity requirement for each of the modes in ET_CAP_REQS.
  • In the example above, you can see this order was not sent over to ECC yet as ET_ORDMAPS is empty.
  • If there are activity relationships in the order, they will be shown in ET_INTERN_CONSTRAINTS.
  • Many other informations can be checked, if required, and then compared to the values displayed in interactive transaction (like the /SAPAPO/RRP3 or /SAPAPO/SDP94).

Multi resource scheduling heuristics

$
0
0


Multi resource scheduling heuristics for SCM APO PPDS



Multi-resource scheduling heuristic (SAP_MULT_SCH) to switch modes and sources of supply according to defined time blocks, so as to keep the receipts as close to the required date as possible.

 

Multi-resource scheduling provides the following advantages:

a)      Complies better with due dates

b)      Supports finite production planning scenarios with all alternative resources

c)       Allows alternative resources to be modeled using not only modes but also alternative sources of supply



            The orders are scheduled in alternative modes, and different sources of supply are chosen after the switch over time when the heuristic is executed..The user will be able to run the heuristics on selecting resource or product and obtain the results. The customers mainly use this DS Planning board whenever they run this multi-resource scheduling heuristics so that it is easier for them to view the orders at operations level. Also if there are any of the orders which need to be rescheduled it can be done so by using this DS Planning board.



APO Useful Documents on SCN

$
0
0

I was recently going through the blogs/documents posted on SCN.  I see so many useful documents/blogs which could benefit to all.  There are countless documents in different areas which are always handy to brush-up our knowledge. Here in this document I have tried to put some of them which could be very useful in daily life. Feel free to add new documents by adding them in comment area and I will add to this document. I hope this gives a lot of benefit to all experts as well as newbies.

 

 

Demand Planning

 

Forcast Consumption and Reduction in APO System

Options To Delete CVC from DP and Infocube

Step by Step Process to Release Forecast from APO DP to ECC

Improve Performance of DP related processes

Data Extraction from Planning Area to Backup Info Cube

APO DP - Forecast Model Parameters: First-Order Exponential Smoothing

APO DP - Forecast Model Parameters: Second-Order Exponential Smoothing (Holt and Winter's)

APO DP - Forecast Model Parameters: Second-Order Exponential Smoothing (Holt Model)

Demand Planning Add-In for Microsoft Excel

Forecast error calculations in APO DP

How to Load Data into Planning Book from Flat File

Fiscal Year Variant and Posting Periods Configuration in APO

Collaborative 'Demand Planning +' with Business Suite 7.0

Duet with Demand Planning Process

A Practical Guide to MLR Forecasting in APO Demand Planning

Demand Planning Specific mySAP Supply Chain Management (SCM) 5.0 Release Notes

The Role of Functional Teams in Optimizing SAP APO Demand Planning System Performance

SCM 7.0x - Demand Planning (DP) forecasting using KISS

Real time Characteristic Value Combination (APO DP) Validation with ECC

 

Macro's

 

Macro Workbench Version Management in SAP APO

How to Trigger an Event Using APO Planning Book Macro

Hello, macro (1)

Hello, macro(2)

Hello, macro (3)

Macro Examples SAP SCM APO

Documenting and commenting APO macros

Interactive Solution for Complex Calculations in DP Planning Book Using Macro BAdIs

 

CIF

 

SCM Core Interface- Handbook (PART-1)

SCM Core Interface- Handbook (PART-2)

Automating CIF Postprocessing Record Analysis

Automated CIF Postprocessing Record Analysis - Part 2

Automating CIF Delta Report Reprocessing

APO EEWB - Version Dependent Extra Fields

 

 

SNP

 

SNP Configuration Steps for Planning Area and Planning Book/Data View

Third Party Order Planning with SNP

Fix ‘Initial’ Period Column in SNP Planning Book

Resource Conversion

APO SNP - Master Data Automation

Tailor-making of SCM SNP Deployment Heuristics – Technical Insights

Document:SNP Deployment Heuristics - Consideration of Demands at the Source Location

Product Interchangeability in APO SNP

Supply Network Planning (SNP): Deployment and Transportation Load Builder Scenario 1 - Fair Share Rule by Quota Arrangem…

Supply Network Planning (SNP): Deployment and Transportation Load Builder Scenario 2 - Push Rule by Quota

Supply Network Planning (SNP): Deployment and Transportation Load Builder Scenario 3 - Fair Share Rule - a "Proportional Distribution Based on Demands"

How to force 100% Resource Utilization in SAP APO Supply Network Planning Optimization-Based Planning

How to Plan With Constrained Storage in APO Supply Network Planning with Optimization-Based Planning

How to Plan Campaigns in SAP APO Supply Network Planning for a Steel Plant

Rules-based Product Substitution (Down Binning) with Capable To Match (CTM)

Supply Network Planning Optimizer Feasible Solution for Different Production Costs in Network

CTM – Concepts, Demand Prioritization and Supply Categorization

Extended Safety Stock Planning in Capable-to-Match (CTM)

 

PPDS

 

MRP-based Detailed Scheduling and Subcontracting

3rd party subcontracting process in PPDS

Co-Product Planning in SAP APO

Adding custom columns via /SAPAPO/RRP_IO_COL as of note 1709784

PP/DS - Search Terms for effective SAP Note and KBA Search

Improvements in Production Planning Heuristics Logs in SAP APO (Part- 1/3)

Improvements in Production Planning Heuristics Logs in SAP APO (Part- 2/3)

Improvements in Production Planning Heuristics Logs in SAP APO (Part- 3/3)

Alternative Resources Functionality in APO PPM/PDS without having Alternative Sequences in R/3

How to Debug the Background Planning Task in SAP APO Production Planning/Detailed Scheduling

SCM7.0 PPDS Cross Plant Deployment (New Functionality)

Determine Setup ID in APO by Characteristic Values

Contract Manufacturing Procurement  in Supply Chain Management

 

 

GATP

 

Global Available To Promise (GATP) Overview

Product Availability Check

Rule Based ATP with Exclusive Rule Strategy

Rules-Based Availability Check Overview and Steps to Configure

GATP Product Allocation Settings

Enhancing the BOP Filter with custom fields so as to run the BOP in a flexible way

gATP/Rule based/ Enhanced Customizing

Backorder Processing (BOP) run on orders with custom ATP categories

Configurable Process Scheduling (CPS) Used in Transportation and Shipment Scheduling of GATP - Part 1

Multilevel Available To Promise (MLATP) with Rules-Based Component Substitution   

GATP - An Adaptive Feedback Supply Chain Control System potential - Part 1

Understanding the technical stuff in SAP APO gATP module

 

 

Interfaces

 

Mass Uploads in SAP APO using Function_Loader_Inlay

Guide to APO Interfaces

Automating Planning Run Batch Results

APO Alert Monitor

 

 

LiveCache

 

How to trace Livecache in APO

How to read order details from the liveCache with FM /SAPAPO/OM_ORDER_GET_DATA

Performance Optimization in SAP APO

 

 

SPP

 

SAP Spare Parts Planning ( SPP ) in a Nutshell

Service parts Planning - Inventory Balancing

APO SPP Manual Adjustment of Demand History - Do's & Dont's

 

 

Misc.

How to close a discussion and why

Finding Transport request

Tip: how to find out the correct component when raising OSS messages

How to find the suitable notes/KBAs

Deactivate, delete integration models - process

$
0
0

Hi,

 

You would like to deactivate integration models , please check if the below process are carried out

 

In CFC1 transaction , kindly maintain the connected APO system to deactivate and delete the models> Once all the data are deleted, you can then delete these settings .Also please check note 886103 and see if you have followed all the steps carefully , probably you miss the step 6.7

Step 6.7 Convert logical system names

Transaction BDLS must be run for the following logical systems

 

1) In the SAP R/3 backend system:

     a) Logical system for SCM

     b) Logical system for OLTP

2) In the SAP SCM system:

     a) Logical system for SCM

     b) Logical system for OLTP

Transaction BDLS must be started for every client  on the backend system. It is also recommended  to deactivate  all integration model during a client copy. Please also check the note 1708883

 

Thanks

Gomathi


Hello, macro

Hello, macro (3)

Mass Creation Of Product Specific Transportation Lane creation

$
0
0

The objective of this document is to highlight the options available for mass creation of product specific Transportation lane.

 

Transportation lane is the External procurement Relationship  master data object used in APO planning process.

 

The ways to create this master data object is listed below

 

     1. Created through CIF from ECC. Special Procurement Key in Material master and  Purchase Info record can trigger the creation of the Transportation lane in APO.

 

     2. Create in APO manually through the transaction /N/SAPAPO/SCC_TL1  or through the custom program.

 

This document will bring out the options available in the transaction /N/SAPAPO/SCC_TL1 to create Transportation lane for multiple product location.

 

Steps 1: Access the transaction /N/SAPAPO/SCC_TL1 enter the model, source and destination location and hit on create.

 

Capture.PNG

 

Step 2: Click on Creation of New entry and select Mass selection, as highlighted in the below screen shots.

Capture3.PNG

Step 3: Hit on Mass selection and you will get the options to enter Multiple products, Location products ,selection based on SNP Planners etc as as shown in the below screen shot.

 

Capture4.PNG

Enter Multiple products and execute the transaction by clicking on the clock.

 

And finally save the data to create product specific transportation lane.

SCM Core Interface- Handbook (PART-1)

$
0
0

1. Introduction

APO core interface (CIF) is a standard interface which connects an APO & a standard R/3 system. CIF interface enables the data exchange between an SAP ERP system (SAP R/3, SAP R/3 Enterprise, or SAP ECC), and a connected SAP SCM system. CIF enables the integration of master data from SAP ECC to SCM APO (one way only) and the integration of transactional data in both ways, from ECC to APO and vice versa. The basic idea of the integration is to write events for each planning relevant change – e.g. the creation of an order – and use these as trigger point for the data transfer. Technically the transfer is performed via qRFC. Which objects (i.e. planned orders, stock) should be transferred from SAP ECC to SCM APO is controlled by the integration models, which could be regarded as something like the master data of the CIF.

 

2. Business Objective

The main objective of configuring the ECC core interface is to integrate the centralized source of master data from ECC to APO & publishing planning result from SCM planning system (APO) to SAP OLTP system (ECC) & vice versa. It is essential to maintain the consistent data & its flow between planning and execution system for better planning result & optimum inventory level.  This configuration also suffices the objective of real time transactional data transfer between planning and execution system & delta transfer for master data changes.

CIF provides the following central functions:

  1. Supply the master and transaction data, relevant for planning, in SCM APO system.
  2. Techniques for initial, incremental, and real-time data transfers from ECC to SCM APO
  3. Techniques for publishing planning results back from SCM APO to ECC system

 

3. Key Design Areas

  • Technical Integration between ECC and APO (CIF Configuration)
  • Integration of Master data and Transactional data (Design of Integration Models)
  • CIF Error Handling and Queue Management

 

4. Technical Integration between ECC & APO (CIF Configuration)

  • As of SAP ECC 6.0, CIF is an integrated part of the system. (If you use SAP SCM 5.0 with ERP systems up to SAP ECC 5.0, you receive CIF via the relevant SAP R/3 Plug-In. The SAP R/3 Plug-In is a combined Plug-In used to link SAP components such as SAP SCM to the ERP system up to and including SAP ECC 5.0) .
  • SAP ERP Central Component 6.0 (SAP ECC 6.00) and later releases will directly contain all the interfaces necessary for the technical integration with other SAP components. These interfaces were previously part of the SAP R/3 Plug-In.

1.jpg

 

5. CIF Set-up & Related Configuration Task

5.1.      Configuration in R/3

    • Define logical system (Transaction code- BD54)

We defined logical system for both ECC and APO system. To enable the transfer of data via APO Core Interface (CIF), you need to name both                    the ERP system in which you are working and the SAP APO system to which you want to transfer data as logical systems.

 

2.jpg

Note:-The above activity is dependent upon the target system concerned. For this reason, there is no transport to the production system. As a result, you need to also make the settings for the relevant target system in the production system manually, Or Alternatively, you can maintain the entire ECC & APO logical system name (as per SAP system landscape & client strategy) in Development client  and include them in a transport. This will make all the logical system available across ECC system landscape & minimize the manual activities.

 

  • Assign logical system to client (Transaction code- SCC4)

         In this step, assign the logical system to a client. All the fields like City, Std. currency, logical system should be maintained. Otherwise there will be           some futile CIF error during the data transfer from ECC to APO.

          3.jpg

Note: - These settings cannot be transported. When a new system is being set up, these settings must be made after the system installation has been completed.

 

  • Specify APO release (Transaction code- NDV2)

In this step, you have to specify the release level of the SAP APO system that is defined as the target system. The release level will activate the compatible functionality for the data transfer.

4.jpg

 

Note: - The above activity is dependent upon the target system concerned. For this reason, there is no transport to the production system. As a result, you need to also make the settings for the relevant target system in the production system manually or

Alternatively, you can maintain the all of the logical system name (as per SAP system landscape) in Development client and include them in a transport. This will make all the logical system available across ECC system landscape & minimize the manual activities.

 

 

  • Setup RFC destination (Transaction code- SM59)

In this step, create the RFC destination which has the same name as the target logical system. This RFC enables the connection to the SCM APO system.

5.jpg

 

Performance Tip: - By activating the load balancing & defining the logon group, you can use the load balancing procedure while connecting to the APO system. This logon group needs to be assigned to the various servers in transaction code RZ12.

6.jpg

When ECC is connected to APO system it always uses ALEREMOTE user in order to process any CIF queues.

 

  • Assign RFC destinations to various application cases (Transaction code- CFC7)

You assign various application areas to the logical system and the RFC destination. This is optional configuration & purely depends on the business requirement & guidelines for the type of RFC user to be used for security reasons. Some of the applications only work with RFC dialog user e.g. GATP availability check, but from security point of view, most of the clients e.g. life science, FMCG doesn’t allow to create the RFC user as dialog user which can be misused.

To meet the business requirement, there is a need to create a new RFC destination with different RFC user of type either Dialog or Service user with very limited & restricted authorization. The authorization has to be provided for the type of application to be use.

7.jpg

The below applications can be triggered from ECC into APO

8.jpg

 

  • Set target system and queue type (Transaction code- CFC1)

In this step, you can set the queue type for the target system specified. It is always recommended to use the inbound queues in the target system to control the processing of the CIF queues with heavy data load.

The queue type (inbound or outbound) determines whether the queues processing is controlled by the sending or receiving system

9.jpg

 

  • Set user parameter (Transaction code- CFC1)

    In this IMG activity, you can make user-specific entries for the following parameters:

    • You can use this to set whether and how data records should be logged in the application log for the user specified
    • User can use this functionality to debug in case of any queues get created for that user. Any master data or transactional date created by that user will be stuck in the CIF queue which the user can debug and analyze the issue
    • Here, you should maintain entry for user name as “*” for normal or detailed logging of the application log.
    • RFC user should be maintained with the normal logging.
    • CIF administrator id should be maintained for detailed logging along with Debugging mode “ON”.

          10.jpg

 

  • Determine Number Ranges for Parallelization (Transaction code- CFC8)

          11.jpg

 

  • Define filter and selection block size (Transaction code- CFC3)
    • You determine the number of filter objects that are processed in one block in the APO Core Interface.
    • Also you determine the number of data objects that are transferred to SAP APO in a remote function call (RFC) at the initial data transfer.
    • In normal cases, it is recommended not to change the above Filter & Block size.
    • You can use these settings to improve system performance during the initial data transfer. The optimum values for improved performance vary from case to case and are largely dependent upon the client data situation. Therefore, you are recommended to experiment with the settings in individual cases in your system. 

              12.jpg

   

                13.jpg

         

                14.jpg

                    Note: - Refer to OSS note # 436527 for the block size recommendation.

 

  • Configure Change transfer for master data (Transaction code- CFC9)
    • All the master data changes e.g. material, customer, vendor & resources are configured as periodic transfer to APO.
    • During the initial transfer of the resource, it is defined to create external capacity for the resource for 30 days in past and 730 days in future.
    • It is recommended to create single-mixed resource in APO for the resources having single capacity and for which the indicator “can be used by several operation” is not set.
    • It is recommended to configure to create Multi-Mixed resource in APO for the resources having multiple capacities or for capacities for which the indicator “can be used by several operations” is set.

          15.jpg

              Note: -

      • Master data change transfer from ECC to APO can be configured as immediate or periodic, which is completely depends on client requirement.
      • If SNP & PP/DS both are in scope then it is always recommended to use the resource type as mentioned above. This is very critical setting & cannot be altered later neither in ECC nor in APO. You need to delete the resource in APO to change the resource type. This cleanup can be a mini project.


  • Activate ALE Change Pointers Generally (Transaction code- BD61)

          This configuration is a prerequisite for transferring master data changes with change pointers.

            16.jpg


  • Activate ALE Change Pointers per Message Type

          The following change pointers are important to be activated e.g. Vendor master, Customer master, Set-up group and Source of supply, material           MRP area, Subcontracting PDS.

17.jpg


Note: - Activation of change pointer is only needed if the corresponding master data is used in system.


  • Activate online Transfer Using BTE (Transaction code- BF11)
    • We activate BTE for the integration with SAP APO in order to activate the online transfer of both transaction data changes and some master data changes like Material master and Resource. We Set the ND-APO (New Dimension Plug-In APO) and NDI (New Dimension Integration) application indicators to active.

18.jpg

              Note: - This is not the default setting; hence make sure both the above mentioned indicators are set.


  • QIN Scheduler (Transaction code- SMQR)
    • When APO sends the planning results to the connected ECC system, it generates the CIF queues in the inbound of the ECC system as per the configuration. Those inbound Queues that are to be automatically processed by the target system must be registered in the QIN scheduler.
    • The below settings used for QIN scheduler worked for most of the clients. Changes can be done based upon the queue processing.

19.jpg


  • QOUT Scheduler (Transaction code- SMQS)
    • Outbound qRFCs (outbound queues) are processed by the QOUT scheduler. The QOUT scheduler is configured using transaction code SMQS. The targets system to which the outbound qRFCs are to be sent are registered in the QOUT scheduler.
    • The below settings used for QOUT scheduler worked for most of the clients. Changes can be done based upon the queue processing.

20.jpg



5.2.    Configuration in SCM APO System

      • Define Logical System (Transaction code- BD54)

We defined logical system for both ECC and APO system. To enable the transfer of data via APO Core Interface (CIF), you need to name both the ERP system in which you are working and the SAP APO system to which you want to transfer data as logical systems.

21.jpg


              Note: - The above activity is dependent upon the target system concerned. For this reason, there is no transport to the production system. As a                     result, you need to also make the settings for the relevant target system in the production system manually, Or

Alternatively, you can maintain the all of the logical system name (as per SAP system landscape) in Development client and include them in a transport. This will make all the logical system available across ECC system landscape & minimize the manual activities.

 

  • Assign logical system to the Client (Transaction code- SCC4)

In this step, assign the logical system to a client. All the fields like City, Std. currency, logical system should be maintained. Otherwise there will be some futile CIF error during the data transfer from ECC to APO.

22.jpg

Note: -These settings cannot be transported. When a new system is being set up, these settings must be made after the system installation has been completed.

 

  • Setup RFC Destination (Transaction code-SM59)

In this step, create the RFC destination which has the same name as the target logical system. This RFC enables the connection to the ECC system.

23.jpg

Performance Tip: - By activating the load balancing & defining the logon group, you can use the load balancing procedure while connecting to the ECC system. This logon group needs to be assigned to the various servers in transaction code RZ12.

 

24.jpg

When APO is connected to ECC system it always uses ALEREMOTE user in order to process any CIF queues.

 

  • Assign RFC Destinations to Various Application Cases (Transaction code- SPRO)
    • You assign various application areas to the logical system and the RFC destination. This is optional configuration & purely depends on the business requirement & guidelines for the type of RFC user to be used for security reasons. Some of the applications only work with RFC dialog user e.g. display application log from APO into ECC system, but from security point of view, most of the clients e.g. life science, FMCG doesn’t allow to create the RFC user as dialog user which can be misused.
    • To meet the business requirement, there is a need to create a new RFC destination with different RFC user of type either Dialog or Service user with very limited & restricted authorization. The authorization has to be provided for the type of application to be use. So that while accessing the data from the target system through a Remote function call, the system will use the specified RFC destination specified in the configuration of “Assign RFC destinations to various Application cases”.

              25.jpg


  • Maintain Business System Group (Transaction code- /SAPAPO/C1)
    • This configuration determines the assignment to a business system group of this system and the respective ECC systems that are to be connected.
    • If this APO system is connected with multiple ECC system using the same number range of material master, plant, vendor & customer master, then we need to define multiple BSG groups to bring master data from each system.

26.jpg

 

  • Assign Logical system to Business System Group (Transaction code- /SAPAPO/C2)

In this step, to enable error-free communication, every source system (ERP system) must be assigned to a BSG. We assign the logical system to the BSG group & the queue type.

Here, we have also activated the CIF post processing functionality for CIF error handling of transaction data.

27.jpg

Note: - It is recommended to use the Inbound Queues if transferring a large amount of data to the ERP system to ensure an even load on the ERP system. Ensure to maintain settings for the Queue-In (QIN) Scheduler in the qRFC monitor on the ERP side.

 

  • Set User Parameter (Transaction code- /SAPAPO/C4)

In this IMG activity, you can make user-specific entries for the following parameters:

    • You can use this to set whether and how data records should be logged in the application log for the user specified.
    • User can use this functionality to debug in case of any queues get created for that user. Any master data or transactional date created by that user will be stuck in the CIF queue which the user can debug and analyze the issue.
    • Here, you should maintain entry for user name as “*” for normal or detailed logging of the application log.
    • RFC user should be maintained with the normal logging.
    • CIF administrator id should be maintained for detailed logging along with Debugging mode “ON”.

              28.jpg


  • QIN Scheduler (Transaction code- SMQR)
    • When ECC sends the master and transaction data to the connected ECC system, it generates the CIF queues in the inbound of the APO system as per the configuration. Those inbound Queues that are to be automatically processed by the target system must be registered in the QIN scheduler.
    • The below settings used for QIN scheduler worked for most of the clients. Changes can be done based upon the queue processing.

              29.jpg

 

  • QOUT Scheduler (Transaction code- SMQS)
    • Outbound qRFCs (outbound queues) are processed by the QOUT scheduler. The QOUT scheduler is configured using transaction code SMQS. The targets system to which the outbound qRFCs are to be sent are registered in the QOUT scheduler.
    • The below settings used for QOUT scheduler worked for most of the clients. Changes can be done based upon the queue processing.

          30.jpg

 

  • Maintain Distribution Definition (Publication) (Transaction code- /SAPAPO/CP1)

In order to publish the planning results from APO system to ECC system, we have maintained all the publication types for the required locations for which we wanted to publish the order to ECC system. This is maintained for both in-house and external procurement.

              31.jpg

                  Note: - If the distribution definitions are not maintained then the planning results will not transfer back to the connected ERP system. The data                     inconsistency between ECC & APO system will even not be captured in CCR report.


5.4.  qRFC queue names for CIF

        QRFC queue names for the CIF are always set up according to the following rules:

        CF<CIF object ID><serialization character string>

        The CIF objects that are transferred from an ERP system to the APO system are listed below:

CIF object

ID

Serialization Character string

Batch

BTC

CHARG+(10) + MATNR+(9)

Resources

CAPA

NAME+(18)

Characteristic

CHR

ATNAM+(19)

Class

CLA

KLART+(3) + CLASS+(16)

Inspection lot

LOT

PRUEFLOS+(17)

Material master

MAT

WERKS+(4) + MATNR+(14)

Planned order

PLO

ORDNR+(12)

Confirmation

PPC

ORDERNR

Reservation

RSV

ORDNR+(12)

Purchase order

PO

DOC+(10)

Purchase Requisition

PO

DOC+(10)

 

 

The CIF objects that are transferred to an ERP system from SAP APO are listed below:

CIF object

ID

Serialization Character string

Delivery confirmation

CD

ORDERNO

Confirmation

CF

GUID or ORDERNO

VMI Sales order

CO

ORDERNO

Delivery

DLV

001

Purchase requisition

EP

GUID or ORDERNO

Purchase order

EP

GUID or ORDERNO

Planned order

IP

GUID or ORDERNO

Process/Production Order

IP

GUID or ORDERNO

Production campaign

PC

GUID or ORDERNO

  1. Man. Reservation

RV

GUID or ORDERNO

Shipment

SHP

001

Stock transport order

TO

GUID or ORDERNO

 

 

5.4.  CIF Interface Configuration Checklist

          Below is the CIF configuration checklist or tracker which can be maintained to monitor the CIF configuration in ECC & APO system          simultaneously. This will avoid any configuration to get missed.


  1. S.no

Configuration Nodes

APO

APO Status

ECC

ECC Status

1

Name Logical System

Basis

Completed

Basis

Completed

2

Assign Logical System to a Client

Basis

Completed

Basis

Completed

3

Specify SAP APO Release

NA

NA

Basis/Functional

Completed

4

Create RFC User

Basis

Completed

Basis

Completed

5

Set Up RFC Destination

Basis

Completed

Basis

Completed

6

Set Target System and Queue Type

CIF Functional

Completed

CIF Functional

Completed

7

Set User Parameters

Basis/CIF Functional

Completed

Basis/CIF Functional

Completed

8

Configure Application Log

CIF Functional

Not Started

CIF Functional

No Change

9

Determine Number Ranges for Parallelization

NA

NA

Basis

No Change

10

Set Filter and Selection Block Size

No Change

No Change

No Change

No Change

11

Configure Change Transfer for Master Data

NA

NA

CIF Functional

Completed

12

Activate ALE Change Pointers Generally

NA

NA

CIF Functional

Not Started

13

Activate ALE Change Pointers for Message Types

NA

NA

CIF Functional

Not Started

14

Activate Online Transfer Using BTE

NA

NA

CIF Functional

Completed

16

Activate Cross-System Update Logic

NA

NA

CIF Functional

Completed

17

Maintain Business System Group

CIF Functional

Completed

NA

NA

18

Assign Logical System and Queue Type

CIF Functional

Completed

NA

NA

19

Maintain Object-Specific Settings

CIF Functional

Not Started

NA

NA

20

Maintain Publication Settings

CIF Functional

Not Started

NA

NA

 

  1. S.no

Activities

APO

APO Status

ECC

ECC Status

1

Setup Outbound scheduler

Basis

Completed

Basis

Completed

2

Setup Inbound scheduler

Basis

NA

Basis

NA

3

Activate outbound scheduler for CIF

Functional

Completed

Functional

Completed

4

Activate Inbound scheduler for CIF

Functional

NA

Functional

NA

5

Create Integration model

NA

NA

Functional

Completed

6

Activate Integration model

NA

NA

Functional

Completed

7

Maintain Publication Settings (After Master Data transfer)

Functional

Completed

 

NA

 

 

6. Integration of Master data and Transactional data (Design of Integration Models)

The integration model controls the transfer of master data and transaction data. It is generated in the ERP system and contains all data that is to be transferred to the SCM system. It is uniquely identified by name and application. There are 2 steps in integration models

  • Create & Generate Integration models (Transaction code- CFM1)

          When you generate an integration model, you specify which data objects are to be selected from the total dataset in the ERP system for the                    transfer. To create the integration model, follow the below steps:

    • First select the object types (for example, material masters) to be selected on the Create Integration Model selection screen.
    • Next, you select specific selection criteria (in most cases, a material/plant combination) that further restrict the object types you have already selected. If you have already selected Material Masters, for example, you could now enter an MRP controller. In this way, you define filter objects
    • . Filter objects are used to select which data objects are transferred to a specified SCM system. In the example, all material masters for a particular MRP controller are selected.

 

  • Activate Integration model (Transaction code- CFM2)
    • The activation of integration models results in an initial transfer. If you work with SAP APO, the online transfer of transaction data is released.
    • As standard, the integration models to be activated are compared with the integration models that are already active. You can generate multiple integration models. However, only one version can be activated for each model at a time.
    • You can activate and deactivate several integration models simultaneously.
    • Integration models must remain active to enable online transfer.
    • Activation of master or transaction data integration model should be followed a logical sequence & recommended to grouped as below

 

      • ATP Customizing, Setup groups (group/key) and product allocation
      • Plants & distribution centers
      • Change master records
      • Class & characteristics
      • Material master (+ classes, characteristic)
      • Batches
      • MRP area and material master for MRP areas
      • Planning Products
      • Materials for GATP check
      • Product allocation
      • Vendors
      • Customers
      • Work Centers
      • Production data structure
      • Purchase Info Record, scheduling agreement, contracts

 

Note: - Normally, there are some challenges to generate & activate the material dependent integration model for master & transaction data during initial transfer, when the master data objects to be transferred are huge. In that case we need to do some work to design the integration model to have the optimum no of data objects per IM.

The critical data objects are:

  • Material master
  • Purchase Info record/source list
  • Sales Orders
  • Stocks
  • Batches

 

The following are the recommendations which can be used to do the initial data transfer successfully.

  • Split the material master integration model  by either material type / MRP controller/ material master number range or in combinations
  • If the data split is not feasible then add data objects in the sequential manner. It means, first create & activate the IM for e.g. material type FERT. Next time, add some more material type in the same variant & activate it. This way, there will not be heavy load on the system at a time & it will be distributed over multiple IM transfer.


6.1. Data Flow between ECC to APO & its Frequency

    • Master Data flow from SAP (ECC) --> APO

Master Data

Daily Batch Job- Once in a Day

Manually

Plant

No

Periodic, Need basis

Vendor, Customer

Yes; Create new, Change existing

Need basis

Material

Yes; Create new, Change existing

Need basis

Info Record

Yes; Create new, Change existing

Need basis

PDS

Yes; Create new, Change existing

Need basis

 

    • Transaction Data Flow SAP (ECC) --> APO

Transactional Data

Daily Batch Job- Once in a Day

Real Time

Purchase Requisition's

Yes; Activate IM for new APO relevant materials

Create, Change

Inspection lot

Yes; Activate IM for new APO relevant materials

Create, Change

Purchase Orders

Yes; Activate IM for new APO relevant materials

Create, Change

Stocks

Yes; Activate IM for new APO relevant materials

Create, Change

Stock Transfer Orders

Yes; Activate IM for new APO relevant materials

Create, Change

Sales Orders

Yes; Activate IM for new APO relevant materials

Create, Change

 

          Note: - The above mentioned data flow frequency from ECC to APO is just a recommendation based on various project experiences. Still, these can           vary from project to project depends on business requirements    .


6.2. Publication of procurement proposal from APO to ECC (Transaction code- /SAPAPO/C5)

    Transactional Data can be transferred from APO to ECC in two ways

  • Periodically – it is recommended to use the periodic transfer of SNP planning result to ECC. It improves system performance, locking issues, & time to review & correct the planning results before publishing to ECC.
  • Immediately/Real Time – This ensures real time data transfer and better consistency between APO & ECC. It is recommended to use this method for short term planning method e.g. PP/DS, CTM.

 

---------Contd... PART-2-----------------------------------------------------------------------------------------------------------------------------------------------------------------------

SCM Core Interface- Handbook (PART-2)

$
0
0

In continuation of PART-1

 

7.  CIF Error Handling and Queue Management

7.1.  Activate CIF Error Handling in SCM APO System

    • Transaction code- /SAPAPO/C2

              32.jpg

                  Here we choose the error handling method as Post-processing of errors. No splitting of LUW in order to activate the post-processing of the                         logical systems.

If you are transferring a large amount of data from SAP APO to SAP R/3, and you want to be ensure that an even system load is placed on SAP R/3, choose Inbound Queues.

Since we choose Inbound Queues in SAP APO, We did the necessary settings for the QIN Scheduler (Queue-In Scheduler) in the qRFC monitor in SAP R/3(As explained above). Queues that are to be processed automatically by SAP R/3 must be registered in the QIN Scheduler.


    7.2.  CIF Error Handling (Post Processing)

    • Transaction code- /SAPAPO/CPP1

This functionality in APO will allow the CIF user to see all the logged CIF error messages centrally in APO. This process is independent of the queue type (inbound or outbound) and CIF errors in both the system (ECC and APO) can be handled from this one transaction in APO.

CIF error handling ensures that all CIF queue entries are processed during the data transfer. Faulty queues no longer lead to queue blocks. Instead, they are logged in post processing records in the relevant target system for the data transfer. You can then call these post processing records at a later point in time in CIF Post processing. Once the error has been corrected you can then send the objects to the relevant target system again.

If a change to transaction data cannot be posted in the target system due to an error, the system creates a post processing record with the error status Faulty for the object concerned.

CIF error handling is not available in the following situations, which means that CIF queues hang when errors occur:

      • At the initial data transfer(Master Data and Transactional Data)
      • At the transfer of master data (initial and change transfer)
      • At short dumps or when liveCache is unavailable
      • When the target system is unavailable
      • When an object is locked in the target system (as before for the repetition of the transfer)
      • If errors occur in customer exits or BAdIs that run in CIF inbound function  modules during integration

You can find information about restrictions to CIF error handling when using certain applications and functions in SAP Note 602484.

Since not all errors are included in CIF error handling, faulty queue entries may continue to exist once CIF error handling has been activated. Faulty queue entries can also block objects that are resent by CIF post processing. Therefore, you still need to monitor CIF queues by using the qRFC monitors for inbound or outbound queues, or by using the SCM Queue Manager.

 

  • Steps for CIF Post Processing
    • Invoke transaction /SAPAPO/CPP1 in APO. Choose the target system and this should be the connected ECC system. Make sure that the indicator “Select Data from R3” is turned on. Processing status should be “1” for selecting the data which needs to be processed. However you can select the other statuses like processed, obsolete etc. depending upon the requirement.

              33.jpg

    • The navigation tree on the left side shows the system connection and the transfer directions; under each transfer direction, there are different order categories.
    • Select and process the records for each group of the “order categories”, e.g. “In-House Production”, “External Procurement”, etc.
    • These messages appear in detail on the right side of the screen and by double clicking on the rows for External procurement or In House Production.
    • On the right side of the screen there are two sections. One is worklist of the inbound system (top) and second is objects processed in this session (bottom)

              34.jpg

    • Select the records to be processed.  If there are too many records to be processed, you may sort them into different groups and process one group at a time.  You may sort them by products and by location, and select a few product/location in a group.

              35.jpg

    • Depending upon the object types and the direction of the record, you can either select send to R/3 or Send to APO.

         36.jpg

 

    • Once the records are in process, they will be moved from the “Worklist” window to “Objects Process” window.

                 37.jpg


    • Refresh the “Object Processed” window by click on the refresh button. If the errors are still persistent, it will be reflected in the status. Sometime this does not provide the actual error occurred.                 

                38.jpg


    • So To resolve the issue, you may select a desired entry and click on the application log button.

               39.jpg


    • The application log will be displayed

                40.jpg

      • Once the issue has been resolved, move the records back to the “Worklist” window to be processed again by selecting the record in “Object Processed” window and the “click on the icon.   Then you can process the object once again.
      • Resolve all the issues until the “Objects Processed” window is clear if it’s possible.
      • You use the Set Entry as Obsolete indicator to set the processing status of the post processing record to Obsolete (Set Manually); for example, if you do not want an object to be sent again. The object itself is unaffected by this action.
      • You use the Remove Obsolescence Indicator to reset the processing status to Still for Processing. As a result, the post processing record is displayed in yellow.


7.3.  Queue Management

As mentioned in the restrictions of CIF post processing, all the errors did not get captured in the CIF post processing. We still need to monitor the CIF queues.

    • CIF Queue Monitoring Steps and Procedure

Transactions to check for Errors (R/3 and APO)

SMQ1 - qRFC Monitor (Outbound Queue)

SMQ2 - qRFC Monitor (Inbound Queue)

41.jpg

42.jpg

      • Click on Change View.
      • This will show you only the error (SYSFAIL) queues with the error message.


  • /SAPAPO/CQ - CIF queue manager

43.jpg

                    Note: The indicator for expand mode is performance incentive. The Transaction will take little bit longer time if you set this indicator.

 

                    44.jpg


      • Double click on the error message on the left side box so that the details of the error will appear on the right side box:

                    52.jpg

Note:

      • Please do not delete any queue from this screen as this will delete the error queue as well as the queues which are waiting for the error queue.
      • Please select the queue and click on “ENGINE ICON”. This will take you to the same SMQ2 and SMQ1 screen where you can delete the single queue as well as you can reprocess the waiting queues once you delete the error queue.

 

  • Evaluate CIF Application Log

          Sometime the error message does not provide the detail information like product, location etc. The additional detail information can be seen in the           CIF application log. Please proceed as following:

         

               > In APO: Double click on the CIF error message within the CIF entry(Transaction SMQ2 and SMQ1) or execute transaction : /n/SAPAPO/C3


               > In R/3:

In order to get the details of the Error message:

Transaction in R/3: CFG1 - Display CIF Application Log

      • In both the cases copy the External or Transaction ID to get the details of the error message.

               45.jpg

    • Enter the External ID, From Date (Yesterday) and To date (Today) and Execute It.

     It will give the details of the error message displayed in the SMQ2 or SMQ1 transaction.

 

  • CIF Error Resolution Process
    • Each CIF error should be evaluated against the list of errors and actions below. Any new errors should be investigated and resolved if possible with the list of known errors and actions updated.
    • When the product, location and order number (if relevant) have been recorded along with the error message and necessary action then the CIF entry can be deleted.
    • If there is any new error comes up and needs some time to investigate, the same queue can be saved by selecting the option edit à
    • Once the saved queue is investigated and completed with the root cause analysis, then the saved queue can be restored (edit à
    • An email should be sent to the Key User clearly detailing the product, location and order (if relevant) and the action(s) to be taken.
    • When the Key User confirms that the action has been completed, an ad-hoc CIF reconciliation report should be run for the relevant product(s) to correct any differences between ECC and APO.
    • If no confirmation from the Key User is received, then differences will be picked up in the weekly CIF reconciliation report.

 

8. Practical Challenges

There are various challenges which CIF admin or functional consultants will encounter while integrating the ECC & APO systems. Based on various project experiences, the following are some of the issues which are encountered.

 

8.1. Huge CIF queue build up in APO in-bound

If you are using the inbound queues both in ECC & APO system, then you might get into this issue.

 

When the data is transferred from ECC to APO the data is not visible in APO. Similarly, when the planning results from APO are transferred to ECC, the results are not visible in ECC & not returned to APO with ECC order number range. This is applicable for master & transaction data, both.

All the transferred data will be blocked in CIF inbound queue (SMQ2) with “Ready” status & you need to manually activate each LUW individually. This may take long time to clear all the queues

These queues will not be displayed in CIF queue manager (/SAPAPO/CQ). If you will run the reconciliation report (/SAPAPO/CCR) report, the differences between ECC & APO changes will reflect. But if will again transfer the differences from CCR report then it will again add one more record in the APO inbound. So it is always recommended to check the CIF blocks, clear them & then use the CCR for reconciliation.

.

Root Cause: - This issue mainly pops up when the inbound schedulers are either not registered in SMQR or if registered but inactive (Type= “U”, it should be “R”) in ECC & APO system.

 

           Example: - Inbound scheduler is registered but inactive (Type-U) in APO.

           46.jpg

              You have made some changes in the existing purchase requisition in ECC & saved. This change should transfer to APO immediately. But when                you look into APO, the changes are not reflecting.

     The changes are blocked in the APO inbound with “Ready” status & need manual intervention to clear them       47.jpg


8.2.  Master & Transaction Data change transfer to APO is not real time

This issue is applicable for all type of transaction data & only those master data objects which are configured as immediate transfer, e.g. material master, vendor & customer master, resource.

 

When there are any changes in the existing master data objects or transaction data which are integrated with APO & part of active integration model, still the changes are not reflecting in APO. You will not find any queue blocks for these changes. If you will run the reconciliation report in APO for transaction data consistency check (/SAPAPO/CCR), you will not find any record with differences. These changes will transfer to APO when the CIF job will run for respective integration model.

 

Root Cause: - This issue mainly pops up when you have not activated the online transfer using business transaction events for application “ND-APO” & NDI.

For master data, you can additionally check the “change pointers” are active or not, for the message type for the objects mentioned above.

       48.jpg


8.3  Change pointers for master data changes are not recorded

This issue is applicable for all type of master data objects which are relevant to APO planning. 

 

When there are any changes in the existing master data objects which are integrated with APO & part of active integration model, still the changes are not reflecting in APO. You will not find any CIF queue blocks for these changes. These changes will not even transfer to APO when the CIF job will run for respective integration model.

If you will check the table BDCP & BDCPS (for change pointer), you will not find the change pointer that you are looking for.

 

Root Cause: - This issue mainly pops up when you have not activated the “ALE change pointers” globally in the transaction code BD61, or the change pointer for specific message type for the master data object in the transaction code BD50 shown below.


        49.jpg


8.4  Master Data Changes are not getting transferred to APO

This issue is applicable for all type of master data objects which are relevant to APO planning. 

       

When there are any changes in the existing master data objects which are integrated with APO & part of active integration model, still the changes are not reflecting in APO. You will not find any CIF queue blocks for these changes. These changes will not even transfer to APO when the CIF job will run for respective integration model.

You have checked that the master the change pointers are active globally & for all the required master data message types.

 

Root Cause: - This issue mainly pops up when you have multiple integration models active for the same object. You should check the active integration model for the master data object by using the transaction code- CFM5 in ECC.

Conclusion: - you should have only one active integration model for any unique master data object.



8.5  Custom field value change in material master is not triggering change transfer to APO

This issue is applicable for all type of master data objects which are relevant to APO planning and enhanced by adding Z custom fields.

   

When there are any changes in the Z custom field value (Enhanced master data objects with Z fields relevant to APO planning) in the existing master data objects, the changes are not recorded & transferred to APO. The objects are integrated with APO & part of active integration model. There are no CIF queue blocks for these changes. These changes will only transfer to APO when there is an initial transfer for the respective master data objects. 

 

You have checked that the change pointers are active globally (T.code-BD61) & for all the required master data message types (T.code-BD50).

 

Root Cause: - This issue mainly pops up when there is Z custom fields added in the APO relevant master data object but these tables & fields are not added in the transaction code BD62 for the specific message type. 

 

Conclusion: - All fields of the required master data objects should be maintained in the transaction code- BD62, which should trigger the change transfer upon any change in ECC.



8.6  APO orders are not getting transferred from APO to ECC

This issue is only applicable to the transaction data which are planned in APO & to be sending back to ECC system.

   

Planning run has created procurement proposals in APO. These procurement proposals have to be send back to ECC. The new or changed proposals are not reflecting in ECC system.  The transaction data is integrated with APO & part of active integration model. There are no CIF queue blocks for these changes. These changes are not reflecting even in the reconciliation report (/SAPAPO/CCR).

 

You have checked that the all the basic CIF configuration is maintained.

 

Root Cause: - This issue mainly pops up when the publication definition have not been maintained for the Plant & ECC logical system combination in APO. /SAPAPO/CP1



8.7   Poor performance of CIF background job

 

  • First, check the setting in transaction CFC2. It is recommended to use the “Normal” logging to keep the system performance better. For administrator, who is responsible to resolve interface issues, can be given detailed logging access with “debugging on”.
  • Second, schedule the performance improvement CIF batch jobs on regular basis, at least once in a week.
    • Archive logs using t.code CFGD
    • Configure the application log using t. code CFC6
  • Third, check the block size settings of the filter objects using t.code CFC3. The size of the filter object improves system performance.
  • The block size that should be used in each individual case is largely dependent upon the current data situation.
  • There is no thumb rule to define the block size. It’s a judgmental, hit & trial method to find the optimum filter size. It is recommended to use the default settings as a starting point.
  • Fourth, use Parallelized process wherever available.



8.8  Common configuration objects are not in sync between ECC & APO

There are various common configuration objects in ECC & APO which should in sync. This is one of the prerequisites before you initiate the master data transfer from ECC to APO. If the configuration is not in sync (object missing in APO) then there will be initial error in CIF data transfer.

Some of the common configuration objects are:

    • Insert Regions
    • Currencies
    • Calendar
    • Unit of Measure
    • Time Zones

 

Root cause: Standard configuration has been changed in ECC for the UOM, Calendars, or new configuration objects have been added. These changes must be updated in APO.

 

Solution: - you can update the UOM, Currency & Calendars using the RSA1 transaction if the connected ECC system is created & active under the SAP source systems. You can use the “Transfer global settings” from the context.

          50.jpg

              

               For other configuration object, either compares and maintains it manually, if missing, or use the compare tool under the Utility menu bar to                compare the objects from concerned ECC system & update all differences in one shot.

          51.jpg    


9.  CIF Housekeeping Job

  • Report – RAPOKZFX

     Detect and correct inconsistencies between material master and integration models with report RAPOKZFX. In rare cases, inconsistencies can occur      between data in integration models and field APOKZ in table MARC. They may occur if you activate a model that refers to a material master that is      being changed at the same time. In this case, the activation is finished successfully but the APOKZ is not set correctly, and an error message is      displayed. The inconsistency can result in an error during the ATP check and when transferring production and planned orders

 

  • Report – RCIFMAX

     As of R/3 Plug-In 2002.1, report RCIFIMAX should be scheduled regularly to find Inconsistencies between the integration model sources and their      runtime versions. This report must not be run in parallel with activations of integration models.

 

  • Report – RSQOWKEX & RSQIWKEX – (Exceptional use only)

You can activate qRFC queues using the reports RSQOWKEX (outbound queues) and RSQIWKEX (inbound queues). In normal operation, however, it is not necessary to run these programs regularly because almost all queue entries are processed without errors. In case of queue errors, these should be detected by the procedures described below, and analyzed and corrected accordingly. The error analysis should suggest preventive measures to reduce the number of future exceptions. In exceptional cases, or, for example, on test systems, you can use reports RSQOWKEX and RSQIWKEX. If you start these reports at an inappropriate time or with too many queues selected, they may cause an excessive additional system load.

 

  • Report - /SAPAPO/CIF_DELTAREPORT3

Detect and correct external inconsistencies between APO and R/3 with report /SAPAPO/CIF_DELTAREPORT3 (transaction /SAPAPO/CCR). To ensure that all relevant transaction data objects (such as purchase, production or sales orders, and stocks) for which there are active integration models exist in both APO and R/3, this report should be scheduled to run:

    • Periodically, and preferably daily, to detect and reconcile possible inconsistencies as soon as possible. This is important because otherwise further inconsistencies can be generated and cause subsequent planning to be based on incorrect data.
    • In case a recovery of your liveCache or your APO database had to be executed, but was incomplete (point-in-time recovery, loss of data,)
    • In case you have evidence of inconsistencies between your APO and your R/3 OLTP system
    • In case queue entries have been deleted.

 

  • Report - /SAPAPO/OM17 -  ( In case of Recovery only)

Internal consistency between APO DB and live-Cache is checked by transaction /SAPAPO/OM17. If it is necessary to reconcile the internal consistency, for example in case of a recovery, we recommend doing this first before checking and reconciling external consistency.

 

 

10.  Performance Optimization Job

To optimize the performance of the data transfer between the APO and the connected R/3 OLTP system(s) and to prevent accumulation of useless data in the systems, several reorganization jobs must be scheduled to run regularly.

  • Administration Jobs Related to Data Transfer (R/3)
    1. Delete application log with report RDELALOG. If writing of application logs is enabled in (R/3) transaction CFC2 or APO transactions /SAPAPO/C4 or /SAPAPO/C41) – and this should be done in a production system for certain users and for problem analysis only – old logs must be deleted regularly. It is recommended to run this job daily and delete logs older than 7 days.
    2. Delete ALE change pointers with report RBDCPCLR. If changes to master data are transferred periodically via ALE (as it is recommended), processed change pointers must be deleted regularly. After completing this, if your database system on the R/3 side is Oracle, run report RBDCPIDXRE to reorganize the Oracle indexes on tables BDCP and BCDPS. See SAP Note 328355.
    3. Delete old inactive integration model versions with report RIMODDEL. Every time an integration model is generated, a new version is created, distinguished by a timestamp. The old version is deactivated and the new one is activated. Old versions must be deleted regularly.

 

  • Administration Jobs Related to Data Transfer (APO)
    1. Delete application log with report /SAPAPO/RDELLOG. Same as RDELALOG in R/3 (see above).
    2. Check processing of APO change pointers with report /SAPAPO/RDMCPPROCESS. To verify that all change pointers created are processed, after publishing of planning results to R/3 run report /SAPAPO/RDMCPPROCESS without restricting the selection of orders and confirm that message “No change pointers were selected” is displayed. If change pointers remain unprocessed, contact the application support team to clarify whether these change pointers are necessary and why they are not processed.
    3. Deletion of R/3 data those are no longer required in APO with report /SAPAPO/SDORDER_DEL.
    4. Delete old results of CIF delta report using report /SAPAPO/CIF_DELTAREPORT3_REORG. As it is now possible to save the results of a Delta report run, it is necessary to delete outdated results from the database. The spool list from this report contains the number of records deleted.
    5. Delete post-processing records with report /SAPAPO/CIF_POSTPROC_REORG. Processed and obsolete post-processing records are no longer required and should be deleted. This report is used to do so. Non-deletion of these records will have an increasingly negative impact on CIF performance over the time. The deletion is a two-step process. In a first run, outdated records that meet the selection criteria with the status still to be processed are set to status obsolete (set manually). In a second run, all processed and all obsolete records are deleted.


  • Note:

          a)     Deleting change pointers may cause inconsistencies, as the corresponding order changes are not transferred to R/3.

 

In SAP APO database tables, the tables expand with data from SAP R/3 documents. However, this data is no longer required; no corresponding information exists in liveCache. In addition, the performance of the initial data supply or of other transfer processes with a high data volume is affected negatively. The obsolete records need to be deleted regularly to control the size of certain tables (e.g. /SAPAPO/SDFIELD and /SAPAPO/POSMAPN) and to improve the performance of the Sales order updates on SAP APO side. For details, see SAP Note 504620.\

 


11.  Real CIF Error encountered in various project


  • Error: CFEP000000043548  ED1CLNT120            @OJ@            Internal number assignment not defined

Cause:- There is no internal number range maintained for purchase  requisition document type “NB”.

Solution:- Create pur req no range in ECC & enter it into the Document types under purchasingàPur Req for type NB “internal no rng”

 

  • Error: CFEP000000043548  ED1CLNT120            @OJ@            Enter Purch. group

Cause:- Purchasing Group is not maintained in ECC system for the given material.

Solution:- Maintain the Purchasing group in the material master & reactivate the queue.

 

  • Error: CFIP009000001374   ED1CLNT120            @OJ@            21.12.2015 date comes after end of valid factory calendar

Cause:-The planning calendar validity is ending in 2010 while the orders dates are lying in 2013.

Solution:- Extend the planning calendar.

 

  • Error: CFIP000100000896   ED1CLNT120            @OJ@            Messages for WM staging: Check order 100000896

Cause: The Production supply area has not maintained in the resource or in the BOM. System first picks the Prod Supply area from the resource master & if that is not maintained in the Resource then it will look in the BOM component.

This is used if the Warehouse management is active.

 

        Note:- In the SPRO (Log Exe:--Wh mgmt àinterfaceàdefine production) production supply area is maintained or use trxn code OMK0.

 

Solution:-Maintain the Production supply area & should be assign to the Resource (for the master recipe for which component is assigned) or assign directly to the BOM component. Ensure that the storage location assigned in the Production supply area for a component should be assigned in the MMSC trxn.

 

 

  • Error: CFIP0000072359       EQ1CLNT210            @OJ@            Backflushing only possible for materials kept in stock

Cause: The Backflush indicator in the material master is “1” always backflush. While the valuation class in the material master is WIPS – WIP inventory stock.

Hence while converting planned order into process order system reads the accounting view data to calculate the cost & determine the accounts using the valuation class. That time system is giving the CIF error.

Note:- The Qty & value updating field in the material type config is not relevant to this error.

Solution:-Either changes the Valuation class in the accounting view of material master or else change the back flush indicator from 1 to blank.

 

  • Error: CFPLO000100000382           MQ1CLNT210           @OJ@            Two constraints that exclude each other or create unrealizable situation

Cause:

Process Order directly created in ECC system & sent to APO. In ECC Master Recipe is created but the production version is not created & also not CIFed to APO. Hence the error occurred.

 

Note: - Following observations:-

  • Delete the queue.
  • Create the production version in ECC but don’t CIF to APO.
  • Now if you will create the Process order in ECC system then it will go to APO without any error. But there will not be source of supply in the process order in APO. The reason is that the PPM is still not CIFed to APO.
  • All the operation will also appear in APO but without any operation description.
  • You can also schedule the order in APO without PPM availability but the dates will calculate based on ECC master data not as per APO.

 

     Resolution:-Create the Production version in ECC & send it to APO.

 

  • Source of supply () not in source list (Material/Plant) despite source list requirement
    • Vendor is not included in the source list (if it is purchase info record).
    • Vendor /scheduling agreement is not included in the source list for the material plant.
    • The validity dates are not in the range of the purchase requisition.
    • The validity dates of the Pur Info record maintained in the info record  is different then valid in the source list.
    • The validity date of the source of supply in APO is inconsistent with the validity dates in ECC system.
    • Source list indicator is set in the material master “Purchasing view”.
    • Source list is maintenance is mandatory at plant level irrespective of the material type (In-house or External)

     Solution:-

    • Maintain the vendor or scheduling agreement in the source list with the consistent validity date in ECC & APO.
    • The validity dates should be same in ECC & APO for the outline agreement & the purchase info record.
    • If after removing the “Source list indicator” from the material master still the same error is reflecting that means in the source list configuration the ‘Source list indicator” is mandatory for that location.


  • Not possible to determine shipping data for material MAT1  at YYYY
    • Check the Sales organization data of Material master and it should be maintained with correct sales area which is maintained in customizing “Setup for Stock Transport Order”.
    • Make sure that the Customizing for “Set Up for Stock Transport Order” is correctly maintained. Ask the user to check the same and if necessary take the help of R/3 Support team.
    • In order to check the Customizing for “Set Up for Stock Transport Order” , follow the path : SPRO à Material Management à Purchasing à Purchase Order à Set Up for Stock Transfer Order.

               Here please make sure that

      1. Receiving plant is assigned to correct customer Number created by the Sales area of supplying plant.
      2. Also check correct delivery document type (NL-Intra Company and NLCC-Inter Company) has been assigned to the supplying plant.
      3. A correct purchasing document type should be assigned to the combination of receiving plant and supplying plant.

 

  • No sales area is assigned to sold-to party XXXXXXXXX and plant YYYY
    • Check in the customizing of R/3 that whether any sales area is assigned to this VMI customer and plant combination. The Path to check this is SPRO àIntegration with Other SAP Components  à Advanced Planning and Optimization à Application Specific Settings and Enhancements à Settings and Enhancements for Sales Order à Settings for Vendor Managed Inventory à Assign Sales Area and Order Type to Ordering Party/Plant.
    • This setting is required if the customer is a VMI customer for one plant for some products without which the sales orders will not get any numbers from R/3.



12.  CIF- Important T-codes


R/3:

Transaction code

Description

CFM1

Create integration model

CFM2

Activate/deactivate integration models

CFG1

View CIF application log

CFC2

User parameters for CIF

CFC3

Block sizes for initial transfer

CFM5

Filter object search in integration models

CFC1

Define logical systems as APO systems

NDV2

Setting of release level of APO systems

SMQ1/SMQ2

qRFC monitor incl. functions start, stop, execute

SM59

Definition of RFC destinations

SALE

Definition of logical systems

 

 

APO:

Transaction code

Description

/SAPAPO/C3

View CIF application log

/SAPAPO/C4

Setting of user parameters CIF

/SAPAPO/C5

Send planning results to R/3

/SAPAPO/C1

Create business system group

/SAPAPO/C2

Assign logical systems to a business system group

/SAPAPO/CQ

SCM Queue Manager

/SAPAPO/CCR

Comparison/reconciliation tool

SMQ1/SMQ2

qRFC monitor incl. functions start, stop, execute

SM59

Definition of RFC destinations

SALE

Definition of logical systems

/SAPAPO/CPP

CIF Post processing

 

References

  • SAP Note: 563806
  • SAP Note: 369007
  • SAP note: 786446
  • SDN: www.sdn.sap.com
  • SAP Help: www.help.sap.com
  • For more information, visit the Supply Chain Management homepage



Viewing all 24 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>