Quantcast
Channel: SCN : Blog List - SAP Master Data Governance
Viewing all 87 articles
Browse latest View live

MDG 7.0: Click-through demo for MDG-C scenario with Customer Like UI

$
0
0

Hi,

 

Introduction

SAP delivers with SAP MDG a so called “customer like UI” which hides the complexity of the Business Partner model from the end users. Especially customer who do not use or know the SAP Business Partner concept are used to a simple creation of a customer user transaction xd01. Customer Like UI looks pretty similar and simplifies the creation of a ERP customer via SAP MDG.

For more information please visit http://help.sap.com/mdg70

 

Link to the click through demo

http://demo.tdc.sap.com/SpeedDemo/93cb9d046af1e202


Storyboard

CustomerLikeUIStory.jpg

 

Key Points:

  • Create Customer using "Customer Like UI"
  • One Step Contact Person creation

 

Screenshots

Int hese screenshots you can see that the complexity of BP is hidden from the end user.

CustomerLikeUIScreenshot1.jpg

 

CustomerLikeUIScreenshot2.jpg

 

Best regards

Steffen


MDG 7.0: Click-through demo for MDG-S scenario with Vendor Like UI

$
0
0

Hi


Introduction

SAP delivers with SAP MDG a so called “vendor like UI” which hides the complexity of the Business Partner model from the end users. Especially customer who do not use or know the SAP Business Partner concept are used to a simple creation of a vendor user transaction xk01. Vendor Like UI looks pretty similar and simplifies the creation of a ERP vendor via SAP MDG. For more information please visit http://help.sap.com/mdg70

 

Link to the click through demo

http://demo.tdc.sap.com/SpeedDemo/841101508357468b



Storyboard

VendorLikeUIStory.jpg


Key Points:

  • Create Vendor using " Vendor Like UI"


Screenshots

In these screenshots you can see that the complexity of BP is hidden from the end user.

VendorLikeUIScreenshot1.jpg

VendorLikeUIScreenshot2.jpg


Best regards

Steffen

Config.Import of data using IDOC

$
0
0


The propose of this post is to give the details steps how to configure the Import sceniro mention in the SAP doc and it will help to answer some of the question ask frequently.

Data Transfer:

Setting up Data Transfer i.e. Export and import configuration.

 

  1. The first step of a data transfer is the export of the master data to an
    IDoc-XML file, which you can save locally or on your application server.
  2. In the second step, you import the data into your central MDG system.

Data Export from the source
System, configure the logical system
for IDoc-XML export to the application server file system

  1. Create an XML-file port: Use transaction WE21 to create an XML-file
    port for IDoc processing. Ensure that you have network access from your local
    client to the directory configured in the XML-file port.

Untitled.png

Enter the function module EDI_PATH_CREATE_CLIENT_DOCNUM. On the Outbound Trigger tab enter the RFC
destination
LOCAL_EXEC.

Untitled1.png

2.Create Logical System

Open transaction SALE and then go to
Basic Settings ->Logical->Systems->to create a new logical system.

Untitled3.png

3. Maintain Distribution Model
Open transaction SALE and then go to->Modeling and Implementing Business Processes->Maintain Distribution Model and Distribute Views->You can also use transaction BD64 for this.

a. Switch to change mode and choose Create Model View to create a new entry. Enter a short text and a technical identifier.
b. Choose Add Message Type for the newly created model. Enter a logical source system name and a destination system name and choose the message types MATMAS and CLFMAS

Untitled4.png

4.Create Partner Profile

Run transaction SALE and then go to->Partner Profiles->Generate Partner Profiles

>Alternatively you can use transaction BD82.

a.Select the newly created model using the input help for the technical name and then select the logical destination system.

b.Enter the authorized user and the following values:

 Version: 3

 Pack.Size: 100

 Output Mode: Immediate Transfer

 Inbound. Processing: Immediately

c.Choose Execute. You can ignore the port error that appears.

 

Untitled5.png

5.Call transaction WE20 and make the following settings:

a.Open the Partner Type LS folder and select the partner profile you created above.

b.Update the message types MATMAS and CLFMAS in the Outbound Parameters section. The Receiver Port is the XML-file port from the first step above. In the Basic Type field enter MATMAS05 for MATMAS and CLFMAS02 for CLFMAS.

Untitled6.png

6.The Receiver Port is the XML-file port from the first step above. In the Basic Type
field enter
MATMAS05 for MATMAS and CLFMAS02 for CLFMAS.

Untitled7.png

7.Test creation of IDOC XML

  1. a.Generate the
    IDoc-XML for material using the transaction BD10.

Untitled8.png

Untitled9.png

 

 

8.Check the newly generated IDocs using transaction WE02 or BD87. You can use the
receiver port as the filter criteria in the Partner Port field.

Untitled10.png

 

9.Use transaction AL11 to find the XML files on the directory of your XML-file

port.

10.To download the file for analysis purposes to local directory use transaction CG3Y.

Untitled11.png

Data Import into Target System (MDG Hub)

1. To be able to import IDoc-XML files the following set up activities need to be carried out:

a.Use transaction IDX1 to create two ports in the IDoc adapter, one for sending and the other for receiving. Enter the port, client, description, and RFC destination for each port. Both ports should have the RFC destination of the MDG hub. Check that the port names match the names in your IDoc-XML file for the SNDPOR and RCVPOR.

Note:That the sender and receiver Port Name should match the incoming XML file,otherwise the
file will not be picked up(shown in XML File below).

Untitled12.png

Untitled13.png

 

 

2.In transaction WE21 enter the receiver XML port using the same name as in step 1 above.
Enter the port name under the folder XML File, and enter a description and a
physical directory. In the function module field enter
EDI_PATH_CREATE_CLIENT_DOCNUM. On the Outbound:Trigger tab, in
the RFC destination field, enter
LOCAL_EXEC.

Untitled14.png

3.In transaction FILE create the logical file path name. Enter a Logical
File
and a Name. In the Physical File field enter
<PARAM_1>. In the data format field enter BIN. In the Application Area field
enter
CA. In the Logical Path field enter the logical file path.

Note:Physical File field some is not reconize due which the file will not be picked up so check properly.

Untitled15.png

4.Open the Configuration activity  General Settings->Data Transfer->Define File Source and Archive Directories for Data Transferand assign your new logical file path name as a directory for data transfer.

Untitled16.png

Untitled17.png

5.In transaction AL11 make sure that the IDoc-XML files are stored under the logical path and that there are no other files stored in that directory. Double-click on the path to view the existing IDoc-XML file. You can use transaction CG3Z to copy a local IDoc-XML file to the path

Untitled11.png

6.Open the MDG workcenter and provide the details with CR if you want to govern the data as shown below.

Untitled18.png

select the Show Directory Content to see the file has come in the path.

Untitled19.png

check the logs to see whether the files are picked up or not.

Untitled19.png

go back to MDG workcenter and see the you can see the CR if the file is picked Up

Untitled20.png

click on the CR and activate

Untitled21.png

check the status of CR it should Final check approved and go MM03 to check the material

Untitled22.png

 

Hope this will help.......

Flexible Data Governance - Oxymoron or Practical Need?

$
0
0

The need for data governance which actually means establishing control on data source to obsolescence has become a need for many organizations. I would be surprised if any CIO is not mentioning establishing data governance or enhancing data governance as one his primary IT investment objectives in the short term or long term. Lot of organizations have a baggage of multitude of IT systems each of which serves a specific business need and designed to serve it by sourcing the needed data all by itself. All organizations always keep evaluating the conflicting need to rationalize IT systems or building more robustness in the interfacing part so that the hold on to the best of the breed IT systems even though collectively they create lot of weak links. So if an organization is able to rationalize IT systems, then data governance is taken care by the rationalization itself but in case of the need to maintain the best of breed IT systems, then data governance becomes paramount as it can really break the weak links and exponentially increase the pain of maintaining such heterogeneity.

 

So the context of data governance is to arraign the sources of proliferation of data elements and data values and bring in much needed control. Basically strengthen the weak links in the heterogeneous IT systems landscape which I had mentioned earlier. As anybody can guess this is easier said than done. Additionally data governance needs harmonization as pre-requisite which can also turn out to be tough to crack. So achieving one source of truth and one value across all the IT systems is a tall order, but then is it not the fundamental requirement to be achieved through Data Governance. It is here my topic bears significance of having flexibility in data governance. I would say "Flexible Data Governance" is indeed an oxymoron but it is a practical need too. Let me explain with an example.

 

In a recent data governance implementation project we came across the field "Division" having available values as 10, 20, 30 & 40 in one SAP system and in other SAP systems there were multiple values additional like 60, 15 and even alpha numeric of two character length. Keep in mind all the systems involved are SAP, so it should be a piece of cake to harmonize this and that is how it started. We standardized the values as 10, 20, 30 & 40 in the data governance system and mapped the additional values available across all systems to these 4 values. But then we found case of hard coding in interface programs, middleware program, enhancement programs and even values in this field being used in user exits to execute specific activities etc. which ruled out harmonization of one value of after another. So what to do? Continuing with harmonization means costly program changes, elaborate testing efforts, risk of introducing production issues etc. Here comes the concept of “Flexible data governance” where-in we introduced scalable data governance where-in within a master data object we allowed values to be harmonized and controlled on a certain fields while in other fields it is allowed to have different values in different systems. So the data object is part of data governance but not all fields in it are controlled by the data governance tool.

 

I am sure each of us would have seen such conflicting requirements, but in a data governance project where-in the fundamental need to conformity, flexibility is a bad word but then life thrives in the gray. Please share such examples in your project experience.

SAP Master Data Governance for Governing Retail Articles

$
0
0

The purpose of this blog is to give you a quick overview on the consulting solution which is available now to govern retail article data using MDG.

 

MDG as a solution has been validated to work with the IS Retail system for more than a year now, but so far the content (data model, UI and logic) available on MDG framework, was only for  customers, suppliers, BP, Finance objects and materials.Customers could of course build their own article data model and UI on top of the MDG framework, but were left with a lot of open questions around the architecture to follow, how the UI should be modelled, etc. and had to deal with complexities of bringing the retail business logic into MDG. On top of this, the effort and time involved to build a model as complex as the one in SAP IS Retail was  too much and not really suitable for customers who wanted to get it running quickly.

 

Now with the new consulting solution which is offered, we will be able to govern also retail articles using MDG’s powerful data governance and enrichment platform.

 

The solution, which is in its second version now, not only provides the article data model  and  UI  for staging and governing retail articles, but also benefits from  its tight integration with the IS Retail system to provide the business logic and customization settings already done on the retail system. This way, the data quality is enforced exactly when data is captured in the governance process , ensuring that data governed in MDG, is fully compliant to your retail business rules when its replicated to the IS Retail system.


What is the architecture?

 

For those of you who are familiar with the MDG architecture , MDG for Articles uses a ‘reuse’ architecture, which means the solution is able to fully leverage the IS Retail customizing logic including concepts like reference handling, which is an integral part of the IS retail system and is automatically able to  replicate into your retail tables after governance.


In short the architecture is kept aligned to the architecture of the other MDG domains like MDG for Materials , Suppliers etc.

 

The solution uses the underlying MDG framework to provide the governance functionalities like change documents, workflow, customizable UI , tight
integration into the business rules framework for additional validations and defaulting  etc. And of course the architecture is extensible, so you are able to extend the pre-delivered content with your landscape specific extensions on the data model and the UI


Feature Highlights:

   

  • Data governance for structured and unstructured Retail Article types including Single, Generic, Set, Prepack and Display
  • Supports the reference handling logic of the IS Retail system
  • Support for Full Products , Empties and BoM (for structure articles)
  • Characteristics handling and variant creation.
  • Data views for Basic, Purchasing, Listing and Logistics (more views to come)

 

Article Detailed Screen.png
Figure 1 : Showing the Article Change Request screen with some of the fields and the article image on the side panel.

 

 

The overall idea of this consulting solution is to make it easier for customers to use MDG for their Retail master data and decrease the time to value.  The solution attempts to cover all the complexities around designing the architecture and core retail functionality on top of the MDG framework, so that you can focus on your business specific requirements and not worry about bringing in the core retail logic.


Hope this information was useful and if you have questions feel free to post them on the blog or write to me with your queries.

MDG-M: How To Send an Email Notification to the Requestor at Time of Material Activation

$
0
0

Greetings:


I recently added Outlook email notification to our MDG-M system and thought I would share our solution.


Our business requirements are as follows:


Requirement #1:

When a MDG Change Request (work item) is sent to an approver’s workflow inbox in MDG-M, send an email to the approver’s Outlook mailbox.

Our SOLUTION: Use Extended Notification.

 

Requirement #2:

When a work item receives final approval and the material is activated, send an email to the person that created the CR, i.e., the requestor.

Our SOLUTION: Create a sub-workflow using transaction SWDD, and call it from our BRFplus workflows.

 

Although I’ll give a very brief overview of Extended Notification, the main focus of this blog is to provide detail on how I accomplished requirement #2 – send an email to the CR requestor after final CR approval.


Req #1: Email Notification to Approvers using Extended Notification


The Extended Notification method is the latest and most flexible method for creating email notifications based on work items in a person’s workflow inbox.  The configuration is not specifically a part of the MDG module.  Rather, it is a general method of monitoring work items, including those created by MDG, and then generating an email notification based on your configured schedule.

 

To send an email notification to each Change Request approver in my MDG workflow, I am using Extended Notification.  Extended Notification will also send an email to the requestor if a Change Request is sent back to the requestor via the MDG workflow's “Send for Revision” action (because in this case, the work item is sent to the requestor’s workflow inbox).

 

For sending email to approvers, I believe Extended Notification is the best solution. In contrast, using the “SendMail” method (which I describe next in this blog for requestor notification), would require a lot of additional steps to be added to my BRFplus workflow configuration, i.e., a SendMail step would have to be added after every approval step of every BRFplus workflow.  I have many CR Types (each has their own workflow) and each has multiple approval steps.  Any change in business requirements would require tedious changes to BRFplus – no fun.  So that is why I took the Extended Notification approach for requirement #1.

 

Although I have provided a brief overview of Extended Notification below, there is already good documentation on how to setup Extended Notification so I won’t go into detail.  Extended Notification documentation I found useful:

 

 

I found another really good document on the web titled, E-Mail Extended Notifications v1.1.pdf, but for the life of me, I cannot remember where I got it and I am unable to attach a PDF here (I also don't know the author).  If anyone knows what document I am talking about, I'd love to add a link here.

 

Brief Overview of Extended Notification

 

The Extended Notification process is comprised of two phases:

 

  • Selection:  The relevant work items are selected and notifications are created and stored in database table SWN_NOTIF.
  • Delivery:  The notifications are selected from SWN_NOTIF and messages are created and sent to the users.

 

The selection and delivery schedules require customizing via transaction SWNCONFIG.  Alternatively, the configuration can be performed using the simplified SWNADMIN transaction but SWNADMIN does not support recording of changes in a transport (and SWNADMIN does not have all configuration options).

 

Program SWN_SELSEN then has to be scheduled periodically.  Program SWN_SELSEN processes both the selection and the delivery, as configured via
SWNCONFIG.  Email notifications created by SWN_SELSEN, can be monitored via transaction SOST (SAPconnect).

 

We also scheduled a batch job to run program RSWNNOTIFDEL to physically delete records from table SWN_NOTIF that were logically marked as deleted by program SWN_SELSEN.  Our variant uses 15 days as the “Minimum Age” for deleting records.

 

Req #2: Final Approval Notification to Requestor using SWDD Sub-workflow

 

Our second business requirement is to notify the requestor when the MDG Change Request receives final approval and the Material activation is successful.  Because the Change Request work item does not go back to the requestor’s inbox after final approval, Extended Notification can’t be use for this requirement.

 

For this requirement, I created my own sub-workflow using transaction SWDD.  I then added a step to the BRFplus workflow to call my sub-workflow.  I am by no means a workflow expert and this is the first time I have used SWDD to build a sub-workflow.  To get started, I referenced section 3.1 of the following “how-to” document (How To...MDG-M Send an E-Mail Notification during the governance process):  https://scn.sap.com/docs/DOC-49089

 

However, the problem with the above document is the example in section 3.1 only shows how to configure the Send Mail step with a single, hard-coded e-mail address.  In my case, the recipient is dynamic – I want to send the email to the person that created the change request, i.e., the requestor.  To make the recipient dynamic, a little ABAP code is required. I am not an ABAP developer so I like to accomplish as much as I can via configuration, without extra code. Section 3.2 of the above document (49089) does discuss how to accomplish the full email notification with a BADI system call.  However, wanting to avoid as much extra code as possible, I did not pursue the BADI approach.

 

There is also a MDG-F How To document for Mail Notification that is a good reference on how to create a workflow task that retrieves the email receiver’s ID:  http://scn.sap.com/docs/DOC-15088

 

My approach is a hybrid of those discussed in the above how-to documents.

 

Sub-workflow Overview

 

Below is what my completed sub-workflow definition looks like in transaction SWDD:

Pic 1.png

Note the following which I have highlighted above:

 

  • When creating a new workflow, SAP will give it a number.  Mine is WS99900002.  When I add this sub-workflow as a service call to by BRFplus workflow, WS99900002 is the service number that will be referenced.  This is discussed at the end of this blog.
  • This workflow has three steps:
    • Get CR Creator– This is where we had to put some ABAP code to lookup the CR Creator ID from the standard MDG Change Request table usmd120c.
    • Send Email– This is where you specify the email recipient and email content.
    • Workflow End
  • I created three workflow containers:
    • CR_CREATOR_NAME – This contains the first and last name of the CR Creator (it is not their ID).  You only need this container if you want to include the creator’s name in your email body.
    • EMAILRECEIVER – This contains the CR Creator’s ID, prefixed with “US”, which I believe tells the system the ID is a USER ID.  The “US” prefix is mandatory to make this work.
    • Change_Request – This contains some basic information about the change request such as the CR Number and CR Description. Unfortunately, it does not contain the Creator ID – that’s why the Get CR Creator workflow step is required.

 

Step-by-Step Sub-workflow Build

 

Here are the steps to create the sub-workflow in transaction SWDD.

 

Create Workflow Containers

 

In the left-hand workflow container pane, create a new container by double-clicking “<Double-Click to Create>”.

 

CR_CREATOR_NAME

 

I defined CR_CREATOR_NAME as follows:

Pic 1.png

I have checked the Import property:

Pic 1.png

EMAILRECEIVER

 

EMAILRECEIVER uses a standard Data Type that is available called TSWHACTOR:

Pic 1.png

Both Import and Export are checked because this container is “imported” to the Get CR Creator step (where ABAP code will set the "US" prefix and the ID), and then “exported” and used by the Send Email step:

Pic 1.png

If you expand this container in the workflow container pane, you will see that Data Type TSWHACTOR has two fields, Object Type and Agent ID:

Pic 1.png

The ABAP code will set these two values in the Get CR Creator step.  Object Type gets set to “US”.

 

Change_Request

 

The Change_Request container uses standard object BUS2250, provided by MDG:

Pic 1.png

This will provide you the CR Number, Material Number (stored in SINGLE_VALUE_OBJ) and CR Description:

Pic 1.png

Create Workflow Steps

To create a new workflow step, click the Create icon and select the Step Type from the dialogPic 1.png

Get CR Creator Step

 

The Get CR Creator step is an Activity Step Type.  It requires a Task to be defined.  I initially tried to create the Task from scratch but soon realized I wasn’t sure of all the necessary settings to get an Activity step to work.  Therefore, I decided to copy an existing task from the standard MDG workflow WS60800086.

 

I opened workflow WS60800086 in transaction SWDD to see the steps.  I decided to copy the task TS75707952, used by the “Check Change Request” step:

Pic 1.png

To copy task TS task 75707952, I used transaction PFTC_COP.  SAP assigned 99900005 as my new TS task number.  Transaction PFTC_CHG can then be used to maintain your new task:

Pic 1.png

The only three things I have changed in my copy of the task are the Name, Work Item Text:

Pic 1.png

And created an EMAILRECEIVER container with the same definition as that which was defined in my workflow (described earlier):

Pic 1.png

After creating the task, you can now go back to SWDD and add the Activity step called Get CR Creator to your workflow, and assign it your new task - in my case, task TS99900005.

 

Next, click the Binding button to define the binding:

Pic 1.png

Here is the binding I have defined for the step.  The CHANGE_REQUEST is passed into the step so that the CR Number can be retrieved by the Program Exit (our ABAP code).  The two EMAILRECEIVER attributes will be populated in the Program Exit and passed back out of the step:

Pic 1.png

The code written by our ABAP developer for the Program Exit class is attached to this blog - see ABAP Code to Get Email Receiver.txt.  The code retrieves the CR creator's ID from table usmd120cAgain, I am not an ABAP developer so I am not including the steps to define the class.

 

Once the class is defined, you will reference it in the Program Exits tab of the Get CR Creator step.  We named our class CL_WFEXIT_REQUESTER_EMAIL:

Pic 1.png

Send Email Step

 

Next, add a Send Mail step to your workflow.

 

The Recipient Expression will be &EMAILRECEIVER& - this is our EMAILRECIEVER container.  Define your Subject text and Body text.  In my case, I am including the expressions to display the CR Creator, CR Number, CR Description and the CR Material Number:

Pic 1.png

Click the Insert Expression button below the Subject or body, to get a list of available expressions.

 

Workflow End Step

 

Lastly, add the Workflow End step, which is an Event creator Step Type.  For the Container Element, choose your Change_Request container which uses BOR Object Type BUS2250.  For the Event, choose SUBWORKFLOW_PROCESSED:

Pic 1.png

Click the Binding button and bind CHANGE_REQUEST to _EVT_OBJECT:

Pic 1.png

Add Sub-workflow to BRFplus Workflow

In the MDGIMG (under Process Modeling > Workflow > Rule-Based Workflow > Define Change Request Steps for Rule-Based Workflow), I created a new Change Request Step for my Rule-Based Workflow:

Pic 1.png

I named the new CR Step“Send Email” and defined it as CR Step Number "11":

Pic 1.png

Next, run transaction USMD_SSW_RULE to edit your BRFplus workflow.  Because I want to send an email to the requester upon successful activation of the material, in the Single Value Decision Table, I changed the row for Previous Action“31 (Activation successful)”.  I used the new Condition Alias “20” (any unused number) and for the New Chng. Req. Step, selected the new CR Step “11 (Send Email)”:

Pic 1.png

In the Non-User Agent Decision Table, Condition Alias“20” was added with a Process Pattern of “03 (Call Sub-Workflow)" and the Service Name equal to my sub-workflow created via transaction SWDD, i.e., “WS99900002”:

Pic 1.png

Recipient’s User Profile

 

Each user that will receive email from the sub-workflow must have their email address entered in their User Profile (transaction SU0) and the Communication Method must be set to INT E-Mail.

 

Basis Configuration

 

In addition to the configuration described above, your Basis team has to configure external delivery of email from your SAP system using transaction SCOT.  I am not a Basis person and there is other documentation out there on this so I won’t go into any of the SCOT detail.  We have the SCOT batch job scheduled to run every minute.  The SCOT batch job picks up the notifications you see in transaction SOST and sends them out.  When an email has been successfully generated by either Extended Notification or your BRF+ SendMail step, it will appear in transaction SOST with a yellow status. When the SCOT batch job successfully delivers the send request, the status turns green.


I hope this blog was useful for those of you trying to get email notification configured for MDG.

 

Warm Regards,

Rob Scofield

Validations & Derivation Rule’s on various CR Type & various CR Step

$
0
0

Path: MDGIMG ->General Settings -> Data Quality and Search -> Define Validation and Derivation Rules .

There is a Naming Convention for Trigger function nodes in the catalog structure.

The naming convention for check trigger function nodes of a catalog structure is
CHECK_<name of entity type>, for example, CHECK_MATERIAL

Step-1 : Create a Function with Name CHECK_MATERIAL: Add the Existing Data Objects in signature like :

MATERIAL_MTART (For Material Type) ;

SAPFMDM_CREQUEST_STEP(For CR Step) ;

SAPFMDM_CREQUEST_TYPE(For CR Type) .

 

Case-1Without CR Step :

2014-05-10 18_28_10-BRFplus_ Catalog Browser.png

 

 

Step-2 Create a Ruleset

2014-05-10 18_36_12-BRFplus_ Catalog Browser.png

Step-3 Create a Log Action Message :

2014-05-10 18_43_51-BRFplus_ Catalog Browser.png

Step-4 : Create a Rule :

Insert CR type MAT01 ; Material Type FERT &Log Action .

2014-05-10 20_32_13-BRFplus_ Catalog Browser.png

Click on Ok , Save & Activate the ruleset and also Activate the Function CHECK_MATERIAL .

Step-5 Create a Material for CR type - MAT01:  Enter the Material Number & CR type : Click on Continue

2014-05-10 19_21_04-Material_ New, TEST-CR-STEP-00.png

 

Insert Material Type : ROH & Click on Check:

 

sshot-1dd.png

 

Warning Message is Reflect :

2014-05-10 19_24_48-Material_ New TEST-CR-STEP-00.png

Now Change the Material Type ROH  -> FERT and click on check: Click on Submit.

2014-05-10 19_25_56-Material_ New TEST-CR-STEP-00.png

 

Now we are check Rule on Final Step -90:

First we change FERT -> ROH  and then Click on Check :

2014-05-10 19_33_35-Material_ New TEST-CR-STEP-00.png

Change the Material Type and then Check the Status:

2014-05-10 19_28_22-Material_ New TEST-CR-STEP-00.png

 

Case-2 With CR Step & Different CR Type :

Create a Rule for case-2 :

Insert CR type ZRMAT01 ;Material Type FERT ; CR Step Number - 90  &Log Action

2014-05-10 19_17_52-BRFplus_ Catalog Browser.png

Click on Ok , Save & Activate the ruleset and also Activate the Function CHECK_MATERIAL .

 

2014-05-10 20_34_23-BRFplus_ Catalog Browser.png

Step-6 Create a Material for CR type - ZRMAT01: 

Enter the Material Number & CR type : Click on Continue

2014-05-10 19_35_50-Material_ New.png

Insert Material Type : ROH & Click on Check: No warning message is refelect as no rule is define on step - 00 .

2014-05-10 19_44_10-Material_ New TEST-CR-STEP-90.png

Now we can see that at the processing STEP (means STEP-00) No Rule is work : Click on Submit :

CR goes to Final STEP-90 here we click on Check :

2014-05-10 19_45_04-Material_ New TEST-CR-STEP-90.png

Now Change the Material Type ROH ->FERT :

2014-05-10 19_46_03-Material_ New TEST-CR-STEP-90.png

 

These ways we can apply Validation Rules on Various CR type & Various CR Step :

 

Derivation Rule :

The naming convention for check trigger function nodes of a catalog structure is
DERIVE_<name of entity type>, for example, DERIVE_MATERIAL

Step-1 : Create a Function with Name DERIVE_MATERIAL: Add the Existing Data Objects in signature like :

  MATERIAL(Entity) ;

SAPFMDM_CREQUEST_STEP(For CR Step) ;

SAPFMDM_CREQUEST_TYPE(For CR Type) .

 

2014-05-11 09_52_12-BRFplus_ Catalog Browser.png

Case-1Without CR Step :

1a) Create a Ruleset :

 

2014-05-11 11_07_03-sshot-8 - Windows Photo Viewer.png

 

1b): Create a Rule :

Insert CR type MAT01 ; Material Group 01.

Click on Ok , Save & Activate the ruleset and also Activate the Function DERIVE_MATERIAL

 

2014-05-11 10_00_24-BRFplus_ Catalog Browser.png

 

1c) Create a Material for CR type MAT01 :

2014-05-11 10_07_12-Material_ New.png

 

2014-05-11 10_55_06-Material_ New DER. RULE MAT01.png

 

Here if you want to change Material Group 01 -> XX and click on check its auto change to 01 .


Now we check the Final Step -90 :

 

2014-05-11 10_57_54-Material_ New Detsdd.png

 

Case-2With CR Step and different CR type :

2a): Create a Rule :

Insert CR type ZRMAT01 ; Material Group 01 ; CR Step - 00.

 

2014-05-11 10_03_45-BRFplus_ Catalog Browser.png


2b) Create a Material for CR type ZRMAT01 :

 

2014-05-11 10_15_19-Material_ New.png

 

2014-05-11 10_50_16-Material_ New DER. RULE ZRMAT01.png

 

Check on Final Step - 90 : We are able to change Material Group 01 -> xx at the final step 90

 

2014-05-11 10_51_56-Material_ New Test-Rule-Derive.png

 

Thanks

Nikhilesh Agarwal

Pain or Pleasure? Dual Governance after post go-live of SAP MDG

$
0
0

We have already seen several implementations going live with SAP MDG. Corporates are facing pain and pleasure of having MDG tool in their application landscape. I had interesting learning to provoke a discussion.

 

As we all know typical master data objects like a material master can span across easily 50+ attributes. We nail down few critical fields for data governance and perhaps because pragmatic reasons to keep the project scope limited and meet aggressive timeline. It is reasonably (?) believed that limiting the fields for governance is important for sake of operational efficiencies.

This strategy directs the hosting of few attributes in MDG Framework and few in typical ECC environment.

 

The real pain is realized in the production environment when data steward has to work in two applications/processes to make the master editions. It is very pleasant to have deployed the MDG for few attributes. But the master data object is still not business ready because the other so called little relevant attributes are not full filed with appropriate values. This strategy has also become the source for incompleteness in the master data object.

 

Hence is it worth to consider all attributes into MDG even though many of them are not meant for governance and validation. This will provide data stewards one user interface and discontinue the usage of other sources of editions. Do we face limitations/challenges while bringing all the fields from Native ECC transactions in the Hub.


Innovations in SAP Master Data Governance 7.0 SP02 (Feature Pack)

$
0
0

May the 19th be with you! May 19th marked a great day for our SAP MDG customers. After an entirely overbooked and extremely successful Ramp-Up that started in November, SAP MDG 7.0 was released for general availability. In addition, SAP Master Data Governance 7.0 SP02 (Feature Pack) was released to customers on the same day, and marks the new go-to release that from now on everyone who is  planning to install SAP MDG should consider.

 

The MDG SP02 (Feature Pack) is very easy to consume. Similar to SAP MDG 7.0, it can be installed on top of Enhancement Package 6 for SAP ERP 6.0 as well as on top of Enhancement Package 7 for SAP ERP 6.0. For existing installations of SAP ERP 6.0 EhP6 or SAP MDG 6.1, this means that you don’t need to upgrade to any higher Enhancement Package, but can just upgrade to SAP MDG 7.0 SP02 in that system.

 

What’s new in SAP MDG 7.0 SP02 (Feature Pack)?

The Feature Pack builds on SAP MDG 7.0. With that release, we firstly enabled faster search with SAP HANA, duplicate detection, cleansing and matching. We secondly provided better usability for all standard and custom-defined master data objects, and thirdly, we delivered a more flexible foundation for higher business efficiency and refined process control.

 

In addition to that, the Feature Pack now provides a further improved user experience and easy access for business users, for example, by offering device-independent SAP Fiori apps for master data request scenarios and real-time insight into governance processes by enabling “smart business” KPIs.

 

SAP MDG Fiori apps for master data requests

 

SAP Fiori apps enable request scenarios for multiple master data domains like Business Partner, Supplier, Customer, Material, Cost Center, and Profit Center. Business users can access these apps from their device of choice. They will then enter some basic data, optionally run a duplicate check to see if the master data already exists, attach supportive documents to their request as needed, and submit their request. This will trigger the creation of an SAP MDG change request, which is forwarded to the next person for completion or approval.

1_Fiori_CostCenter_iPad.jpg

Figure 1: SAP MDG Fiori app that allows business users to request a new cost center

 

SAP Smart Business for real-time insights into SAP MDG process KPIs

 

The use of SAP HANA is optional in SAP MDG 7.0. However, if you decide to run SAP MDG on top of SAP HANA SAP MDG uses SAP HANA’s capabilities, for example, for duplicate-detection and similarity-ranked search (for more details, see my last blog post on SAP MDG. 7.0). You can also make use of HANA being an in-memory, column-based database, and benefit from this in a multi-attribute drill-down selection that allows you to filter and analyze the intrinsic structures in your master data.

 

In the Feature Pack, we provide additional HANA-enabled values allowing for easy access, insight, and follow-up actions for the business. Since SAP HANA allows for fast access to aggregated data across multiple data sources, we enable fast calculation of key performance indicators, for example, to analyze the quality of SAP MDG’s governance process execution. This can be done to present the information in a Smart Business dashboard for real-time insight, and then to allow for follow-up and issue resolution.

 

The appropriate Smart Business content can be configured based on customer-defined HANA views. Examples of such key performance indicators include: the total number of change requests, the number of currently open change requests, change requests with the status final approval, long-running change requests that are still open, change requests with an exceeded processing time compared to an SLA, overdue change requests compared to due dates.

2_Smart_Business.jpg

Figure 2: Key performance indicators in SAP Smart Business showing process quality in SAP MDG

 

Improved user experience: highlighting changes and the undo / redo feature

 

The main purpose of the Feature Pack is to improve the user experience. Let’s look at one example in more detail: the highlighting of saved and unsaved changes. If you switch on this new feature for end-users, the system displays saved and unsaved changes in two different colors on the SAP MDG user interface.

 

When you create a new object, the system highlights all unsaved changes in a certain color. If you change an existing object, the system also highlights unsaved changes, but in addition, it also highlights already saved changes in a second color. These are typically changes that were made by a different person earlier in the change request process, or changes you made yourself, but that have already been saved. “Changes” in this context refer to those values on the screen that would change a master data attribute compared to the last approved version of the master data – that is, compared to the “active” master data in SAP MDG terms.

 

Of course, this feature also works in conjunction with SAP MDG’s editions. If you change an edition-based object, for comparison. the system uses the active value either from the validity period the change refers to, or from the previous validity period. In case the object did not exist before the validity period the change refers to, or if the object was deleted in the previous validity period, the system highlights unsaved changes, but does not highlight saved changes.

The system also highlights table rows that refer to changes you can only see when navigating from a table row to the details of a dependent entity. It also allows you to distinguish new rows from changed rows in a table.

 

When you move your mouse over a highlighted field, the tooltip of the field displays the previous value. If you change a value several times before saving it, the tooltip displays the active value and the last saved value.

3_Highlight_Changes.jpg

Figure 3: Highlighting saved and unsaved changes in two colors and displaying previous values in the tooltip

 

Another example of improved user experience is the new undo / redo feature. When you are processing a change request, the SAP MDG system records all your actions. For this, it collects exactly those actions in one “undo” step that are made in the web applications between two client/server roundtrips. Each of these steps carried out since you last saved the object can be undone. You can undo steps until you save or cancel. After having used the undo feature, you can use “redo” to recover your actions.

4_Undo.jpg

Figure 4: New undo / redo feature allows users to take back their proposed changes step by step

 

Other usability improvements

 

There are further usability improvements provided with SAP MDG 7.0 SP02. Let me just mention a few of them briefly. SAP MDG for Financial Data now allows users to efficiently create general ledger account data for multiple company codes by copying existing data from one company code to a new company code. It also allows creating a new general ledger account and the corresponding cost element in one maintenance step. One other highly desired feature is that you can now still change the ID of new financial master data later in the change request process. Often, when requesting a new financial master data, the final ID is not yet known to the requesting user. To allow both flexibility to the requestor and adherence to corporate numbering standards, SAP MDG for Financial Data now offers the feature to use a preliminary key first, and change it to the desired final number in a later step in the creation process.

 

In SAP MDG for Customer or Supplier, it is now possible to change the account group within MDG’s single object maintenance user interface. With the Feature Pack, the standard delivery also provides dedicated role-specific user interface configurations for efficient processing by experts. These include a specialized user interface covering company code data for customers or suppliers, a specialized user interface covering sales area data for customers, and one for purchasing organization data for suppliers. Users navigate directly to the specialized user interface from their (workflow) inbox based on their role and based on the step in the change request process. This saves the users several clicks for navigating to the right screen within the full change request.

5_Customer__Sales_Area_Data.jpg

Figure 5: Example of a specialized user interface covering sales area data for customers

 

SAP MDG for Customer or Material have also been enhanced. For example, there is an improved printing and an an enhanced copy function for material master data, as well as enriched integration with SAP PLM via Change Number and Material Revision Level, and a display option for thumbnail pictures in the SAP MDG Side Panel. The database search using SAP HANA was improved for material master data. The SAP Document Management System integration was improved. And the standard content was further enhanced: additional attributes are supported, such as storage location and the views containing material data for the warehouse. In addition, several enhancements of previously delivered material master data views are provided in the data model and the corresponding user interfaces.

 

As pointed out in one of my earlier blog posts, whenever we make additional investments in the MDG Application Foundation, the main focus is on extensibility, flexibility, usability, and ease of consumption. This means that we want to allow companies to create very flexible governance processes, with role-based user interfaces, but with very reasonable implementation efforts. One example of this in the Feature Pack is that we now provide customizing and configuration for all relevant SAP MDG user interfaces from one single place. A dedicated WebDynpro application is provided to manage all user interface configurations, for example, across single-object processing, multiple-record processing, and search. You can also copy a standard SAP user interface configuration to the customer namespace and configure it to your requirements.

 

Summary and outlook

 

You may have seen earlier documents about SAP MDG that describe how the focus of SAP MDG is on comprehensive master data ready for use in business processes by offering ready-to-run governance applications for specific master data domains. There, it is also stated that SAP MDG allows for customer-defined master data objects, processes, and user interfaces, and how it can distribute master data to SAP and non-SAP systems. SAP MDG is an important part of SAP’s Information Management portfolio that you might want to combine with additional tools like the SAP Information Steward for the monitoring of master data quality in all relevant systems and correction in SAP MDG, or like SAP Data Services for data quality services, validations and data enrichment.

 

If you look at SAP MDG 7.0 in particular, and at the recent Feature Pack, you’ll see that these enhancements focus on three aspects:

 

Firstly, SAP MDG now provides an even more flexible MDG Application Foundation that allows for refined control in the governance processes, leading to more flexibility and higher efficiency in the business. One example of this is the flexible Edition management for easier and more flexible scheduling of changes, a very intuitive access to the different states of master data that is valid in certain timeframes, higher transparency of past and planned changes, and a more granular control of replication timing. Another example is the parallel change requests combined with the concept of “interlocking”, which allows for many changes of the same master data object at the same time but without the risk of conflicting changes.

 

Secondly, enhancements to SAP MDG enable better usability providing easy access and insight for business people and allowing direct follow-up on found issues. We have looked at SAP Fiori, Smart Business, and several other usability improvements in this post. In addition, there are also the improved user interfaces for single-object maintenance, the cleansing case that allows for a reduction of duplicates during search or creation, and the multi-record processing function that allows changing multiple master data objects simultaneously in one single user interface.

 

Thirdly, as an option, SAP MDG can use SAP HANA for advanced duplicate-detection and search, or for multi-attribute drill-down selection. You can also use SAP HANA for efficient access to key performance indicators that are aggregated across multiple data sources to analyze the quality of governance process execution in your company.

 

We have already started the development of the next SAP MDG release. Stay tuned to read about more exciting enhancements and new capabilities as soon as we are ready to announce them.

 

You can find more information about SAP MDG on the SAP Master Data Governance page on SCN.

SAP FIORI based Lean Request Scenarios for SAP Master Data Governance

$
0
0

As of May 2014, SAP Master Data Governance 7.0 is generally available with Support Package 2 (Feature Pack). MDG blogger Markus Kuppe has already provided useful information on the functional scope of MDG 7.0 and the enhancements added with MDG 7.0 SP02 (Feature Pack).

 

In this context, I'd like to add some news on the MDG 7.0 SP02 (Feature Pack) with a focus on the new SAP Fiori-based lean request scenarios for MDG. These new SAP Fiori apps extend the reach of SAP MDG to business contexts where business users with little or no SAP knowledge need to request master data directly on site (e.g., during a sales event) using their device of choice.

 

 

 

SAP MDG Fiori apps for master data requests

 

SAP Fiori apps for SAP Master Data Governance enable request scenarios for multiple master data domains, i.e., Business Partner, Supplier, Customer, Material, Cost Center, and Profit Center data. Business users can access these apps using the device of choice, such as a tablet or smart phone.

 

An offline demo clearly illustrates how a business user can use the SAP Fiori app, e.g., during a sales event to request the creation of a new B2B customer data record on a tablet PC.

 

Let's assume the folowing situation:

 

Story flow:

 

Fiori.jpg

 

  1. A sales rep (business user with little SAP knowledge) enters basic data on an iPad, runs a duplicate check to see if the master data already exists, attaches a business card to the form, and finally submits the request.
  2. This triggers the creation of an SAP MDG change request, which is automatically routed to a local data steward for checking and completion in the MDG back-end. Accordingly, the local data steward runs the checks and completes missing data. During the MDG workflow all changes made to the record are highlighted, so that processors can clearly see what has been changed, which is another usability feature coming with SP02 (Feature Pack).
  3. A global data steward checks the completed request and gives final approval to activate the record.
  4. Another SAP Fiori app allows the sales rep to track the processing status of his requests.

 

To run through this end-to-end process that starts with the SAP Fiori lean request for customer data, click the demo link.

(To navigate, simply click the dotted orange-coloured box)

 

 

Hope it is useful for you.

 

Best,

 

Markus

 

 

For more information about SAP FIORI, see also the SCN blog by Masayuki Sekihara

Parallel Workflow approval with E-mail Notification after activation of Material-BADI

$
0
0


This blog is ment to show case the MDG parallel workflow and sending Email notification to requestor after approval.it is base on the documentation mention under link  http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/30681c36-5884-2e10-85a1-cd9499942e37?QuickLink=index&…and also MDG-M: How To Send an Email Notification to the Requestor at Time of Material Activation. I thanks Rob for  good ducomentation which helps us to trigger the E-mail notification and also my colleguges who help me on this.

 

Business Scenario:While creation of material the two departments wants to approve the CR at the same time and also trigger the E-mail notification after activation of material.

1.Create the CR type.

Untitled.jpg

 

Untitled1.png

2.Defining the services

Untitled3.jpg

 

3.Defining the user and Non-User decision table in BRF+,you have provide the Merge Type B(Badi) and Marge Paramaters as service which we defind in the perivous step.

Untitled4.jpg

Provide the workflow template which is created as per the Rob Docs MDG-M: How To Send an Email Notification to the Requestor at Time of Material Activation. and service names in the non-user decision table.

Untitled5.jpg

4.Define  BADI's use the Enhamcment spot USMD_SERVICE_PROCESSOR(SE18)

Untitled6.jpg

goto Badi Defination USMD_SSW_PARA_RESULT_HANDLER and Create implemenation

Untitled7.jpg

Provide the Implementation name ans text

Untitled8.jpg

Provide the Badi Name and implementing Class.

Untitled9.jpg

5.Providing the Code,click on the method IF_USMD_PARA_RSLT_HANDLER~HANDLE_PARALLEL_RESULT

Untitled10.jpg

check out the steps you have to provide the same steps in decision tables

Untitled11.jpg

 

6.Defining the filter types and save

Untitled12.jpg

follow similar method for defining the BADI USMD_SSW_SYSTEM_METHOD_CALLER and check the method

 

IF-USMD_SSW_SYST_METHOD_CALLER~CALL_SYSTEM_METHOD should be empty.

Untitled13.jpg

7.Create the CR for material

 

Untitled14.jpg

 

TETWEEWE.jpg

 

Untitled15.jpg

8.Check the status of Work flow you can see that the CR is in the two user que they can approve Parallel

Untitled16.jpg

Status of CR after approval from User 1

Untitled18.jpg

Status you can see the the both user has approved the CR and it went to next user

Untitled19.jpg

final approval the status has changed for Final Check Approved.

 

Untitled20.jpg

 

check the status of mail in SOST tcode and check the recipient

 

SOST.jpg

Open up the mail box and check the mail

Untitled21.jpg

we are working on extended Email notification once done will publish it as well.

Improved Search and Duplicate check in MDG 7.0

$
0
0

SAP MDG 7.0 introduces an additional search provider that is based on SAP HANA and can be used for search and deduplication. It has some advantages over the search providers available before:

 

  • Improved performance and more meaningful  score
  • Powerful configuration with a lot of options to adopt the search behavior to your needs
  • Lower TCO if MDG runs on SAP HANA

 

This blog post will show you how your landscape will be simplified and reveal some of the advantages you will gain by using the SAP HANA fuzzy search functionality.

 

Prerequisites: Before you can take advantage of SAP HANA fuzzy search within MDG, you must prepare your system accordingly. The steps you will need to take for this are described in the article “Configuring SAP HANA-Based Search for MDG 7.0”, you will find a link at the end of this article.

 

At some point in time, you may find that the search behavior does not fit your or your end users’ expectations when they use the default settings. For example, additional, non-matching tokens may affect the score not as expected, or you want to deal with synonyms and stop words to improve the score to show what is expected for a particular field. For most of the cases, switches are available in the search rule set configuration, which can help tweak the search behavior to better meet the expectations.

 

In the following, I have chosen a simple example in order to illustrate how search rule sets in HANA are used and how you can change the configuration. In this example, I will deal with abbreviations and the consideration of tokens (matching tokens vs. non-matching).

 

A search for “L Croft” does not meet my expectations. It does not provide any results although I am quite sure that there is a change request in process that does deal with somebody whose last name is “Craft” and I am quite sure the first name started with an “L”:

screen1.png

Under the default configuration, the search algorithm does not take abbreviations for the field “MCNAME1” into account. This field is associated with the user interface field “Name1 / Last Name”. This is the case because the Abbreviation Similarity is currently set to zero:

screen2b.png

I will therefore change the Abbreviation Similarity to 0.9:      

screen3b.png

This leads to a much better search result:

screen4b.png

Now, I am getting curious, I would like to check what other “Laras” exists in my system. Unfortunately, the search result does not meet my expectations at all:

screen5b.png

I have just seen a “Lara” out there, so why she is not considered in the search algorithm if I do not provide a last name? This might be related to the amount of tokens in the search. Let’s tell SAP HANA that the score of the best matching token should be emphasized. So how about changing the parameter Best Matching Token Weight to 0.9?

screen6b.png

Great, I got my “Lara” back. I now even get a higher and more meaningful score. And I also found “Mara Croft ltd.”, which sounds quite similar to “Lara”. This means the search does pay more attention to the things I am looking for, and does get less disrupted by additional information in the database. This is great for explorative search.

 

For a meaningful duplicate check, I might want to only get results that consider additional information as a relevant differentiation. Accordingly, “non-matching tokens” should lower the score. This can be achieved with the parameter Consider Non-Matching Tokens.

 

screen7b.png

By using SAP HANA as your database and search backend, you will also reduce your TCO, as you do not need to maintain an additional component like TREX or a third party search solution. In consistency with the HANA story, you do not even need to replicate your data in this setup, which simplifies the setup reasonably, considering replication and synchronization were a complex and sensitive topic before.

 

This was just a very simple example. There are plenty of additional parameters to tweak and tune the search behavior as desired. In case you got interested, you may want to have a look at the comprehensive chapter on fuzzy search available in the HANA Developer Guide:

http://help.sap.com/hana/SAP_HANA_Developer_Guide_en.pdf

 

 

Configuring SAP HANA-Based Search for MDG

 

http://help.sap.com/erp_mdg_addon70/helpdata/en/72/93f8516599a060e10000000a44176d/frameset.htm

Global numbers for master data - Is it time to accept it as a best practice?

$
0
0

Every organization is now muddled with multi system ERP landscape. No matter how many IT consolidation projects are undertaken, after a few years it is back to square one. The dynamic business environment always allows evolution of niche IT solutions while the big brothers SAP & Oracle can keep gobbling them and enhance their portfolio but still within their portfolio of applications, the niche solution tend to maintain their uniqueness and all SAP & Oracle will do is to create a bolt on to their core product that will make the integration simplified.

 

The biggest pain area due to the multiplicity of IT or ERP systems is master data numbers being different in different systems. To make transactions pass through such multiple systems, interfaces will have all sorts of logic built-in to ensure the different master data numbers are somehow logically matched so that in all the systems same material/vendor/customer or any common master data is being transaction upon. But this turns out to be the weak link that breaks regularly.

 

Best practice on numbers for master data is never to build logic or intelligence in the numbering systems. Put it differently make it non-speaking so it will not identify itself to you. But this is the rule that is always broken in the environment I have painted above as the end users want the master data numbers to be intelligent and identify itself. Also in an environment where-in you have multiple applications or ERP systems, achieving common numbering systems means it has to be external numbering system. Sometimes this is called "Global Number".

 

Let us take for discussion the concept of "Global Material Number". By default this number will be externally defined which means some sort of logical process have to be defined. I would say it need not be very intelligent numbering system, but a simple program that can generate sequential numbers with say one character prefix or a suffix. There is no need to even list down the advantages of having a common number like "Global Material Number" across the IT or ERP systems as the benefits far outweigh the shortcomings of such a system.

 

Hence I strongly feel that having Global numbers that are externally defined in a multi IT or ERP system landscape should be considered a best practice too.

 

Any thoughts or counter views to my conclusion?

Benefit from new SCN site featuring how-to videos on SAP MDG processing

$
0
0

Dear SAP Master Data Governance community,

 

On behalf of SAP MDG development and knowledge management, I am happy to announce the availablity of an additional SCN site about SAP Master Data Governance. The new site is aimed at featuring learning videos about specific data processing tasks across the master data domains that are supported with out-of-the-box content, while also covering specific processing tasks related to custom objects.
The intention is to give onboarding MDG users a quick start on user interaction in MDG processing.

As such, the new site perfectly complements our SCN learning offering which already features technical best-practice information covering MDG configuration and extensibility, which is specifically tailored for consultants.

 

As a starting point, the new site already presents three video recordings related to MDG processing topics in MDG for Financials, including change request processing for single and multiple objects, creating a profit center and creating an edition.

 


... and of course, this initital scope is planned to be extended.

 

Credit goes to the SAP MDG documentation team who made this possible. I really think this is a perfect means to convey easy-to-digest information about SAP MDG to the user community. Thanks a lot for this great piece of work!

 

We hope you'll find these videos beneficial for your daily work. Please let us know your thoughts. Your comments help us to continue this path of providing you with additional learning videos.


Thanks, and awaiting your feedback.

 

Markus

SAP MDG & SAP Information Steward: Single Sign On (SSO) working

$
0
0

Hi,

just an small update on the work I'm doing with SAP MDG and SAP Information Steward integration: Last week I invested some time in getting the SSO working on an internal system. The good news is:

1. Its working as expected

2. Its pretty simple to configure the SSO.

 

Background:

SAP MDG comes with a DQR (Data Quality Remediation) scenario (see here: http://scn.sap.com/community/mdm/master-data-governance/blog/2013/10/22/sap-mdg-sap-information-steward-a-perfect-combination-for-data-quality-remediation-dqr-scenarios). In general you can integrate any Data Quality tool into this framework. Most of the time I have used SAP Information Steward but i have never tested the SSO between both stacks. The disadvantage of not having SSO configured is that the end users have manually login each time the DQ-dashboard comes up in the SAP MDG User Interface for DQR. With a configured SSO between both stack the users get automatically logged in into the dashboard! - Big advantage, big simplification!

 

What to do?

On high level you need to do the following steps:

  1. Define a authentication system in CMC - A wizard exists
  2. Generate and import certificates -  with SAP tools provides in your SAP Information Steward installation
  3. Define SSO in your configuration of SAP Information Steward
  4. Restart Tomcat/ Web Server of SAP Information Steward
  5. Define SAP ERP/MDG roles to be SSO'ed in SAP Information Steward
  6. Give the SAP users the correct right in SAP Information Steward

 

Instead of writing again a guide I would like to mention these detailed guides:

http://wiki.scn.sap.com/wiki/display/BOBJ/Generate+keystore+and+certificate+for+SAP+BO+BI4.0

http://wiki.scn.sap.com/wiki/display/BOBJ/How+to+setup+SSO+against+SAP+BW+with+SAP+BO+BI4.0+Common+Semantic+Layer+%28UNX%29+or+BICS

 

 

 

Want to see a demo?

if you want to see a demo, please contact your local SAP contact person The scenario is configured in our sapdemocloud (https://sapdemocloud.com/cloudportal/index.html)

 

Best Regards

Steffen


Unable to Finalize the change request for country code JE -Jersey C I

$
0
0


Issue :

 

Whenever we are trying to Finalize the change Request (CR) which belongs to country Jersey C I, we have received the process error because of  below error message.

 

JE1.png

 

Solution :

 

To resolve this issue we have followed the below steps.

 

1.SAP OSS # 1901429 the country Jersey, C.I. ( Code: JE) is related to GB (this country is a crown dependency of the United Kingdom). But when we verified the country’s global parameter setup it is pointing to US (Screen shot: 1). Hence I believe for this reason we are getting the above  error message.

 

Je2.png

 

 

Je3.png

Je4.png

We have mapped  the country code JE to GB parameters.

 

2. Then we tried to finalize the change request again we have received the process error. This time the error message as "No geocoder for country JE maintained in system (table GEOCODERS)".

 

To resolve this we maintained the  below entry in the path SPRO--> SAP NetWeaver -->General settings --> Set Geocoding --> Assign Geocoding Program to Countries.

 

JE5.png

3. After this entry we tried to finalize the CR and received the error message as “Geocoder SAP0: Country specification is incomplete (Customizing)”. To resolve this I maintained the below value in table V_GEOT005 & V_GEOT005S (for Region details).

 

JE6.png 

I found these  values from the URL : http://www.theodora.com/country_digraphs.html (Country : JE --> Geography) . The Geography code is similar to country code GB.

 

4. After this, when we tried to finalize the CR,we received the error message as “Telephone/fax code for country code JE is not maintained”. Based on this we maintained the entry in the below path

 

SPRO --> SAP NetWeaver  --> General settings --> Set Countries --> Define Country

 

JE8.png

 

I have found the Jersey Calling code from the URL: http://countrycode.org/jersey. This code is similar to GB.

 

JE9.png

 

After completing the above configuration, again I tried to finalize the CR  and this time it has been successfully finalized and created the Vendor Master in ECC without any issues.

 

JE10.png

 

I believe it might an useful.

 

Regards,

Sada

MDG Custom Object Number range

$
0
0

In service master i have  changed the number ranges in ACNR transaction .Now  we are facing the short dump

Asset condition was violated.This is due to the deletion of NR ,how to revert back

.i am able to create in AC03 with new number range But in Nwbc screen the mentioned short dump appears

MDG menu not visible on NWBC. Instead, It shows an error “The user menu tree for user is empty."

$
0
0
Issue
MDG menu not visible on NWBC. Instead, It shows an error “The user menu tree for user is empty."
Error Screenshot
Solution
Role is assigned but folder is defined as a "link collection" instead of a "service map" - see:http://goo.gl/OEq9qa
Steps to solution
Step-1: Goto transaction PFCG

 

Step-2: Open your Role in edit mode.

 

Step-3: Goto ‘Menu’ Tab.

 

Step-4: Double click on the folder in which your views are defined. If no folder defined, then create one and move the views to this folder.

 

Step-5: Click on ‘Other Node Details’ button

 

Step-6: Select ‘Folder Option’ as ‘As Service Map'

 

Step-7: Save & Execute NWBC
Solution Screenshot
References

 

https://websmp208.sap-ag.de/~sapidb/012003146900001035042013E/empty_user_menu.pdf

 

1378659 - NWBC known issues & what to check when opening a ticket


   

Error while creating relationship - "Choose the key from allowed namespace"

$
0
0

 

Scenario

 

You are trying to enhance standard data model by creating new entity types and creating relationships.

 

Issue


Error while trying to create a leading relationship for new entity type (SU type-4) on entity type-1 .

 

Error Screenshot


 

 

Solution


Create new relationship by prefixing ZZ i.e., ZZMATMARC1

SAP Master Data Governance: Updated Product Roadmap Now on SAP Service Marketplace

$
0
0

Today, SAP Master Data Governance (MDG) provides applications to centrally create, change and distribute master data. Being highly integrated with SAP Business Suite, MDG re-uses data models, business rules and process logic. It provides pre-build domain-specific content and a flexible framework to define data models, processes, and user interfaces also for custom-built master data.

 

If you'd like to be informed on the way ahead, you may check the product roadmap for SAP MDG which has just been updated, providing the current picture of the solution today (SAP MDG 7.0 SP02), planned innovations and future trends.

 

SAP customers and partners can check out the document on SAP Service Marketplace at http://service.sap.com/roadmaps => Product and solution roadmaps => Database & Technology (SAP Master Data Governance: Edition 2014Q3) (SMP log-in required).


Viewing all 87 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>