The Socratic method tries to extract the truth rather than people's opinion. The method is skeptical, conversational, conceptual/definitional, empirical / inductive and deductive.
The second article is a nice summary of problem solving techniques Socratic method, Windtunnelling and Brainstorming.
Friday, March 19, 2004
KANBAN in SAP
Some research on the Net led to a better understanding of KANBAN systems. This overview provided the business side explanation of KANBAN. However SAP has implemented things a little differently. Their help page describes the methods supported in SAP. The difficult part is understanding what SAP means in term of business terms.
In SAP, the main advantage of KANBAN is that the EMPTY signal automatically creates the replenishment documents. A FULL signal automatically does the goods receipt.
Classic KANBAN - Only need to use FULL and EMPTY. Other statuses are optional.
One-Card KANBAN - Only 2 Kanbans. You need to use special statuses WAIT / INUSE to trigger replenishment.
Quantity KANBAN - Same as classic KANBAN. Instead of changing the status of a KANBAN, you tell it how many items you have used up. The system will calculate when the KANBAN qty is zero and changes it to EMPTY. The important thing here is that a KANBAN quantity signal is a seperate transaction code. It is not the same as a production order backflush. (In fact in KANBAN, goods issue is not done until the KB is EMPTY).
Event driven KANBAN - When the quantity is highly irregular, use this method. EMPTY does not trigger replenishment. Instead you go to the Event menu item to specify that you want a new KANBAN and wht quantity.
Later on I went ahead and created a SWARM simulation for KANBAN. It simulates KANBAN as implemented in SAP. It was interesting to view the behaviour of the KANBAN system when demand was erratic. The things that I was studying included
1. Number of refill transactions
2. Number of times the production had to stop because of no stock.
3. Inventory on hand.
These were modelled in SWARM which makes it easy to construct a GUI for the underlying model. The GUI makes things much more interesting and intuitive.
In SAP, the main advantage of KANBAN is that the EMPTY signal automatically creates the replenishment documents. A FULL signal automatically does the goods receipt.
Classic KANBAN - Only need to use FULL and EMPTY. Other statuses are optional.
One-Card KANBAN - Only 2 Kanbans. You need to use special statuses WAIT / INUSE to trigger replenishment.
Quantity KANBAN - Same as classic KANBAN. Instead of changing the status of a KANBAN, you tell it how many items you have used up. The system will calculate when the KANBAN qty is zero and changes it to EMPTY. The important thing here is that a KANBAN quantity signal is a seperate transaction code. It is not the same as a production order backflush. (In fact in KANBAN, goods issue is not done until the KB is EMPTY).
Event driven KANBAN - When the quantity is highly irregular, use this method. EMPTY does not trigger replenishment. Instead you go to the Event menu item to specify that you want a new KANBAN and wht quantity.
Later on I went ahead and created a SWARM simulation for KANBAN. It simulates KANBAN as implemented in SAP. It was interesting to view the behaviour of the KANBAN system when demand was erratic. The things that I was studying included
1. Number of refill transactions
2. Number of times the production had to stop because of no stock.
3. Inventory on hand.
These were modelled in SWARM which makes it easy to construct a GUI for the underlying model. The GUI makes things much more interesting and intuitive.
Interesting techniques in SAP
1. The global variables in a module are internally stored as (Module)VarName . Arrays are stored as (Module)ArrName[]. You can use this to access variables in SAP's modules. We faced the following situation : - sales orders automatically create a purchase order to a vendor; we needed to check the order reason on the sales order to determine whether a discount pricing condition should be applied in the purchase order. Unfortunately, the link to the sales order was only in memory when the userexit was called. The link was saved to the database after the userexit. So we located this data in memory and used a field-symbol assign to access that data.
2. PDF printing within SAP. When SAP prints using SAP-Script or SmartForms, the output can be saved into an array in internal format called OTF. SAP provides a function to convert OTF to PDF. We used this to let users view a PDF printout of their purchase orders from the web. The web interface (Business connector DSP) calls a custom function. This function queries the output condition table Bxxx to get the output type (KSCHL). Then it looks up TNAPR to get details about which program, form and sap-script to use. At this point, I realized that we need to modify core SAP to let it output OTF format. Made two changes - one in the beginning to turn on capturing (itcpo-tdgetotf='x') and one at the end to retrieve the OTF format from function 'CLOSE_FORM' and store it in memory. Once we have the OTF output, calling CONVERT_OTF_TO_PDF converts it to PDF. The resulting hex encoding is sent back to BC, which decodes it and sends it to the user.
3. KANBAN - Did a quick review of the various KANBAN methods supported in SAP. SAP supports classic KANBAN, One-card Kanban, Kanban with quantity and event driven Kanban. SAP's automatic calculation can calculate either the number of KANBAN's or the KANBAN quantity that is required to meet the average consumption. The tradeoff is between number of transactions, production area space and inventory cost. We need to look for a model that addresses these three and build a linear model.
4. Siebel - Finally was able to read about the product configurator in v7. Interesting aspects :- Product features, class hierarchy, concept of a model, XML import/export, web services.
2. PDF printing within SAP. When SAP prints using SAP-Script or SmartForms, the output can be saved into an array in internal format called OTF. SAP provides a function to convert OTF to PDF. We used this to let users view a PDF printout of their purchase orders from the web. The web interface (Business connector DSP) calls a custom function. This function queries the output condition table Bxxx to get the output type (KSCHL). Then it looks up TNAPR to get details about which program, form and sap-script to use. At this point, I realized that we need to modify core SAP to let it output OTF format. Made two changes - one in the beginning to turn on capturing (itcpo-tdgetotf='x') and one at the end to retrieve the OTF format from function 'CLOSE_FORM' and store it in memory. Once we have the OTF output, calling CONVERT_OTF_TO_PDF converts it to PDF. The resulting hex encoding is sent back to BC, which decodes it and sends it to the user.
3. KANBAN - Did a quick review of the various KANBAN methods supported in SAP. SAP supports classic KANBAN, One-card Kanban, Kanban with quantity and event driven Kanban. SAP's automatic calculation can calculate either the number of KANBAN's or the KANBAN quantity that is required to meet the average consumption. The tradeoff is between number of transactions, production area space and inventory cost. We need to look for a model that addresses these three and build a linear model.
4. Siebel - Finally was able to read about the product configurator in v7. Interesting aspects :- Product features, class hierarchy, concept of a model, XML import/export, web services.
Tuesday, March 16, 2004
Setup SSHD on Windows
Found this excellent page to install SSH services on Windows.
Use Cygwin's SSHD to run a SSH server on Windows.
This page is more specific and also shows how you can use SSH at home.
I got stuck at the point on how to login. Which username should I use?
1. Tried the Domain id (domain\username)
2. Create a local account - did not work.
The answer lies in the way Cygwin maps Windows users to Unix users (LINK).
Here are the steps that I took to resolve the problem.
Windows does not do a great job of logging. You must use eventvwr all the time.
So do the following to create a log file.
1. Ensure c:\cygwin\bin is in your system path.
2. net stop sshd
3. Start the Cygwin shell (click the icon in your start menu).
4. cygrunsrv -R sshd
5. cygrunsrv -I sshd -p /usr/sbin/sshd -a "-D -e" -2 /var/log/sshd.log
This will setup sshd to log to a txt file /var/log/sshd.log so that you can accuall see what is going on.
6. net start sshd
7. tail -f /var/log/sshd.log
This will display the log while you are trying to login
8. Start another Cygwin window
9. ssh localhost
If you get "Illegal user xxxx from 127.0.0.1" then you must add an entry for the user in /etc/passwd.
10. mkpasswd -l -c
The last line(s) displays the information you must put into /etc/passwd.
usd05813:unused_by_nt/2000/xp:23996:10513:usd05813,U-CODE1\usd05813,S-1-5-21-205
2333302-790525478-839522115-13996:/c/Documents and Settings/usd05813:/bin/bash
Copy this line. Change the group id (10513 in the above example) to 513 or you will get an invalid GID error while logging in with SSH.
9. ssh localhost
If you get "Illegal GID" then change your group id to 513 (see above)
10. If you keep getting "POSSIBLE BREAKIN ATTEMPT", you need to do the following
a. vi /etc/sshd_config
b. replace the line "#UseDNS yes" with "UseDNS no".
c. net stop sshd
d. net start sshd.
Check this Free SSH copy / SSH FTP client.
PS: Love W.Bloggar.
Use Cygwin's SSHD to run a SSH server on Windows.
This page is more specific and also shows how you can use SSH at home.
I got stuck at the point on how to login. Which username should I use?
1. Tried the Domain id (domain\username)
2. Create a local account - did not work.
The answer lies in the way Cygwin maps Windows users to Unix users (LINK).
Here are the steps that I took to resolve the problem.
Windows does not do a great job of logging. You must use eventvwr all the time.
So do the following to create a log file.
1. Ensure c:\cygwin\bin is in your system path.
2. net stop sshd
3. Start the Cygwin shell (click the icon in your start menu).
4. cygrunsrv -R sshd
5. cygrunsrv -I sshd -p /usr/sbin/sshd -a "-D -e" -2 /var/log/sshd.log
This will setup sshd to log to a txt file /var/log/sshd.log so that you can accuall see what is going on.
6. net start sshd
7. tail -f /var/log/sshd.log
This will display the log while you are trying to login
8. Start another Cygwin window
9. ssh localhost
If you get "Illegal user xxxx from 127.0.0.1" then you must add an entry for the user in /etc/passwd.
10. mkpasswd -l -c
The last line(s) displays the information you must put into /etc/passwd.
usd05813:unused_by_nt/2000/xp:23996:10513:usd05813,U-CODE1\usd05813,S-1-5-21-205
2333302-790525478-839522115-13996:/c/Documents and Settings/usd05813:/bin/bash
Copy this line. Change the group id (10513 in the above example) to 513 or you will get an invalid GID error while logging in with SSH.
9. ssh localhost
If you get "Illegal GID" then change your group id to 513 (see above)
10. If you keep getting "POSSIBLE BREAKIN ATTEMPT", you need to do the following
a. vi /etc/sshd_config
b. replace the line "#UseDNS yes" with "UseDNS no".
c. net stop sshd
d. net start sshd.
Check this Free SSH copy / SSH FTP client.
PS: Love W.Bloggar.
COMCAST upgrades its network
This message is for those who have COMCAST as their INTERNET service provider, if you don't have COMCAST you should ignore this message. Recently COMCAST tripled their INTERNET speed from about 1 Mbps to 3.5 Mbps. To take advantage of this you will need to reset your cable modem by:
1- Turn it off
2- Unplug it from your cable
3- Wait 60 seconds
4- Plug the cable back in
5- Turn the modem back on
6- Reboot you computer.
You should now have a significantly faster connection.
You can test your speed before and after doing this by visiting the following URL and performing their speed test and comparing the results:
http://google2-cnet.com.com/Bandwidth_meter/7004-7254_7-0.html?tag=tool
When I did this mine went from just below 1 Mbps to 3.4 Mbps when directly connected to COMCAST and 1.5 Mbps when connected to PMS CMS vis VRAS.
1- Turn it off
2- Unplug it from your cable
3- Wait 60 seconds
4- Plug the cable back in
5- Turn the modem back on
6- Reboot you computer.
You should now have a significantly faster connection.
You can test your speed before and after doing this by visiting the following URL and performing their speed test and comparing the results:
http://google2-cnet.com.com/Bandwidth_meter/7004-7254_7-0.html?tag=tool
When I did this mine went from just below 1 Mbps to 3.4 Mbps when directly connected to COMCAST and 1.5 Mbps when connected to PMS CMS vis VRAS.
Monday, March 15, 2004
Desktop blog tool
I had installed W.Bloggar . This is a very nice tool for creating Blog entries from your desktop. So rather than being on the Internet, you can do it offline.
It also introduced me to a nice installation tool. Ghost Installer - seems like a very easy to use installation package. See the viewlet to get an idea of it.
The Information Society Technologies had some interesting IS related projects and proposals.
It also introduced me to a nice installation tool. Ghost Installer - seems like a very easy to use installation package. See the viewlet to get an idea of it.
The Information Society Technologies had some interesting IS related projects and proposals.
Saturday, March 13, 2004
Compiere downloads decreasing at an alarming rate
The Compiere blurb says 640000 downloads. But what does that really mean?
I sat down and analyzed the file download statistics on Sourceforge. Out of the 640000 downloads, only 420000 are for the server module. Drilling down further, I realized that the number of downloads is decreasing at an alarming rate.
2001 - 98911 downloads
2002 - 228868 downloads
2003 - 87268 downloads
2004 - 5708 downloads
In terms of versions, 2.5 hasn't taken off as yet.
2.3 - 17613 downloads
2.4 - 359831 downloads (2.4.2 was the most downloaded. it accounts for 173000 downloads)
2.5 - 43311 downloads
I've posted the detailed analysis on my web site http://compiere.dmahajan.org
I sat down and analyzed the file download statistics on Sourceforge. Out of the 640000 downloads, only 420000 are for the server module. Drilling down further, I realized that the number of downloads is decreasing at an alarming rate.
2001 - 98911 downloads
2002 - 228868 downloads
2003 - 87268 downloads
2004 - 5708 downloads
In terms of versions, 2.5 hasn't taken off as yet.
2.3 - 17613 downloads
2.4 - 359831 downloads (2.4.2 was the most downloaded. it accounts for 173000 downloads)
2.5 - 43311 downloads
I've posted the detailed analysis on my web site http://compiere.dmahajan.org
Using RSS feeds in your own pages
This excellent site describes how you can pull in RSS feeds in your own page.
The Javascript example (http://forever.p3k.org/rss/) embeds the RSS feed in your page without XSLT or any server side code.
The Javascript example (http://forever.p3k.org/rss/) embeds the RSS feed in your page without XSLT or any server side code.
Moving into Portals
I've decided to host my own web site and use a portal architecture. This also meant that I have to integrate a weblog, Wiki site into the portal.
The DotNetNuke (http://www.dotnetnuke.com) portal was easy to setup and deploy to my ISP. It is a ASP.NET application that uses Microsoft SQL Server or Microsoft Access as the backend database. Since it is open source, it has a lot of developer support and tons of modules. Check my portal at http://dnn.dmahajan.com
The next part that I wanted was to integrate a Wiki site at my web site. So I've installed a Wiki site which is at http://wiki.dmahajan.com . I plan to integrate it into the portal when I get some free time.
The last piece was to integrate the web log. This was easy. I changed Blogger to post the published files to my server. Next an XSLT transform of the RSS.XML file provided a neat list of articles, that is on the portal's home tab.
Enough fun for now!
The DotNetNuke (http://www.dotnetnuke.com) portal was easy to setup and deploy to my ISP. It is a ASP.NET application that uses Microsoft SQL Server or Microsoft Access as the backend database. Since it is open source, it has a lot of developer support and tons of modules. Check my portal at http://dnn.dmahajan.com
The next part that I wanted was to integrate a Wiki site at my web site. So I've installed a Wiki site which is at http://wiki.dmahajan.com . I plan to integrate it into the portal when I get some free time.
The last piece was to integrate the web log. This was easy. I changed Blogger to post the published files to my server. Next an XSLT transform of the RSS.XML file provided a neat list of articles, that is on the portal's home tab.
Enough fun for now!
Sunday, January 11, 2004
Mixing XSLT , Access , SAP and .NET
Its fun when the technology that you hear about actually make it easier to make something relevant.
In this case, I am extracting planning information from SAP. Aggregating and analysing the data in Microsoft Access. Providing web access to this using ASP.NET. The next piece was to make the data accessible in XML and Excel.
Here XSLT steps in. I use the native XML support in NET to convert the Access data into an XML string. The XML is then transformed by an XSLT spreadsheet into Excel 2000 HTML spreadsheet format. (Incidently Excel 2002 has a native XML format, but it isn't supported by Excel 2000.)
The next step is to graph this data, since graphs is what makes planning easy.
The first idea was to use VML since it has native support in IE. However, VML is not an approved draft (even though Microsoft will continue supporting it) SVG is the next generation Vector Markup Language. So the graphs must be SVG.
To view SVG, you'll need the Adobe SVG Viewer plugin.
Now, the fun part. The XSLT cookbook by Sal Mangano is brilliant. It already has an XSLT template to generate SVG graphs. After breaking my head trying some trivial stuff in XSLT, I've turned to the guru. He is amazing. The XSLT techniques that he uses are mind blowing and open your eyes. You feel humble after comparing your feeble attempts to his elegant solutions.
But another question that arises is whether there is a real user requirement for graphing the data. I'm not so sure about it. There is a strong request to automate sending the updated percentages back into SAP "automatically". To this end, I've written a program which can update the percentages in planning configuration points. It is a RFC enabled function module to update the points. Now I've to write the NET code to provide an update mechanism.
Day to day activities have no coherant purpose. I think I've got to get some coherance into these activities.
Signing off.
In this case, I am extracting planning information from SAP. Aggregating and analysing the data in Microsoft Access. Providing web access to this using ASP.NET. The next piece was to make the data accessible in XML and Excel.
Here XSLT steps in. I use the native XML support in NET to convert the Access data into an XML string. The XML is then transformed by an XSLT spreadsheet into Excel 2000 HTML spreadsheet format. (Incidently Excel 2002 has a native XML format, but it isn't supported by Excel 2000.)
The next step is to graph this data, since graphs is what makes planning easy.
The first idea was to use VML since it has native support in IE. However, VML is not an approved draft (even though Microsoft will continue supporting it) SVG is the next generation Vector Markup Language. So the graphs must be SVG.
To view SVG, you'll need the Adobe SVG Viewer plugin.
Now, the fun part. The XSLT cookbook by Sal Mangano is brilliant. It already has an XSLT template to generate SVG graphs. After breaking my head trying some trivial stuff in XSLT, I've turned to the guru. He is amazing. The XSLT techniques that he uses are mind blowing and open your eyes. You feel humble after comparing your feeble attempts to his elegant solutions.
But another question that arises is whether there is a real user requirement for graphing the data. I'm not so sure about it. There is a strong request to automate sending the updated percentages back into SAP "automatically". To this end, I've written a program which can update the percentages in planning configuration points. It is a RFC enabled function module to update the points. Now I've to write the NET code to provide an update mechanism.
Day to day activities have no coherant purpose. I think I've got to get some coherance into these activities.
Signing off.
Wednesday, January 07, 2004
Connecting to a mainframe
IBM provides Java connectors to connect to a mainframe.
The CCF connector is one example. It is a J2EE resource provider.
Check out the IBM redpages www.redpages.ibm.com to get a PDF on using the CCF connector.
The CCF connector is one example. It is a J2EE resource provider.
Check out the IBM redpages www.redpages.ibm.com to get a PDF on using the CCF connector.
Putting the KMAT option series on the web
After looking at SIS and APO, it is clear that they will not be sufficient to meet my requirements.
So I had to go back to the Access database (which processes an extract from SAP).
1. Reorganize the database to use a star schema and behave like a DW.
a. Pre-aggregate at three different levels.
b. Seperate the database load application and the DW database. (seperate MDBs).
c. Have a single Access form to query all data.
2. Provide Web-page access to this data.
a. Used ASP.NET to query the data from Access. [It took a bit of digging to figure out how to pass parameter values to Access. Especially when the query uses form variables. Also figured the other way around - i.e how to use query parameters in Access Forms :- this involves creating your own recordset and binding it to the form at runtime.]
I've borrowed the concept of using JSP "session" bean to store parameters. This provides an elegant method of sharing parameter selections across pages. Page Viewstate is a bit restrictive.
b. Add the company trimmings
c. Write an XSLT transform that converts the data into MSExcel 2000 "HTML" format. When you associate this with Excel's Mime type, Excel will open the file as a standard spreadsheet.
d. Provide raw XML access also. [Must override the Render function, if you don't want the HTML in the page to get rendered]
3. Next, time for some graphics.
a. VML is easy and integrated in IE. The syntax is not too difficult either. Microsoft provides a VMLGenerator editor to play around with VML.
b. Found a tool to generate graphs in VML or SVG (www.grapl.com).
[This led to a interesting diversion into the languages APL and J. May take some time to study the language J].
c. SVG is the W3C approved vector language, but you need to download a plugin for it. For some reason it does not work on my system (Adobe SVG Viewer 3.0).
d. Need an XSL template to convert the XML into a set of graph data.
[ Rather challenging, since I've forgotten XSLT. The site www.exslt.org provides a lot of handy subroutines/functions.]
[ The site http://purl.org provides a place to put up permanent URLs. Registered there as dmahajan.]
So for now, I've just got as far as transforming the data into a graph series. Next step is to transform it into VML lines.
So I had to go back to the Access database (which processes an extract from SAP).
1. Reorganize the database to use a star schema and behave like a DW.
a. Pre-aggregate at three different levels.
b. Seperate the database load application and the DW database. (seperate MDBs).
c. Have a single Access form to query all data.
2. Provide Web-page access to this data.
a. Used ASP.NET to query the data from Access. [It took a bit of digging to figure out how to pass parameter values to Access. Especially when the query uses form variables. Also figured the other way around - i.e how to use query parameters in Access Forms :- this involves creating your own recordset and binding it to the form at runtime.]
I've borrowed the concept of using JSP "session" bean to store parameters. This provides an elegant method of sharing parameter selections across pages. Page Viewstate is a bit restrictive.
b. Add the company trimmings
c. Write an XSLT transform that converts the data into MSExcel 2000 "HTML" format. When you associate this with Excel's Mime type, Excel will open the file as a standard spreadsheet.
d. Provide raw XML access also. [Must override the Render function, if you don't want the HTML in the page to get rendered]
3. Next, time for some graphics.
a. VML is easy and integrated in IE. The syntax is not too difficult either. Microsoft provides a VMLGenerator editor to play around with VML.
b. Found a tool to generate graphs in VML or SVG (www.grapl.com).
[This led to a interesting diversion into the languages APL and J. May take some time to study the language J].
c. SVG is the W3C approved vector language, but you need to download a plugin for it. For some reason it does not work on my system (Adobe SVG Viewer 3.0).
d. Need an XSL template to convert the XML into a set of graph data.
[ Rather challenging, since I've forgotten XSLT. The site www.exslt.org provides a lot of handy subroutines/functions.]
[ The site http://purl.org provides a place to put up permanent URLs. Registered there as dmahajan.]
So for now, I've just got as far as transforming the data into a graph series. Next step is to transform it into VML lines.
Wednesday, December 24, 2003
Patterns for developing Web applications
Interesting site which discusses Patterns for Web applications.
Patterns for e-business are a group of reusable assets that can help speed the process of developing Web-based applications. This site breaks down these reusable assets. The Patterns leverage the experience of IBM architects to create solutions quickly, whether for a small local business or a large multinational enterprise. As shown in the following figure, customer requirements are quickly translated through the different levels of Patterns assets to identify a final solution design and product mapping appropriate for the application being developed.
Click to go to the IBM site.
Patterns for e-business are a group of reusable assets that can help speed the process of developing Web-based applications. This site breaks down these reusable assets. The Patterns leverage the experience of IBM architects to create solutions quickly, whether for a small local business or a large multinational enterprise. As shown in the following figure, customer requirements are quickly translated through the different levels of Patterns assets to identify a final solution design and product mapping appropriate for the application being developed.
Click to go to the IBM site.
Results of SIS
I've put it a lot of effort into understanding SIS and getting it setup.
Was it worth it?
My answer is mixed.
The original goal was to get proposals for configuration supporting points in planning.
SIS is not sufficient for that goal. The closest it gets to that, is showing historical data from S138 on the configuration supporting point screen.
However, it was interesting from other points of data reporting.
The research gave me a better understanding of how SIS can support SAP's flexible planning (the precursor to APO). I've also gained an insight into integration between SIS and planning.
The idea of sitting piggyback on LIS to calculate custom statistics is very appealing. It is real time and better integrated into SAP instead of custom Z-reports.
I've found an interesting paper on using SIS to calculate supply chain metrics for sales orders. The paper was written in 1997 at University of St. Gallens. It creates a new infostructure and uses that to monitor metrics on whether a sales order was executed on time, late, etc.
Legner, C.; Muschter, S.; Brecht, L.; Österle, H.: Process Measurement and Benchmarking: The SAP Process Information System, Institut für Wirtschaftsinformatik der Universität St. Gallen, St. Gallen, 1997.Click to download.
Was it worth it?
My answer is mixed.
The original goal was to get proposals for configuration supporting points in planning.
SIS is not sufficient for that goal. The closest it gets to that, is showing historical data from S138 on the configuration supporting point screen.
However, it was interesting from other points of data reporting.
The research gave me a better understanding of how SIS can support SAP's flexible planning (the precursor to APO). I've also gained an insight into integration between SIS and planning.
The idea of sitting piggyback on LIS to calculate custom statistics is very appealing. It is real time and better integrated into SAP instead of custom Z-reports.
I've found an interesting paper on using SIS to calculate supply chain metrics for sales orders. The paper was written in 1997 at University of St. Gallens. It creates a new infostructure and uses that to monitor metrics on whether a sales order was executed on time, late, etc.
Legner, C.; Muschter, S.; Brecht, L.; Österle, H.: Process Measurement and Benchmarking: The SAP Process Information System, Institut für Wirtschaftsinformatik der Universität St. Gallen, St. Gallen, 1997.Click to download.
Using infostructures in SAP
It was worth the effort. SAP put a lot of effort into its preliminary data warehousing efforts. Variously known as OIW (Open Info Warehouse), LIS, SIS and "Infostructures".
Infostructures are aggregated data that is extracted from the transaction tables in SAP.
For example, information from sales orders, deliveries and billing documents are aggregated into S001 (Customer Infostructure).
SAP's documentation links.
What is LIS and how do things work.
Using LIS and its tools, also has a nice introduction to LIS.
Details from above link for SIS.
Configuration guide for Logistics Information System IMG.
I was interested in SIS to see if the system can automatically populate configuration supporting points for KMATs. Briefly, config points are used when you have configurable materials (KMATs). So in addition to planning the number of units, you must plan the percentage of the various possible options. These percentages are stored in MD63 as configuration supporting points.
Prereqs: KMAT, Planning profile (MDPH and MDP6), Setup SIS
Overview:
1. Customize statistical groups for materials, customers, order types, etc. in SIS.
2. Update master data to set statistical groups for materials, customers etc.
3. Customize update groups for these combinations.
4. Customize - Activate update for the SIS infostructures that you want to update. I chose V2 updates for S001 to S006. For variant configuration you must activate S126.
5. Customize - Under update control, simulate an update with a sales order, billing document and a delivery. In the analysis log, check that the correct update group is being calculated for your document and its items. If the update group is initial, then the item / document is ignored.
6. Clear out old data in the infostructures using transaction OLIX.
7. Run the statistical update (i.e read existing documents to update infostructures) to update the infostructures.
In SIS, the update group is calculated for the sales order. It is then copied into the delivery. Finally the update group is copied from the delivery into the billing document.
When you run a statistical update, ensure you select re-calculate update groups and update document. The order is important, sales orders followed by deliveries and then billing documents.
8. Run copy management. For variant configuration, you must first populate S126 as above. Then use copy management from S126 to S137 (method V003). Finally use copy management from S137 to S138. S138 is the infostructure used by transaction MC(B to analyze variant configuration data.
SAP documentation link.
How to setup SIS variant configuration.
Problem areas:
Infostructures not listed during update simulation.
a. Check the infostructure is active for update.
b. Check the document update group and the item update group are not initial.
S126 is populated, but S137 is empty after running copy management.
a. The copy method V003 ignores a material which does not have "Update SIS" flag on the planning profile header. This is a radio button on the profile list screen of MDPH.
I can't see any data for a particular combination.
a. Check the material's planning profile has this combination.
Infostructures are aggregated data that is extracted from the transaction tables in SAP.
For example, information from sales orders, deliveries and billing documents are aggregated into S001 (Customer Infostructure).
SAP's documentation links.
What is LIS and how do things work.
Using LIS and its tools, also has a nice introduction to LIS.
Details from above link for SIS.
Configuration guide for Logistics Information System IMG.
I was interested in SIS to see if the system can automatically populate configuration supporting points for KMATs. Briefly, config points are used when you have configurable materials (KMATs). So in addition to planning the number of units, you must plan the percentage of the various possible options. These percentages are stored in MD63 as configuration supporting points.
Prereqs: KMAT, Planning profile (MDPH and MDP6), Setup SIS
Overview:
1. Customize statistical groups for materials, customers, order types, etc. in SIS.
2. Update master data to set statistical groups for materials, customers etc.
3. Customize update groups for these combinations.
4. Customize - Activate update for the SIS infostructures that you want to update. I chose V2 updates for S001 to S006. For variant configuration you must activate S126.
5. Customize - Under update control, simulate an update with a sales order, billing document and a delivery. In the analysis log, check that the correct update group is being calculated for your document and its items. If the update group is initial, then the item / document is ignored.
6. Clear out old data in the infostructures using transaction OLIX.
7. Run the statistical update (i.e read existing documents to update infostructures) to update the infostructures.
In SIS, the update group is calculated for the sales order. It is then copied into the delivery. Finally the update group is copied from the delivery into the billing document.
When you run a statistical update, ensure you select re-calculate update groups and update document. The order is important, sales orders followed by deliveries and then billing documents.
8. Run copy management. For variant configuration, you must first populate S126 as above. Then use copy management from S126 to S137 (method V003). Finally use copy management from S137 to S138. S138 is the infostructure used by transaction MC(B to analyze variant configuration data.
SAP documentation link.
How to setup SIS variant configuration.
Problem areas:
Infostructures not listed during update simulation.
a. Check the infostructure is active for update.
b. Check the document update group and the item update group are not initial.
S126 is populated, but S137 is empty after running copy management.
a. The copy method V003 ignores a material which does not have "Update SIS" flag on the planning profile header. This is a radio button on the profile list screen of MDPH.
I can't see any data for a particular combination.
a. Check the material's planning profile has this combination.
Friday, December 05, 2003
An unusual error trying to remove the SAP .NET connector
I had uninstalled MS Visual Studio 2002 but left the SAP .NET connector. After reinstalling VS2002, the .NET connector was not visible in the IDE. So I tried to uninstall / repair it.
Hit a road block
Unable to get installer types in the c:\tools\Visual Studio\Common7\IDE\SAP.Connector.Design.dll assembly. --> One or more of the types in the assembly unable to load."
Looked on Google and someone suggested that this was related to registration with COM.
So I tried regsvcs on this DLL (no luck) and then regasm on this DLL.
Running regasm resolved a few problems and the uninstall got a little further. But the error persisted.
Finally I moved the files from the old VS location to the new Visual Studio location (C:\program files\MIcrosoft Visual Studio .NET\Common7\IDE).
This still did not resolve the problem. For some reason, it kept looking at the old location.
Desperate, I went to the registry and renamed all references of the old DLL location.
This fixed the problem (whew).
Uninstalled the component and rebooted (it broke something in Visual Studio, since the installer kicked in when I tried to start Visual Studio. It fixed itself.)
Finally re-installed the SAP .NET connector and it now is visible in VS.NET 2002!
Earlier, I created a small security web application. It was to explore the builtin authentication and authorization concepts of NET. I was impressed at the ease (it always helps to have an example in front of you to get moving fast!).
I copied the project to a new project. When I copied the ASPX files, it gave me compile warnings. Digging deeper, it turns out the default namespace for a project is the projectname. So when you copy an ASPX to the new project, you must go into the C# code behind each ASPX and change the namespace. {The alternate is to simply change the namespace of the entire project}.
BTW the Brinkster account didn't let me upload this to the "free/general" account. No idea what happened.
Hit a road block
Unable to get installer types in the c:\tools\Visual Studio\Common7\IDE\SAP.Connector.Design.dll assembly. --> One or more of the types in the assembly unable to load."
Looked on Google and someone suggested that this was related to registration with COM.
So I tried regsvcs on this DLL (no luck) and then regasm on this DLL.
Running regasm resolved a few problems and the uninstall got a little further. But the error persisted.
Finally I moved the files from the old VS location to the new Visual Studio location (C:\program files\MIcrosoft Visual Studio .NET\Common7\IDE).
This still did not resolve the problem. For some reason, it kept looking at the old location.
Desperate, I went to the registry and renamed all references of the old DLL location.
This fixed the problem (whew).
Uninstalled the component and rebooted (it broke something in Visual Studio, since the installer kicked in when I tried to start Visual Studio. It fixed itself.)
Finally re-installed the SAP .NET connector and it now is visible in VS.NET 2002!
Earlier, I created a small security web application. It was to explore the builtin authentication and authorization concepts of NET. I was impressed at the ease (it always helps to have an example in front of you to get moving fast!).
I copied the project to a new project. When I copied the ASPX files, it gave me compile warnings. Digging deeper, it turns out the default namespace for a project is the projectname. So when you copy an ASPX to the new project, you must go into the C# code behind each ASPX and change the namespace. {The alternate is to simply change the namespace of the entire project}.
BTW the Brinkster account didn't let me upload this to the "free/general" account. No idea what happened.
Tinkering again with ASP.NET
I've finally re-installed Visual C# with Visual Studio.NET on my machine again. Got all the posters out and am challenging myself with making something interesting. Get a feel for the technology with a real project. I'll get down to reading the various .NET self paced training books first.
Did get a new web account to host my non-existent apps. Link to my Brinkster account.
BTW, PDM is really cool. I am very impressed with the ease of use and its speed. I haven't quite figured out the "Query" tab as yet (need more time to read the manual!!!)
Almost finished the returns "Create Variant" program. Still a bug with missing accounting views. I've got a demo tomorrow.
Looking on the web, I've found an interesting manual from SAP on using Order BOMs.
Another interesting article on setting up/ using class hierarchies and nested KMATs.
The table CUXREF can be used to see which constraints are setting up a Z2 sales order block (GLOBAL_SDCOM_ZZLIFSK). Gave a short overview on the tables CABN, CABNT, CAWN, CAWNT, CUKB, CUXREF.
Its getting late.....
Did get a new web account to host my non-existent apps. Link to my Brinkster account.
BTW, PDM is really cool. I am very impressed with the ease of use and its speed. I haven't quite figured out the "Query" tab as yet (need more time to read the manual!!!)
Almost finished the returns "Create Variant" program. Still a bug with missing accounting views. I've got a demo tomorrow.
Looking on the web, I've found an interesting manual from SAP on using Order BOMs.
Another interesting article on setting up/ using class hierarchies and nested KMATs.
The table CUXREF can be used to see which constraints are setting up a Z2 sales order block (GLOBAL_SDCOM_ZZLIFSK). Gave a short overview on the tables CABN, CABNT, CAWN, CAWNT, CUKB, CUXREF.
Its getting late.....
Thursday, November 27, 2003
Reading SAP data from Access
Hey this sounds interesting
Personal Data Miner for SAP - gives you the ability to reach into the SAP database directly from MS Access.
http://www.pdm.lu
I should try and remember to try this one out.
Personal Data Miner for SAP - gives you the ability to reach into the SAP database directly from MS Access.
http://www.pdm.lu
I should try and remember to try this one out.
Further SAP projects
Processing returns of configurable & serialized materials in SAP
Return processing in SAP is pitiful. Try adding configurable and serialized products. It blows SAP out of the water.
Move the configurable material out of sales stock and into inventory. But first, create a material variant for this configurable material. Here lies the next problem, SAP does not create a variant which references the configuration in a sales order. So I have to write your own program to create material variants. (Steps involved are: Create a new material master, configure it, attach a BOM and a routing, create and mark a cost estimate).
Once this is in inventory, you need to break it down to its components. More problems. Since it is configurable and changes frequently, the components in the current BOM may not match what you built and shipped out. So read the components from the original production order and compare this with today's BOM. At the end, perform a 561/562 (or something similar) to swap the variant for the components.
Now the serial numbers. If you don't reuse your serial numbers, there is no problem. Sadly, that is not the case in most companies. The serial number of purchased components need to be reused. So when we break down the KMAT, we need to make these component serial numbers available for use. Also if there are configured serial numbers, we need to remove configuration information to avoid problems during delivery (In an inbound delivery, the system checks the serial number configuration with the purchase order configuration).
Return processing in SAP is pitiful. Try adding configurable and serialized products. It blows SAP out of the water.
Move the configurable material out of sales stock and into inventory. But first, create a material variant for this configurable material. Here lies the next problem, SAP does not create a variant which references the configuration in a sales order. So I have to write your own program to create material variants. (Steps involved are: Create a new material master, configure it, attach a BOM and a routing, create and mark a cost estimate).
Once this is in inventory, you need to break it down to its components. More problems. Since it is configurable and changes frequently, the components in the current BOM may not match what you built and shipped out. So read the components from the original production order and compare this with today's BOM. At the end, perform a 561/562 (or something similar) to swap the variant for the components.
Now the serial numbers. If you don't reuse your serial numbers, there is no problem. Sadly, that is not the case in most companies. The serial number of purchased components need to be reused. So when we break down the KMAT, we need to make these component serial numbers available for use. Also if there are configured serial numbers, we need to remove configuration information to avoid problems during delivery (In an inbound delivery, the system checks the serial number configuration with the purchase order configuration).
Its been a long time since my last post. Life got in the way.
With so many items trying to grab your attention, it is easy to forget the non-urgent activities.
Recent trends
SAP - Using Javascript to script and automate the SAPGUI frontend.
(I should write an article about this, since it is a lot of fun)
SAP - Planning for configurable materials. Despite the rich functionality that SAP provides in 4.6C, there is a lack of clear instructions on planning KMATs. Working with the planning folks, I've learnt a lot about the complex nature of planning these beasts. Furthermore, what information must drive your forecasting. I wrote two SAP extract programs that get production order information from SAP. The extract files are imported into Access (500K + 3M rows) where I do some crunching to reduce the data volume. The resulting time series is something that the planners can use. But I went a step further, looked at all KMAT BOMs and extracted the characteristics used for any selection condition/procedure in a BOM. That list of characteristics is what is relevant for planning. By comparing this list with the planning profiles of your KMATs, you can analyze how much coverage you have for each KMAT.
The next step is to compare the percentages in your configuration points with the actual characteristic - value percentages. Again this was accomplished in Access.
Now all this information is not useful if it is difficult to update percentages. So I wrote an update program that can go in and change percentages on each configuration point.
Interested? Give me a yell.
With so many items trying to grab your attention, it is easy to forget the non-urgent activities.
Recent trends
SAP - Using Javascript to script and automate the SAPGUI frontend.
(I should write an article about this, since it is a lot of fun)
SAP - Planning for configurable materials. Despite the rich functionality that SAP provides in 4.6C, there is a lack of clear instructions on planning KMATs. Working with the planning folks, I've learnt a lot about the complex nature of planning these beasts. Furthermore, what information must drive your forecasting. I wrote two SAP extract programs that get production order information from SAP. The extract files are imported into Access (500K + 3M rows) where I do some crunching to reduce the data volume. The resulting time series is something that the planners can use. But I went a step further, looked at all KMAT BOMs and extracted the characteristics used for any selection condition/procedure in a BOM. That list of characteristics is what is relevant for planning. By comparing this list with the planning profiles of your KMATs, you can analyze how much coverage you have for each KMAT.
The next step is to compare the percentages in your configuration points with the actual characteristic - value percentages. Again this was accomplished in Access.
Now all this information is not useful if it is difficult to update percentages. So I wrote an update program that can go in and change percentages on each configuration point.
Interested? Give me a yell.
Subscribe to:
Posts (Atom)