Showing posts with label DSO. Show all posts
Showing posts with label DSO. Show all posts

Wednesday, 2 May 2012

How Does DSO Maintains Delta Records in Change Log Table

Hi All,

This post explains how DSO manages change log table to give proper delta records to the Info Provider(DSO/Info Cube) and the importance of orecordmode.

DSO has three tables:  i) Activation Queue/ New Table 
                                   ii) Active table and 
                                  iii) Change Log table

DSO uses Change Log table to Manage Delta Records.

In DSO you can set whether Key Figures should get Overwrite/Summation based on Key Fields.

Now let’s take an example where today we got one sales order with quantity 50

111          50

When it gets loaded to the DSO and activated, we will get the below records in Active Table and Change Log table (there will be no data in Activation Queue/New Table upon activation)

Active Table:

111          50

Change Log Table

111          50   N (0recordmode - New Image)

When this record loaded to the cube, it will be as below (Info Cube is always Additive)

111          50

As this record is loaded to the CUBE, Delta Time stamp for this source and target will get update in ROOSPRMSC table with all the details.

Now tomorrow we got same sales order with quantity 70. So this record has to get change in DSO and CUBE. As DSO is overwrite, new value will get overwrite in place of old value so we will get correct value.

What about CUBE? Cube is always additive so if I load this new record to CUBE then the quantity will become 120 which is wrong.

This is the place where 0recordmode comes into picture. It maintains the images for changed and new records so that records will correctly uploaded to the cube.

Now as the quantity is changed to 70, we will get below records in active and change log table of DSO upon activation.

Active Table

111            70 (overwrite)

Change Log Table

111            -50        (X -- Before Image)
111             70        ( '  ' -- After Image) symbol for after image is 'Space'

So if we load this record to the target as a delta the above two records from change log table will get loaded to the CUBE.

Now what happens?

111            50 (already Existing)
111           -50
111            70
------------------
111            70 (Which is expected)

Hope this gives you clear idea...

Wednesday, 28 March 2012

Integration in the Data Flow

Metadata

DataStore objects are fully integrated with BI metadata. They are transported in the same way as InfoCubes and are installed from BI Content (more information Installing BI Content in the Active Version). DataStore objects are grouped with InfoCubes in the InfoProvider view in the Data Warehousing Workbench - Modeling and are displayed in a tree. They also appear in the data flow display.

Update

Transformation rules define the rules that are used to write data to a DataStore object. They are very similar to the transformation rules for InfoCubes. The main difference is the behavior of data fields in the update. When you update requests into a DataStore object, you have an overwrite option as well as an addition option.

More information: Aggregation Type.

Requests in DataStore Objects

Use

This tab page provides information about all requests that have run in the DataStore object. You can also delete requests here or schedule an update.

Features

Request Is Available for Reporting
The This graphic is explained in the accompanying text Request is available for reporting information is displayed when activation has been started.

The system does not check whether the data has been activated successfully.
Data Mart Status

Scenario for Using DataStore Objects for Direct Update

The following graphic shows a typical operational scenario for for DataStore Objects for direct update:
This graphic is explained in the accompanying text

DataStore objects for direct update ensure that the data is available quickly. The data from this kind of DataStore object is accessed transactionally. The data is written to the DataStore object (possibly by several users at the same time) and reread as soon as possible.

Scenario for Using Write-Optimized DataStore Objects

A plausible scenario for write-optimized DataStore objects is exclusive saving of new, unique data records, for example in the posting process for documents in retail. In the example below, however, write-optimized DataStore objects are used as the EDW layer for saving data.
This graphic is explained in the accompanying text
There are three main steps to the entire data process:

Tuesday, 27 March 2012

Scenario for Using Standard DataStore Objects

The diagram below shows how standard DataStore objects are used in this example of updating order and delivery information, and the status tracking of orders, meaning which orders are open, which are partially-delivered, and so on.
This graphic is explained in the accompanying text
There are three main steps to the entire data process:

Further Processing of Data in DataStore Objects

Purpose

If you have loaded data into a DataStore object, you can use this DataStore object as the source for another InfoProvider. To do this, the data must be active. Use process chains to ensure that one process has ended before any subsequent processes are triggered.

Process Flow

Process flow for updating DataStore object data:

Management of DataStore Objects

Features

The DataStore object is displayed in the top table. You only have to select a DataStore object from the DataStore objects available if you called DataStore object administration from the monitor. 

In the top toolbar, choose Contents to display the contents of the table of active data for the DataStore object you have selected. With Delete Contents, you can delete the contents of the DataStore object. You can also display an application log and a process overview.
Tab Page: Contents
You can display the content of the change log table, the newly loaded data table (activation queue), or the active data (A table). You can also selectively delete requests from the DataStore object.

DataStore Objects for Direct Update

Definition

The DataStore object for direct update differs from the standard DataStore object in terms of how the data is processed. In a standard DataStore object, data is stored in different versions (active, delta, modified), whereas a DataStore object for direct update contains data in a single version. Therefore, data is stored in precisely the same form in which it was written to the DataStore object for direct update by the application. In the BI system, you can use a DataStore object for direct update as a data target for an analysis process. 

More information: Analysis Process Designer.

The DataStore object for direct update is also required by diverse applications, such as SAP Strategic Enterprise Management (SEM) for example, as well as other external applications.

Structure

Write-Optimized DataStore Objects

Definition

A DataStore object that consists of just one table of active data. Data is loaded using the data transfer process.

Use

Data that is loaded into write-optimized DataStore objects is available immediately for further processing. 

They can be used in the following scenarios:

      You use a write-optimized DataStore object as a temporary storage area for large sets of data if you are executing complex transformations for this data before it is written to the DataStore object. The data can then be updated to further (smaller) InfoProviders. You only have to create the complex transformations once for all data.

Standard DataStore Object

Use

The standard DataStore object is filled with data during the extraction and loading process in the BI system.

Structure

A standard DataStore object is represented on the database by three transparent tables:

Activation queue: Used to save DataStore object data records that need to be updated, but that have not yet been activated. After activation, this data is deleted if all requests in the activation queue have been activated. 

DataStore Object

Definition

A DataStore object serves as a storage location for consolidated and cleansed transaction data or master data on a document (atomic) level.
This data can be evaluated using a BEx query.

A DataStore object contains key fields (such as document number, document item) and data fields that, in addition to key figures, can also contain character fields (such as order status, customer). The data from a DataStore object can be updated with a delta update into InfoCubes (standard) and/or other DataStore objects or master data tables (attributes or texts) in the same system or across different systems.

Write-Optimized DSO

The objective of Write-Optimised DSO is to save data as efficiently as possible to further process it without any activation, additional effort of generating SIDs, aggregation and data-record based delta. This is a staging DataStore used for a faster upload.

Write-Optimized DSO has been primarily designed to be the initial staging of the source system data from where the data could be transferred to the Standard DSO or the InfoCube.



  • The data is saved in the write-optimized Data Store object quickly. Data is stored in at most granular form. Document headers and items are extracted using a DataSource and stored in the DataStore.

Write Optimized DSOs and Business Content

Those who work with BI7 got introduced to a concept of ”write optimized DataStore objects” together with “direct update DSOs”. In this post I will review an example of how BI7 objects, such as write optimized DSOs, can be used together with 3.x business content objects in the same dataflow.

By definition write optimized DSOs do not have three relational tables as standard DSOs (formerly known as ODSes). Instead, they only have an active table. It is clear, loading data from source to the write optimized DSO takes less time, and requires less disk space. In write optimized DSOs, however, we do not have an option for generating SIDs (formerly known as a “BEx reporting” option). Therefore, data loads go quicker, but running queries on write optimized DSOs is not a very good idea.

Types of DSO's

Three types of DSO in BI 7.0.

1)Standard DSO


It consists of three tables.......activation queue,active table,and change log table.....It is completly integrated in the staging process.....Using change log table means that all changes are also written and are available as delta uploads for connected data targets.......

Data population is done via the Standard DTP
SID are generated for standard DSO
Data Records with same key are aggregates during activation process for standard DSO.
Needs Request activation process 


2)Write optimized DSO

Thursday, 15 March 2012

DSO(Data Store Object) in BI 7.0

Features Of Changelog & Active Queue Of Standard Data Store Object (DSO) in BI 7.0 :

 Business Scenario

Sometimes it is desirable to combine data from different Data Sources before the same is stored into the Info Cubes. Also, there are analyses that need access to the detailed data than that found in the Cubes.

Types of DSO

  •  Standard Data Store Object (Ref. Fig. A)
  • Data Store Object with Direct Update (Transactional ODS using 3.x)
  • Write Optimized Data Store - BI 7.0
  •  Contains only active data table used to manage huge data loads

Motivation for DSO