Premium Outbound Integration in SAP Datasphere

 

Recently SAP introduced Premium Outbound Integration as an offering in SAP Datasphere. This is the way forward for customers who want to extract data out of their SAP S/4HANA system and send this to a non-SAP Data Warehousing environment, especially since the alternative, the use of ODP to extract data to external systems, is no longer allowed as per note 3255746.

Transferring your S/4HANA data outside of the SAP ecosystem comes at a price

Premium Outbound Integration uses Replication Flows in SAP Datasphere to set up a connection from a source to a target system without storing any data in SAP Datasphere itself. This is the same mechanism that was already available in SAP Data Intelligence and is part of the roadmap to move all Data Intelligence functionality to SAP Datasphere. In this blog, we will not dive into the details of how replication flows work. In case you are interested in this topic, please check out this blog written by my colleague Dennis van Velzen.

Recently, SAP made a couple of changes in its pricing model, which reduced the costs of Premium Outbound by 50%. Currently, the monthly costs are 500 Capacity Units per 20GB of used outbound data volume. For more information on pricing, you can check the pricing calculator for Datasphere.

But what does the phrase ‘used outbound data volume’ exactly mean? Well, that is not so clear. One might think that it refers to the data that is stored in the HANA database or the size of the file that is written (compressed or uncompressed) to the target system. Actually, neither of these is true. SAP note 3470559 was trying to explain how the actual calculation takes place. This note has temporarily been moved to 'In Progress' in the support portal. The reason is most likely that the note was overly technical and did not give any real insights in what you can expect to be charged for Outbound Premium flows.

The bottom line is that the calculation is done based on the uncompressed size of the data being exported:

Each cell contributes to the outbound data volume based on its data type and value. The resulting volume to be consumed from the available premium outbound integration blocks is the sum of bytes of individual cells for each record and column.

After that, it explains how to calculate the number of bytes for individual data types. Although this provides details on the lowest level, it doesn’t indicate what can be expected on a more global level. What does this mean if you have a table that is taking up 1GB of storage in the SAP HANA database? How much will be charged for outbound premium if this table is sent over an initial load to an Azure Data Lake, for example?

In order to determine the ratio that needs to be applied on top of your HANA database size we did a full extract of the BSEG table, which at source-side was taking up 5,5GB. In the Database explorer of SAP Datasphere, the amount of GB’s charged to Outbound Premium can be shown per Replication Flow in table DS_METERING.

Blog Rogier juni 2024

As you can see, the total charge towards Outbound premium in this case is 167GB. That is a factor 30 compared to the original size of the data. It depends on the type of data stored in your table how big the impact is. We have also looked at other tables and saw a similar factor for ACDOCA. The MARA table is even having a factor 45, where PROJ table ‘only’ has a factor 3. Needless to say, this can quickly result in very high consumption figures in outbound premium, knowing that there are customers who have way larger tables than the BSEG we used in our example.

What does this mean for you in case you in case you want to send data via SAP Datasphere to your external Data Lake or Data Warehouse? Well first of all, think wisely of what you send and how you send it.

  1. Always use delta where possible. This might seem a no-brainer, but given the amount of data that is generated it is a quick win as well.
  2. Is the data already available in your datalake to a large extent? Do not send this again, but re-use what is there. This means if you migrate from ODP to Datasphere you do not need to resend most of your data that was created more than a year ago, since it is not likely to change. You can just run an initial load and delta for the data that you still expect to change.
  3. In case you do need to run full loads (for example Inventory status), think of an optimal schedule to do this and also look at the level of detail you need in your external system.
  4. Remove unused fields from the extraction. Typically the receiving part will say “Give me all data”, but it would be good to challenge that in this case; since sending 200 instead of 400 fields from a table can easily reduce the amount of data consumed.
  5. Apply filters where possible. In case you have multiple ledgers, do you need all of them to be send in the ACDOCA interface? Or is one enough?


Based on these recommendations you should make an estimate of the table size in S/4HANAHANA and calculate your expected monthly increase of storage. To calculate your expected monthly outbound consumption, it is recommended to not only look at this monthly increase but also take potential re-inits or ad-hoc loads into account. There are a lot of factors that play a role here, but as a minimum it is recommended to double the size of your delta to cater for these loads as well. This means if your expected delta is 25GB/month you should calculate 50GB as a total amount of outbound premium. Since these come in blocks of 20GB, you would need 3 blocks at a total cost of 1500 Capacity Units.


Disclaimer


The content of this blog is based on the situation (pricing and outbound premium calculation) as of July 2024. Both pricing and the calculation logic may be subject to change. Expertum will try to keep this blog as up-to-date as possible, but no claims can be made out of this.

Charlotte is here to listen.
Get in touch with her.

%firstName% is here to listen.<br />
Get in touch with %pronouns%.

About the author

Photo of Rogier Schipper
Rogier Schipper

Rogier Schipper is a Project manager and Solution expert SAP BI solutions at Expertum

Read more articles by Rogier Schipper

Related articles