One night in Walldorf: a visit to SAP headquarters

 

After a hiatus of two years, the Dutch community of SAP users (VNSG) once again organized its annual trip to SAP’s headquarters in Walldorf, or more specifically for this year, Sankt Leon-Rot (as SAP’s main complex was undergoing renovations). As a member of the core team of the data and analytics focus group, I was lucky to tag along for this edition of the SIG Lead Event and will share my experiences with readers in this blogpost.

Opening keynote on sustainability

After the last delegation of international SAP user communities arrived in the large Audiomax room on SAP’s campus, the company’s SVP and Chief Marketing and Solutions Officer Sven Denecken opened the event with an update on SAP’s global strategy. Similar to last year’s TechEd, a big emphasis was on the concept of sustainability as a business goal in itself. Denecken sketched a future in which CO2 would become a currency and sustainability audits a regular fact of corporate life. Organizations would need to scrutinize their product or service in the context of their respective value chains and realize what processes would touch it before and after they add ‘their part’ to it. In this context, he presented SAP’s new visualization of their well-known ‘hamburger’ portfolio model (shown in the picture below).

Blog 1

Part of this realization is, according to SAP, the consideration of business semantics in an open system and not just in an SAP context. The challenge here is to find the right partner to provide the tools to attain these insights, whether it be through low-code, no-code or pro-code. All of this is crucial because the ‘mastercode’ of industries is changing; Denecken gave the example of a gas station that had extended its refueling and recharging services to a storefront that would give customers discount coupons whenever they picked the option to slowly charge their vehicle as opposed to the fast-charge option. This was a case of a re-thought out process that considers a world beyond their core business model and acts upon that; a case that SAP aspires to facilitate for many other companies in the near future.

The challenges of Cloud adoption

Another main session during the first day was the discussion panel on next-generation SAP Cloud implementations, especially S/4HANA. The panel touched upon the challenges customers face and have to overcome in order to realize successful deployments, and unfolded more into a business conversation than anything else. Bert Schulze, head of S/4HANA product success & adoption programs, remarked that the business blueprint companies had years ago is often still the same today. These companies take SAP’s new solutions, project their existing, unchanged processes on them and ask ‘where does SAP deliver the speed and the benefits?’. Schulze stressed that more often than not, too little thought goes into the rethinking and remodeling of existing processes and why the new software facilitates them from a different perspective, or not even at all. Business processes should not just be buttoned into a new coat, as the same input will yield the same output no matter how it is framed. There is a reason why SAP changes its solutions, and it tries to advise customers through a phased approach where the new functionalities are first deployed to a small supporting role or user group, which in turn creates a snowball effect when other people in the organization start noticing the new way of working (with the new technology). Something that can help users to identify their processes within new technology, is the SAP Business Technology Platform (BTP).

Blog Lars 2

Steffen Schad (head of BTP outbound product management), stated that very few customers use the BTP just as a platform; most of them see if in the context of a HANA back-end and use the new functionalities just because they are new, without a sound consideration of what they could do to improve existing processes. The reason for this is that, if someone proposes to adjust the process based on whatever research has been done or what new technology is available, the default response often is: ‘We don’t do that here, our process works fine and has done so for years, go try it somewhere else’. The question here that should be asked here, according to Schad, is: are there valid reasons why the process is still like that or not? SAP Signavio, Walldorf’s go-to solution for process mining and process modeling, was also mentioned briefly as an option to reassess the viability and efficiency of existing processes. To summarize; the observations shared by the panelists reaffirm why a Cloud implementation, or any IT implementation that affects a larger portion of an organization, should be done not just with IT at the table, but primarily with business representatives in order to reflect a correct image of the key processes involved. Even though this sounds like a decade-year old stigma in the world of IT, it still appears to hold true today.

To close out the session on a technical note, the panellists emphasized SAP’s ‘Keep The Core Clean’-policy in the wake of the business discussion summarized in the section above. Even when carefully considered, there will always be processes that will need to be customized in the context of new software. The BTP offers customers multiple ways to realize this; either through SAP Build (low-code/no-code options) considered for smaller functionalities, or the ABAP environment and the Cloud development tools such as Cloud Foundry for larger requirements. To identify the gaps that might still exist even when considering these options, SAP once again pointed towards their readiness check and custom code analyzer respectively, which are already frequently used by customers when considering the move to S/4HANA.

SAP Datasphere

The second day of our visit revolved around the Expert sessions with the men and women behind the SAP solutions. Heiko Schneider, senior product manager for SAP Datasphere, kicked off with the evolution of Data Warehouse Cloud. In comparison to BW/4HANA, Datasphere’s key focus is federative data access. What helps in this regard is that, even for Cloud-to-Cloud connections, in the past customers would need to install a separate agent, but with the data and replication flows in Datasphere (for which SAP states the formal abbreviation is still pending!), this is no longer required. Remote tables are a good example of this federation and virtual access and SAP confirmed that the adoption of these objects was high among current customers, which is a good signal.

In addition to the information that was shared during the reveal of Datasphere on the 8th of March, Schneider elaborated on some of the partnerships that were announced on that day. The bi-directional relation with Collibra for example, acting as a metadata catalog, is intended to be a complementary product to Datasphere’s catalog, fulfilling that functionality for non-SAP data that is connected to Datasphere. This would enable Datasphere to maintain the crucial semantics for non-SAP data, which it needs to realize SAP’s concept of the Business Data Fabric architecture. Even though this concept was once again defined as something new within SAP’s portfolio, I have already highlighted that the core concept is not new to the broader IT world in our Datasphere whitepaper (you can find that here). Terminology aside, this would mean that proper integration and utilization of Collibra is in fact vital for customers who also want to use non-SAP data in their Datasphere solution. Whether or not this functionality would come packaged with SAP Datasphere as part of this partnership was something Schneider could not comment on, although I expect that some sort of separate license for Collibra will be required.

A final important take away from this part of the session was that SAP is putting the new Analytic Model front and center with regards to BW Bridge as well. The model transfer of BW queries through BW Bridge (towards Datasphere), will be primarily focused on conversion towards this Analytic Model which will serve as the central, natively-generated Datasphere equivalent of BW queries. This means that, in addition to becoming the main source for consumption in SAP Analytics Cloud, SAP is striving for feature parity between the old BEx queries and the new Analytic Model in the (near) future. Semantics are being taken into account through for example, specific columns or associations with other objects generated in Datasphere during BW query transfer. Note that Power Designer models are not yet supported for transfer, and that SAP is currently considering whether or not to enable the OLAP-layer (limited to the query object) in the BW Bridge. BW Query metadata also remains not visible or executable in BW Bridge, but thus will be useable in the form of Datasphere objects (along with already supported objects such as ADSOs, Composite Providers, DTPs and Transformations).

SAP Analytics Cloud (SAC)

SAP’s flagship Analytics solution could of course not be left undiscussed, so Tobias Mueller started his segment with the clear takeaway that in 2022, one-hundred-and-ten of the one-hundred-and-sixty-six delivered features originated from the Customer Influence portal; showcasing that SAC customers have a real, profound impact on the course of the solution. The top three categories of delivered functionalities in order from highest number of deliverables to lowest, were Planning (by quite a margin), Story & Application Design and finally, Analytical Platform Services.

With the opener out of the way, Mueller continued with the general announcement that Search to Insights will be considerably updated with new functionalities in the wake of SAP’s acquisition of Askdata last year. Although he was not able to disclose more specifics, I am eager to see powerful SAP can make this, in my opinion, rather underutilized feature. Moving on, the Unified Story, which is due for release in the second quarter of this year, was also elaborated upon. Currently, SAC users can use either the classic approach for Story building, or use the Optimized mode. Important to note is that, while the Unified Story will merge Applications and Stories into a single asset, this asset will in fact not
be labeled as such. The feature set of the current Analytical Designer will be added to the Optimized mode (or Experience if you will) instead, so that customers will not have to deal with a third term while building their dashboards. Also important is to realize that Stories already built in the Optimized mode will be automatically migrated to the ‘unified’-version (as it will be one and the same mode). However, classically built Stories will still have to be migrated ‘by hand’, where users will have to determine how they want to close the functionality gap that still exists between the classic and Optimized approaches. The elephant in the room here is the data blending function, which is still omitted from the Optimized mode. Mueller explained the challenges that SAP is facing with this conversion, but assured us that they were committed to closing this gap so that the Optimized mode will become the new standard in the user’s eyes as well.

One of the most intriguing advantages of the Unified Story update from my perspective, was the presented Lightweight Viewer. This view mode, which hence is only available for Stories that were either built in or converted to the Optimized mode, allows users to run the Story without any design time code being generated. Note that this code is currently always
generated, even if a user only has display rights on a Story. The internal tests at SAP already show an improvement of five seconds (thirteen seconds vs eight seconds) when running a medium-sized/complex Story, with the developer’s ambition being to bring that performance gain up to an average of eight seconds. This would be a major win for some of the customers that I am currently working with and something that I was very happy to see.

Another highlight was the showcase of the increased integration with Microsoft Office, as SAP showed the embedding of fully-functioning SAC tiles in a Powerpoint presentation during a demo. An interactive widget, including drill-down and data refresh functionalities, was also displayed in the context of a marketing presentation. Additionally, we were able to witness a demo of a live iFrame being pasted into an Outlook mail (again, including full refresh and drill-down options). The difference with the Powerpoint demo was that the former would actually contain SAC data, whereas the latter would only stream it based on the respective user’s rights in SAC. I believe this to be another feature that could prove quite powerful for business users who want to share their SAC content with colleagues.

With regards to integration with SAP Datasphere, the existing roadmap for the integration with Planning is still valid. The current situation allows for a direct import connection of Datasphere data in SAC models. In practice, this means that actuals are usually imported into SAC, where they are used to plan ahead for an X-amount of time. Afterwards, these forecasts can be written back to Datasphere and used for other (reporting) purposes. SAP’s roadmap envisions Datasphere as the data foundation for planning across the board, where real-time planning is enabled between the solutions and data coming from Datasphere is treated in SAC as data coming from a native model.

Blog 3

Finally, Mueller explained SAP’s ambition to streamline the overall modeling experience in SAC. The current gap between creating models from scratch or from existing data will be closed, and changing an existing model after creation will also become easier and more flexible. For example, you should be able to change, add and remove columns without any impact on the transaction data or usage of the respective model across Stories, all in a single space. In the end, the goal is to have a single model type for import models (the live model will of course remain a separate object). Whether or not that will turn out to be the current ‘new model’ or something else, is not yet clear at the moment, but the move towards a more concise and consistent modeling environment is something that I for one am fondly looking forward to.

About the author

Photo of Lars van der Goes
Lars van der Goes

Lars van der Goes is a SAP Data & Analytics Lead at Expertum. Lars combines strong functional skills with broad knowledge of Analytics principles, products and possibilities.

Read more articles by Lars van der Goes