Follow BigDATAwire:

April 23, 2012

SAP Accounts for Sybase, HANA Futures

Datanami Staff

SAP says its making its way from an applications company with help from the technologies its brought in from Sybase—an acquisition that SAP says is critical for the future of their data management and movement strategy.

SAP recently held a webinar featuring its own Dan Lahl, which hit on the ways a traditional company like SAP is looking forward to the new era of data management and use in a webcast.

The following is a transcription of the webcast for those who need a skimmable version instead of a 30-minute recording.

Twenty years ago, nearly every business innovation started with a supply chain question, some type of way to automate billing, inventory, accounts receivable and the like.  Now, however, the connected era has led to the great shift of business innovation today starts really with new sets of questions, like “How can I better offer my customers new products, new services? How can I go after adjacent markets?” Those “What if” questions are really driving the business forward today.

The systems of record for ERP and supply chain management and CRM are only going to get squeezed a little bit going forward, but the real answer is driven by data. So really the game changers is that data is the new game changer. At Sybase, we’ve known that for a long time and I think SAP is waking up to that fact.

Today, the changes in what’s going on in the marketplace with data, is that data has really become real-time. Data has to be looked at in context of what’s going on either within transactions or the analysis of that data. Data has become on demand for all users and I don’t know if that’s a palm pilot that guy is working on, or a Treo, but it is on demand, every device, every place, every where.

Data also is attempting to become more seamless. The more seamless it becomes, the better businesses will do. We’ve gotta knock down those silos that historically we’ve had. At the same time, we have to make sure that data is secure. Not only secure inside the drum, with data at rest, but encrypted on the fly as well. It’s gotta be secure from place to place and then finally, data has to flow efficiently. 

Now if we take kind of an external view, we also see that having data, at the end of the day, is not fully enough. Really what customers, what you are looking to do, is to drive more toward a real-time business. From your customers’ standpoint, you are looking to deliver systems that will, in real-time, deliver performance, real-time performance for your customers.

You’re trying to find out which customer profiles are the right customer profiles for things like loyalty rewards or adjacent product offers, or that kind of thing. You’re looking to segment your customers down to an audience of one and you’ve gotta be able to do that in real-time.

So, from an innovation standpoint, we see that products have to be delivered, and your products have to be delivered better than your competition. You have to do things like tracking complaints from call centers and social data websites in real-time. You need to know if your call center calls are tracked to what’s going on in social data websites and then take advantage of that.

And then again, you have to drive excellence within the IT in the business as well, delivering ahead of the business from an IT perspective and predicting supply chains and disruptions and so on and so forth. So these are the kind of things that have to be applied to the data that’s seamlessly moving, securely moving, moving without silos seamlessly throughout the enterprise.

In addition to that, you see that the landscape is actually changing. We are seeing that there are a lot of trends like big data and analytics driving the business. We see new cloud applications and cloud data management being embraced into the enterprise. We’re seeing data being delivered to mobile and interacting with mobile. Also, we see the social networks data being analyzed through social networks.

With all of that, we see that what is really needed, is a real-time platform. That’s what we announced on Tuesday of this week in San Francisco, we announced the SAP real—time data platform with the extreme capabilities to take transactional data, move that data into different assets and applications around the enterprise. Store that data, process that data, analyze that data and make all of that data available across the enterprise. We’re doing that, again, through the real-time data platform.

This allows you to really rethink old paradigms and innovate in new and various different ways. I was just thinking about something that happened to my family just the last week. We got a call from Wells Fargo, which is our bank, the middle of the week last week. They called us and said, “Has your son been taking money out of his account out of ATMs in the U.K.?” We said, he’s never been to the U.K., he’s only sixteen, so I don’t think he’s over there himself, let me go check his bedroom. He’s in his bedroom, so what has happened? The fraud department at Wells Fargo said that over the last three days, someone has been draining his account around the U.K. around the city of London with ATM transactions. So for three days, someone had gotten a hold of his pin and his bank account and drained about six to seven hundred dollars out of his bank account.

Why do I bring this story up? Well that’s because the innovation today is not the innovation of tomorrow. Wouldn’t it be great if you had a real-time data platform that could understand, in near real-time, that first transaction that gets pulled out of the ATM machine by my son. The fraud department would be alerted that a person who lives in California, who is 16 years-old, is actually getting money taken out in London. That call should happen right away, or there should be a temporary hold put on that account, so that no more money should flow out of that account. That’s a new paradigm of fraud management in real-time.

How do we think that’s going to happen? We think it’s going to happen through the adoption of in-memory and real-time data management that is taking hold and that SAP has actually pioneered with some of the products that we’re going to talk about.

Let’s just do a little history lesson and find out why we think this in-memory and real-time capabilities are available to us. If you look back in the 1990s, the cost of terabyte per disk was about $9 million, that’s not super cheap, that’s expensive, but look at the cost of memory. Memory was $106 million per terabyte, so all of the database companies rightly chose to store information and process information that actually managed with small memory and big disk. Sybase being one of those, Oracle, IBM, Ingriss, Informix and others, optimized really for the lower cost disk and much less memory for the processing.

Let’s fast forward to today, 2012. Disk has dropped an enormous amount, just $60 per terabyte, but look at the cost of memory. Memory is down to $4,900 per terabyte. If you compare that to the access speeds for being able to do transactions or analytics, or data movement on that data, you can see that the difference between disk speed and memory speed that we all know is dramatically different, it’s actually 10 million times different, ten million times faster to do it in memory than it is to do it in disk.

If you take the speed, and the cost of memory, there is now the capability to do much more data management and analysis in memory today. That’s actually what we are basing our real-time data platform on. At the bottom there, you see the SAP HANA product that is an in-memory, real-time database that will be the core of the real-time data platform moving forward.

Moreover, what you’ll see is capabilities from Sybase ASE, the great OLTP engine that many of you have used and built awesome applications with. That will be integrated with SAP HANA so that there will be an integration of real-time disk capabilities brought in from ASE into HANA and real-time capabilities from HANA into ASE.

Same thing with Sybase IQ, as we build out the platform. You’ll see the great capabilities for doing big data, like the capabilities that IQ has to do MapReduce and the capabilities of Hadoop integration and in-database processing and great performance optimization and indexing technologies. You will see that brought into the real-time capabilities of SAP HANA. Again, the best things about HANA brought into Sybase IQ going forward.

Many of you have implemented applications where you’re using SQL anywhere, the number one mobile embedded database on the market today. With over 10 million instances throughout the world, in many cases, lights out, no one knows what’s going on with those databases underneath the application. That will be tightly integrated with the real-time data platform so that you can move from the data center to the device center in a seamless fashion.

Finally, if you look at our data movement capabilities across Sybase and SAP, we bring very low latency data movement across Sybase ESP. The event stream processor, the CEP technology, along with the rock solid replication server for data movement, and bringing that together with the SAP classic ETL type of capabilities, we will now bring that into the real-time data platform and allow customers to choose what flavor of data movement they require. Whether that be super low latency with ESP, low latency with replication server, or transform-heavy with data services from SAP.

Power designer will be utilized across all of the assets from HANA to ASE to IQ to SQL anywhere to the data movement products, to be the orchestrator of the real-time data platform for data modeling, orchestration between the databases, data movement between those and then metadata management as well. We’re taking all of those and moving them into the platform, and also allowing the capability for partners to roll into the platform as well.

How are we going to do this? It’s not all going to happen tomorrow, because what I described is not something that we can just snap our fingers and accomplish tomorrow. From what we have today, to where we’re going, we actually have a three-phase approach.

The first is to integrate through semi-loosely coupling of the Sybase database technology to SAP HANA, as well as the data movement products to the real-time platform. That will be the integrate phase, being able to replicate for example, a new OLTP application into HANA and then utilizing HANA as the analytics engine. Or taking HANA and sharing information between that and IQ as a near-line store capability, so you can do real-time analysis for this quarter’s information on HANA and the next seven years, you can do analysis with IQ. Loosely coupling that using our data integration technology.

When we get to the optimize phase, what you’ll see is more of an integration of the Sybase databases and the Sybase and SAP data movement products into this real-time platform. The real-time capabilities of HANA will actually implement some of the ASE and OLTP rock solid capabilities that brings. Some of the big data, MapReduce, and Hadoop integrations brought from Sybase IQ into the real-time platform. Again, the data movement products being more closely orchestrated together, so it looks more seamless to you, our customer, for doing data movement across the enterprise.

Finally, in the synthesize phase, when we fully realize the SAP real-time data platform, you will see ASE taking on HANA innovation and HANA fully implementing the OLTP capabilities of ASE. Similarly, you’ll see Sybase IQ taking on the HANA innovation from HANA and bringing that into IQ and then HANA taking on all of the optimization and indexing technologies from IQ into HANA as well and then fully integrated with the data movement technologies across the platform.

Our whole goal is we will move this forward with you, with minimal customer disruption. Hopefully over time, your TCO at the data management level, for the applications that you build, whether they be custom, whether they be SAP applications, whether they be a combination of two, we will do that as we build a platform out. We will do that with minimal disruption for you.  

Again, this kind of goes through that talk-track I was talking about. Integrating today through data movement, loosely coupled. Optimizing by bringing the technologies more together and synthesize by really bringing together the platform, the orchestration, the metadata management, and actually deploying that either on-premise or on-cloud at the end of the day.

When we fully realize this data platform, when we get to the synthesize phase. What you’ll see is, if we go from the bottom up, the capability to do replication, events, data services, ETL, data quality, master data management. You’ll see all of that capability in the platform itself. You’ll see all of those engines that you can pick and choose from, orchestrated for you, through power designer. Tuning your applications to the needs of the data management architecture, again with common administration, common modeling, common lifecycle and maintenance of all that. So that you can work between your systems of record, kind of your OLTP, classic supply chain type of applications, to systems of engagement, which gives you more capabilities to do analytics and delivery, down to your knowledge workers. Again, those three bars really will become the visual coming together of our real-time data platform.

So that’s kind of what we’re aiming towards, and as we do that, what we are trying to do from our promise to you, is that we are actually going to innovate on the database market. We’re not going to take existing technologies like some companies are doing, packaging them up and new hardware and making you buy hardware for old database technologies. We’re actually going to innovate with new and interesting technology going forward, just like we’ve always done from Sybase, and now we’re just adding SAP HANA to that.

Also we’re going to protect and extend you customer investment. As you implement the products we talked about today in a stand-alone manner, they will come together in the real-time data platform, so that you can choose today what you want and then move forward with us into the real-time data platform.

Also, we’ll make sure that the data management portfolio is open an optimized. If you have other databases that you utilized, those can snap into the platform. Other data movement technologies, they will snap into the platform as well.

Finally, I hope you can understand from what I’ve talked about, that we will innovate across the entire product portfolio, bringing great OLTP to HANA, bringing real-time in-memory capabilities to ASE and IQ and seamlessly going between the analytics that are available in HANA for real-time, delivering that through SQL anywhere in a synchronized way out to the edge of the enterprise on any device to any person at any time.

We believe that for us, the future looks bright for SAP and Sybase. Being a Sybase employee going on 17 years, it’s really heartwarming to see SAP talking about how many great assets Sybase has to bring to this data management platform, that SAP has the vision to deliver. We talked about becoming the number two database player by 2015, we announced that we are now not just an applications company, but a database company. These are the key things in how we’re going to do that. Game changing innovation, freedom of choice for you our customers, open and optimized for your applications, and again, providing application and investment protection.

So, what is the road ahead for you? We believe the road ahead is going to be great for you moving forward. We’re bringing all these things together so that you can dream up new applications that you want to build and have the applications that you have running today run better, run faster, run more easily with less moving parts and pieces. We’re co-innovating with you and with SAP and Sybase together. We’re integrating our development teams between Sybase and SAP so that all of these products will go forward together. We’re aligning the product lifecycle so that there won’t be a big maintenance and upgrade headache and patching headache. We’re streamlining our support, and going forward, we believe we will be the one trusted advisor as a database and data management company, but as well, bringing what SAP brings, which is great application knowledge. What Sybase brings, is great infrastructure and data management knowledge. We now believe we can be your one trusted advisor going forward. That’s our view for what’s the right choice for real-time business.

What I also want to do is talk specifically about data warehousing and that topic, because there has been some confusion, we feel, between Sybase IQ and HANA. There’s some overlap and we want to make sure that you understand where we see the fit and the affinities between those products.

If we look at what’s going on in the area of data warehousing and analytics, we see that, from a stakeholder requirement, there are actually a number of different types of people that need analytics and data to be analyzed from those analytics. From, starting on the right, data scientists, who are really doing data mining and predictive analytics, to executives who are looking at maybe daily reports or daily dashboards, to middle managers who are doing some ad hoc reporting, to front line workers who are looking at operational and technical delivery of analytics, all the way down to business processes and information streams that are really proxies for people. Those are really application to application, they need much more low latency kind of data into what they’re looking to do from a delivery and a realization standpoint, from streaming to real time. If you follow that forward then, not only that, but the data scope is very different for all those different audiences.

Again, starting with data scientists, they may need to take information, not only within the whole enterprise, but also information across the market in which that company participates to analyze data and make decisions. Executives are looking for enterprise-wide data, then you go down to middle managers and front line workers who are looking at departmental and operational data scope. Business process might be an information stream source specific or even process specific in what they need in their data scope.

Finally, if you look at what stakeholders are looking for from a data latency standpoint, you have from streams to data scientists, you may have milli or microseconds, maybe down to nanoseconds for some of you in the financial services industry, to business processes that need seconds of data latency, out to data scientists, who really don’t need to look at real-time data per say, to find long trends. They may be okay to have a weekly data latency trend. That kind of goes on the velocity and the volume of data and then the variety kind of goes across all of those different categories.

The problem is that today, in classic analytics and data warehousing for these different data latency, data scope, data delivery across all of these different constituencies, we see that EDWs kind of failed them. That is that gartner wrote in 2011 “The vast majority of organizations select a single deployment style for what they term an enterprise data warehouse (EDW)” And then they compromise, maybe they choose executives as the people they’re going to serve. Then it’s not actually hitting business process needs from and SLA.  

Interesting too, in the survey work that gartner has done, they’ve seen that most so-called enterprise data warehouses, more than 70 percent actually serve just one back office function, or one departmental use case. What they believe going forward, is a data warehousing strategy has to support different workloads, ranging from simple to complex and different stakeholders. These SLAs are shortening and getting harder to do.

How do we believe a next generation data warehouse, or what gartner calls the logical data warehouse, how do we believe that is actually going to be delivered? We see on the top that you have a number of different ways that you visualize the data, from dashboards to reporting tools, to analytical workbenches to visualization and data discovery tools. In the middle there, you have this class of data warehouse applications. Something like a business warehouse from SAP or some other things that Oracle has put out in their fusion middleware and so on and so forth.

At the data management layer, kind of what we’ve been talking about in this real-time platform, you need to go from streams data management through to the real-time data foundation, all the way out to an enterprise data warehouse in a big data case as well. Then you need at the bottom of course, real-time data integration and data services to service that.

When we look at that, we see that today, we offer capabilities in all of these areas. From a data warehousing application standpoint, we offer the SAP netweaver business warehouse. If you’re an SAP customer, that’s already built-in reporting and ad hoc capability application for your SAP data. But, if you’re looking at streams data management, we supply Sybase ESP, our events processor for very low latency stream data management, we supply this real-time in-memory capability with SAP HANA. Then from a big data standpoint, we supply and provide Sybase IQ for that.

If you look at what the streams data management, what Sybase ESP provides, is the capability for not only alert monitoring, but actually for continuous optimization. You’re not only alerting when you correlate something that is happening through a time window analysis, but you can actually use it as a continuous optimization of let’s say a telecom data network or a telecom cellular network.

The real-time data foundation with Hana supplies the capabilities for an operational datamart for reports, classic ODS type of capability. But also an agile datamart for doing real-time analytics that is not super low latency, but maybe it’s intraday datamart capacities and analytics that you need. HANA is actually acting as the netweaver business warehouse data foundation, so you can combine up HANA and BW for real-time analysis of your SAP applications if you will.

And then Sybase IQ provides what it has done very well for many years, an open data warehouse and what we’re calling this new capability, called HANA Smart Store. I’ll get into that one in just a second.

If you look at how we see the logical data warehouse being implemented today, this again is kind of our first step from a data warehousing view of the smart data platform. The real-time data platform is ESP being able to feed data into HANA, and at the press release we did just on Tuesday, we had ESP feeding HANA in a hundred-terabyte benchmark that we did and ESP was loading 1.2million records into HANA per second. I believe that was tick data information that was loading into HANA.

We’ve already got that working together.

Then you can take that real-time capability of HANA and you can export data out to the Sybase IQ layer or some of the larger big data historical analytics and you can go between any of those three.

Again, down at the bottom of the batch, in real-time data integration and netweaver close to the top there. The business warehouse sits on top of that architecture for doing SAP analytics, if you’re an SAP customer.

What does that look like in a real customer example? McKesson has been a long time SAP customer, and here’s what they decided to do. They had kind of an EDW and datamart scenario that was not working for them in a traditional relational database environment. So what they decided to do was take SAP data, that’s the ECC down there at the bottom. Take information from three months, the current data and use that for analysis using SAP HANA. Then using the data services product, which is an ETL capability, loading Sybase IQ with 19 months worth of data. Now you’ve got the capability to federate that at the top, using business objects, BI 4.0 with the federation between HANA and IQ. Now they have just got a logical data warehouse working. Real-time analysis of their current quarterly data that’s coming in and historical trending using Sybase IQ for currently 19 months, that’s what they initially loaded in, and I believe that they are over two years now. Again, federating the reports and the queries that they want to run off of those with business objects. So that’s pretty exciting with what McKesson is doing and they’re very happy with the performance both with HANA as well as IQ.

Let me give you another example, and this is the example of Coinstar/Redbox. If you’ve been to a blockbuster store recently, consider yourself lucky because I think they’re all closing. In my town, I think there were three and now there are zero. They’re being replaced by these kiosks in the grocery store, and that’s what this redbox project is all about.

They are a company that has done the Coinstar piece for a while, but now they’re just getting into the rental of DVDs in your store. They’re a very forward looking IT organization that wants to do really two big applications. The two big applications being serviced, one is being serviced by HANA, the second one is being serviced by IQ.

The first application is the one that HANA is performing, that’s called the thinning application. They’re taking transactional information from places like redbox.com where people can actually stream data, and then these sources are actually the kiosks all around the country. They are actually thinning down, so they can get the right DVDs placed into the right kiosks on the right day, because they’ll lose two dollars every time someone comes to a kiosk and asks for a movie that’s not in that specific kiosk. Everyday they are running this thinning application to get the right DVDs at the right time to the right kiosks all around the country.  

The second thing they are doing, and that goes to the IQ use case is, they are then taking all of the information that they’re gathering, loading all of that into IQ for more trending analysis. So they want to know if say, the city I live in, San Ramon, California, is the romantic comedy capital of the world. They know that over time then, they will need to load more kiosks with romantic comedies, than let’s say, horror movies or action movies.

The capability to do both the thinning with SAP HANA for the daily delivery of DVDs to the right kiosks, as well as the trending the right type of movies over time to the right kiosks down to the city and town and region level and just ordering their exact right movies from the studios. Those two working together, they believe is going to be a great future for them. Again, that’s HANA and IQ working together in this logical data warehouse type of approach with business objects sitting on top of those.

Lets take a quick look at business warehouse, my guess is not all of you are business warehouse customers.

Today, they announced on Tuesday, that business warehouse will run on top of HANA and then the capability is also there for real-time analysis in HANA, and you can also use Sybase IQ as a smart store, connecting HANA      together with IQ, so you can have in-memory real-time with HANA, and then out to Sybase IQ. Or you can use a traditional database. If you want to use ASE to run your business warehouse, you can do that. Again, you can use Sybase IQ out to the smart store, that’s the BW application right there.

Here’s a customer called Empresas Polar. They got I believe, over a thousand times reporting improvements from the traditional RDBMS they were using to run their business warehouse. Once they moved it onto HANA, it improved a thousand times. That RDBMS box should be read, not gray if you catch my drift. Again, providing innovation for their customers, for our customers using BW.

Finally, customers can choose to use any of these technologies alone. Again, ESP to be the alert monitor, HANA for the agile datamart and then Sybase IQ for the open EDW.

So that kind of goes through that second piece, the next generation data warehousing solution. So to sum that piece up, it’s really the first instantiation of how we’re moving towards this SAP real-time data platform. We’re solving the complex and diverse challenges today, real-time and big data. We’re powering very sophisticated information landscapes as you can see from the Coinstar/Redbox example, and again, we’re providing customers choice. So the customer is not shoved into one technology, they can use any of the technologies that we bring or other technologies as well. 

BigDATAwire