New hardware is enabling a combination of scaling up, which leads to faster insights,
and scaling out, which leads to more data available for analysis, says Intel's Eddie Toh, Data Center Platform Marketing Manager. Speaking at the IDC & Intel Big Data Analytics Forum in Singapore, Toh noted that the confluence of these two trends have been a catalyst for increasingly useful insights from big data and analytics.
Real-time analytics can lead to more personalised
products and services, Toh added. For genomic analytics, for example, costs
have gone down from an average US$100 million per analysis in 2001 to just
US$4,500 in 2014, with the time taken for the analysis cut from 22 hours to 1.6
minutes, all through Intel reference architecture.
Such advances allow more personalised healthcare, Toh said. Other industries that can also benefit from providing more personalised offers include retail and telcommunications.
![]() |
Toh holds up a silicon wafer for the new Intel Xeon processor. |
As new hardware offering exponential leaps in performance come online, things can change quickly. Toh noted that Intel's Xeon E7 v3 family, launched about six weeks ago, contributes to scaling up with a 6X performance increase as the family supports up to 18 cores on one chip.
![]() |
Ramanathan discusses some of the issues facing analytics practitioners today. |
Deepak Ramanathan, CTO, SAS Asia Pacific, said that analytics is scaling out from internal data that the company already has, such as credit card data or loan data for a bank, to data from other sources. “That's where the real value is,” he said. “Try to model the external environment of the customer rather than the internal environment.”
Ramanathan also noted that scaling up has changed what
can be done with the data. “Historically what's been happening with analytics
is that there used to a data infrastructure which used to be a storage area
network, and then they used to have the compute tier. In the last two to three
years there has been a collapse of those two tiers," he said. “Eighteen cores on one chip – that's a lot of processing. That's what is changing a lot of
paradigms. That, to us, is why we are in a position to analyse all of the
data.”
The willingness to analyse large volumes of unstructured
data can lead to big dividends, Ramanathan shared. Advanced analytics has led the Lenovo Corporate Analytics Unit to report a 50% reduction in issue detection time, 10% to 15% reduction in warranty costs, as well as a 30% to 50% reduction in general information calls to the contact centre.
Analytics was how the Lenovo Perceptual Quality Project uncovered a single product review of their gaming computer with 2,000 comments on the content. Such a review would have been completely overlooked with traditional monitoring methods, Ramanathan said.
Analytics was how the Lenovo Perceptual Quality Project uncovered a single product review of their gaming computer with 2,000 comments on the content. Such a review would have been completely overlooked with traditional monitoring methods, Ramanathan said.
The same project was able to analyse call centre and
social media comments with SAS technology to discover a problem with docking,
even though the complaints were about systems hanging, screen issues, or
unexplained shutdowns, and docking was not always mentioned. “Everybody has a
problem but the way they surface the problem is completely different,” he
noted.
Advances in hardware performance go hand in hand with sophisticated software to achieve insights more quickly than ever before. SAP, for example, has found a lot of benefit through running its own
SAP S/4 HANA, says Manik Narayan Saha, CIO, SAP Asia Pacific & Japan. “We
asked ourselves how we could become a real-time business, with a 360-degree
view of the customer and get this information to the account executive who is
meeting the customer today,” said Saha.
The migration started in 2013, and things have speeded up
since. “In past analyses of data were extremely slow,” Saha explained. Instead
of extracting information from the data warehouse and then analysing it, a
process that could take up to 24 hours, SAP now bypasses the data warehouse
altogether and gets answers directly from the core system.
Saha also spoke about being able to pre-announce
financial results for the entire year on early January 2014. “We were able to do
that by putting (all data) on SAP S/4 HANA,” he said. He also shared that SAP
S/4 HANA had enabled some analyses of 500 million general ledger records, which
would have taken a traditional database system a few hours, to be completed in
just 5 seconds.
Saha envisions that the boardroom of future will be one where “everyone can sit at the same table and have a view of a real-time dashboard to see how the company is doing”. The next step, he said, will be predictive analytics,
such as looking at how currency shifts may
impact revenues. “We're ready to start these simulations today,” Saha
said. “It's a side-by-side journey with the business.”
“We are living in a world which is not geting simpler.
The challenge is how to make it simple for your customers,” he added. “Hide
the complexity so that they get the best solutions from you and that you can
easily deal with the complexity that you have.”
Read the Intel case study on genomics
No comments:
Post a Comment