Change Password

Input error
Input error
Input error
Submit

Change Nickname

Current Nickname:
Submit

Graph computing technology essentially is a form of augmented intelligence (also shortened as AI, unlike artificial intelligence, it focuses more on leveraging graph’s superior computing power therefore augmenting human intelligence while analyzing data with celerity, depth, finer granularity, and flexibility). Graph augmented intelligence is much needed in many BI scenarios. According to a recent report by Gartner (2021), 80% of BI innovation will be powered with graph analytics by 2025. The % was at 50% by 2023 in Gartner’s 2020 projection.


BI encompasses how an enterprise acts upon its valuable business data. With graph augmented intelligence, a.k.a. BI+AI, a lot of unprecedented business scenarios can be empowered and realized with much faster time-to-value, lower TCO and higher ROI. The innovation and benefits of graph are illustrated in previous use cases, such as Liquidity Risk Management, Asset & Liability Management and Real-time Decision Making. In this use case, we’ll show you how graph computing and machine learning join force in a 1+1 >> 2 way to offer what traditional AI/ML couldn’t achieve.

Pain Points of a Case Study

A major bank’s credit card department has voracious needs for data intelligence: more accurate bank-wide credit card spending prediction (turnover prediction), more effective fraud detection (identify, payment, etc.), and more intelligent marketing promotions (by identifying valuable merchants and customers).

Inaccuracy

Taking prediction of banking credit card turnover (how much money the clients would spend) in the following month as an example. ML (machine learning) based prediction method has a monthly mismatch rate over 2.2%, which translates to over $1 billion. This inaccuracy affects the bank’s liquidity arrangement and profitability.

Retardedness

The ML system takes long time to go through the data sampling and training process, which may take many days (weeks) to churn out the prediction results, making the whole process a joke (Behold, this is the common problem with most of the so-called AI systems today.)

Innovation

The bank IT decided to leverage Ultipa Graph for much accelerated and intelligent sea-volume data sampling and training, by leveraging Ultipa Graph to process all historical credit card transactions.

  • Graph AI’s Advantages Over ML/DL
  • 50% Accuracy Improvement

Graph AI’s Advantages Over ML/DL

Credit Card Turnover PredictionUltipa GraphMachine Learning Based
Cluster Size (Instances)3 (HTAP)≥20
Data Volume (Monthly Trxes)36-month (in billions)36-month
LatencyT+0 (Same Day)T+N (Weeks)
Prediction Mismatch (%)≤1% (avg. 0.5%)≥2%
Key MechanismGraph algorithm to generate results as input parameters for downstream ML/DL frameworks.Traditional ML/DL

The above table shows 3 key things:

  • Data sampling/training time is dramatically reduced from weeks to hours (~100x improvement).
  • Prediction Accuracy is improved by 50%, the mismatch % was reduced from 2% to 1% or lower.
  • The hardware investment is also greatly reduced from 20-instance big-data cluster to a mere 3-instance HTAP graph cluster.

50% Accuracy Improvement

The diagram below shows how the prediction accuracy has been improved. The gist is “graph data modeling” because credit card transactions involve 2 types of entities card holders and merchants, and by building monthly transaction networks pertaining to transaction entities, the networks essentially reflect all card holders spending behaviors, therefore running graph algorithms like Degree, PageRank and template-based path queries will generate fact-reflecting parameters as inputs for downstream ML system to more accurately predict future spendings.

img

Turnover Prediction: ML vs. Graph, with 50% Accuracy Improvement

Significance

This use case illustrates that graph technology can be leveraged as a booster for traditional ML/DL technologies, and it works out perfectly fine. The core concept is to unify BI and AI, so that business can be propelled forward swiftly. On the other hand, we all need to look at the efficiency, accuracy and efficacy of big-data and AI, while they are being deployed everywhere, do they deliver what they originally promised? The future of AI lies with augmented intelligence, which is a perfect companion of BI.

Please leave the following information to download (Logged in users can download directly)
*
公司名称不能为空
*
公司邮箱必须填写
*
你的名字必须填写
*
你的电话必须填写
*
你的电话必须填写