SlideShare a Scribd company logo
1 of 19
Download to read offline
Asia Pacific Equity Research
16 April 2023
J P M O R G A N
www.jpmorganmarkets.com
Correction (first published 15 April 2023) (See disclosures for details)
Technology - Hardware
Albert Hung AC
(886-2) 2725-9875
albert.hung@jpmchase.com
Bloomberg JPMA AHUNG <GO>
J.P. Morgan Securities (Taiwan) Limited/ J.P.
Morgan Securities (Asia Pacific) Limited/ J.P.
Morgan Broking (Hong Kong) Limited
Gokul Hariharan
(852) 2800-8564
gokul.hariharan@jpmorgan.com
J.P. Morgan Securities (Asia Pacific) Limited/ J.P.
Morgan Broking (Hong Kong) Limited
JJ Park
(82-2) 758-5717
jj.park@jpmorgan.com
J.P. Morgan Securities (Far East) Limited, Seoul
Branch
Jerry Tsai
(886-2) 2725-9867
jerry.tsai@jpmorgan.com
J.P. Morgan Securities (Taiwan) Limited
Anthony Leng
(886-2) 2725-9240
anthony.leng@jpmorgan.com
J.P. Morgan Securities (Taiwan) Limited
Robert Hsu
(886-2) 2725-9864
robert.hsu@jpmorgan.com
J.P. Morgan Securities (Taiwan) Limited
Jennifer Hsieh
(886-2) 2725-9868
jennifer.hsieh@jpmorgan.com
J.P. Morgan Securities (Taiwan) Limited
Sangsik Lee
(82-2) 758 5146
sangsik.lee@jpmorgan.com
J.P. Morgan Securities (Far East) Limited, Seoul
Branch
Following our deep-dive note on AI semis in 2018 (link), we update our AI server
shipment forecast and provide a BoM cost analysis of Nvidia HGX H100/A100
servers. We estimate 3% AI server shipment mix in 2022 and forecast a 42%
shipment CAGR in the next 5 years driven by increasing investment by ISPs in
machine learning, monetization in inference applications, and rising adoption in
AI cloud service platforms. We estimate the costs of AI servers to be around
15x/32x (A100/H100 server, respectively) higher than that of regular servers,
driven mainly by silicon value increase (especially GPUs and memory) and higher
hardware specs requirement. In the Asian tech space, we identify TSMC, SK
Hynix, Unimicron, ASPEED, Wistron, Quanta, Delta and Sunonwealth as the
major beneficiaries of exponential growth of AI servers.
• How to define AI servers? We define AI servers as servers with GPUs and AI
ASIC (ex: Google’s TPU). We believe most of the AI activities remain at the
training stage now and inference applications are still limited. Given Nvidia’s
lion’s share in the GPU server market (85%-90% market share, according to
IDC), we use Nvidia’s GPU server shipment to derive our current AI server
volumes and forecasts. We estimate ~3% of total servers to be AI related, with
the higher-end SKUs possibly representing only one-third of that.
• Inference is the key to drive volume upsides. Despite recent market hype on
AI, we have seen limited order increase in AI servers. We attribute the limited
volume upsides to the reuse of AI servers to train different algorithms and
limited monetization in AI applications. Still, we forecast AI server market to
grow by 3.5x in 5 years with accelerating investment in machine learning and
mature monetization in inference applications. We believe the rising AI server
trend, along with market focus shifting to 2H23 demand recovery, is likely to
drive valuation re-rating on the related stocks despite limited revenue
contribution now.
• Datacenter GPUs drive leading edge node migration, memory also a key
beneficiary. Semiconductor comprises 90%+ of total AI server BoM cost vs.
65-70% of regular servers, driven mainly by high-end CPUs, incremental
GPUs and rising requirement for memory (~4x/5x DRAM/NAND content vs.
regular servers). The high silicon costs have incentivized hyperscalers to
develop in-house datacenter ASICs, in our view. The heterogeneous
computingtrendbodeswellforTSMC,inourview.Besides,wealsoseehigher
BMC content in AI servers.
• Rising power consumption and more complex system integration for AI
servers. We see ODM, PCB, power supply, and heat dissipation as major
beneficiaries of AI servers in the hardware space. Server ODMs benefit from
themorecomplexdesignandsystemintegration.PCBmakersenjoyincreasing
layer counts in AI server PCB. The high-end CPUs and incremental GPUs
consumemorepowerandthisleadstocontentgrowthinpowersupplyandheat
dissipation modules.
See page 15 for analyst certification and important disclosures, including non-US analyst disclosures.
J.P. Morgan does and seeks to do business with companies covered in its research reports. As a result, investors should be aware that
the firm may have a conflict of interest that could affect the objectivity of this report. Investors should consider this report as only a single
factor in making their investment decision.
AI Servers
Deconstructing the BoM and understanding potential
upside for Asia Tech hardware
This document is being provided for the exclusive use of anthony.wc.liao@jpmorgan.com & clients of J.P. Morgan.
2
Albert Hung
(886-2) 2725-9875
albert.hung@jpmchase.com
Asia Pacific Equity Research
16 April 2023 J P M O R G A N
Key charts and tables
Figure 1: AI server ecosystem and key drive factors
Source: J.P. Morgan.
This document is being provided for the exclusive use of anthony.wc.liao@jpmorgan.com & clients of J.P. Morgan.
3
Albert Hung
(886-2) 2725-9875
albert.hung@jpmchase.com
Asia Pacific Equity Research
16 April 2023 J P M O R G A N
Table 1: AI related exposure in our tech coverage
Company Nature of involvement % of revenue likely in 2023 Comments
TSMC Foundry + backend for AI (CoWoS) 2-3% 100% market share for GPU for Gaming and AI as well as Mellanox DPUs, CoWoS for AI Chips
Unimicron AI chip substrates 1-2% Unimicron is the secondary supply for the substrates used in the AI chips.
SK Hynix HBM or GDDR6 DRAM Mid-HSD% Sole supplier of HBM to NVIDIA. 60-70% M/S in HBM market
ASPEED AI server BMC 1%
Delta AI server power supply and fans Limited Has great potential in power supply and fans given larger power consumption in AI server
Sunonwealth AI server fans 1-2% Has great potential in fans given larger power consumption in AI server
FII GPU server/module ODMs 5-10%
Wistron GPU server/module ODMs 2-3%
Wiwynn GPU server ODMs Teens % 50% of current project pipeline (in terms of project number) are AI related.
Inventec GPU server mainboard ODMs 5-10% Mainly MSFT, Google and small contribution from Amazon AI projects.
Quanta GPU server ODMs 5-10% Key supplier of MSFT AI servers.
Source: Company data, J.P. Morgan estimates.
What are AI servers?
Our definition of AI servers is the servers with GPU and AI ASIC (TPU). We believe
the majority of AI server activities are in the training phase, while there are limited
applications in the inference. As Nvidia’s GPU has dominant market shares (85%-90%
market share, according to IDC) in AI training due to the multi-cores structure, we
derived AI server volumes from Nvidia’s GPU shipment.
Figure 2: Block diagram of a heterogeneous compute implementation
GPU
Source: Gartner, J.P. Morgan. (Link). *Note: ASIC includes Google’s TPU, Amazon’s Inferentia etc.
AI server shipment still small, but revenue contribution
reached teens %
There were 3.2mn datacenter GPUs in 2022, of which ~35% was Nvidia’s A100/V100
GPU. If we assume each AI server has 8 GPUs, the implied GPU server shipment was
400k units in last year. We estimate 90% of AI servers to be GPU servers. This implies
total AI server shipment to be 440k in 2022, which was 3.3% of the total server market.
In a simplified sense, the process of AI /
Machine Learning can be divided into 2
main steps – (1) Training, for the system
to learn and perfect a model or algorithm
from massive datasets, and (2) Inference,
for the system to apply the model in a
real-life scenario or use case like facial
recognition, speech recognition.
This document is being provided for the exclusive use of anthony.wc.liao@jpmorgan.com & clients of J.P. Morgan.
4
Albert Hung
(886-2) 2725-9875
albert.hung@jpmchase.com
Asia Pacific Equity Research
16 April 2023 J P M O R G A N
According to IDC, servers that cost US$25k and above comprised ~2% /~11% of total
server shipment/value in 2022. While AI server shipment contribution remains limited,
we estimate teens % revenue contribution of AI servers given the multiple times of ASP
vs. regular servers.
Table 2: AI server penetration estimate
2022
Datacenter GPU shipment (mn) 3.2
Nvidia A100/V100 shipment 1.1
A100/V100 shipment mix 34%
GPU per servers 8
GPU server volume (mn) 0.40
% of GPU for AI applications 100%
GPU servers as % of AI servers 90%
AI server volumes (mn) 0.44
Global server shipment 13.5
AI server as % of total mix 3.3%
Source: IDC, Gartner, J.P. Morgan estimates.
Accelerating AI server market growth
We forecast a 42% AI server shipment CAGR from 2022-27, buoyed by accelerating
investment in machine learning, proliferation of inference applications and higher rate
of adoption of AI cloud service. Consequently, we expect the AI server penetration rate
to increase from 3% in 2022 to 15% in 2027.
Figure 3: AI server shipment forecast
0.9% 1.3%
2.3%
3.3%
5.3%
7.5%
9.8%
12.4%
15.0%
0%
2%
4%
6%
8%
10%
12%
14%
16%
-
0.5
1.0
1.5
2.0
2.5
2019 2020 2021 2022 2023E 2024E 2025E 2026E 2027E
AI server volumes (in mn units) AI server % mix
Source: IDC, J.P. Morgan estimates.
Increasing complexity in model training
This document is being provided for the exclusive use of anthony.wc.liao@jpmorgan.com & clients of J.P. Morgan.
5
Albert Hung
(886-2) 2725-9875
albert.hung@jpmchase.com
Asia Pacific Equity Research
16 April 2023 J P M O R G A N
According to OpenAI, the total computing workload used in AI training has been
doubling every 2 years starting from 1959, and the doubling time has accelerated to 3.4
months since 2012. Looking forward, the Research Lab believes that the doubling time
is likely to speed up further given the increasing trend of algorithmic innovations (i.e.
increasing genres of AI-specific chips) and improving cost burden (better affordability
of hardware chips for AI training).
Take generative large language model (LLM) training as an example. Total computing
flops requirements have increased to 3,640 training petaflop/s per day in GPT-3 175B
model (launched in 2H20), ~10x more than 382 petaflop/s in T5-11B model (launched
in 2H19).
Figure 4: Compute requirement of Large Language Model training
Source: OpenAI (link).
How to translate required training parameters into GPU consumption?
Nvidia provides various computing power levels under different learning structures for a
single GPU. Take A100 as an example. A single 4-GPU based DGX A100 server could
generate 1.25 PetaFLOP per second under Tensor Float 32 structure. As the required
total training compute of Chat GPT-3 is 3.14 * 10^23 FLOPs, this implies that it will
take ~300 4-GPU A100 server units to keep the training time within 30 days, on our
estimates.
Of note, the required training compute of each model is correlated with the number of
parameters and training tokens. The sharp increase in the parameters of new language
models implies higher computing power consumption.
Floating point operations per second
(FLOPS) is a commonly used
performance indicator of machine
learning hardware, due to the prevalence
of using floating point, instead of integer,
in deep learning.
Of note, GigaFLOPs= 10^9, TeraFLOPs=
10^12, PetaFLOPs= 10^15.
This document is being provided for the exclusive use of anthony.wc.liao@jpmorgan.com & clients of J.P. Morgan.
6
Albert Hung
(886-2) 2725-9875
albert.hung@jpmchase.com
Asia Pacific Equity Research
16 April 2023 J P M O R G A N
Table 3: GPU consumption estimates in AI training (ChatGPT 3.0)
Note
Single A100 card computing power 312 TeraFLOP/s; under TF32
4-GPUs A100 server machine computing power 1.25 PetaFLOP/s
redundancy rates 67%
A single 4-GPUs A100 server computing power per day 35,583 PetaFLOP/day
GPT3 175B model training requirements 314,496,000 PetaFLOP/day
Required days to train GPT3 model on single A100 server 8,838 Days
How many 4-GPUs A100 server machine to keep training time in 1 month 295 servers
Source: J.P. Morgan estimates.
More complete AI ecosystem attracts new investment
AI is playing a critical role in the world. There are more use cases relying on AI
algorithms and machine learning to save time and costs across industries. People are
more frequently using ChatGPT to collect data and make better decisions in a shorter
time frame. Businesses are leveraging AI algorithms to automate operational processes
to improve efficiency and save costs. According to a McKinsey & Company survey, AI
adoption in enterprises has more than doubled in 2022 (50%) vs. 2017 (20%).
The rise of underlying AI demand will lead to a more complete AI ecosystem, in our
view. During Nvidia’s GTC investor day, the company indicated strong growth in the
number of developers, CUDA downloads, AI startups, and GPU-accelerated
applications. Similarly, Intel saw 85%+ install base increase in its open accelerated
computing platform during the investor webinar. Given that server capacity and
computing power are the key factors of AI infrastructure, we believe the pent-up AI
demand and emerging AI applications will lead to a fast growth of AI server spending.
According to IDC, AI server infrastructure will grow at a 17% CAGR in 2021-26.
Figure 5: AI demand growth
Source: Nvidia.
Inference is the key for volume upside
This document is being provided for the exclusive use of anthony.wc.liao@jpmorgan.com & clients of J.P. Morgan.
7
Albert Hung
(886-2) 2725-9875
albert.hung@jpmchase.com
Asia Pacific Equity Research
16 April 2023 J P M O R G A N
We are still in the early stage of AI, with most applications focused on machine learning
and AI training. As hyperscalers could leverage the same servers to train different
algorithms under various time frames, this implies limited volume upsides in the
training phase for the same customers. Of note, we estimate the OpenAI organisation to
have 2-3k GPU servers while it only takes several hundred AI servers to train a single
model in one month.
Besides, training is a cost factor for hyperscalers so stringent investment in AI learning
is important to keep their margins stable. But, if there were heightened competition in
the existing business areas, Internet service providers could prioritize market shares over
profitability and invest heavily on front-end research. For example, we believe Google
would be likely to accelerate its investments in search engine related areas to defend
against competition from MSFT’s GPT-powered Bing.
In the medium to near term, we expect more internet service providers to join AI
research competition and drive front-end investments in AI servers for training. In the
medium to long term, we believe inference plays a more important role in AI server
volume scale. Take ChatGPT-3 as an example in Table 4. The GPU server consumption
could grow by multiple times if the number of users or the frequency of queries
increases. Besides, the higher number of parameters in each generation could also
increase the FLOPs consumption proportionally. As the rising traction of inference
applications will incur meaningful costs, the key watch point is whether hyperscalers are
able to monetize the product and support the expansion.
Table 4: Generative AI running cost - Inference (ChatGPT 3.0)
Scenario 1 Scenario 2 Scenario 3 Note
Chat GPT users (mn) 100 200 300 Achieved 100 mn users in two months
Monthly visit times 3.9 4.5 5.2 13mn daily users
Query each time 4 8 12
Words each Query 400 500 600 Input+ output
Generated words each month (mn) 624,000 3,588,000 11,140,740
Generated words per second 240,741 1,384,259 4,298,125
Implied tokens per second 180,556 1,038,194 3,223,594 1 word = 0.75 token
Chat GPT Parameters (bn) 175 175 175
A100 computing power (TFLOPs) 624 Under FP16 or INT8
Required second per token by single A100 card 0.00056
GPU UTRs 50%
Average word outputs per second on Chat GPT 5.6
Required A100 cards 1,134 6,522 20,251
Nvidia A100 cost 11,000
Required monthly running costs (USD mn) 12 72 223
Source: J.P. Morgan estimates.
Key assumptions of the exercise:
• Monthly visit times: We assume 13mn daily users in Scenario 1.
• Words in each query: This include the input words and generated words.
• Implied tokens per second: We assume 1 English word = 0.75 token.
• A100 computing power: 624 TFLOPs under FP16 or INT8 structure.
• Required second per token by single A100 card: (2* 175bn parameters) /A100
computing power (TFLOPs).
• GPU UTRs: GPU cannot run at peak levels all the time. Also need to build extra
capacity as ChatGPT users could be more concentrated in certain regions.
This document is being provided for the exclusive use of anthony.wc.liao@jpmorgan.com & clients of J.P. Morgan.
8
Albert Hung
(886-2) 2725-9875
albert.hung@jpmchase.com
Asia Pacific Equity Research
16 April 2023 J P M O R G A N
• Average word outputs per second on ChatGPT: The responsive time/ word has been
largely consistent during our test.
• Required A100 cards: We assume OpenAI to keep responsive time the same.
• Nvidia A100 cost: We do not include other hardware costs such as server power,
chassis, etc.
Have we seen upsides in AI server shipment recently?
We have seen increasing market hype on AI topics given the positive feedback on
ChatGPT and generative AI models. Although some hyperscalers have recently
accelerated their investments in machine learning and AI training, the magnitude of
upward revision on server units appeared milder than expected. Our research indicates
limited AI server volume upside while there is supply chain bottleneck in TSMC’s
CoWoS HBM3 due to lower yields.
This document is being provided for the exclusive use of anthony.wc.liao@jpmorgan.com & clients of J.P. Morgan.
9
Albert Hung
(886-2) 2725-9875
albert.hung@jpmchase.com
Asia Pacific Equity Research
16 April 2023 J P M O R G A N
BoM cost comparison between AI and
regular servers
In the following table, we estimate the component costs of regular servers and AI
servers. The key difference of AI servers vs. regular servers is GPU (or accelerators),
which account for 70%+ of AI server BoM cost. The silicon content increase also leads
to higher requirements of memory/storage, networking transmission speed, power
consumption and heat dissipation. Overall, we estimate AI server BoM cost to be
15x/32x higher than that of regular servers.
Table 5: Server BoM cost analysis: Regular server vs. GPU/AI server
Content value % of Total BoM Content value % of Total BoM Content value % of Total BoM
CPU 2,166 29% 13,900 12% 21,420 9%
GPU 0 0% 80,000 71% 200,000 83%
CPU DIMM (DDR5) 1,380 18% 4,600 4% 4,600 2%
Storage SSD 1,365 18% 6,825 6% 6,825 3%
Network Cards (NIC) 155 2% 1,000 1% 1,000 0%
Chassis Costs 20 0% 40 0% 40 0%
Motherboard: Dual Sockets 300 4% 360 0% 360 0%
Power Supply 300 4% 1,200 1% 1,200 0%
Storage Backplane 83 1% 83 0% 83 0%
Drive Caddies 57 1% 57 0% 57 0%
Fans 75 1% 270 0% 270 0%
Heat dissipation module excl. fans (heat pipe) 30 0% 100 0% 100 0%
Internal Cables 20 0% 20 0% 20 0%
Riser Cards 20 0% 20 0% 20 0%
Sheet Metal Case 100 1% 200 0% 200 0%
PCB 325 4% 650 1% 650 0%
Assembly Labor and Test 495 7% 1,485 1% 1,485 1%
Markup 689 9% 2,067 2% 2,067 1%
Total Cost 7,580 100% 112,877 100% 240,397 100%
AI server BoM vs. Regular server BoM 14.9x 31.7x
Regular servers GPU/AI servers (A100x8) GPU/AI servers (H100x8)
Source: Company data, J.P.Morgan estimates.
Figure 6: Key component BoM breakdown - regular server
CPU, 29%
CPU DIMM,
18%
NAND
storage,
18%
Others, 35%
Source: J.P. Morgan estimates.
Figure 7: Key component BoM breakdown - H100 server
GPU, 83%
CPU, 9%
CPU DIMM,
2%
NAND
storage, 3%
Others, 3%
Source: J.P. Morgan estimates.
GPU: The key BoM cost boost, good for leading-edge Foundry vendors
A single AI server usually includes 2/4/8 GPUs for parallel processing to accelerate the
computing. In the BoM cost analysis, we assume 8 GPUs in a single AI server and
estimate US$10k/25k for a A100/H100 module.
Nvidia has been improving the computing power of datacenter GPUs, resulting in
meaningful price upticks but lower cost per computing power. We expect such GPU
This document is being provided for the exclusive use of anthony.wc.liao@jpmorgan.com & clients of J.P. Morgan.
10
Albert Hung
(886-2) 2725-9875
albert.hung@jpmchase.com
Asia Pacific Equity Research
16 April 2023 J P M O R G A N
price trend to continue in the next few generations.
Figure 8: Nvidia datacenter GPU computing power comparison
In TFLOPs (Tera floating point operations per second)
10.6
15.7
19.5
60.0
0
10
20
30
40
50
60
70
P100 V100 A100 H100
Source: Company data, J.P. Morgan.
Figure 9: Nvidia’s datacenter GPU prices and cost per compute
US$k,US$/TFLOPs
9
11 10
25
0.9
0.7
0.5
0.4
0.0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1.0
0
5
10
15
20
25
30
P100 V100 A100 H100
Source: Company data, J.P.Morgan calculations. Prices refer to SXM model.
CPU: Higher specs requirement but not the key silicon
Dual socket CPU configuration comprises 80%+ of current servers. As GPUs and AI
ASIC (TPU) are key drivers to accelerating processing, the number of CPUs do not
increase in AI servers. However, the CPU specs of AI server are much higher than the
regular ones. According to Nvidia, the default CPUs in DGX A100 and DGX H100 are
AMD EPYC 7742 (Rome) and Intel 4th gen Xeon 8480C (EagleStream) processors.
The prices of both CPUs are 5-10x higher than mainstream server CPUs.
Memory and storage: Meaningful content increase to facilitate AI workload
The DRAM content is around 600 GB per regular server while the default DRAM specs
of Nvidia’s HGX/DGX series is 2 TB. There is also 40GB/80GB GDDR per GPU.
Assuming 8 GPUs in a single AI server, total DRAM content could be 2 TB+
320GB/640GB GDDR. Besides, we also see higher specs of DRAM in AI servers such
as DDR5 adoption in CPU DIMM and high bandwidth memory (HBM) for GDDR.
NAND content also increases in AI servers due to higher requirements to store the data
set. We assume 20 TB of average NAND storage per AI server, while Nvidia’s HGX/
DGX series support 30 TB NVMe SSD storage and mainstream regular server NAND
content is 4 TB now.
Our Korean memory team (led by analyst JJ Park) estimates that AI server contribution
was ~4% of total memory revenue in 2022 and it will increase to ~9%-12% from FY24-
FY27E.
This document is being provided for the exclusive use of anthony.wc.liao@jpmorgan.com & clients of J.P. Morgan.
11
Albert Hung
(886-2) 2725-9875
albert.hung@jpmchase.com
Asia Pacific Equity Research
16 April 2023 J P M O R G A N
Table 6: AI server contribution to total memory revenue
2022 2023E 2024E 2025E 2026E 2027E
DRAM revenue (US$mn) 77,768.9 41,529.8 58,119.6 63,931.5 70,324.7 77,357.1
NAND revenue (US$mn) 47,079.1 34,254.8 46,801.3 53,119.5 60,290.6 68,429.9
Total MM (US$mn) 124,848.0 75,784.6 104,920.9 117,051.0 130,615.3 145,787.0
Total AI server revenue (DRAM/NAND) 4,780.9 6,517.6 9,048.3 11,821.7 14,883.9 18,031.6
% of Memory Market 4% 9% 9% 10% 11% 12%
DRAM
CPU
AI Server demand (8Gb, M) 907 1,393 2,176 3,084 4,209 5,525
AI server shipment (M) 0.4 0.7 1.0 1.4 1.8 2.2
Server density (GB/system) 2,048 2,048 2,150 2,258 2,371 2,489
DDR5 ASP (US$/GB) 2.0 1.9 1.8 1.7 1.6 1.5
(A) DRAM Revenue catered to CPU 1,814.5 2,646.5 3,926.9 5,288.0 6,857.3 8,550.2
GPU
AI server demand (8Gb, M) 191 277 408 578 789 1,036
HBM server density (GB/system) 480 480 504 529 556 583
GPU server shipment (M) 0.4 0.6 0.8 1.1 1.4 1.8
HBM ASP (US$/GB) 3.0 2.7 2.4 2.2 2.0 1.8
(B) DRAM Revenue catered to GPU 574.1 749.2 991.2 1,264.6 1,553.5 1,835.1
(A) + (B) 2,388.7 3,395.8 4,918.1 6,552.6 8,410.8 10,385.3
% of DRAM revenue 3.1% 8.2% 8.5% 10.2% 12.0% 13.4%
NAND (Storage SSD)
AI server demand (8Gb, M) 7,974 11,562 16,997 24,093 32,886 43,163
Storage density (GB/system) 20,000 20,000 21,000 22,050 23,153 24,310
GPU server shipment (M) 0.4 0.6 0.8 1.1 1.4 1.8
ASP 0.3 0.3 0.2 0.2 0.2 0.2
Revenue (US$mn) 2,392.2 3,121.8 4,130.2 5,269.1 6,473.0 7,646.3
% of NAND revenue 5.1% 9.1% 8.8% 9.9% 10.7% 11.2%
Source: iSuppli, Gartner, WSTS, J.P. Morgan Korean memory team estimates.
PCB: Higher layer counts and lower yields drive ASP upticks
We estimate 10-12 layers of regular server PCB while AI servers require higher end
PCB including 18-20 layers. The higher number of PCB layers implies not only more
content value but also higher difficulty in production yields. Consequently, we estimate
50-100% ASP upticks of AI server PCB vs. regular server PCB.
ODM: More complex design and system integration
While the general ODM feedback suggests similar margin for AI servers, we believe
ODM margins should be diluted by the higher GPU costs. Still, the more complex
configuration design, longer testing time, and pricing premiums for niche models will
likely drive higher profit dollars.
We believe several server ODMs are considering to change the pricing model from “buy
and sell” to “consign” in AI servers. In this case, the AI server price could be reduced
by 60-70% while margins could be higher.
This document is being provided for the exclusive use of anthony.wc.liao@jpmorgan.com & clients of J.P. Morgan.
12
Albert Hung
(886-2) 2725-9875
albert.hung@jpmchase.com
Asia Pacific Equity Research
16 April 2023 J P M O R G A N
Figure 10: GPU module (A100)
Source: Nvidia.
Figure 11: GPU server (DGX H100)
Source: Nvidia.
Power supply: A single GPU chip consumes similar power of a regular server
Processing chip is the key component of power consumption in servers. The TDP
(Thermal Design Power) of single server CPU is around 300W and a two socket regular
server could consume 1200-1600W. The TDP of GPU ranges from 300W to 700W and
it is around 50W per Smart NIC card. Therefore, most regular servers require 1+1 1.2k-
1.8kW server power supply while AI server power supply requirement could range from
2 to 4 *3kW.
Figure 12: Nvidia’s datacenter GPU power consumption in TDP
Watt
250
300
400
700
0
100
200
300
400
500
600
700
800
P100 V100 A100 H100
Source: Company data.
Heat dissipation: Air cooling still the majority, liquid cooling the next trend
Air cooling (fans + heat pipe/ vapor chamber) is still the mainstream heat dissipation
solution for servers now. Nvidia’s A100 servers have much higher total design power
and require more advanced heat dissipation. Our research indicates prices are 3x-5x for
This document is being provided for the exclusive use of anthony.wc.liao@jpmorgan.com & clients of J.P. Morgan.
13
Albert Hung
(886-2) 2725-9875
albert.hung@jpmchase.com
Asia Pacific Equity Research
16 April 2023 J P M O R G A N
AI heat dissipation solutions vs. regular server solutions.
Currently some ISPs adopt immersion liquid cooling as a transitional solution, which is
costly but has better heat dissipation performance vs. air cooling. We believe solution
providers have been researching on a more economical liquid cooling (cold-plate)
solution to tackle the much higher heat generation of AI servers (such as H100 server).
Figure 13: Immersion liquid cooling (two phase)
Source: Wiwynn.
Figure 14: Cold-plate liquid cooling system
Source: Wiwynn.
Networking: NVlink and NVSwitch
It takes multiple AI servers to train a single algorithm so the data transmission between
GPUs in different AI servers is important to reduce the latency. Nvidia has launched
NVlink to improve the communication between GPUs within a single AI server and
NVSwtich to connect the various AI rack servers. Consequently, we expect the rising AI
server mix to drive the networking upgrade in datacenter switch and volume upsides of
Smart NIC/DPU.
Figure 15: Nvidia’s NVLink
Source: Company data.
Figure 16: Nvidia’s NVSwitch
Source: Company data.
AI beneficiaries in Asia tech
In the Asian tech space, we identify key beneficiaries under the AI explosion trend,
including TSMC (key datacenter GPU foundry), SK Hynix (key HBM3 supplier),
Unimicron (AI chips substrate), ASPEED (AI server BMC content growth), Wistron
(GPU server subsystem supplier), Quanta (key GPU server ODM), Delta and
This document is being provided for the exclusive use of anthony.wc.liao@jpmorgan.com & clients of J.P. Morgan.
14
Albert Hung
(886-2) 2725-9875
albert.hung@jpmchase.com
Asia Pacific Equity Research
16 April 2023 J P M O R G A N
Sunonwealth (increasing power supply and fan contents in AI server).
Table 7: AI related exposure in our tech coverage
Company Nature of involvement % of revenue likely in 2023 Comments
TSMC Foundry + backend for AI (CoWoS) 2-3% 100% market share for GPU for Gaming and AI as well as Mellanox DPUs, CoWoS for AI Chips
Unimicron AI chip substrates 1-2% Unimicron is the secondary supply for the substrates used in the AI chips.
SK Hynix HBM or GDDR6 DRAM Mid-HSD% Sole supplier of HBM to NVIDIA. 60-70% M/S in HBM market
ASPEED AI server BMC 1%
Delta AI server power supply and fans Limited Has great potential in power supply and fans given larger power consumption in AI server
Sunonwealth AI server fans 1-2% Has great potential in fans given larger power consumption in AI server
FII GPU server/module ODMs 5-10%
Wistron GPU server/module ODMs 2-3%
Wiwynn GPU server ODMs Teens % 50% of current project pipeline (in terms of project number) are AI related.
Inventec GPU server mainboard ODMs 5-10% Mainly MSFT, Google and small contribution from Amazon AI projects.
Quanta GPU server ODMs 5-10% Key supplier of MSFT AI servers.
Source: Company data, J.P. Morgan estimates.
This document is being provided for the exclusive use of anthony.wc.liao@jpmorgan.com & clients of J.P. Morgan.
15
Albert Hung
(886-2) 2725-9875
albert.hung@jpmchase.com
Asia Pacific Equity Research
16 April 2023 J P M O R G A N
Correction: The following corrections have been made in the report: 1) Page 7, first paragraph, the table reference has been corrected; 2) Page 9
text, ''xPUs'' has changed to ''GPUs and AI ASIC (TPU)''; 3) Page 11, the text on AI server power supply requirement has been corrected; 4)
Page 8, the text on AI server BoM cost has been corrected; 5) Figure 1, Figure 7 and Table 5 have been updated to fix inadvertent errors.
Companies Discussed in This Report (all prices in this report as of market close on 14 April 2023)
ASPEED Technology Inc.(5274.TWO/NT$2,755.00/OW), Delta Electronics, Inc.(2308.TW/NT$314.50/OW), Quanta Computer
Inc.(2382.TW/NT$81.80/N), SK hynix(000660.KS/W89,300/OW), Sunonwealth(2421.TW/NT$52.90/OW), TSMC(2330.TW/
NT$516.00/OW), Unimicron(3037.TW/NT$139.00/OW), Wistron Corporation(3231.TW/NT$43.95/N)
Analyst Certification: The Research Analyst(s) denoted by an “AC” on the cover of this report certifies (or, where multiple Research Analysts
are primarily responsible for this report, the Research Analyst denoted by an “AC” on the cover or within the document individually certifies,
with respect to each security or issuer that the Research Analyst covers in this research) that: (1) all of the views expressed in this report
accurately reflect the Research Analyst’s personal views about any and all of the subject securities or issuers; and (2) no part of any of the
Research Analyst's compensation was, is, or will be directly or indirectly related to the specific recommendations or views expressed by the
Research Analyst(s) in this report. For all Korea-based Research Analysts listed on the front cover, if applicable, they also certify, as per KOFIA
requirements, that the Research Analyst’s analysis was made in good faith and that the views reflect the Research Analyst’s own opinion,
without undue influence or intervention.
All authors named within this report are Research Analysts who produce independent research unless otherwise specified. In Europe, Sector
Specialists (Sales and Trading) may be shown on this report as contacts but are not authors of the report or part of the Research Department.
Important Disclosures
Market Maker/ Liquidity Provider: J.P. Morgan is a market maker and/or liquidity provider in the financial instruments of/related to SK
hynix.
Client: J.P. Morgan currently has, or had within the past 12 months, the following entity(ies) as clients: SK hynix.
Client/Non-Investment Banking, Securities-Related: J.P. Morgan currently has, or had within the past 12 months, the following entity(ies)
as clients, and the services provided were non-investment-banking, securities-related: SK hynix.
Potential Investment Banking Compensation: J.P. Morgan expects to receive, or intends to seek, compensation for investment banking
services in the next three months from SK hynix.
Non-Investment Banking Compensation Received: J.P. Morgan has received compensation in the past 12 months for products or services
other than investment banking from SK hynix.
Debt Position: J.P. Morgan may hold a position in the debt securities of SK hynix, if any.
Gartner: All statements in this report attributable to Gartner represent J.P. Morgan's interpretation of data opinion or viewpoints published as
part of a syndicated subscription service by Gartner, Inc., and have not been reviewed by Gartner. Each Gartner publication speaks as of its
original publication date (and not as of the date of this report). The opinions expressed in Gartner publications are not representations of fact,
and are subject to change without notice.
Company-Specific Disclosures: Important disclosures, including price charts and credit opinion history tables, are available for compendium
reports and all J.P. Morgan–covered companies, and certain non-covered companies, by visitinghttps://www.jpmm.com/research/disclosures,
calling 1-800-477-0406, or e-mailing research.disclosure.inquiries@jpmorgan.com with your request.
Explanation of Equity Research Ratings, Designations and Analyst(s) Coverage Universe:
J.P. Morgan uses the following rating system: Overweight [Over the next six to twelve months, we expect this stock will outperform the average
total return of the stocks in the analyst’s (or the analyst’s team’s) coverage universe.] Neutral [Over the next six to twelve months, we expect this
stock will perform in line with the average total return of the stocks in the analyst’s (or the analyst’s team’s) coverage universe.] Underweight
[Over the next six to twelve months, we expect this stock will underperform the average total return of the stocks in the analyst’s (or the
analyst’s team’s) coverage universe.] Not Rated (NR): J.P. Morgan has removed the rating and, if applicable, the price target, for this stock
because of either a lack of a sufficient fundamental basis or for legal, regulatory or policy reasons. The previous rating and, if applicable, the
price target, no longer should be relied upon. An NR designation is not a recommendation or a rating. In our Asia (ex-Australia and ex-India)
and U.K. small- and mid-cap equity research, each stock’s expected total return is compared to the expected total return of a benchmark country
market index, not to those analysts’ coverage universe. If it does not appear in the Important Disclosures section of this report, the certifying
analyst’s coverage universe can be found on J.P. Morgan’s research website, https://www.jpmorganmarkets.com.
Coverage Universe: Hung, Albert: ASPEED Technology Inc. (5274.TWO), ASUSTek Computer (2357.TW), Chindata (CD), Compal
Electronics, Inc. (2324.TW), Inventec (2356.TW), Lenovo Group Limited (0992) (0992.HK), Micro-Star International Co., Ltd. (2377.TW),
Pegatron Corp (4938.TW), Quanta Computer Inc. (2382.TW), VNET Group (VNET), Wistron Corporation (3231.TW), Wiwynn Corp
(6669.TW)
This document is being provided for the exclusive use of anthony.wc.liao@jpmorgan.com & clients of J.P. Morgan.
16
Albert Hung
(886-2) 2725-9875
albert.hung@jpmchase.com
Asia Pacific Equity Research
16 April 2023 J P M O R G A N
J.P. Morgan Equity Research Ratings Distribution, as of April 01, 2023
Overweight
(buy)
Neutral
(hold)
Underweight
(sell)
J.P. Morgan Global Equity Research Coverage* 47% 38% 15%
IB clients** 47% 44% 34%
JPMS Equity Research Coverage* 46% 41% 13%
IB clients** 66% 65% 53%
*Please note that the percentages might not add to 100% because of rounding.
**Percentage of subject companies within each of the "buy," "hold" and "sell" categories for which J.P. Morgan has provided
investment banking services within the previous 12 months.
For purposes only of FINRA ratings distribution rules, our Overweight rating falls into a buy rating category; our Neutral rating falls
into a hold rating category; and our Underweight rating falls into a sell rating category. Please note that stocks with an NR designation
are not included in the table above. This information is current as of the end of the most recent calendar quarter.
Equity Valuation and Risks: For valuation methodology and risks associated with covered companies or price targets for covered companies,
please see the most recent company-specific research report at http://www.jpmorganmarkets.com, contact the primary analyst or your J.P.
Morgan representative, or email research.disclosure.inquiries@jpmorgan.com. For material information about the proprietary models used,
please see the Summary of Financials in company-specific research reports and the Company Tearsheets, which are available to download on
the company pages of our client website, http://www.jpmorganmarkets.com. This report also sets out within it the material underlying
assumptions used.
A history of J.P. Morgan investment recommendations disseminated during the preceding 12 months can be accessed on the Research &
Commentary page of http://www.jpmorganmarkets.com where you can also search by analyst name, sector or financial instrument.
Analysts' Compensation:The research analysts responsible for the preparation of this report receive compensation based upon various factors,
including the quality and accuracy of research, client feedback, competitive factors, and overall firm revenues.
Registration of non-US Analysts: Unless otherwise noted, the non-US analysts listed on the front of this report are employees of non-US
affiliates of J.P. Morgan Securities LLC, may not be registered as research analysts under FINRA rules, may not be associated persons of J.P.
Morgan Securities LLC, and may not be subject to FINRA Rule 2241 or 2242 restrictions on communications with covered companies, public
appearances, and trading securities held by a research analyst account.
Other Disclosures
J.P. Morgan is a marketing name for investment banking businesses of JPMorgan Chase & Co. and its subsidiaries and affiliates worldwide.
UK MIFID FICC research unbundling exemption: UK clients should refer to UK MIFID Research Unbundling exemption for details of
JPMorgan’s implementation of the FICC research exemption and guidance on relevant FICC research categorisation.
All research material made available to clients are simultaneously available on our client website, J.P. Morgan Markets, unless specifically
permitted by relevant laws. Not all research content is redistributed, e-mailed or made available to third-party aggregators. For all research
material available on a particular stock, please contact your sales representative.
Any long form nomenclature for references to China; Hong Kong; Taiwan; and Macau within this research material are Mainland China; Hong
Kong SAR (China); Taiwan (China); and Macau SAR (China).
J.P. Morgan Research may, from time to time, write on issuers or securities targeted by economic or financial sanctions imposed or administered
by the governmental authorities of the U.S., EU, UK or other relevant jurisdictions (Sanctioned Securities). Nothing in this report is intended to
be read or construed as encouraging, facilitating, promoting or otherwise approving investment or dealing in such Sanctioned Securities. Clients
should be aware of their own legal and compliance obligations when making investment decisions.
Any digital or crypto assets discussed in this research report are subject to a rapidly changing regulatory landscape. For relevant regulatory
advisories on crypto assets, including bitcoin and ether, please see https://www.jpmorgan.com/disclosures/cryptoasset-disclosure.
The author(s) of this research report may not be licensed to carry on regulated activities in your jurisdiction and, if not licensed, do not hold
themselves out as being able to do so.
Exchange-Traded Funds (ETFs): J.P. Morgan Securities LLC (“JPMS”) acts as authorized participant for substantially all U.S.-listed ETFs. To
the extent that any ETFs are mentioned in this report, JPMS may earn commissions and transaction-based compensation in connection with the
distribution of those ETF shares and may earn fees for performing other trade-related services, such as securities lending to short sellers of the
ETF shares. JPMS may also perform services for the ETFs themselves, including acting as a broker or dealer to the ETFs. In addition, affiliates
of JPMS may perform services for the ETFs, including trust, custodial, administration, lending, index calculation and/or maintenance and other
services.
This document is being provided for the exclusive use of anthony.wc.liao@jpmorgan.com & clients of J.P. Morgan.
17
Albert Hung
(886-2) 2725-9875
albert.hung@jpmchase.com
Asia Pacific Equity Research
16 April 2023 J P M O R G A N
Options and Futures related research: If the information contained herein regards options- or futures-related research, such information is
available only to persons who have received the proper options or futures risk disclosure documents. Please contact your J.P. Morgan
Representative or visit https://www.theocc.com/components/docs/riskstoc.pdf for a copy of the Option Clearing Corporation's Characteristics
and Risks of Standardized Options or http://www.finra.org/sites/default/files/Security_Futures_Risk_Disclosure_Statement_2018.pdf for a copy
of the Security Futures Risk Disclosure Statement.
Changes to Interbank Offered Rates (IBORs) and other benchmark rates: Certain interest rate benchmarks are, or may in the future
become, subject to ongoing international, national and other regulatory guidance, reform and proposals for reform. For more information, please
consult: https://www.jpmorgan.com/global/disclosures/interbank_offered_rates
Private Bank Clients: Where you are receiving research as a client of the private banking businesses offered by JPMorgan Chase & Co. and its
subsidiaries (“J.P. Morgan Private Bank”), research is provided to you by J.P. Morgan Private Bank and not by any other division of J.P. Morgan,
including, but not limited to, the J.P. Morgan Corporate and Investment Bank and its Global Research division.
Legal entity responsible for the production and distribution of research: The legal entity identified below the name of the Reg AC Research
Analyst who authored this material is the legal entity responsible for the production of this research. Where multiple Reg AC Research Analysts
authored this material with different legal entities identified below their names, these legal entities are jointly responsible for the production of
this research. Research Analysts from various J.P. Morgan affiliates may have contributed to the production of this material but may not be
licensed to carry out regulated activities in your jurisdiction (and do not hold themselves out as being able to do so). Unless otherwise stated
below, this material has been distributed by the legal entity responsible for production. If you have any queries, please contact the relevant
Research Analyst in your jurisdiction or the entity in your jurisdiction that has distributed this research material.
Legal Entities Disclosures and Country-/Region-Specific Disclosures:
Argentina: JPMorgan Chase Bank N.A Sucursal Buenos Aires is regulated by Banco Central de la República Argentina (“BCRA”- Central
Bank of Argentina) and Comisión Nacional de Valores (“CNV”- Argentinian Securities Commission” - ALYC y AN Integral N°51). Australia:
J.P. Morgan Securities Australia Limited (“JPMSAL”) (ABN 61 003 245 234/AFS Licence No: 238066) is regulated by the Australian
Securities and Investments Commission and is a Market, Clearing and Settlement Participant of ASX Limited and CHI-X. This material is
issued and distributed in Australia by or on behalf of JPMSAL only to "wholesale clients" (as defined in section 761G of the Corporations Act
2001). A list of all financial products covered can be found by visiting https://www.jpmm.com/research/disclosures. J.P. Morgan seeks to cover
companies of relevance to the domestic and international investor base across all Global Industry Classification Standard (GICS) sectors, as well
as across a range of market capitalisation sizes. If applicable, in the course of conducting public side due diligence on the subject company(ies),
the Research Analyst team may at times perform such diligence through corporate engagements such as site visits, discussions with company
representatives, management presentations, etc. Research issued by JPMSAL has been prepared in accordance with J.P. Morgan Australia’s
Research Independence Policy which can be found at the following link: J.P. Morgan Australia - Research Independence Policy. Brazil: Banco
J.P. Morgan S.A. is regulated by the Comissao de Valores Mobiliarios (CVM) and by the Central Bank of Brazil. Ombudsman J.P. Morgan:
0800-7700847 / ouvidoria.jp.morgan@jpmorgan.com. Canada: J.P. Morgan Securities Canada Inc. is a registered investment dealer, regulated
by the Investment Industry Regulatory Organization of Canada and the Ontario Securities Commission and is the participating member on
Canadian exchanges. This material is distributed in Canada by or on behalf of J.P.Morgan Securities Canada Inc. Chile: Inversiones J.P. Morgan
Limitada is an unregulated entity incorporated in Chile. China: J.P. Morgan Securities (China) Company Limited has been approved by CSRC
to conduct the securities investment consultancy business. Dubai International Financial Centre (DIFC): JPMorgan Chase Bank, N.A., Dubai
Branch is regulated by the Dubai Financial Services Authority (DFSA) and its registered address is Dubai International Financial Centre - The
Gate, West Wing, Level 3 and 9 PO Box 506551, Dubai, UAE. This material has been distributed by JP Morgan Chase Bank, N.A., Dubai
Branch to persons regarded as professional clients or market counterparties as defined under the DFSA rules. European Economic Area
(EEA): Unless specified to the contrary, research is distributed in the EEA by J.P. Morgan SE (“JPM SE”), which is subject to prudential
supervision by the European Central Bank (“ECB”) in cooperation with BaFin and Deutsche Bundesbank in Germany. JPM SE is a company
headquartered in Frankfurt with registered address at TaunusTurm, Taunustor 1, Frankfurt am Main, 60310, Germany. The material has been
distributed in the EEA to persons regarded as professional investors (or equivalent) pursuant to Art. 4 para. 1 no. 10 and Annex II of MiFID II
and its respective implementation in their home jurisdictions (“EEA professional investors”). This material must not be acted on or relied on by
persons who are not EEA professional investors. Any investment or investment activity to which this material relates is only available to EEA
relevant persons and will be engaged in only with EEA relevant persons. Hong Kong: J.P. Morgan Securities (Asia Pacific) Limited (CE
number AAJ321) is regulated by the Hong Kong Monetary Authority and the Securities and Futures Commission in Hong Kong, and J.P.
Morgan Broking (Hong Kong) Limited (CE number AAB027) is regulated by the Securities and Futures Commission in Hong Kong. JP Morgan
Chase Bank, N.A., Hong Kong Branch (CE Number AAL996) is regulated by the Hong Kong Monetary Authority and the Securities and
Futures Commission, is organized under the laws of the United States with limited liability. Where the distribution of this material is a regulated
activity in Hong Kong, the material is distributed in Hong Kong by or through J.P. Morgan Securities (Asia Pacific) Limited and/or J.P. Morgan
Broking (Hong Kong) Limited. India: J.P. Morgan India Private Limited (Corporate Identity Number - U67120MH1992FTC068724), having its
registered office at J.P. Morgan Tower, Off. C.S.T. Road, Kalina, Santacruz - East, Mumbai – 400098, is registered with the Securities and
Exchange Board of India (SEBI) as a ‘Research Analyst’ having registration number INH000001873. J.P. Morgan India Private Limited is also
registered with SEBI as a member of the National Stock Exchange of India Limited and the Bombay Stock Exchange Limited (SEBI
Registration Number – INZ000239730) and as a Merchant Banker (SEBI Registration Number - MB/INM000002970). Telephone: 91-22-6157
3000, Facsimile: 91-22-6157 3990 and Website: http://www.jpmipl.com. JPMorgan Chase Bank, N.A. - Mumbai Branch is licensed by the
Reserve Bank of India (RBI) (Licence No. 53/ Licence No. BY.4/94; SEBI - IN/CUS/014/ CDSL : IN-DP-CDSL-444-2008/ IN-DP-NSDL-285-
2008/ INBI00000984/ INE231311239) as a Scheduled Commercial Bank in India, which is its primary license allowing it to carry on Banking
business in India and other activities, which a Bank branch in India are permitted to undertake. For non-local research material, this material is
This document is being provided for the exclusive use of anthony.wc.liao@jpmorgan.com & clients of J.P. Morgan.
18
Albert Hung
(886-2) 2725-9875
albert.hung@jpmchase.com
Asia Pacific Equity Research
16 April 2023 J P M O R G A N
not distributed in India by J.P. Morgan India Private Limited. Indonesia: PT J.P. Morgan Sekuritas Indonesia is a member of the Indonesia Stock
Exchange and is registered and supervised by the Otoritas Jasa Keuangan (OJK). Korea: J.P. Morgan Securities (Far East) Limited, Seoul
Branch, is a member of the Korea Exchange (KRX). JPMorgan Chase Bank, N.A., Seoul Branch, is licensed as a branch office of foreign bank
(JPMorgan Chase Bank, N.A.) in Korea. Both entities are regulated by the Financial Services Commission (FSC) and the Financial Supervisory
Service (FSS). For non-macro research material, the material is distributed in Korea by or through J.P. Morgan Securities (Far East) Limited,
Seoul Branch. Japan: JPMorgan Securities Japan Co., Ltd. and JPMorgan Chase Bank, N.A., Tokyo Branch are regulated by the Financial
Services Agency in Japan. Malaysia: This material is issued and distributed in Malaysia by JPMorgan Securities (Malaysia) Sdn Bhd (18146-
X), which is a Participating Organization of Bursa Malaysia Berhad and holds a Capital Markets Services License issued by the Securities
Commission in Malaysia. Mexico: J.P. Morgan Casa de Bolsa, S.A. de C.V.and J.P. Morgan Grupo Financiero are members of the Mexican
Stock Exchange and are authorized to act as a broker dealer by the National Banking and Securities Exchange Commission. New Zealand: This
material is issued and distributed by JPMSAL in New Zealand only to "wholesale clients" (as defined in the Financial Markets Conduct Act
2013). JPMSAL is registered as a Financial Service Provider under the Financial Service providers (Registration and Dispute Resolution) Act of
2008. Pakistan: J. P. Morgan Pakistan Broking (Pvt.) Ltd is a member of the Karachi Stock Exchange and regulated by the Securities and
Exchange Commission of Pakistan. Philippines: J.P. Morgan Securities Philippines Inc. is a Trading Participant of the Philippine Stock
Exchange and a member of the Securities Clearing Corporation of the Philippines and the Securities Investor Protection Fund. It is regulated by
the Securities and Exchange Commission. Russia: CB J.P. Morgan Bank International LLC is regulated by the Central Bank of Russia.
Singapore: This material is issued and distributed in Singapore by or through J.P. Morgan Securities Singapore Private Limited (JPMSS) [MCI
(P) 060/08/2022 and Co. Reg. No.: 199405335R], which is a member of the Singapore Exchange Securities Trading Limited, and/or JPMorgan
Chase Bank, N.A., Singapore branch (JPMCB Singapore), both of which are regulated by the Monetary Authority of Singapore. This material is
issued and distributed in Singapore only to accredited investors, expert investors and institutional investors, as defined in Section 4A of the
Securities and Futures Act, Cap. 289 (SFA). This material is not intended to be issued or distributed to any retail investors or any other investors
that do not fall into the classes of “accredited investors,” “expert investors” or “institutional investors,” as defined under Section 4A of the SFA.
Recipients of this material in Singapore are to contact JPMSS or JPMCB Singapore in respect of any matters arising from, or in connection
with, the material. As at the date of this material, JPMSS is a designated market maker for certain structured warrants listed on the Singapore
Exchange where the underlying securities may be the securities discussed in this material. Arising from its role as a designated market maker for
such structured warrants, JPMSS may conduct hedging activities in respect of such underlying securities and hold or have an interest in such
underlying securities as a result. The updated list of structured warrants for which JPMSS acts as designated market maker may be found on the
website of the Singapore Exchange Limited: http://www.sgx.com. South Africa: J.P. Morgan Equities South Africa Proprietary Limited and
JPMorgan Chase Bank, N.A., Johannesburg Branch are members of the Johannesburg Securities Exchange and are regulated by the Financial
Services Board. Taiwan: J.P. Morgan Securities (Taiwan) Limited is a participant of the Taiwan Stock Exchange (company-type) and regulated
by the Taiwan Securities and Futures Bureau. Material relating to equity securities is issued and distributed in Taiwan by J.P. Morgan Securities
(Taiwan) Limited, subject to the license scope and the applicable laws and the regulations in Taiwan. According to Paragraph 2, Article 7-1 of
Operational Regulations Governing Securities Firms Recommending Trades in Securities to Customers (as amended or supplemented) and/or
other applicable laws or regulations, please note that the recipient of this material is not permitted to engage in any activities in connection with
the material that may give rise to conflicts of interests, unless otherwise disclosed in the “Important Disclosures” in this material. Thailand:
This material is issued and distributed in Thailand by JPMorgan Securities (Thailand) Ltd., which is a member of the Stock Exchange of
Thailand and is regulated by the Ministry of Finance and the Securities and Exchange Commission, and its registered address is 3rd Floor, 20
North Sathorn Road, Silom, Bangrak, Bangkok 10500. UK: Unless specified to the contrary, research is distributed in the UK by J.P. Morgan
Securities plc (“JPMS plc”) which is a member of the London Stock Exchange and is authorised by the Prudential Regulation Authority and
regulated by the Financial Conduct Authority and the Prudential Regulation Authority. JPMS plc is registered in England & Wales No. 2711006,
Registered Office 25 Bank Street, London, E14 5JP. This material is directed in the UK only to: (a) persons having professional experience in
matters relating to investments falling within article 19(5) of the Financial Services and Markets Act 2000 (Financial Promotion) (Order) 2005
(“the FPO”); (b) persons outlined in article 49 of the FPO (high net worth companies, unincorporated associations or partnerships, the trustees of
high value trusts, etc.); or (c) any persons to whom this communication may otherwise lawfully be made; all such persons being referred to as
"UK relevant persons". This material must not be acted on or relied on by persons who are not UK relevant persons. Any investment or
investment activity to which this material relates is only available to UK relevant persons and will be engaged in only with UK relevant persons.
Research issued by JPMS plc has been prepared in accordance with JPMS plc's policy for prevention and avoidance of conflicts of interest
related to the production of Research which can be found at the following link: J.P. Morgan EMEA - Research Independence Policy. U.S.: J.P.
Morgan Securities LLC (“JPMS”) is a member of the NYSE, FINRA, SIPC, and the NFA. JPMorgan Chase Bank, N.A. is a member of the
FDIC. Material published by non-U.S. affiliates is distributed in the U.S. by JPMS who accepts responsibility for its content.
General: Additional information is available upon request. The information in this material has been obtained from sources believed to be
reliable. While all reasonable care has been taken to ensure that the facts stated in this material are accurate and that the forecasts, opinions and
expectations contained herein are fair and reasonable, JPMorgan Chase & Co. or its affiliates and/or subsidiaries (collectively J.P. Morgan) make
no representations or warranties whatsoever to the completeness or accuracy of the material provided, except with respect to any disclosures
relative to J.P. Morgan and the Research Analyst's involvement with the issuer that is the subject of the material. Accordingly, no reliance should
be placed on the accuracy, fairness or completeness of the information contained in this material. There may be certain discrepancies with data
and/or limited content in this material as a result of calculations, adjustments, translations to different languages, and/or local regulatory
restrictions, as applicable. These discrepancies should not impact the overall investment analysis, views and/or recommendations of the subject
company(ies) that may be discussed in the material. J.P. Morgan accepts no liability whatsoever for any loss arising from any use of this material
or its contents, and neither J.P. Morgan nor any of its respective directors, officers or employees, shall be in any way responsible for the contents
hereof, apart from the liabilities and responsibilities that may be imposed on them by the relevant regulatory authority in the jurisdiction in
This document is being provided for the exclusive use of anthony.wc.liao@jpmorgan.com & clients of J.P. Morgan.
19
Albert Hung
(886-2) 2725-9875
albert.hung@jpmchase.com
Asia Pacific Equity Research
16 April 2023 J P M O R G A N
question, or the regulatory regime thereunder. Opinions, forecasts or projections contained in this material represent J.P. Morgan's current
opinions or judgment as of the date of the material only and are therefore subject to change without notice. Periodic updates may be provided on
companies/industries based on company-specific developments or announcements, market conditions or any other publicly available
information. There can be no assurance that future results or events will be consistent with any such opinions, forecasts or projections, which
represent only one possible outcome. Furthermore, such opinions, forecasts or projections are subject to certain risks, uncertainties and
assumptions that have not been verified, and future actual results or events could differ materially. The value of, or income from, any
investments referred to in this material may fluctuate and/or be affected by changes in exchange rates. All pricing is indicative as of the close of
market for the securities discussed, unless otherwise stated. Past performance is not indicative of future results. Accordingly, investors may
receive back less than originally invested. This material is not intended as an offer or solicitation for the purchase or sale of any financial
instrument. The opinions and recommendations herein do not take into account individual client circumstances, objectives, or needs and are not
intended as recommendations of particular securities, financial instruments or strategies to particular clients. This material may include views on
structured securities, options, futures and other derivatives. These are complex instruments, may involve a high degree of risk and may be
appropriate investments only for sophisticated investors who are capable of understanding and assuming the risks involved. The recipients of
this material must make their own independent decisions regarding any securities or financial instruments mentioned herein and should seek
advice from such independent financial, legal, tax or other adviser as they deem necessary. J.P. Morgan may trade as a principal on the basis of
the Research Analysts’ views and research, and it may also engage in transactions for its own account or for its clients’ accounts in a manner
inconsistent with the views taken in this material, and J.P. Morgan is under no obligation to ensure that such other communication is brought to
the attention of any recipient of this material. Others within J.P. Morgan, including Strategists, Sales staff and other Research Analysts, may take
views that are inconsistent with those taken in this material. Employees of J.P. Morgan not involved in the preparation of this material may have
investments in the securities (or derivatives of such securities) mentioned in this material and may trade them in ways different from those
discussed in this material. This material is not an advertisement for or marketing of any issuer, its products or services, or its securities in any
jurisdiction.
"Other Disclosures" last revised April 01, 2023.
Copyright 2023 JPMorgan Chase & Co. All rights reserved. This material or any portion hereof may not be reprinted, sold or
redistributed without the written consent of J.P. Morgan. #$J&098$#*P
Completed 15 Apr 2023 04:14 AM HKT Disseminated 15 Apr 2023 04:17 AM HKT
This document is being provided for the exclusive use of anthony.wc.liao@jpmorgan.com & clients of J.P. Morgan.

More Related Content

What's hot

Cloud Computing Roadmap Public Vs Private Vs Hybrid And SaaS Vs PaaS Vs IaaS ...
Cloud Computing Roadmap Public Vs Private Vs Hybrid And SaaS Vs PaaS Vs IaaS ...Cloud Computing Roadmap Public Vs Private Vs Hybrid And SaaS Vs PaaS Vs IaaS ...
Cloud Computing Roadmap Public Vs Private Vs Hybrid And SaaS Vs PaaS Vs IaaS ...SlideTeam
 
LiBRA 09.2021 / 営業向け講演
LiBRA 09.2021 / 営業向け講演LiBRA 09.2021 / 営業向け講演
LiBRA 09.2021 / 営業向け講演Masanori Saito
 
【de:code 2020】 アマダの Azure への取り組みと DevOPS・MLOPS 環境の構築と運用
【de:code 2020】 アマダの Azure への取り組みと DevOPS・MLOPS 環境の構築と運用【de:code 2020】 アマダの Azure への取り組みと DevOPS・MLOPS 環境の構築と運用
【de:code 2020】 アマダの Azure への取り組みと DevOPS・MLOPS 環境の構築と運用日本マイクロソフト株式会社
 
チームトポロジーから学び、 データプラットフォーム組織を考え直した話.pptx
チームトポロジーから学び、 データプラットフォーム組織を考え直した話.pptxチームトポロジーから学び、 データプラットフォーム組織を考え直した話.pptx
チームトポロジーから学び、 データプラットフォーム組織を考え直した話.pptxRakuten Commerce Tech (Rakuten Group, Inc.)
 
Accenture Public Service - The Future of Government Back Office Operations
Accenture Public Service - The Future of Government Back Office OperationsAccenture Public Service - The Future of Government Back Office Operations
Accenture Public Service - The Future of Government Back Office Operationsaccenture
 
reInvent reCap 2022
reInvent reCap 2022reInvent reCap 2022
reInvent reCap 2022CloudHesive
 
Crossing the low-code and pro-code chasm: a platform approach
Crossing the low-code and pro-code chasm: a platform approachCrossing the low-code and pro-code chasm: a platform approach
Crossing the low-code and pro-code chasm: a platform approachAsanka Abeysinghe
 
AWS Private Equity Transformation Advisory
AWS Private Equity Transformation AdvisoryAWS Private Equity Transformation Advisory
AWS Private Equity Transformation AdvisoryTom Laszewski
 
YugaByte DB on Kubernetes - An Introduction
YugaByte DB on Kubernetes - An IntroductionYugaByte DB on Kubernetes - An Introduction
YugaByte DB on Kubernetes - An IntroductionYugabyte
 
Computing at the Edge with AWS Greengrass and Amazon FreeRTOS, ft. Enel (IOT2...
Computing at the Edge with AWS Greengrass and Amazon FreeRTOS, ft. Enel (IOT2...Computing at the Edge with AWS Greengrass and Amazon FreeRTOS, ft. Enel (IOT2...
Computing at the Edge with AWS Greengrass and Amazon FreeRTOS, ft. Enel (IOT2...Amazon Web Services
 
AI Chip Trends and Forecast
AI Chip Trends and ForecastAI Chip Trends and Forecast
AI Chip Trends and ForecastCastLabKAIST
 
Performance Analysis of Apache Spark and Presto in Cloud Environments
Performance Analysis of Apache Spark and Presto in Cloud EnvironmentsPerformance Analysis of Apache Spark and Presto in Cloud Environments
Performance Analysis of Apache Spark and Presto in Cloud EnvironmentsDatabricks
 
The Incredible Ways Shell Uses Artificial Intelligence To Help Transform The ...
The Incredible Ways Shell Uses Artificial Intelligence To Help Transform The ...The Incredible Ways Shell Uses Artificial Intelligence To Help Transform The ...
The Incredible Ways Shell Uses Artificial Intelligence To Help Transform The ...Bernard Marr
 
Democratization - New Wave of Data Science (홍운표 상무, DataRobot) :: AWS Techfor...
Democratization - New Wave of Data Science (홍운표 상무, DataRobot) :: AWS Techfor...Democratization - New Wave of Data Science (홍운표 상무, DataRobot) :: AWS Techfor...
Democratization - New Wave of Data Science (홍운표 상무, DataRobot) :: AWS Techfor...Amazon Web Services Korea
 
Tealium+AWS Analytics サービスで実現する Customer Experience(CX)
Tealium+AWS Analytics サービスで実現する Customer Experience(CX)Tealium+AWS Analytics サービスで実現する Customer Experience(CX)
Tealium+AWS Analytics サービスで実現する Customer Experience(CX)Amazon Web Services Japan
 
組み込みから HPC まで ARM コアで実現するエコシステム
組み込みから HPC まで ARM コアで実現するエコシステム組み込みから HPC まで ARM コアで実現するエコシステム
組み込みから HPC まで ARM コアで実現するエコシステムShinnosuke Furuya
 
High Tech Digital Transformation
High Tech Digital TransformationHigh Tech Digital Transformation
High Tech Digital Transformationaccenture
 

What's hot (20)

Amazon SageMaker Clarify
Amazon SageMaker ClarifyAmazon SageMaker Clarify
Amazon SageMaker Clarify
 
Cloud Computing Roadmap Public Vs Private Vs Hybrid And SaaS Vs PaaS Vs IaaS ...
Cloud Computing Roadmap Public Vs Private Vs Hybrid And SaaS Vs PaaS Vs IaaS ...Cloud Computing Roadmap Public Vs Private Vs Hybrid And SaaS Vs PaaS Vs IaaS ...
Cloud Computing Roadmap Public Vs Private Vs Hybrid And SaaS Vs PaaS Vs IaaS ...
 
LiBRA 09.2021 / 営業向け講演
LiBRA 09.2021 / 営業向け講演LiBRA 09.2021 / 営業向け講演
LiBRA 09.2021 / 営業向け講演
 
【de:code 2020】 アマダの Azure への取り組みと DevOPS・MLOPS 環境の構築と運用
【de:code 2020】 アマダの Azure への取り組みと DevOPS・MLOPS 環境の構築と運用【de:code 2020】 アマダの Azure への取り組みと DevOPS・MLOPS 環境の構築と運用
【de:code 2020】 アマダの Azure への取り組みと DevOPS・MLOPS 環境の構築と運用
 
チームトポロジーから学び、 データプラットフォーム組織を考え直した話.pptx
チームトポロジーから学び、 データプラットフォーム組織を考え直した話.pptxチームトポロジーから学び、 データプラットフォーム組織を考え直した話.pptx
チームトポロジーから学び、 データプラットフォーム組織を考え直した話.pptx
 
Accenture Public Service - The Future of Government Back Office Operations
Accenture Public Service - The Future of Government Back Office OperationsAccenture Public Service - The Future of Government Back Office Operations
Accenture Public Service - The Future of Government Back Office Operations
 
reInvent reCap 2022
reInvent reCap 2022reInvent reCap 2022
reInvent reCap 2022
 
Crossing the low-code and pro-code chasm: a platform approach
Crossing the low-code and pro-code chasm: a platform approachCrossing the low-code and pro-code chasm: a platform approach
Crossing the low-code and pro-code chasm: a platform approach
 
AWS Private Equity Transformation Advisory
AWS Private Equity Transformation AdvisoryAWS Private Equity Transformation Advisory
AWS Private Equity Transformation Advisory
 
YugaByte DB on Kubernetes - An Introduction
YugaByte DB on Kubernetes - An IntroductionYugaByte DB on Kubernetes - An Introduction
YugaByte DB on Kubernetes - An Introduction
 
Computing at the Edge with AWS Greengrass and Amazon FreeRTOS, ft. Enel (IOT2...
Computing at the Edge with AWS Greengrass and Amazon FreeRTOS, ft. Enel (IOT2...Computing at the Edge with AWS Greengrass and Amazon FreeRTOS, ft. Enel (IOT2...
Computing at the Edge with AWS Greengrass and Amazon FreeRTOS, ft. Enel (IOT2...
 
AI Chip Trends and Forecast
AI Chip Trends and ForecastAI Chip Trends and Forecast
AI Chip Trends and Forecast
 
Performance Analysis of Apache Spark and Presto in Cloud Environments
Performance Analysis of Apache Spark and Presto in Cloud EnvironmentsPerformance Analysis of Apache Spark and Presto in Cloud Environments
Performance Analysis of Apache Spark and Presto in Cloud Environments
 
The Incredible Ways Shell Uses Artificial Intelligence To Help Transform The ...
The Incredible Ways Shell Uses Artificial Intelligence To Help Transform The ...The Incredible Ways Shell Uses Artificial Intelligence To Help Transform The ...
The Incredible Ways Shell Uses Artificial Intelligence To Help Transform The ...
 
Democratization - New Wave of Data Science (홍운표 상무, DataRobot) :: AWS Techfor...
Democratization - New Wave of Data Science (홍운표 상무, DataRobot) :: AWS Techfor...Democratization - New Wave of Data Science (홍운표 상무, DataRobot) :: AWS Techfor...
Democratization - New Wave of Data Science (홍운표 상무, DataRobot) :: AWS Techfor...
 
Center of Excellence
Center of Excellence Center of Excellence
Center of Excellence
 
Tealium+AWS Analytics サービスで実現する Customer Experience(CX)
Tealium+AWS Analytics サービスで実現する Customer Experience(CX)Tealium+AWS Analytics サービスで実現する Customer Experience(CX)
Tealium+AWS Analytics サービスで実現する Customer Experience(CX)
 
組み込みから HPC まで ARM コアで実現するエコシステム
組み込みから HPC まで ARM コアで実現するエコシステム組み込みから HPC まで ARM コアで実現するエコシステム
組み込みから HPC まで ARM コアで実現するエコシステム
 
High Tech Digital Transformation
High Tech Digital TransformationHigh Tech Digital Transformation
High Tech Digital Transformation
 
Estrategias de recuperación de información: Hot Sites
Estrategias de recuperación de información: Hot SitesEstrategias de recuperación de información: Hot Sites
Estrategias de recuperación de información: Hot Sites
 

Similar to Server industry report

Vertex Perspectives | AI-optimized Chipsets | Part I
Vertex Perspectives | AI-optimized Chipsets | Part IVertex Perspectives | AI-optimized Chipsets | Part I
Vertex Perspectives | AI-optimized Chipsets | Part IVertex Holdings
 
Vertex perspectives ai optimized chipsets (part i)
Vertex perspectives   ai optimized chipsets (part i)Vertex perspectives   ai optimized chipsets (part i)
Vertex perspectives ai optimized chipsets (part i)Yanai Oron
 
Dell NVIDIA AI Powered Transformation in Financial Services Webinar
Dell NVIDIA AI Powered Transformation in Financial Services WebinarDell NVIDIA AI Powered Transformation in Financial Services Webinar
Dell NVIDIA AI Powered Transformation in Financial Services WebinarBill Wong
 
[Cisco Connect 2018 - Vietnam] Vipul shah intel it transformation an imperat...
[Cisco Connect 2018 - Vietnam] Vipul shah  intel it transformation an imperat...[Cisco Connect 2018 - Vietnam] Vipul shah  intel it transformation an imperat...
[Cisco Connect 2018 - Vietnam] Vipul shah intel it transformation an imperat...Nur Shiqim Chok
 
[Cisco Connect 2018 - Vietnam] It transformation an imperative for driving bu...
[Cisco Connect 2018 - Vietnam] It transformation an imperative for driving bu...[Cisco Connect 2018 - Vietnam] It transformation an imperative for driving bu...
[Cisco Connect 2018 - Vietnam] It transformation an imperative for driving bu...Nur Shiqim Chok
 
Vipul shah intel it transformation an imperative for driving business outcomes
Vipul shah  intel it transformation an imperative for driving business outcomesVipul shah  intel it transformation an imperative for driving business outcomes
Vipul shah intel it transformation an imperative for driving business outcomesNetworkCollaborators
 
Internet of things Emerging Network Technology Assessment Report
Internet of things Emerging Network Technology Assessment ReportInternet of things Emerging Network Technology Assessment Report
Internet of things Emerging Network Technology Assessment ReportHuilian (Irene) Zhang
 
E-Magazine September Issue 2021
E-Magazine September Issue 2021E-Magazine September Issue 2021
E-Magazine September Issue 2021VARINDIA
 
IRJET - Implementation of SDC: Self-Driving Car based on Raspberry Pi
IRJET - Implementation of SDC: Self-Driving Car based on Raspberry PiIRJET - Implementation of SDC: Self-Driving Car based on Raspberry Pi
IRJET - Implementation of SDC: Self-Driving Car based on Raspberry PiIRJET Journal
 
Autonomous Platform with AIML Document Intelligence Capabilities to Handle Se...
Autonomous Platform with AIML Document Intelligence Capabilities to Handle Se...Autonomous Platform with AIML Document Intelligence Capabilities to Handle Se...
Autonomous Platform with AIML Document Intelligence Capabilities to Handle Se...IRJET Journal
 
“Benchmarking vs. Benchmarketing: Why Should You Care?,” a Presentation from ...
“Benchmarking vs. Benchmarketing: Why Should You Care?,” a Presentation from ...“Benchmarking vs. Benchmarketing: Why Should You Care?,” a Presentation from ...
“Benchmarking vs. Benchmarketing: Why Should You Care?,” a Presentation from ...Edge AI and Vision Alliance
 
Cisco Connect 2018 Malaysia - It transformation-an imperative for driving bus...
Cisco Connect 2018 Malaysia - It transformation-an imperative for driving bus...Cisco Connect 2018 Malaysia - It transformation-an imperative for driving bus...
Cisco Connect 2018 Malaysia - It transformation-an imperative for driving bus...NetworkCollaborators
 
Training Report DRDO.pptx
Training Report DRDO.pptxTraining Report DRDO.pptx
Training Report DRDO.pptxLeoShad1
 
IoT Based Advertising System
IoT Based Advertising SystemIoT Based Advertising System
IoT Based Advertising SystemIRJET Journal
 
Honeypotz inc company deck
Honeypotz inc company deckHoneypotz inc company deck
Honeypotz inc company deckVladimir Lialine
 
IRJET- Share Market Prediction using Deep Learning Approach
IRJET- Share Market Prediction using Deep Learning ApproachIRJET- Share Market Prediction using Deep Learning Approach
IRJET- Share Market Prediction using Deep Learning ApproachIRJET Journal
 
Reducing Total Cost of Ownership
Reducing Total Cost of OwnershipReducing Total Cost of Ownership
Reducing Total Cost of OwnershipBiBoard.Org
 
Microservices: The Future-Proof Framework for IoT
Microservices: The Future-Proof Framework for IoTMicroservices: The Future-Proof Framework for IoT
Microservices: The Future-Proof Framework for IoTCapgemini
 
IRJET - An Overview of Edge Computing
IRJET - An Overview of Edge ComputingIRJET - An Overview of Edge Computing
IRJET - An Overview of Edge ComputingIRJET Journal
 

Similar to Server industry report (20)

Vertex Perspectives | AI-optimized Chipsets | Part I
Vertex Perspectives | AI-optimized Chipsets | Part IVertex Perspectives | AI-optimized Chipsets | Part I
Vertex Perspectives | AI-optimized Chipsets | Part I
 
Vertex perspectives ai optimized chipsets (part i)
Vertex perspectives   ai optimized chipsets (part i)Vertex perspectives   ai optimized chipsets (part i)
Vertex perspectives ai optimized chipsets (part i)
 
Dell NVIDIA AI Powered Transformation in Financial Services Webinar
Dell NVIDIA AI Powered Transformation in Financial Services WebinarDell NVIDIA AI Powered Transformation in Financial Services Webinar
Dell NVIDIA AI Powered Transformation in Financial Services Webinar
 
Future of Big Data
Future of Big DataFuture of Big Data
Future of Big Data
 
[Cisco Connect 2018 - Vietnam] Vipul shah intel it transformation an imperat...
[Cisco Connect 2018 - Vietnam] Vipul shah  intel it transformation an imperat...[Cisco Connect 2018 - Vietnam] Vipul shah  intel it transformation an imperat...
[Cisco Connect 2018 - Vietnam] Vipul shah intel it transformation an imperat...
 
[Cisco Connect 2018 - Vietnam] It transformation an imperative for driving bu...
[Cisco Connect 2018 - Vietnam] It transformation an imperative for driving bu...[Cisco Connect 2018 - Vietnam] It transformation an imperative for driving bu...
[Cisco Connect 2018 - Vietnam] It transformation an imperative for driving bu...
 
Vipul shah intel it transformation an imperative for driving business outcomes
Vipul shah  intel it transformation an imperative for driving business outcomesVipul shah  intel it transformation an imperative for driving business outcomes
Vipul shah intel it transformation an imperative for driving business outcomes
 
Internet of things Emerging Network Technology Assessment Report
Internet of things Emerging Network Technology Assessment ReportInternet of things Emerging Network Technology Assessment Report
Internet of things Emerging Network Technology Assessment Report
 
E-Magazine September Issue 2021
E-Magazine September Issue 2021E-Magazine September Issue 2021
E-Magazine September Issue 2021
 
IRJET - Implementation of SDC: Self-Driving Car based on Raspberry Pi
IRJET - Implementation of SDC: Self-Driving Car based on Raspberry PiIRJET - Implementation of SDC: Self-Driving Car based on Raspberry Pi
IRJET - Implementation of SDC: Self-Driving Car based on Raspberry Pi
 
Autonomous Platform with AIML Document Intelligence Capabilities to Handle Se...
Autonomous Platform with AIML Document Intelligence Capabilities to Handle Se...Autonomous Platform with AIML Document Intelligence Capabilities to Handle Se...
Autonomous Platform with AIML Document Intelligence Capabilities to Handle Se...
 
“Benchmarking vs. Benchmarketing: Why Should You Care?,” a Presentation from ...
“Benchmarking vs. Benchmarketing: Why Should You Care?,” a Presentation from ...“Benchmarking vs. Benchmarketing: Why Should You Care?,” a Presentation from ...
“Benchmarking vs. Benchmarketing: Why Should You Care?,” a Presentation from ...
 
Cisco Connect 2018 Malaysia - It transformation-an imperative for driving bus...
Cisco Connect 2018 Malaysia - It transformation-an imperative for driving bus...Cisco Connect 2018 Malaysia - It transformation-an imperative for driving bus...
Cisco Connect 2018 Malaysia - It transformation-an imperative for driving bus...
 
Training Report DRDO.pptx
Training Report DRDO.pptxTraining Report DRDO.pptx
Training Report DRDO.pptx
 
IoT Based Advertising System
IoT Based Advertising SystemIoT Based Advertising System
IoT Based Advertising System
 
Honeypotz inc company deck
Honeypotz inc company deckHoneypotz inc company deck
Honeypotz inc company deck
 
IRJET- Share Market Prediction using Deep Learning Approach
IRJET- Share Market Prediction using Deep Learning ApproachIRJET- Share Market Prediction using Deep Learning Approach
IRJET- Share Market Prediction using Deep Learning Approach
 
Reducing Total Cost of Ownership
Reducing Total Cost of OwnershipReducing Total Cost of Ownership
Reducing Total Cost of Ownership
 
Microservices: The Future-Proof Framework for IoT
Microservices: The Future-Proof Framework for IoTMicroservices: The Future-Proof Framework for IoT
Microservices: The Future-Proof Framework for IoT
 
IRJET - An Overview of Edge Computing
IRJET - An Overview of Edge ComputingIRJET - An Overview of Edge Computing
IRJET - An Overview of Edge Computing
 

Recently uploaded

Tata AIG General Insurance Company - Insurer Innovation Award 2024
Tata AIG General Insurance Company - Insurer Innovation Award 2024Tata AIG General Insurance Company - Insurer Innovation Award 2024
Tata AIG General Insurance Company - Insurer Innovation Award 2024The Digital Insurer
 
Advantages of Hiring UIUX Design Service Providers for Your Business
Advantages of Hiring UIUX Design Service Providers for Your BusinessAdvantages of Hiring UIUX Design Service Providers for Your Business
Advantages of Hiring UIUX Design Service Providers for Your BusinessPixlogix Infotech
 
2024: Domino Containers - The Next Step. News from the Domino Container commu...
2024: Domino Containers - The Next Step. News from the Domino Container commu...2024: Domino Containers - The Next Step. News from the Domino Container commu...
2024: Domino Containers - The Next Step. News from the Domino Container commu...Martijn de Jong
 
Workshop - Best of Both Worlds_ Combine KG and Vector search for enhanced R...
Workshop - Best of Both Worlds_ Combine  KG and Vector search for  enhanced R...Workshop - Best of Both Worlds_ Combine  KG and Vector search for  enhanced R...
Workshop - Best of Both Worlds_ Combine KG and Vector search for enhanced R...Neo4j
 
Driving Behavioral Change for Information Management through Data-Driven Gree...
Driving Behavioral Change for Information Management through Data-Driven Gree...Driving Behavioral Change for Information Management through Data-Driven Gree...
Driving Behavioral Change for Information Management through Data-Driven Gree...Enterprise Knowledge
 
08448380779 Call Girls In Friends Colony Women Seeking Men
08448380779 Call Girls In Friends Colony Women Seeking Men08448380779 Call Girls In Friends Colony Women Seeking Men
08448380779 Call Girls In Friends Colony Women Seeking MenDelhi Call girls
 
Understanding Discord NSFW Servers A Guide for Responsible Users.pdf
Understanding Discord NSFW Servers A Guide for Responsible Users.pdfUnderstanding Discord NSFW Servers A Guide for Responsible Users.pdf
Understanding Discord NSFW Servers A Guide for Responsible Users.pdfUK Journal
 
Breaking the Kubernetes Kill Chain: Host Path Mount
Breaking the Kubernetes Kill Chain: Host Path MountBreaking the Kubernetes Kill Chain: Host Path Mount
Breaking the Kubernetes Kill Chain: Host Path MountPuma Security, LLC
 
Factors to Consider When Choosing Accounts Payable Services Providers.pptx
Factors to Consider When Choosing Accounts Payable Services Providers.pptxFactors to Consider When Choosing Accounts Payable Services Providers.pptx
Factors to Consider When Choosing Accounts Payable Services Providers.pptxKatpro Technologies
 
How to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected WorkerHow to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected WorkerThousandEyes
 
Axa Assurance Maroc - Insurer Innovation Award 2024
Axa Assurance Maroc - Insurer Innovation Award 2024Axa Assurance Maroc - Insurer Innovation Award 2024
Axa Assurance Maroc - Insurer Innovation Award 2024The Digital Insurer
 
The Codex of Business Writing Software for Real-World Solutions 2.pptx
The Codex of Business Writing Software for Real-World Solutions 2.pptxThe Codex of Business Writing Software for Real-World Solutions 2.pptx
The Codex of Business Writing Software for Real-World Solutions 2.pptxMalak Abu Hammad
 
Powerful Google developer tools for immediate impact! (2023-24 C)
Powerful Google developer tools for immediate impact! (2023-24 C)Powerful Google developer tools for immediate impact! (2023-24 C)
Powerful Google developer tools for immediate impact! (2023-24 C)wesley chun
 
What Are The Drone Anti-jamming Systems Technology?
What Are The Drone Anti-jamming Systems Technology?What Are The Drone Anti-jamming Systems Technology?
What Are The Drone Anti-jamming Systems Technology?Antenna Manufacturer Coco
 
EIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptx
EIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptxEIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptx
EIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptxEarley Information Science
 
CNv6 Instructor Chapter 6 Quality of Service
CNv6 Instructor Chapter 6 Quality of ServiceCNv6 Instructor Chapter 6 Quality of Service
CNv6 Instructor Chapter 6 Quality of Servicegiselly40
 
Bajaj Allianz Life Insurance Company - Insurer Innovation Award 2024
Bajaj Allianz Life Insurance Company - Insurer Innovation Award 2024Bajaj Allianz Life Insurance Company - Insurer Innovation Award 2024
Bajaj Allianz Life Insurance Company - Insurer Innovation Award 2024The Digital Insurer
 
08448380779 Call Girls In Civil Lines Women Seeking Men
08448380779 Call Girls In Civil Lines Women Seeking Men08448380779 Call Girls In Civil Lines Women Seeking Men
08448380779 Call Girls In Civil Lines Women Seeking MenDelhi Call girls
 
GenCyber Cyber Security Day Presentation
GenCyber Cyber Security Day PresentationGenCyber Cyber Security Day Presentation
GenCyber Cyber Security Day PresentationMichael W. Hawkins
 
Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...
Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...
Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...Miguel Araújo
 

Recently uploaded (20)

Tata AIG General Insurance Company - Insurer Innovation Award 2024
Tata AIG General Insurance Company - Insurer Innovation Award 2024Tata AIG General Insurance Company - Insurer Innovation Award 2024
Tata AIG General Insurance Company - Insurer Innovation Award 2024
 
Advantages of Hiring UIUX Design Service Providers for Your Business
Advantages of Hiring UIUX Design Service Providers for Your BusinessAdvantages of Hiring UIUX Design Service Providers for Your Business
Advantages of Hiring UIUX Design Service Providers for Your Business
 
2024: Domino Containers - The Next Step. News from the Domino Container commu...
2024: Domino Containers - The Next Step. News from the Domino Container commu...2024: Domino Containers - The Next Step. News from the Domino Container commu...
2024: Domino Containers - The Next Step. News from the Domino Container commu...
 
Workshop - Best of Both Worlds_ Combine KG and Vector search for enhanced R...
Workshop - Best of Both Worlds_ Combine  KG and Vector search for  enhanced R...Workshop - Best of Both Worlds_ Combine  KG and Vector search for  enhanced R...
Workshop - Best of Both Worlds_ Combine KG and Vector search for enhanced R...
 
Driving Behavioral Change for Information Management through Data-Driven Gree...
Driving Behavioral Change for Information Management through Data-Driven Gree...Driving Behavioral Change for Information Management through Data-Driven Gree...
Driving Behavioral Change for Information Management through Data-Driven Gree...
 
08448380779 Call Girls In Friends Colony Women Seeking Men
08448380779 Call Girls In Friends Colony Women Seeking Men08448380779 Call Girls In Friends Colony Women Seeking Men
08448380779 Call Girls In Friends Colony Women Seeking Men
 
Understanding Discord NSFW Servers A Guide for Responsible Users.pdf
Understanding Discord NSFW Servers A Guide for Responsible Users.pdfUnderstanding Discord NSFW Servers A Guide for Responsible Users.pdf
Understanding Discord NSFW Servers A Guide for Responsible Users.pdf
 
Breaking the Kubernetes Kill Chain: Host Path Mount
Breaking the Kubernetes Kill Chain: Host Path MountBreaking the Kubernetes Kill Chain: Host Path Mount
Breaking the Kubernetes Kill Chain: Host Path Mount
 
Factors to Consider When Choosing Accounts Payable Services Providers.pptx
Factors to Consider When Choosing Accounts Payable Services Providers.pptxFactors to Consider When Choosing Accounts Payable Services Providers.pptx
Factors to Consider When Choosing Accounts Payable Services Providers.pptx
 
How to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected WorkerHow to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected Worker
 
Axa Assurance Maroc - Insurer Innovation Award 2024
Axa Assurance Maroc - Insurer Innovation Award 2024Axa Assurance Maroc - Insurer Innovation Award 2024
Axa Assurance Maroc - Insurer Innovation Award 2024
 
The Codex of Business Writing Software for Real-World Solutions 2.pptx
The Codex of Business Writing Software for Real-World Solutions 2.pptxThe Codex of Business Writing Software for Real-World Solutions 2.pptx
The Codex of Business Writing Software for Real-World Solutions 2.pptx
 
Powerful Google developer tools for immediate impact! (2023-24 C)
Powerful Google developer tools for immediate impact! (2023-24 C)Powerful Google developer tools for immediate impact! (2023-24 C)
Powerful Google developer tools for immediate impact! (2023-24 C)
 
What Are The Drone Anti-jamming Systems Technology?
What Are The Drone Anti-jamming Systems Technology?What Are The Drone Anti-jamming Systems Technology?
What Are The Drone Anti-jamming Systems Technology?
 
EIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptx
EIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptxEIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptx
EIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptx
 
CNv6 Instructor Chapter 6 Quality of Service
CNv6 Instructor Chapter 6 Quality of ServiceCNv6 Instructor Chapter 6 Quality of Service
CNv6 Instructor Chapter 6 Quality of Service
 
Bajaj Allianz Life Insurance Company - Insurer Innovation Award 2024
Bajaj Allianz Life Insurance Company - Insurer Innovation Award 2024Bajaj Allianz Life Insurance Company - Insurer Innovation Award 2024
Bajaj Allianz Life Insurance Company - Insurer Innovation Award 2024
 
08448380779 Call Girls In Civil Lines Women Seeking Men
08448380779 Call Girls In Civil Lines Women Seeking Men08448380779 Call Girls In Civil Lines Women Seeking Men
08448380779 Call Girls In Civil Lines Women Seeking Men
 
GenCyber Cyber Security Day Presentation
GenCyber Cyber Security Day PresentationGenCyber Cyber Security Day Presentation
GenCyber Cyber Security Day Presentation
 
Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...
Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...
Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...
 

Server industry report

  • 1. Asia Pacific Equity Research 16 April 2023 J P M O R G A N www.jpmorganmarkets.com Correction (first published 15 April 2023) (See disclosures for details) Technology - Hardware Albert Hung AC (886-2) 2725-9875 albert.hung@jpmchase.com Bloomberg JPMA AHUNG <GO> J.P. Morgan Securities (Taiwan) Limited/ J.P. Morgan Securities (Asia Pacific) Limited/ J.P. Morgan Broking (Hong Kong) Limited Gokul Hariharan (852) 2800-8564 gokul.hariharan@jpmorgan.com J.P. Morgan Securities (Asia Pacific) Limited/ J.P. Morgan Broking (Hong Kong) Limited JJ Park (82-2) 758-5717 jj.park@jpmorgan.com J.P. Morgan Securities (Far East) Limited, Seoul Branch Jerry Tsai (886-2) 2725-9867 jerry.tsai@jpmorgan.com J.P. Morgan Securities (Taiwan) Limited Anthony Leng (886-2) 2725-9240 anthony.leng@jpmorgan.com J.P. Morgan Securities (Taiwan) Limited Robert Hsu (886-2) 2725-9864 robert.hsu@jpmorgan.com J.P. Morgan Securities (Taiwan) Limited Jennifer Hsieh (886-2) 2725-9868 jennifer.hsieh@jpmorgan.com J.P. Morgan Securities (Taiwan) Limited Sangsik Lee (82-2) 758 5146 sangsik.lee@jpmorgan.com J.P. Morgan Securities (Far East) Limited, Seoul Branch Following our deep-dive note on AI semis in 2018 (link), we update our AI server shipment forecast and provide a BoM cost analysis of Nvidia HGX H100/A100 servers. We estimate 3% AI server shipment mix in 2022 and forecast a 42% shipment CAGR in the next 5 years driven by increasing investment by ISPs in machine learning, monetization in inference applications, and rising adoption in AI cloud service platforms. We estimate the costs of AI servers to be around 15x/32x (A100/H100 server, respectively) higher than that of regular servers, driven mainly by silicon value increase (especially GPUs and memory) and higher hardware specs requirement. In the Asian tech space, we identify TSMC, SK Hynix, Unimicron, ASPEED, Wistron, Quanta, Delta and Sunonwealth as the major beneficiaries of exponential growth of AI servers. • How to define AI servers? We define AI servers as servers with GPUs and AI ASIC (ex: Google’s TPU). We believe most of the AI activities remain at the training stage now and inference applications are still limited. Given Nvidia’s lion’s share in the GPU server market (85%-90% market share, according to IDC), we use Nvidia’s GPU server shipment to derive our current AI server volumes and forecasts. We estimate ~3% of total servers to be AI related, with the higher-end SKUs possibly representing only one-third of that. • Inference is the key to drive volume upsides. Despite recent market hype on AI, we have seen limited order increase in AI servers. We attribute the limited volume upsides to the reuse of AI servers to train different algorithms and limited monetization in AI applications. Still, we forecast AI server market to grow by 3.5x in 5 years with accelerating investment in machine learning and mature monetization in inference applications. We believe the rising AI server trend, along with market focus shifting to 2H23 demand recovery, is likely to drive valuation re-rating on the related stocks despite limited revenue contribution now. • Datacenter GPUs drive leading edge node migration, memory also a key beneficiary. Semiconductor comprises 90%+ of total AI server BoM cost vs. 65-70% of regular servers, driven mainly by high-end CPUs, incremental GPUs and rising requirement for memory (~4x/5x DRAM/NAND content vs. regular servers). The high silicon costs have incentivized hyperscalers to develop in-house datacenter ASICs, in our view. The heterogeneous computingtrendbodeswellforTSMC,inourview.Besides,wealsoseehigher BMC content in AI servers. • Rising power consumption and more complex system integration for AI servers. We see ODM, PCB, power supply, and heat dissipation as major beneficiaries of AI servers in the hardware space. Server ODMs benefit from themorecomplexdesignandsystemintegration.PCBmakersenjoyincreasing layer counts in AI server PCB. The high-end CPUs and incremental GPUs consumemorepowerandthisleadstocontentgrowthinpowersupplyandheat dissipation modules. See page 15 for analyst certification and important disclosures, including non-US analyst disclosures. J.P. Morgan does and seeks to do business with companies covered in its research reports. As a result, investors should be aware that the firm may have a conflict of interest that could affect the objectivity of this report. Investors should consider this report as only a single factor in making their investment decision. AI Servers Deconstructing the BoM and understanding potential upside for Asia Tech hardware This document is being provided for the exclusive use of anthony.wc.liao@jpmorgan.com & clients of J.P. Morgan.
  • 2. 2 Albert Hung (886-2) 2725-9875 albert.hung@jpmchase.com Asia Pacific Equity Research 16 April 2023 J P M O R G A N Key charts and tables Figure 1: AI server ecosystem and key drive factors Source: J.P. Morgan. This document is being provided for the exclusive use of anthony.wc.liao@jpmorgan.com & clients of J.P. Morgan.
  • 3. 3 Albert Hung (886-2) 2725-9875 albert.hung@jpmchase.com Asia Pacific Equity Research 16 April 2023 J P M O R G A N Table 1: AI related exposure in our tech coverage Company Nature of involvement % of revenue likely in 2023 Comments TSMC Foundry + backend for AI (CoWoS) 2-3% 100% market share for GPU for Gaming and AI as well as Mellanox DPUs, CoWoS for AI Chips Unimicron AI chip substrates 1-2% Unimicron is the secondary supply for the substrates used in the AI chips. SK Hynix HBM or GDDR6 DRAM Mid-HSD% Sole supplier of HBM to NVIDIA. 60-70% M/S in HBM market ASPEED AI server BMC 1% Delta AI server power supply and fans Limited Has great potential in power supply and fans given larger power consumption in AI server Sunonwealth AI server fans 1-2% Has great potential in fans given larger power consumption in AI server FII GPU server/module ODMs 5-10% Wistron GPU server/module ODMs 2-3% Wiwynn GPU server ODMs Teens % 50% of current project pipeline (in terms of project number) are AI related. Inventec GPU server mainboard ODMs 5-10% Mainly MSFT, Google and small contribution from Amazon AI projects. Quanta GPU server ODMs 5-10% Key supplier of MSFT AI servers. Source: Company data, J.P. Morgan estimates. What are AI servers? Our definition of AI servers is the servers with GPU and AI ASIC (TPU). We believe the majority of AI server activities are in the training phase, while there are limited applications in the inference. As Nvidia’s GPU has dominant market shares (85%-90% market share, according to IDC) in AI training due to the multi-cores structure, we derived AI server volumes from Nvidia’s GPU shipment. Figure 2: Block diagram of a heterogeneous compute implementation GPU Source: Gartner, J.P. Morgan. (Link). *Note: ASIC includes Google’s TPU, Amazon’s Inferentia etc. AI server shipment still small, but revenue contribution reached teens % There were 3.2mn datacenter GPUs in 2022, of which ~35% was Nvidia’s A100/V100 GPU. If we assume each AI server has 8 GPUs, the implied GPU server shipment was 400k units in last year. We estimate 90% of AI servers to be GPU servers. This implies total AI server shipment to be 440k in 2022, which was 3.3% of the total server market. In a simplified sense, the process of AI / Machine Learning can be divided into 2 main steps – (1) Training, for the system to learn and perfect a model or algorithm from massive datasets, and (2) Inference, for the system to apply the model in a real-life scenario or use case like facial recognition, speech recognition. This document is being provided for the exclusive use of anthony.wc.liao@jpmorgan.com & clients of J.P. Morgan.
  • 4. 4 Albert Hung (886-2) 2725-9875 albert.hung@jpmchase.com Asia Pacific Equity Research 16 April 2023 J P M O R G A N According to IDC, servers that cost US$25k and above comprised ~2% /~11% of total server shipment/value in 2022. While AI server shipment contribution remains limited, we estimate teens % revenue contribution of AI servers given the multiple times of ASP vs. regular servers. Table 2: AI server penetration estimate 2022 Datacenter GPU shipment (mn) 3.2 Nvidia A100/V100 shipment 1.1 A100/V100 shipment mix 34% GPU per servers 8 GPU server volume (mn) 0.40 % of GPU for AI applications 100% GPU servers as % of AI servers 90% AI server volumes (mn) 0.44 Global server shipment 13.5 AI server as % of total mix 3.3% Source: IDC, Gartner, J.P. Morgan estimates. Accelerating AI server market growth We forecast a 42% AI server shipment CAGR from 2022-27, buoyed by accelerating investment in machine learning, proliferation of inference applications and higher rate of adoption of AI cloud service. Consequently, we expect the AI server penetration rate to increase from 3% in 2022 to 15% in 2027. Figure 3: AI server shipment forecast 0.9% 1.3% 2.3% 3.3% 5.3% 7.5% 9.8% 12.4% 15.0% 0% 2% 4% 6% 8% 10% 12% 14% 16% - 0.5 1.0 1.5 2.0 2.5 2019 2020 2021 2022 2023E 2024E 2025E 2026E 2027E AI server volumes (in mn units) AI server % mix Source: IDC, J.P. Morgan estimates. Increasing complexity in model training This document is being provided for the exclusive use of anthony.wc.liao@jpmorgan.com & clients of J.P. Morgan.
  • 5. 5 Albert Hung (886-2) 2725-9875 albert.hung@jpmchase.com Asia Pacific Equity Research 16 April 2023 J P M O R G A N According to OpenAI, the total computing workload used in AI training has been doubling every 2 years starting from 1959, and the doubling time has accelerated to 3.4 months since 2012. Looking forward, the Research Lab believes that the doubling time is likely to speed up further given the increasing trend of algorithmic innovations (i.e. increasing genres of AI-specific chips) and improving cost burden (better affordability of hardware chips for AI training). Take generative large language model (LLM) training as an example. Total computing flops requirements have increased to 3,640 training petaflop/s per day in GPT-3 175B model (launched in 2H20), ~10x more than 382 petaflop/s in T5-11B model (launched in 2H19). Figure 4: Compute requirement of Large Language Model training Source: OpenAI (link). How to translate required training parameters into GPU consumption? Nvidia provides various computing power levels under different learning structures for a single GPU. Take A100 as an example. A single 4-GPU based DGX A100 server could generate 1.25 PetaFLOP per second under Tensor Float 32 structure. As the required total training compute of Chat GPT-3 is 3.14 * 10^23 FLOPs, this implies that it will take ~300 4-GPU A100 server units to keep the training time within 30 days, on our estimates. Of note, the required training compute of each model is correlated with the number of parameters and training tokens. The sharp increase in the parameters of new language models implies higher computing power consumption. Floating point operations per second (FLOPS) is a commonly used performance indicator of machine learning hardware, due to the prevalence of using floating point, instead of integer, in deep learning. Of note, GigaFLOPs= 10^9, TeraFLOPs= 10^12, PetaFLOPs= 10^15. This document is being provided for the exclusive use of anthony.wc.liao@jpmorgan.com & clients of J.P. Morgan.
  • 6. 6 Albert Hung (886-2) 2725-9875 albert.hung@jpmchase.com Asia Pacific Equity Research 16 April 2023 J P M O R G A N Table 3: GPU consumption estimates in AI training (ChatGPT 3.0) Note Single A100 card computing power 312 TeraFLOP/s; under TF32 4-GPUs A100 server machine computing power 1.25 PetaFLOP/s redundancy rates 67% A single 4-GPUs A100 server computing power per day 35,583 PetaFLOP/day GPT3 175B model training requirements 314,496,000 PetaFLOP/day Required days to train GPT3 model on single A100 server 8,838 Days How many 4-GPUs A100 server machine to keep training time in 1 month 295 servers Source: J.P. Morgan estimates. More complete AI ecosystem attracts new investment AI is playing a critical role in the world. There are more use cases relying on AI algorithms and machine learning to save time and costs across industries. People are more frequently using ChatGPT to collect data and make better decisions in a shorter time frame. Businesses are leveraging AI algorithms to automate operational processes to improve efficiency and save costs. According to a McKinsey & Company survey, AI adoption in enterprises has more than doubled in 2022 (50%) vs. 2017 (20%). The rise of underlying AI demand will lead to a more complete AI ecosystem, in our view. During Nvidia’s GTC investor day, the company indicated strong growth in the number of developers, CUDA downloads, AI startups, and GPU-accelerated applications. Similarly, Intel saw 85%+ install base increase in its open accelerated computing platform during the investor webinar. Given that server capacity and computing power are the key factors of AI infrastructure, we believe the pent-up AI demand and emerging AI applications will lead to a fast growth of AI server spending. According to IDC, AI server infrastructure will grow at a 17% CAGR in 2021-26. Figure 5: AI demand growth Source: Nvidia. Inference is the key for volume upside This document is being provided for the exclusive use of anthony.wc.liao@jpmorgan.com & clients of J.P. Morgan.
  • 7. 7 Albert Hung (886-2) 2725-9875 albert.hung@jpmchase.com Asia Pacific Equity Research 16 April 2023 J P M O R G A N We are still in the early stage of AI, with most applications focused on machine learning and AI training. As hyperscalers could leverage the same servers to train different algorithms under various time frames, this implies limited volume upsides in the training phase for the same customers. Of note, we estimate the OpenAI organisation to have 2-3k GPU servers while it only takes several hundred AI servers to train a single model in one month. Besides, training is a cost factor for hyperscalers so stringent investment in AI learning is important to keep their margins stable. But, if there were heightened competition in the existing business areas, Internet service providers could prioritize market shares over profitability and invest heavily on front-end research. For example, we believe Google would be likely to accelerate its investments in search engine related areas to defend against competition from MSFT’s GPT-powered Bing. In the medium to near term, we expect more internet service providers to join AI research competition and drive front-end investments in AI servers for training. In the medium to long term, we believe inference plays a more important role in AI server volume scale. Take ChatGPT-3 as an example in Table 4. The GPU server consumption could grow by multiple times if the number of users or the frequency of queries increases. Besides, the higher number of parameters in each generation could also increase the FLOPs consumption proportionally. As the rising traction of inference applications will incur meaningful costs, the key watch point is whether hyperscalers are able to monetize the product and support the expansion. Table 4: Generative AI running cost - Inference (ChatGPT 3.0) Scenario 1 Scenario 2 Scenario 3 Note Chat GPT users (mn) 100 200 300 Achieved 100 mn users in two months Monthly visit times 3.9 4.5 5.2 13mn daily users Query each time 4 8 12 Words each Query 400 500 600 Input+ output Generated words each month (mn) 624,000 3,588,000 11,140,740 Generated words per second 240,741 1,384,259 4,298,125 Implied tokens per second 180,556 1,038,194 3,223,594 1 word = 0.75 token Chat GPT Parameters (bn) 175 175 175 A100 computing power (TFLOPs) 624 Under FP16 or INT8 Required second per token by single A100 card 0.00056 GPU UTRs 50% Average word outputs per second on Chat GPT 5.6 Required A100 cards 1,134 6,522 20,251 Nvidia A100 cost 11,000 Required monthly running costs (USD mn) 12 72 223 Source: J.P. Morgan estimates. Key assumptions of the exercise: • Monthly visit times: We assume 13mn daily users in Scenario 1. • Words in each query: This include the input words and generated words. • Implied tokens per second: We assume 1 English word = 0.75 token. • A100 computing power: 624 TFLOPs under FP16 or INT8 structure. • Required second per token by single A100 card: (2* 175bn parameters) /A100 computing power (TFLOPs). • GPU UTRs: GPU cannot run at peak levels all the time. Also need to build extra capacity as ChatGPT users could be more concentrated in certain regions. This document is being provided for the exclusive use of anthony.wc.liao@jpmorgan.com & clients of J.P. Morgan.
  • 8. 8 Albert Hung (886-2) 2725-9875 albert.hung@jpmchase.com Asia Pacific Equity Research 16 April 2023 J P M O R G A N • Average word outputs per second on ChatGPT: The responsive time/ word has been largely consistent during our test. • Required A100 cards: We assume OpenAI to keep responsive time the same. • Nvidia A100 cost: We do not include other hardware costs such as server power, chassis, etc. Have we seen upsides in AI server shipment recently? We have seen increasing market hype on AI topics given the positive feedback on ChatGPT and generative AI models. Although some hyperscalers have recently accelerated their investments in machine learning and AI training, the magnitude of upward revision on server units appeared milder than expected. Our research indicates limited AI server volume upside while there is supply chain bottleneck in TSMC’s CoWoS HBM3 due to lower yields. This document is being provided for the exclusive use of anthony.wc.liao@jpmorgan.com & clients of J.P. Morgan.
  • 9. 9 Albert Hung (886-2) 2725-9875 albert.hung@jpmchase.com Asia Pacific Equity Research 16 April 2023 J P M O R G A N BoM cost comparison between AI and regular servers In the following table, we estimate the component costs of regular servers and AI servers. The key difference of AI servers vs. regular servers is GPU (or accelerators), which account for 70%+ of AI server BoM cost. The silicon content increase also leads to higher requirements of memory/storage, networking transmission speed, power consumption and heat dissipation. Overall, we estimate AI server BoM cost to be 15x/32x higher than that of regular servers. Table 5: Server BoM cost analysis: Regular server vs. GPU/AI server Content value % of Total BoM Content value % of Total BoM Content value % of Total BoM CPU 2,166 29% 13,900 12% 21,420 9% GPU 0 0% 80,000 71% 200,000 83% CPU DIMM (DDR5) 1,380 18% 4,600 4% 4,600 2% Storage SSD 1,365 18% 6,825 6% 6,825 3% Network Cards (NIC) 155 2% 1,000 1% 1,000 0% Chassis Costs 20 0% 40 0% 40 0% Motherboard: Dual Sockets 300 4% 360 0% 360 0% Power Supply 300 4% 1,200 1% 1,200 0% Storage Backplane 83 1% 83 0% 83 0% Drive Caddies 57 1% 57 0% 57 0% Fans 75 1% 270 0% 270 0% Heat dissipation module excl. fans (heat pipe) 30 0% 100 0% 100 0% Internal Cables 20 0% 20 0% 20 0% Riser Cards 20 0% 20 0% 20 0% Sheet Metal Case 100 1% 200 0% 200 0% PCB 325 4% 650 1% 650 0% Assembly Labor and Test 495 7% 1,485 1% 1,485 1% Markup 689 9% 2,067 2% 2,067 1% Total Cost 7,580 100% 112,877 100% 240,397 100% AI server BoM vs. Regular server BoM 14.9x 31.7x Regular servers GPU/AI servers (A100x8) GPU/AI servers (H100x8) Source: Company data, J.P.Morgan estimates. Figure 6: Key component BoM breakdown - regular server CPU, 29% CPU DIMM, 18% NAND storage, 18% Others, 35% Source: J.P. Morgan estimates. Figure 7: Key component BoM breakdown - H100 server GPU, 83% CPU, 9% CPU DIMM, 2% NAND storage, 3% Others, 3% Source: J.P. Morgan estimates. GPU: The key BoM cost boost, good for leading-edge Foundry vendors A single AI server usually includes 2/4/8 GPUs for parallel processing to accelerate the computing. In the BoM cost analysis, we assume 8 GPUs in a single AI server and estimate US$10k/25k for a A100/H100 module. Nvidia has been improving the computing power of datacenter GPUs, resulting in meaningful price upticks but lower cost per computing power. We expect such GPU This document is being provided for the exclusive use of anthony.wc.liao@jpmorgan.com & clients of J.P. Morgan.
  • 10. 10 Albert Hung (886-2) 2725-9875 albert.hung@jpmchase.com Asia Pacific Equity Research 16 April 2023 J P M O R G A N price trend to continue in the next few generations. Figure 8: Nvidia datacenter GPU computing power comparison In TFLOPs (Tera floating point operations per second) 10.6 15.7 19.5 60.0 0 10 20 30 40 50 60 70 P100 V100 A100 H100 Source: Company data, J.P. Morgan. Figure 9: Nvidia’s datacenter GPU prices and cost per compute US$k,US$/TFLOPs 9 11 10 25 0.9 0.7 0.5 0.4 0.0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1.0 0 5 10 15 20 25 30 P100 V100 A100 H100 Source: Company data, J.P.Morgan calculations. Prices refer to SXM model. CPU: Higher specs requirement but not the key silicon Dual socket CPU configuration comprises 80%+ of current servers. As GPUs and AI ASIC (TPU) are key drivers to accelerating processing, the number of CPUs do not increase in AI servers. However, the CPU specs of AI server are much higher than the regular ones. According to Nvidia, the default CPUs in DGX A100 and DGX H100 are AMD EPYC 7742 (Rome) and Intel 4th gen Xeon 8480C (EagleStream) processors. The prices of both CPUs are 5-10x higher than mainstream server CPUs. Memory and storage: Meaningful content increase to facilitate AI workload The DRAM content is around 600 GB per regular server while the default DRAM specs of Nvidia’s HGX/DGX series is 2 TB. There is also 40GB/80GB GDDR per GPU. Assuming 8 GPUs in a single AI server, total DRAM content could be 2 TB+ 320GB/640GB GDDR. Besides, we also see higher specs of DRAM in AI servers such as DDR5 adoption in CPU DIMM and high bandwidth memory (HBM) for GDDR. NAND content also increases in AI servers due to higher requirements to store the data set. We assume 20 TB of average NAND storage per AI server, while Nvidia’s HGX/ DGX series support 30 TB NVMe SSD storage and mainstream regular server NAND content is 4 TB now. Our Korean memory team (led by analyst JJ Park) estimates that AI server contribution was ~4% of total memory revenue in 2022 and it will increase to ~9%-12% from FY24- FY27E. This document is being provided for the exclusive use of anthony.wc.liao@jpmorgan.com & clients of J.P. Morgan.
  • 11. 11 Albert Hung (886-2) 2725-9875 albert.hung@jpmchase.com Asia Pacific Equity Research 16 April 2023 J P M O R G A N Table 6: AI server contribution to total memory revenue 2022 2023E 2024E 2025E 2026E 2027E DRAM revenue (US$mn) 77,768.9 41,529.8 58,119.6 63,931.5 70,324.7 77,357.1 NAND revenue (US$mn) 47,079.1 34,254.8 46,801.3 53,119.5 60,290.6 68,429.9 Total MM (US$mn) 124,848.0 75,784.6 104,920.9 117,051.0 130,615.3 145,787.0 Total AI server revenue (DRAM/NAND) 4,780.9 6,517.6 9,048.3 11,821.7 14,883.9 18,031.6 % of Memory Market 4% 9% 9% 10% 11% 12% DRAM CPU AI Server demand (8Gb, M) 907 1,393 2,176 3,084 4,209 5,525 AI server shipment (M) 0.4 0.7 1.0 1.4 1.8 2.2 Server density (GB/system) 2,048 2,048 2,150 2,258 2,371 2,489 DDR5 ASP (US$/GB) 2.0 1.9 1.8 1.7 1.6 1.5 (A) DRAM Revenue catered to CPU 1,814.5 2,646.5 3,926.9 5,288.0 6,857.3 8,550.2 GPU AI server demand (8Gb, M) 191 277 408 578 789 1,036 HBM server density (GB/system) 480 480 504 529 556 583 GPU server shipment (M) 0.4 0.6 0.8 1.1 1.4 1.8 HBM ASP (US$/GB) 3.0 2.7 2.4 2.2 2.0 1.8 (B) DRAM Revenue catered to GPU 574.1 749.2 991.2 1,264.6 1,553.5 1,835.1 (A) + (B) 2,388.7 3,395.8 4,918.1 6,552.6 8,410.8 10,385.3 % of DRAM revenue 3.1% 8.2% 8.5% 10.2% 12.0% 13.4% NAND (Storage SSD) AI server demand (8Gb, M) 7,974 11,562 16,997 24,093 32,886 43,163 Storage density (GB/system) 20,000 20,000 21,000 22,050 23,153 24,310 GPU server shipment (M) 0.4 0.6 0.8 1.1 1.4 1.8 ASP 0.3 0.3 0.2 0.2 0.2 0.2 Revenue (US$mn) 2,392.2 3,121.8 4,130.2 5,269.1 6,473.0 7,646.3 % of NAND revenue 5.1% 9.1% 8.8% 9.9% 10.7% 11.2% Source: iSuppli, Gartner, WSTS, J.P. Morgan Korean memory team estimates. PCB: Higher layer counts and lower yields drive ASP upticks We estimate 10-12 layers of regular server PCB while AI servers require higher end PCB including 18-20 layers. The higher number of PCB layers implies not only more content value but also higher difficulty in production yields. Consequently, we estimate 50-100% ASP upticks of AI server PCB vs. regular server PCB. ODM: More complex design and system integration While the general ODM feedback suggests similar margin for AI servers, we believe ODM margins should be diluted by the higher GPU costs. Still, the more complex configuration design, longer testing time, and pricing premiums for niche models will likely drive higher profit dollars. We believe several server ODMs are considering to change the pricing model from “buy and sell” to “consign” in AI servers. In this case, the AI server price could be reduced by 60-70% while margins could be higher. This document is being provided for the exclusive use of anthony.wc.liao@jpmorgan.com & clients of J.P. Morgan.
  • 12. 12 Albert Hung (886-2) 2725-9875 albert.hung@jpmchase.com Asia Pacific Equity Research 16 April 2023 J P M O R G A N Figure 10: GPU module (A100) Source: Nvidia. Figure 11: GPU server (DGX H100) Source: Nvidia. Power supply: A single GPU chip consumes similar power of a regular server Processing chip is the key component of power consumption in servers. The TDP (Thermal Design Power) of single server CPU is around 300W and a two socket regular server could consume 1200-1600W. The TDP of GPU ranges from 300W to 700W and it is around 50W per Smart NIC card. Therefore, most regular servers require 1+1 1.2k- 1.8kW server power supply while AI server power supply requirement could range from 2 to 4 *3kW. Figure 12: Nvidia’s datacenter GPU power consumption in TDP Watt 250 300 400 700 0 100 200 300 400 500 600 700 800 P100 V100 A100 H100 Source: Company data. Heat dissipation: Air cooling still the majority, liquid cooling the next trend Air cooling (fans + heat pipe/ vapor chamber) is still the mainstream heat dissipation solution for servers now. Nvidia’s A100 servers have much higher total design power and require more advanced heat dissipation. Our research indicates prices are 3x-5x for This document is being provided for the exclusive use of anthony.wc.liao@jpmorgan.com & clients of J.P. Morgan.
  • 13. 13 Albert Hung (886-2) 2725-9875 albert.hung@jpmchase.com Asia Pacific Equity Research 16 April 2023 J P M O R G A N AI heat dissipation solutions vs. regular server solutions. Currently some ISPs adopt immersion liquid cooling as a transitional solution, which is costly but has better heat dissipation performance vs. air cooling. We believe solution providers have been researching on a more economical liquid cooling (cold-plate) solution to tackle the much higher heat generation of AI servers (such as H100 server). Figure 13: Immersion liquid cooling (two phase) Source: Wiwynn. Figure 14: Cold-plate liquid cooling system Source: Wiwynn. Networking: NVlink and NVSwitch It takes multiple AI servers to train a single algorithm so the data transmission between GPUs in different AI servers is important to reduce the latency. Nvidia has launched NVlink to improve the communication between GPUs within a single AI server and NVSwtich to connect the various AI rack servers. Consequently, we expect the rising AI server mix to drive the networking upgrade in datacenter switch and volume upsides of Smart NIC/DPU. Figure 15: Nvidia’s NVLink Source: Company data. Figure 16: Nvidia’s NVSwitch Source: Company data. AI beneficiaries in Asia tech In the Asian tech space, we identify key beneficiaries under the AI explosion trend, including TSMC (key datacenter GPU foundry), SK Hynix (key HBM3 supplier), Unimicron (AI chips substrate), ASPEED (AI server BMC content growth), Wistron (GPU server subsystem supplier), Quanta (key GPU server ODM), Delta and This document is being provided for the exclusive use of anthony.wc.liao@jpmorgan.com & clients of J.P. Morgan.
  • 14. 14 Albert Hung (886-2) 2725-9875 albert.hung@jpmchase.com Asia Pacific Equity Research 16 April 2023 J P M O R G A N Sunonwealth (increasing power supply and fan contents in AI server). Table 7: AI related exposure in our tech coverage Company Nature of involvement % of revenue likely in 2023 Comments TSMC Foundry + backend for AI (CoWoS) 2-3% 100% market share for GPU for Gaming and AI as well as Mellanox DPUs, CoWoS for AI Chips Unimicron AI chip substrates 1-2% Unimicron is the secondary supply for the substrates used in the AI chips. SK Hynix HBM or GDDR6 DRAM Mid-HSD% Sole supplier of HBM to NVIDIA. 60-70% M/S in HBM market ASPEED AI server BMC 1% Delta AI server power supply and fans Limited Has great potential in power supply and fans given larger power consumption in AI server Sunonwealth AI server fans 1-2% Has great potential in fans given larger power consumption in AI server FII GPU server/module ODMs 5-10% Wistron GPU server/module ODMs 2-3% Wiwynn GPU server ODMs Teens % 50% of current project pipeline (in terms of project number) are AI related. Inventec GPU server mainboard ODMs 5-10% Mainly MSFT, Google and small contribution from Amazon AI projects. Quanta GPU server ODMs 5-10% Key supplier of MSFT AI servers. Source: Company data, J.P. Morgan estimates. This document is being provided for the exclusive use of anthony.wc.liao@jpmorgan.com & clients of J.P. Morgan.
  • 15. 15 Albert Hung (886-2) 2725-9875 albert.hung@jpmchase.com Asia Pacific Equity Research 16 April 2023 J P M O R G A N Correction: The following corrections have been made in the report: 1) Page 7, first paragraph, the table reference has been corrected; 2) Page 9 text, ''xPUs'' has changed to ''GPUs and AI ASIC (TPU)''; 3) Page 11, the text on AI server power supply requirement has been corrected; 4) Page 8, the text on AI server BoM cost has been corrected; 5) Figure 1, Figure 7 and Table 5 have been updated to fix inadvertent errors. Companies Discussed in This Report (all prices in this report as of market close on 14 April 2023) ASPEED Technology Inc.(5274.TWO/NT$2,755.00/OW), Delta Electronics, Inc.(2308.TW/NT$314.50/OW), Quanta Computer Inc.(2382.TW/NT$81.80/N), SK hynix(000660.KS/W89,300/OW), Sunonwealth(2421.TW/NT$52.90/OW), TSMC(2330.TW/ NT$516.00/OW), Unimicron(3037.TW/NT$139.00/OW), Wistron Corporation(3231.TW/NT$43.95/N) Analyst Certification: The Research Analyst(s) denoted by an “AC” on the cover of this report certifies (or, where multiple Research Analysts are primarily responsible for this report, the Research Analyst denoted by an “AC” on the cover or within the document individually certifies, with respect to each security or issuer that the Research Analyst covers in this research) that: (1) all of the views expressed in this report accurately reflect the Research Analyst’s personal views about any and all of the subject securities or issuers; and (2) no part of any of the Research Analyst's compensation was, is, or will be directly or indirectly related to the specific recommendations or views expressed by the Research Analyst(s) in this report. For all Korea-based Research Analysts listed on the front cover, if applicable, they also certify, as per KOFIA requirements, that the Research Analyst’s analysis was made in good faith and that the views reflect the Research Analyst’s own opinion, without undue influence or intervention. All authors named within this report are Research Analysts who produce independent research unless otherwise specified. In Europe, Sector Specialists (Sales and Trading) may be shown on this report as contacts but are not authors of the report or part of the Research Department. Important Disclosures Market Maker/ Liquidity Provider: J.P. Morgan is a market maker and/or liquidity provider in the financial instruments of/related to SK hynix. Client: J.P. Morgan currently has, or had within the past 12 months, the following entity(ies) as clients: SK hynix. Client/Non-Investment Banking, Securities-Related: J.P. Morgan currently has, or had within the past 12 months, the following entity(ies) as clients, and the services provided were non-investment-banking, securities-related: SK hynix. Potential Investment Banking Compensation: J.P. Morgan expects to receive, or intends to seek, compensation for investment banking services in the next three months from SK hynix. Non-Investment Banking Compensation Received: J.P. Morgan has received compensation in the past 12 months for products or services other than investment banking from SK hynix. Debt Position: J.P. Morgan may hold a position in the debt securities of SK hynix, if any. Gartner: All statements in this report attributable to Gartner represent J.P. Morgan's interpretation of data opinion or viewpoints published as part of a syndicated subscription service by Gartner, Inc., and have not been reviewed by Gartner. Each Gartner publication speaks as of its original publication date (and not as of the date of this report). The opinions expressed in Gartner publications are not representations of fact, and are subject to change without notice. Company-Specific Disclosures: Important disclosures, including price charts and credit opinion history tables, are available for compendium reports and all J.P. Morgan–covered companies, and certain non-covered companies, by visitinghttps://www.jpmm.com/research/disclosures, calling 1-800-477-0406, or e-mailing research.disclosure.inquiries@jpmorgan.com with your request. Explanation of Equity Research Ratings, Designations and Analyst(s) Coverage Universe: J.P. Morgan uses the following rating system: Overweight [Over the next six to twelve months, we expect this stock will outperform the average total return of the stocks in the analyst’s (or the analyst’s team’s) coverage universe.] Neutral [Over the next six to twelve months, we expect this stock will perform in line with the average total return of the stocks in the analyst’s (or the analyst’s team’s) coverage universe.] Underweight [Over the next six to twelve months, we expect this stock will underperform the average total return of the stocks in the analyst’s (or the analyst’s team’s) coverage universe.] Not Rated (NR): J.P. Morgan has removed the rating and, if applicable, the price target, for this stock because of either a lack of a sufficient fundamental basis or for legal, regulatory or policy reasons. The previous rating and, if applicable, the price target, no longer should be relied upon. An NR designation is not a recommendation or a rating. In our Asia (ex-Australia and ex-India) and U.K. small- and mid-cap equity research, each stock’s expected total return is compared to the expected total return of a benchmark country market index, not to those analysts’ coverage universe. If it does not appear in the Important Disclosures section of this report, the certifying analyst’s coverage universe can be found on J.P. Morgan’s research website, https://www.jpmorganmarkets.com. Coverage Universe: Hung, Albert: ASPEED Technology Inc. (5274.TWO), ASUSTek Computer (2357.TW), Chindata (CD), Compal Electronics, Inc. (2324.TW), Inventec (2356.TW), Lenovo Group Limited (0992) (0992.HK), Micro-Star International Co., Ltd. (2377.TW), Pegatron Corp (4938.TW), Quanta Computer Inc. (2382.TW), VNET Group (VNET), Wistron Corporation (3231.TW), Wiwynn Corp (6669.TW) This document is being provided for the exclusive use of anthony.wc.liao@jpmorgan.com & clients of J.P. Morgan.
  • 16. 16 Albert Hung (886-2) 2725-9875 albert.hung@jpmchase.com Asia Pacific Equity Research 16 April 2023 J P M O R G A N J.P. Morgan Equity Research Ratings Distribution, as of April 01, 2023 Overweight (buy) Neutral (hold) Underweight (sell) J.P. Morgan Global Equity Research Coverage* 47% 38% 15% IB clients** 47% 44% 34% JPMS Equity Research Coverage* 46% 41% 13% IB clients** 66% 65% 53% *Please note that the percentages might not add to 100% because of rounding. **Percentage of subject companies within each of the "buy," "hold" and "sell" categories for which J.P. Morgan has provided investment banking services within the previous 12 months. For purposes only of FINRA ratings distribution rules, our Overweight rating falls into a buy rating category; our Neutral rating falls into a hold rating category; and our Underweight rating falls into a sell rating category. Please note that stocks with an NR designation are not included in the table above. This information is current as of the end of the most recent calendar quarter. Equity Valuation and Risks: For valuation methodology and risks associated with covered companies or price targets for covered companies, please see the most recent company-specific research report at http://www.jpmorganmarkets.com, contact the primary analyst or your J.P. Morgan representative, or email research.disclosure.inquiries@jpmorgan.com. For material information about the proprietary models used, please see the Summary of Financials in company-specific research reports and the Company Tearsheets, which are available to download on the company pages of our client website, http://www.jpmorganmarkets.com. This report also sets out within it the material underlying assumptions used. A history of J.P. Morgan investment recommendations disseminated during the preceding 12 months can be accessed on the Research & Commentary page of http://www.jpmorganmarkets.com where you can also search by analyst name, sector or financial instrument. Analysts' Compensation:The research analysts responsible for the preparation of this report receive compensation based upon various factors, including the quality and accuracy of research, client feedback, competitive factors, and overall firm revenues. Registration of non-US Analysts: Unless otherwise noted, the non-US analysts listed on the front of this report are employees of non-US affiliates of J.P. Morgan Securities LLC, may not be registered as research analysts under FINRA rules, may not be associated persons of J.P. Morgan Securities LLC, and may not be subject to FINRA Rule 2241 or 2242 restrictions on communications with covered companies, public appearances, and trading securities held by a research analyst account. Other Disclosures J.P. Morgan is a marketing name for investment banking businesses of JPMorgan Chase & Co. and its subsidiaries and affiliates worldwide. UK MIFID FICC research unbundling exemption: UK clients should refer to UK MIFID Research Unbundling exemption for details of JPMorgan’s implementation of the FICC research exemption and guidance on relevant FICC research categorisation. All research material made available to clients are simultaneously available on our client website, J.P. Morgan Markets, unless specifically permitted by relevant laws. Not all research content is redistributed, e-mailed or made available to third-party aggregators. For all research material available on a particular stock, please contact your sales representative. Any long form nomenclature for references to China; Hong Kong; Taiwan; and Macau within this research material are Mainland China; Hong Kong SAR (China); Taiwan (China); and Macau SAR (China). J.P. Morgan Research may, from time to time, write on issuers or securities targeted by economic or financial sanctions imposed or administered by the governmental authorities of the U.S., EU, UK or other relevant jurisdictions (Sanctioned Securities). Nothing in this report is intended to be read or construed as encouraging, facilitating, promoting or otherwise approving investment or dealing in such Sanctioned Securities. Clients should be aware of their own legal and compliance obligations when making investment decisions. Any digital or crypto assets discussed in this research report are subject to a rapidly changing regulatory landscape. For relevant regulatory advisories on crypto assets, including bitcoin and ether, please see https://www.jpmorgan.com/disclosures/cryptoasset-disclosure. The author(s) of this research report may not be licensed to carry on regulated activities in your jurisdiction and, if not licensed, do not hold themselves out as being able to do so. Exchange-Traded Funds (ETFs): J.P. Morgan Securities LLC (“JPMS”) acts as authorized participant for substantially all U.S.-listed ETFs. To the extent that any ETFs are mentioned in this report, JPMS may earn commissions and transaction-based compensation in connection with the distribution of those ETF shares and may earn fees for performing other trade-related services, such as securities lending to short sellers of the ETF shares. JPMS may also perform services for the ETFs themselves, including acting as a broker or dealer to the ETFs. In addition, affiliates of JPMS may perform services for the ETFs, including trust, custodial, administration, lending, index calculation and/or maintenance and other services. This document is being provided for the exclusive use of anthony.wc.liao@jpmorgan.com & clients of J.P. Morgan.
  • 17. 17 Albert Hung (886-2) 2725-9875 albert.hung@jpmchase.com Asia Pacific Equity Research 16 April 2023 J P M O R G A N Options and Futures related research: If the information contained herein regards options- or futures-related research, such information is available only to persons who have received the proper options or futures risk disclosure documents. Please contact your J.P. Morgan Representative or visit https://www.theocc.com/components/docs/riskstoc.pdf for a copy of the Option Clearing Corporation's Characteristics and Risks of Standardized Options or http://www.finra.org/sites/default/files/Security_Futures_Risk_Disclosure_Statement_2018.pdf for a copy of the Security Futures Risk Disclosure Statement. Changes to Interbank Offered Rates (IBORs) and other benchmark rates: Certain interest rate benchmarks are, or may in the future become, subject to ongoing international, national and other regulatory guidance, reform and proposals for reform. For more information, please consult: https://www.jpmorgan.com/global/disclosures/interbank_offered_rates Private Bank Clients: Where you are receiving research as a client of the private banking businesses offered by JPMorgan Chase & Co. and its subsidiaries (“J.P. Morgan Private Bank”), research is provided to you by J.P. Morgan Private Bank and not by any other division of J.P. Morgan, including, but not limited to, the J.P. Morgan Corporate and Investment Bank and its Global Research division. Legal entity responsible for the production and distribution of research: The legal entity identified below the name of the Reg AC Research Analyst who authored this material is the legal entity responsible for the production of this research. Where multiple Reg AC Research Analysts authored this material with different legal entities identified below their names, these legal entities are jointly responsible for the production of this research. Research Analysts from various J.P. Morgan affiliates may have contributed to the production of this material but may not be licensed to carry out regulated activities in your jurisdiction (and do not hold themselves out as being able to do so). Unless otherwise stated below, this material has been distributed by the legal entity responsible for production. If you have any queries, please contact the relevant Research Analyst in your jurisdiction or the entity in your jurisdiction that has distributed this research material. Legal Entities Disclosures and Country-/Region-Specific Disclosures: Argentina: JPMorgan Chase Bank N.A Sucursal Buenos Aires is regulated by Banco Central de la República Argentina (“BCRA”- Central Bank of Argentina) and Comisión Nacional de Valores (“CNV”- Argentinian Securities Commission” - ALYC y AN Integral N°51). Australia: J.P. Morgan Securities Australia Limited (“JPMSAL”) (ABN 61 003 245 234/AFS Licence No: 238066) is regulated by the Australian Securities and Investments Commission and is a Market, Clearing and Settlement Participant of ASX Limited and CHI-X. This material is issued and distributed in Australia by or on behalf of JPMSAL only to "wholesale clients" (as defined in section 761G of the Corporations Act 2001). A list of all financial products covered can be found by visiting https://www.jpmm.com/research/disclosures. J.P. Morgan seeks to cover companies of relevance to the domestic and international investor base across all Global Industry Classification Standard (GICS) sectors, as well as across a range of market capitalisation sizes. If applicable, in the course of conducting public side due diligence on the subject company(ies), the Research Analyst team may at times perform such diligence through corporate engagements such as site visits, discussions with company representatives, management presentations, etc. Research issued by JPMSAL has been prepared in accordance with J.P. Morgan Australia’s Research Independence Policy which can be found at the following link: J.P. Morgan Australia - Research Independence Policy. Brazil: Banco J.P. Morgan S.A. is regulated by the Comissao de Valores Mobiliarios (CVM) and by the Central Bank of Brazil. Ombudsman J.P. Morgan: 0800-7700847 / ouvidoria.jp.morgan@jpmorgan.com. Canada: J.P. Morgan Securities Canada Inc. is a registered investment dealer, regulated by the Investment Industry Regulatory Organization of Canada and the Ontario Securities Commission and is the participating member on Canadian exchanges. This material is distributed in Canada by or on behalf of J.P.Morgan Securities Canada Inc. Chile: Inversiones J.P. Morgan Limitada is an unregulated entity incorporated in Chile. China: J.P. Morgan Securities (China) Company Limited has been approved by CSRC to conduct the securities investment consultancy business. Dubai International Financial Centre (DIFC): JPMorgan Chase Bank, N.A., Dubai Branch is regulated by the Dubai Financial Services Authority (DFSA) and its registered address is Dubai International Financial Centre - The Gate, West Wing, Level 3 and 9 PO Box 506551, Dubai, UAE. This material has been distributed by JP Morgan Chase Bank, N.A., Dubai Branch to persons regarded as professional clients or market counterparties as defined under the DFSA rules. European Economic Area (EEA): Unless specified to the contrary, research is distributed in the EEA by J.P. Morgan SE (“JPM SE”), which is subject to prudential supervision by the European Central Bank (“ECB”) in cooperation with BaFin and Deutsche Bundesbank in Germany. JPM SE is a company headquartered in Frankfurt with registered address at TaunusTurm, Taunustor 1, Frankfurt am Main, 60310, Germany. The material has been distributed in the EEA to persons regarded as professional investors (or equivalent) pursuant to Art. 4 para. 1 no. 10 and Annex II of MiFID II and its respective implementation in their home jurisdictions (“EEA professional investors”). This material must not be acted on or relied on by persons who are not EEA professional investors. Any investment or investment activity to which this material relates is only available to EEA relevant persons and will be engaged in only with EEA relevant persons. Hong Kong: J.P. Morgan Securities (Asia Pacific) Limited (CE number AAJ321) is regulated by the Hong Kong Monetary Authority and the Securities and Futures Commission in Hong Kong, and J.P. Morgan Broking (Hong Kong) Limited (CE number AAB027) is regulated by the Securities and Futures Commission in Hong Kong. JP Morgan Chase Bank, N.A., Hong Kong Branch (CE Number AAL996) is regulated by the Hong Kong Monetary Authority and the Securities and Futures Commission, is organized under the laws of the United States with limited liability. Where the distribution of this material is a regulated activity in Hong Kong, the material is distributed in Hong Kong by or through J.P. Morgan Securities (Asia Pacific) Limited and/or J.P. Morgan Broking (Hong Kong) Limited. India: J.P. Morgan India Private Limited (Corporate Identity Number - U67120MH1992FTC068724), having its registered office at J.P. Morgan Tower, Off. C.S.T. Road, Kalina, Santacruz - East, Mumbai – 400098, is registered with the Securities and Exchange Board of India (SEBI) as a ‘Research Analyst’ having registration number INH000001873. J.P. Morgan India Private Limited is also registered with SEBI as a member of the National Stock Exchange of India Limited and the Bombay Stock Exchange Limited (SEBI Registration Number – INZ000239730) and as a Merchant Banker (SEBI Registration Number - MB/INM000002970). Telephone: 91-22-6157 3000, Facsimile: 91-22-6157 3990 and Website: http://www.jpmipl.com. JPMorgan Chase Bank, N.A. - Mumbai Branch is licensed by the Reserve Bank of India (RBI) (Licence No. 53/ Licence No. BY.4/94; SEBI - IN/CUS/014/ CDSL : IN-DP-CDSL-444-2008/ IN-DP-NSDL-285- 2008/ INBI00000984/ INE231311239) as a Scheduled Commercial Bank in India, which is its primary license allowing it to carry on Banking business in India and other activities, which a Bank branch in India are permitted to undertake. For non-local research material, this material is This document is being provided for the exclusive use of anthony.wc.liao@jpmorgan.com & clients of J.P. Morgan.
  • 18. 18 Albert Hung (886-2) 2725-9875 albert.hung@jpmchase.com Asia Pacific Equity Research 16 April 2023 J P M O R G A N not distributed in India by J.P. Morgan India Private Limited. Indonesia: PT J.P. Morgan Sekuritas Indonesia is a member of the Indonesia Stock Exchange and is registered and supervised by the Otoritas Jasa Keuangan (OJK). Korea: J.P. Morgan Securities (Far East) Limited, Seoul Branch, is a member of the Korea Exchange (KRX). JPMorgan Chase Bank, N.A., Seoul Branch, is licensed as a branch office of foreign bank (JPMorgan Chase Bank, N.A.) in Korea. Both entities are regulated by the Financial Services Commission (FSC) and the Financial Supervisory Service (FSS). For non-macro research material, the material is distributed in Korea by or through J.P. Morgan Securities (Far East) Limited, Seoul Branch. Japan: JPMorgan Securities Japan Co., Ltd. and JPMorgan Chase Bank, N.A., Tokyo Branch are regulated by the Financial Services Agency in Japan. Malaysia: This material is issued and distributed in Malaysia by JPMorgan Securities (Malaysia) Sdn Bhd (18146- X), which is a Participating Organization of Bursa Malaysia Berhad and holds a Capital Markets Services License issued by the Securities Commission in Malaysia. Mexico: J.P. Morgan Casa de Bolsa, S.A. de C.V.and J.P. Morgan Grupo Financiero are members of the Mexican Stock Exchange and are authorized to act as a broker dealer by the National Banking and Securities Exchange Commission. New Zealand: This material is issued and distributed by JPMSAL in New Zealand only to "wholesale clients" (as defined in the Financial Markets Conduct Act 2013). JPMSAL is registered as a Financial Service Provider under the Financial Service providers (Registration and Dispute Resolution) Act of 2008. Pakistan: J. P. Morgan Pakistan Broking (Pvt.) Ltd is a member of the Karachi Stock Exchange and regulated by the Securities and Exchange Commission of Pakistan. Philippines: J.P. Morgan Securities Philippines Inc. is a Trading Participant of the Philippine Stock Exchange and a member of the Securities Clearing Corporation of the Philippines and the Securities Investor Protection Fund. It is regulated by the Securities and Exchange Commission. Russia: CB J.P. Morgan Bank International LLC is regulated by the Central Bank of Russia. Singapore: This material is issued and distributed in Singapore by or through J.P. Morgan Securities Singapore Private Limited (JPMSS) [MCI (P) 060/08/2022 and Co. Reg. No.: 199405335R], which is a member of the Singapore Exchange Securities Trading Limited, and/or JPMorgan Chase Bank, N.A., Singapore branch (JPMCB Singapore), both of which are regulated by the Monetary Authority of Singapore. This material is issued and distributed in Singapore only to accredited investors, expert investors and institutional investors, as defined in Section 4A of the Securities and Futures Act, Cap. 289 (SFA). This material is not intended to be issued or distributed to any retail investors or any other investors that do not fall into the classes of “accredited investors,” “expert investors” or “institutional investors,” as defined under Section 4A of the SFA. Recipients of this material in Singapore are to contact JPMSS or JPMCB Singapore in respect of any matters arising from, or in connection with, the material. As at the date of this material, JPMSS is a designated market maker for certain structured warrants listed on the Singapore Exchange where the underlying securities may be the securities discussed in this material. Arising from its role as a designated market maker for such structured warrants, JPMSS may conduct hedging activities in respect of such underlying securities and hold or have an interest in such underlying securities as a result. The updated list of structured warrants for which JPMSS acts as designated market maker may be found on the website of the Singapore Exchange Limited: http://www.sgx.com. South Africa: J.P. Morgan Equities South Africa Proprietary Limited and JPMorgan Chase Bank, N.A., Johannesburg Branch are members of the Johannesburg Securities Exchange and are regulated by the Financial Services Board. Taiwan: J.P. Morgan Securities (Taiwan) Limited is a participant of the Taiwan Stock Exchange (company-type) and regulated by the Taiwan Securities and Futures Bureau. Material relating to equity securities is issued and distributed in Taiwan by J.P. Morgan Securities (Taiwan) Limited, subject to the license scope and the applicable laws and the regulations in Taiwan. According to Paragraph 2, Article 7-1 of Operational Regulations Governing Securities Firms Recommending Trades in Securities to Customers (as amended or supplemented) and/or other applicable laws or regulations, please note that the recipient of this material is not permitted to engage in any activities in connection with the material that may give rise to conflicts of interests, unless otherwise disclosed in the “Important Disclosures” in this material. Thailand: This material is issued and distributed in Thailand by JPMorgan Securities (Thailand) Ltd., which is a member of the Stock Exchange of Thailand and is regulated by the Ministry of Finance and the Securities and Exchange Commission, and its registered address is 3rd Floor, 20 North Sathorn Road, Silom, Bangrak, Bangkok 10500. UK: Unless specified to the contrary, research is distributed in the UK by J.P. Morgan Securities plc (“JPMS plc”) which is a member of the London Stock Exchange and is authorised by the Prudential Regulation Authority and regulated by the Financial Conduct Authority and the Prudential Regulation Authority. JPMS plc is registered in England & Wales No. 2711006, Registered Office 25 Bank Street, London, E14 5JP. This material is directed in the UK only to: (a) persons having professional experience in matters relating to investments falling within article 19(5) of the Financial Services and Markets Act 2000 (Financial Promotion) (Order) 2005 (“the FPO”); (b) persons outlined in article 49 of the FPO (high net worth companies, unincorporated associations or partnerships, the trustees of high value trusts, etc.); or (c) any persons to whom this communication may otherwise lawfully be made; all such persons being referred to as "UK relevant persons". This material must not be acted on or relied on by persons who are not UK relevant persons. Any investment or investment activity to which this material relates is only available to UK relevant persons and will be engaged in only with UK relevant persons. Research issued by JPMS plc has been prepared in accordance with JPMS plc's policy for prevention and avoidance of conflicts of interest related to the production of Research which can be found at the following link: J.P. Morgan EMEA - Research Independence Policy. U.S.: J.P. Morgan Securities LLC (“JPMS”) is a member of the NYSE, FINRA, SIPC, and the NFA. JPMorgan Chase Bank, N.A. is a member of the FDIC. Material published by non-U.S. affiliates is distributed in the U.S. by JPMS who accepts responsibility for its content. General: Additional information is available upon request. The information in this material has been obtained from sources believed to be reliable. While all reasonable care has been taken to ensure that the facts stated in this material are accurate and that the forecasts, opinions and expectations contained herein are fair and reasonable, JPMorgan Chase & Co. or its affiliates and/or subsidiaries (collectively J.P. Morgan) make no representations or warranties whatsoever to the completeness or accuracy of the material provided, except with respect to any disclosures relative to J.P. Morgan and the Research Analyst's involvement with the issuer that is the subject of the material. Accordingly, no reliance should be placed on the accuracy, fairness or completeness of the information contained in this material. There may be certain discrepancies with data and/or limited content in this material as a result of calculations, adjustments, translations to different languages, and/or local regulatory restrictions, as applicable. These discrepancies should not impact the overall investment analysis, views and/or recommendations of the subject company(ies) that may be discussed in the material. J.P. Morgan accepts no liability whatsoever for any loss arising from any use of this material or its contents, and neither J.P. Morgan nor any of its respective directors, officers or employees, shall be in any way responsible for the contents hereof, apart from the liabilities and responsibilities that may be imposed on them by the relevant regulatory authority in the jurisdiction in This document is being provided for the exclusive use of anthony.wc.liao@jpmorgan.com & clients of J.P. Morgan.
  • 19. 19 Albert Hung (886-2) 2725-9875 albert.hung@jpmchase.com Asia Pacific Equity Research 16 April 2023 J P M O R G A N question, or the regulatory regime thereunder. Opinions, forecasts or projections contained in this material represent J.P. Morgan's current opinions or judgment as of the date of the material only and are therefore subject to change without notice. Periodic updates may be provided on companies/industries based on company-specific developments or announcements, market conditions or any other publicly available information. There can be no assurance that future results or events will be consistent with any such opinions, forecasts or projections, which represent only one possible outcome. Furthermore, such opinions, forecasts or projections are subject to certain risks, uncertainties and assumptions that have not been verified, and future actual results or events could differ materially. The value of, or income from, any investments referred to in this material may fluctuate and/or be affected by changes in exchange rates. All pricing is indicative as of the close of market for the securities discussed, unless otherwise stated. Past performance is not indicative of future results. Accordingly, investors may receive back less than originally invested. This material is not intended as an offer or solicitation for the purchase or sale of any financial instrument. The opinions and recommendations herein do not take into account individual client circumstances, objectives, or needs and are not intended as recommendations of particular securities, financial instruments or strategies to particular clients. This material may include views on structured securities, options, futures and other derivatives. These are complex instruments, may involve a high degree of risk and may be appropriate investments only for sophisticated investors who are capable of understanding and assuming the risks involved. The recipients of this material must make their own independent decisions regarding any securities or financial instruments mentioned herein and should seek advice from such independent financial, legal, tax or other adviser as they deem necessary. J.P. Morgan may trade as a principal on the basis of the Research Analysts’ views and research, and it may also engage in transactions for its own account or for its clients’ accounts in a manner inconsistent with the views taken in this material, and J.P. Morgan is under no obligation to ensure that such other communication is brought to the attention of any recipient of this material. Others within J.P. Morgan, including Strategists, Sales staff and other Research Analysts, may take views that are inconsistent with those taken in this material. Employees of J.P. Morgan not involved in the preparation of this material may have investments in the securities (or derivatives of such securities) mentioned in this material and may trade them in ways different from those discussed in this material. This material is not an advertisement for or marketing of any issuer, its products or services, or its securities in any jurisdiction. "Other Disclosures" last revised April 01, 2023. Copyright 2023 JPMorgan Chase & Co. All rights reserved. This material or any portion hereof may not be reprinted, sold or redistributed without the written consent of J.P. Morgan. #$J&098$#*P Completed 15 Apr 2023 04:14 AM HKT Disseminated 15 Apr 2023 04:17 AM HKT This document is being provided for the exclusive use of anthony.wc.liao@jpmorgan.com & clients of J.P. Morgan.