SlideShare a Scribd company logo
1 of 17
CNN
Convolutional Neural Network
Traditional NN & CNN
-์™„์ „์—ฐ๊ฒฐ๊ณ„์ธต (Affine ๊ณ„์ธต)
-CNN
โ€˜ํ•ฉ์„ฑ๊ณฑ ๊ณ„์ธต(conv)โ€™๊ณผ โ€˜ํ’€๋ง ๊ณ„์ธต(Pooling)โ€™ ์ถ”๊ฐ€
์ถœ๋ ฅ์— ๊ฐ€๊นŒ์šด ์ธต์—์„œ๋Š” ์ง€๊ธˆ๊นŒ์ง€์˜ NN ๊ตฌ์„ฑ ๊ทธ๋Œ€๋กœ ์‚ฌ์šฉ
โ–ถ ๊ฐ„๋‹จํ•˜๊ฒŒ, CNN์€ ์‹ ๊ฒฝ๋ง์— ๊ธฐ์กด์˜ ํ•„ํ„ฐ ๊ธฐ์ˆ ์„ ๋ณ‘ํ•ฉํ•˜์—ฌ ์‹ ๊ฒฝ๋ง์ด 2์ฐจ์› ์˜์ƒ์„ ์ž˜ ์Šต๋“ํ•  ์ˆ˜ ์žˆ๋„๋ก ์ตœ์ ํ™” ์‹œ
ํ‚จ ์•Œ๊ณ ๋ฆฌ์ฆ˜์ด๋‹ค.
โ–  Problems of Fully-connected Layer?
โ€˜๋ฐ์ดํ„ฐ์˜ ํ˜•์ƒ์ด ๋ฌด์‹œ๋œ๋‹คโ€™
์ด๋ฏธ์ง€์˜ ๊ฒฝ์šฐ, ์„ธ๋กœ/๊ฐ€๋กœ/์ฑ„๋„(์ƒ‰์ƒ)๋กœ ๊ตฌ์„ฑ๋œ 3์ฐจ์› ๋ฐ์ดํ„ฐ์ด๋‹ค. ๊ทธ๋Ÿฌ๋‚˜ ์™„์ „์—ฐ๊ฒฐ ๊ณ„์ธต์— ์ž…๋ ฅํ•  ๋•Œ๋Š” 3์ฐจ์›์˜ ๋ฐ์ดํ„ฐ๋ฅผ ํ‰ํ‰ํ•œ 1์ฐจ
์› ๋ฐ์ดํ„ฐ(gray scale)๋กœ ํ‰๋ฉดํ™” ํ•ด์ฃผ์–ด์•ผ ํ•œ๋‹ค. ์ด ๊ณผ์ •์—์„œ ๊ณต๊ฐ„ ์ •๋ณด๊ฐ€ ์†์‹ค๋  ์ˆ˜๋ฐ–์— ์—†๊ณ , ๊ฒฐ๊ณผ์ ์œผ๋กœ ์ด๋ฏธ์ง€ ๊ณต๊ฐ„ ์ •๋ณด ์œ ์‹ค๋กœ ์ธ
ํ•œ ์ •๋ณด ๋ถ€์กฑ์œผ๋กœ ์ธ๊ณต ์‹ ๊ฒฝ๋ง์ด ํŠน์ง•์„ ์ถ”์ถœ ๋ฐ ํ•™์Šต์ด ๋น„ํšจ์œจ์ ์ด๊ณ  ์ •ํ™•๋„๋ฅผ ๋†’์ด๋Š”๋ฐ ํ•œ๊ณ„๊ฐ€ ์กด์žฌํ•œ๋‹ค.
์ฆ‰, ์™„์ „์—ฐ๊ฒฐ ๊ณ„์ธต์€ ํ˜•์ƒ์„ ๋ฌด์‹œํ•˜๊ณ  ๋ชจ๋“  ์ž…๋ ฅ๋ฐ์ดํ„ฐ๋ฅผ ๋™๋“ฑํ•œ ๋‰ด๋Ÿฐ(๊ฐ™์€ ์ฐจ์›์˜ ๋‰ด๋Ÿฐ)์œผ๋กœ ์ทจ๊ธ‰ํ•˜์—ฌ ํ˜•์ƒ์— ๋‹ด๊ธด ์ •๋ณด๋ฅผ ์‚ด๋ฆด ์ˆ˜ ์—†
๋‹ค.
์ด๋ฏธ์ง€์˜ ๊ณต๊ฐ„ ์ •๋ณด๋ฅผ ์œ ์ง€ํ•œ ์ƒํƒœ๋กœ ํ•™์Šต์ด ๊ฐ€๋Šฅํ•œ ๋ชจ๋ธ์ด ๋ฐ”๋กœ CNN(Convolutional Neural Network)์ž…๋‹ˆ๋‹ค.
Convolutional Neural Network
โ–  Diff. between CNN and Affine(Fully-connected Layer)
ํ•ฉ์„ฑ๊ณฑ ๊ณ„์ธต์€ ํ˜•์ƒ์„ ์œ ์ง€ํ•œ๋‹ค.
์ด๋ฏธ์ง€๋ฅผ 3์ฐจ์› ๋ฐ์ดํ„ฐ๋กœ ์ž…๋ ฅ ๋ฐ›์œผ๋ฉฐ, ๋งˆ์ฐฌ๊ฐ€์ง€๋กœ ๋‹ค์Œ ๊ณ„์ธต์—๋„ 3์ฐจ์› ๋ฐ์ดํ„ฐ๋กœ ์ „๋‹ฌํ•œ๋‹ค. ๋”ฐ๋ผ์„œ CNN์—์„œ๋Š” ์ด
๋ฏธ์ง€์ฒ˜๋Ÿผ ํ˜•์ƒ์„ ๊ฐ€์ง„ ๋ฐ์ดํ„ฐ๋ฅผ ์ œ๋Œ€๋กœ ์ดํ•ดํ•  ๊ฐ€๋Šฅ์„ฑ์ด ๋†’๋‹ค.
โ€ข๊ฐ ๋ ˆ์ด์–ด์˜ ์ž…์ถœ๋ ฅ ๋ฐ์ดํ„ฐ์˜ ํ˜•์ƒ ์œ ์ง€
โ€ข์ด๋ฏธ์ง€์˜ ๊ณต๊ฐ„ ์ •๋ณด๋ฅผ ์œ ์ง€ํ•˜๋ฉด์„œ ์ธ์ ‘ ์ด๋ฏธ์ง€์™€์˜ ํŠน์ง•์„ ํšจ๊ณผ์ ์œผ๋กœ ์ธ์‹
โ€ข๋ณต์ˆ˜์˜ ํ•„ํ„ฐ๋กœ ์ด๋ฏธ์ง€์˜ ํŠน์ง• ์ถ”์ถœ ๋ฐ ํ•™์Šต
โ€ข์ถ”์ถœํ•œ ์ด๋ฏธ์ง€์˜ ํŠน์ง•์„ ๋ชจ์œผ๊ณ  ๊ฐ•ํ™”ํ•˜๋Š” Pooling ๋ ˆ์ด์–ด
โ€ขํ•„ํ„ฐ๋ฅผ ๊ณต์œ  ํŒŒ๋ผ๋ฏธํ„ฐ๋กœ ์‚ฌ์šฉํ•˜๊ธฐ ๋•Œ๋ฌธ์—, ์ผ๋ฐ˜ ์ธ๊ณต ์‹ ๊ฒฝ๋ง๊ณผ ๋น„๊ตํ•˜์—ฌ ํ•™์Šต ํŒŒ๋ผ๋ฏธํ„ฐ๊ฐ€ ๋งค์šฐ ์ ์Œ
โ–  CNNโ€™s Features
โ€ขLocality
CNN์€ local ์ •๋ณด๋ฅผ ํ™œ์šฉํ•œ๋‹ค. ๊ณต๊ฐ„์ ์œผ๋กœ ์ธ์ ‘ํ•œ ์‹ ํ˜ธ๋“ค์— ๋Œ€ํ•œ correlation ๊ด€๊ณ„๋ฅผ ๋น„์„ ํ˜• ํ•„ํ„ฐ๋ฅผ ์ ์šฉํ•˜์—ฌ ์ถ”์ถœํ•ด ๋‚ด๋Š”๋ฐ, ์ด๋Ÿฌํ•œ
ํ•„ํ„ฐ๋ฅผ ์—ฌ๋Ÿฌ ๊ฐœ ์ ์šฉํ•˜๋ฉด ๋‹ค์–‘ํ•œ local ํŠน์ง•์„ ์ถ”์ถœํ•ด ๋‚ผ ์ˆ˜ ์žˆ๊ฒŒ ๋œ๋‹ค.
โ€ขShared Weight
๋™์ผํ•œ ๊ณ„์ˆ˜๋ฅผ ๊ฐ–๋Š” filter๋ฅผ ์˜์ƒ ์ „์ฒด์— ๋ฐ˜๋ณต์ ์œผ๋กœ ์ ์šฉํ•จ์œผ๋กœ์จ ๋ณ€์ˆ˜์˜ ์ˆ˜๋ฅผ ํš๊ธฐ์ ์œผ๋กœ ์ค„์ผ ์ˆ˜ ์žˆ์œผ๋ฉฐ, topology๋ณ€ํ™”์— ๋ฌด๊ด€ํ•œ ํ•ญ์ƒ์„ฑ
์–ป์„ ์ˆ˜ ์žˆ๊ฒŒ๋œ๋‹ค.
Convolution
โ–  Convolution
์ž…๋ ฅ ์˜์ƒ์œผ๋กœ๋ถ€ํ„ฐ Convolution(filter)๋ฅผ ํ†ตํ•ด, feature map์„
์ƒ์„ฑ
- Filter(=Kernel) & Stride
์ด๋ฏธ์ง€์˜ ํŠน์ง•์„ ์ฐพ์•„๋‚ด๊ธฐ ์œ„ํ•œ ๊ณต์šฉ ํŒŒ๋ผ๋ฏธํ„ฐ๋กœ, ์ผ๋ฐ˜์ ์œผ๋กœ (4, 4)์ด๋‚˜ (3, 3)๊ณผ ๊ฐ™์€ ์ •์‚ฌ๊ฐ ํ–‰๋ ฌ๋กœ ์ •์˜๋œ๋‹ค.
CNN์—์„œ ํ•™์Šต์˜ ๋Œ€์ƒ์€ ํ•„ํ„ฐ ํŒŒ๋ผ๋ฏธํ„ฐ์ด๋ฉฐ, ํ•ฉ์„ฑ๊ณฑ ์—ฐ์‚ฐ์€ ํ•„ํ„ฐ์˜ ์œˆ๋„์šฐ๋ฅผ ์ง€์ •๋œ ๊ฐ„๊ฒฉ(stride)์œผ๋กœ ์ˆœํšŒํ•˜๋ฉฐ ์ฑ„๋„๋ณ„๋กœ(์ปฌ๋Ÿฌ์˜ ๊ฒฝ์šฐ
3๊ฐœ) ์ž…๋ ฅ๋ฐ์ดํ„ฐ์— ์ ์šฉํ•˜์—ฌ Feature Map์„ ๋งŒ๋“ ๋‹ค.
์ฃผ์˜ )์ž…๋ ฅ ๋ฐ์ดํ„ฐ์˜ ์ฑ„๋„ ์ˆ˜์™€ ํ•„ํ„ฐ์˜ ์ฑ„๋„ ์ˆ˜๋Š” ๊ฐ™์•„์•ผ
Convolution
โ–  Filter
์ž…๋ ฅ ์˜์ƒ์œผ๋กœ๋ถ€ํ„ฐ Convolution(filter)๋ฅผ ํ†ตํ•ด, feature map์„
์ƒ์„ฑ
Filter๋Š” ํ•ด๋‹น ํŠน์ง•์„ ๋‘๋“œ๋Ÿฌ์ง€๊ฒŒํ•˜๊ฑฐ๋‚˜, ๊ทธ ํŠน์ง•์ด ๋ฐ์ดํ„ฐ์— ์žˆ๋Š”์ง€ ์—†๋Š”์ง€๋ฅผ ๊ฒ€์ถœํ•ด์ฃผ๋Š” ํ•จ์ˆ˜์ด
๋‹ค.
์˜ˆ1) ํŠน์ง•์„ ๋‘๋“œ๋Ÿฌ์ง€๊ฒŒ ํ•˜๋Š” ํ•„ํ„ฐ
์˜ˆ2) ํ•ด๋‹น ํŠน์ง• ๊ฒ€์ถœ ํ•„ํ„ฐ
๊ณก์„ ์„ ๊ฒ€์ถœํ•˜๋Š” ํ•„ํ„ฐ
์ง์„  ๋ถ€๋ถ„์„ ์ ์šฉํ•˜๋ฉด?
๊ฒฐ๊ณผ๊ฐ’ 0์— ์ˆ˜๋ ด
์ฆ‰, ํ•„ํ„ฐ๋Š” ์ž…๋ ฅ ๋ฐ›์€ ๋ฐ์ดํ„ฐ์—์„œ ๊ทธ ํŠน์„ฑ์„ ๊ฐ€์ง€๊ณ  ์žˆ์œผ๋ฉด ๊ฒฐ๊ณผ ๊ฐ’์ด ํฐ ๊ฐ’์ด ๋‚˜์˜ค๊ณ , ๊ฐ€์ง€๊ณ  ์žˆ์ง€ ์•Š์œผ๋ฉด 0์— ๊ฐ€๊นŒ์šด ๊ฐ’์ด ๋‚˜
์˜ค๊ฒŒ ๋˜์–ด ํ…Œ์ดํ„ฐ๊ฐ€ ๊ทธ ํŠน์„ฑ์„ ๊ฐ€์ง€๊ณ  ์žˆ๋Š”์ง€ ์—†๋Š”์ง€์˜ ์—ฌ๋ถ€๋ฅผ ์•Œ ์ˆ˜ ์žˆ๊ฒŒ ํ•ด์ค€๋‹ค.
Convolution
โ–  multiple filter
์ž…๋ ฅ ์˜์ƒ์œผ๋กœ๋ถ€ํ„ฐ Convolution(filter)๋ฅผ ํ†ตํ•ด, feature map์„
์ƒ์„ฑ
์ž…๋ ฅ ๊ฐ’์—๋Š” ์—ฌ๋Ÿฌ ๊ฐœ์˜ ํŠน์ง•์ด ์žˆ์–ด, ์—ฌ๋Ÿฌ ๊ฐœ์˜ ๋‹ค์ค‘ ํ•„ํ„ฐ ๊ฐ’์„ ๋‹ค์Œ๊ณผ ๊ฐ™์ด ์ ์šฉํ•˜๊ฒŒ ๋œ๋‹ค.
Input data
| Filter
ใ…ก Filter
Convolution
โ–  Padding
์ž…๋ ฅ ์˜์ƒ์œผ๋กœ๋ถ€ํ„ฐ Convolution(filter)๋ฅผ ํ†ตํ•ด, feature map์„
์ƒ์„ฑ
ํ•ฉ์„ฑ๊ณฑ ์—ฐ์‚ฐ์˜ ํŒจ๋”ฉ ์ฒ˜๋ฆฌ : ์ž…๋ ฅ ๋ฐ์ดํ„ฐ ์ฃผ์œ„์— 0์„ ์ฑ„์šด๋‹ค.
Convolution ๋ ˆ์ด์–ด์—์„œ Filter์™€ Stride์˜ ์ž‘์šฉ์œผ๋กœ Feature Map์˜ ํฌ๊ธฐ๋Š” ์ž…๋ ฅ๋ฐ์ดํ„ฐ๋ณด๋‹ค ์ž‘์•„์ง„๋‹ค. ํ•ฉ์„ฑ๊ณฑ ์—ฐ์‚ฐ์„ ๊ฑฐ์น  ๋•Œ
๋งˆ๋‹ค ํฌ๊ธฐ๊ฐ€ ์ž‘์•„์ง€๋ฉด ์–ด๋Š ์‹œ์ ์—์„œ๋Š” ํฌ๊ธฐ๊ฐ€ 1์ด ๋˜์–ด๋ฒ„๋ ค ๋” ์ด์ƒ ํ•ฉ์„ฑ๊ณฑ์„ ์ง„ํ–‰ํ•  ์ˆ˜ ์—†๊ฒŒ ๋˜๋Š”๋ฐ, CNN ๋„คํŠธ์›Œํฌ๋Š” ํ•˜๋‚˜์˜
ํ•„ํ„ฐ ๋ ˆ์ด์–ด๊ฐ€ ์•„๋‹ˆ๋ผ ์—ฌ๋Ÿฌ ๋‹จ๊ณ„์— ๊ฑธ์ณ์„œ ๊ณ„์† ํ•„ํ„ฐ๋ฅผ ์—ฐ์†์ ์œผ๋กœ ์ ์šฉํ•˜์—ฌ ํŠน์ง•์„ ์ถ”์ถœํ•˜๋Š” ๊ฒƒ์„ ์ตœ์ ํ™” ํ•ด ๋‚˜๊ฐ€๋Š”๋ฐ, ํ•„ํ„ฐ
์ ์šฉ ํ›„ ๊ฒฐ๊ณผ ๊ฐ’์ด ์ž‘์•„์ง€๊ฒŒ ๋˜๋ฉด ์ฒ˜์Œ์— ๋น„ํ•ด์„œ ํŠน์ง•์ด ๋งŽ์ด ์œ ์‹ค๋  ์ˆ˜๊ฐ€ ์žˆ๋‹ค. ์ถฉ๋ถ„ํžˆ ํŠน์ง•์ด ์ถ”์ถœ๋˜๊ธฐ ์ด์ „์— ๊ฒฐ๊ณผ ๊ฐ’์ด ์ž‘์•„
์ง€๋ฉด ํŠน์ง•์ด ์œ ์‹ค๋˜๊ธฐ ๋•Œ๋ฌธ์—, ์ด๋ฅผ ๋ฐฉ์ง€ํ•˜๊ธฐ ์œ„ํ•œ ๋ฐฉ๋ฒ•์œผ๋กœ ์ถœ๋ ฅ ํฌ๊ธฐ๋ฅผ ์กฐ์ •ํ•  ๋ชฉ์ ์œผ๋กœ Padding์„ ์ง„ํ–‰ํ•œ๋‹ค.
Convolution
โ–  Output size
์ž…๋ ฅ ์˜์ƒ์œผ๋กœ๋ถ€ํ„ฐ Convolution(filter)๋ฅผ ํ†ตํ•ด, feature map์„
์ƒ์„ฑ
ํ•ฉ์„ฑ๊ณฑ ์—ฐ์‚ฐ์˜ ์ฒ˜๋ฆฌ ํ๋ฆ„
๋ฐฐ์น˜์ฒ˜๋ฆฌ
Input data
๋†’์ด : H
ํญ : W
ํ•„ํ„ฐ ๋†’์ด : FH
ํ•„ํ„ฐ ํญ : FW
Stride ํฌ๊ธฐ : S
ํŒจ๋”ฉ ์‚ฌ์ด์ฆˆ: P
Convolution
โ–  Activation Function
์ž…๋ ฅ ์˜์ƒ์œผ๋กœ๋ถ€ํ„ฐ Convolution(filter)๋ฅผ ํ†ตํ•ด, feature map์„
์ƒ์„ฑ
ํ•„ํ„ฐ๋“ค์„ ํ†ตํ•ด์„œ Feature map์ด ์ถ”์ถœ๋˜์—ˆ์œผ๋ฉด, ์ด Feature map์— Activation
function์„ ์ ์šฉํ•˜๊ฒŒ ๋œ๋‹ค. Activation function์˜ ๊ฐœ๋…์„ ์„ค๋ช…ํ•˜๋ฉด, ์œ„์˜ ์ฅ ๊ทธ
๋ฆผ์—์„œ ๊ณก์„  ๊ฐ’์˜ ํŠน์ง•์ด ๋“ค์–ด๊ฐ€ ์žˆ๋Š”์ง€ ์•ˆ ๋“ค์–ด๊ฐ€ ์žˆ๋Š”์ง€์˜ ํ•„ํ„ฐ๋ฅผ ํ†ตํ•ด์„œ ์ถ”
์ถœํ•œ ๊ฐ’์ด ๋“ค์–ด๊ฐ€ ์žˆ๋Š” ์˜ˆ์—์„œ๋Š” 6000, ์•ˆ ๋“ค์–ด๊ฐ€ ์žˆ๋Š” ์˜ˆ์—์„œ๋Š” 0 ์œผ๋กœ ๋‚˜์™”๋‹ค.
์ด ๊ฐ’์ด ์ •๋Ÿ‰์ ์ธ ๊ฐ’์œผ๋กœ ๋‚˜์˜ค๊ธฐ ๋•Œ๋ฌธ์—, ๊ทธ ํŠน์ง•์ด โ€œ์žˆ๋‹ค ์—†๋‹คโ€์˜ ๋น„์„ ํ˜•
๊ฐ’์œผ๋กœ ๋ฐ”๊ฟ” ์ฃผ๋Š” ๊ณผ์ •์ด ํ•„์š”ํ•œ๋ฐ, ์ด ๊ฒƒ์ด ๋ฐ”๋กœ Activation ํ•จ์ˆ˜์ด๋‹ค.
ReLu ํ•จ์ˆ˜๋ฅผ ์ด์šฉํ•˜๋Š” ์ด์œ ๋Š” ๋‰ด๋Ÿด ๋„คํŠธ์›Œํฌ์—์„œ ์‹ ๊ฒฝ๋ง์ด ๊นŠ์–ด์งˆ ์ˆ˜๋ก ํ•™์Šต์ด
์–ด๋ ต๊ธฐ ๋•Œ๋ฌธ์—, ์ „์ฒด ๋ ˆ์ด์–ด๋ฅผ ํ•œ๋ฒˆ ๊ณ„์‚ฐํ•œ ํ›„, ๊ทธ ๊ณ„์‚ฐ ๊ฐ’์„ ์žฌ ํ™œ์šฉํ•˜์—ฌ ๋‹ค์‹œ
๊ณ„์‚ฐํ•˜๋Š” Back propagation์ด๋ผ๋Š” ๋ฐฉ๋ฒ•์„ ์‚ฌ์šฉํ•˜๋Š”๋ฐ, sigmoid ํ•จ์ˆ˜๋ฅผ
activation ํ•จ์ˆ˜๋กœ ์‚ฌ์šฉํ•  ๊ฒฝ์šฐ, ๋ ˆ์ด์–ด๊ฐ€ ๊นŠ์–ด์ง€๋ฉด ์ด Back propagation์ด
์ œ๋Œ€๋กœ ์ž‘๋™์„ ํ•˜์ง€ ์•Š๊ธฐ ๋•Œ๋ฌธ์—,(๊ฐ’์„ ๋’ค์—์„œ ์•ž์œผ๋กœ ์ „๋‹ฌํ• ๋•Œ ํฌ์„์ด ๋˜๋Š”
ํ˜„์ƒ. ์ด๋ฅผ Gradient Vanishing ์ด๋ผ๊ณ  ํ•œ๋‹ค.) ReLu๋ผ๋Š” ํ•จ์ˆ˜๋ฅผ ์‚ฌ์šฉํ•œ๋‹ค.
Pooling ์ถœ๋ ฅ ๋ฐ์ดํ„ฐ์˜ ํฌ๊ธฐ๋ฅผ ์ค„์ด๊ฑฐ๋‚˜ ํŠน์ • ๋ฐ์ดํ„ฐ๋ฅผ ๊ฐ•์กฐํ•˜๊ฒŒ ๋˜๋Š”
๋ ˆ์ด์–ด
- Pooling
ํ’€๋ง ๋ ˆ์ด์–ด๋Š” ์ปจ๋ณผ๋ฃจ์…˜ ๋ ˆ์ด์–ด์˜ ์ถœ๋ ฅ ๋ฐ์ดํ„ฐ(Activation Map)๋ฅผ ์ž…๋ ฅ์œผ๋กœ ๋ฐ›์•„์„œ ์ถœ๋ ฅ ๋ฐ์ดํ„ฐ์˜ ํฌ๊ธฐ๋ฅผ ์ค„์ด๊ฑฐ๋‚˜ ํŠน์ • ๋ฐ์ดํ„ฐ๋ฅผ ๊ฐ•
์กฐํ•˜๋Š” ์šฉ๋„๋กœ ์‚ฌ์šฉํ•œ๋‹ค. ํ’€๋ง ๋ ˆ์ด์–ด๋ฅผ ์ฒ˜๋ฆฌํ•˜๋Š” ๋ฐฉ๋ฒ•์œผ๋กœ๋Š” Max Pooling๊ณผ Average Pooling, Min Pooling์ด ์žˆ๋‹ค.
์ผ๋ฐ˜์ ์œผ๋กœ ๋ฐ์ดํ„ฐํ’€๋ง์˜ ์œˆ๋„์šฐ ํฌ๊ธฐ์™€ ์ŠคํŠธ๋ผ์ด๋“œ๋Š” ๊ฐ™์€ ๊ฐ’์œผ๋กœ ์„ค์ •ํ•˜์—ฌ ๋ชจ๋“  ์›์†Œ๊ฐ€ ํ•œ ๋ฒˆ์”ฉ ์ฒ˜๋ฆฌ๋˜๋„๋ก ์„ค์ •ํ•œ๋‹ค.
- ํŠน์ง•
1. ํ•™์Šตํ•ด์•ผ ํ•  ๋งค๊ฐœ๋ณ€์ˆ˜๊ฐ€ ์—†๋‹ค.
2. ์ฑ„๋„๋งˆ๋‹ค ๋…๋ฆฝ์ ์œผ๋กœ ๊ณ„์‚ฐํ•˜๊ธฐ ๋•Œ๋ฌธ์—, ์ฑ„๋„ ์ˆ˜๊ฐ€ ๋ณ€ํ•˜์ง€ ์•Š๋Š”๋‹ค.
3. ์ž…๋ ฅ์˜ ๋ณ€ํ™”์— ์˜ํ–ฅ์„ ์ ๊ฒŒ ๋ฐ›๋Š”๋‹ค.
4. ํ†ต๊ณผํ•˜๋ฉด ํ–‰๋ ฌ์˜ ํฌ๊ธฐ๊ฐ€ ๊ฐ์†Œํ•œ๋‹ค
CNN
โ–  Local feature > global feature
์ƒˆ๋กœ ์ƒ๊ธด ์ด๋ฏธ์ง€๋Š” ์› ์ด๋ฏธ์ง€์—์„œ ๋ถ€๋ถ„ ๋ถ€
๋ถ„์˜ ํŠน์ง•์„ ๋ฝ‘์•„๋‚ธ ๊ฒƒ ์ด๋‹ค.(local feature).
์ด๋ ‡๊ฒŒ ๋ถ€๋ถ„ ๋ถ€๋ถ„์—์„œ ํŠน์ง•์„ ์•Œ ์ˆ˜ ์žˆ๊ธฐ ๋•Œ
๋ฌธ์— ์ „์ฒด ์ด๋ฏธ์ง€์—์„œ ํŠน์ง•์„ ๋ถ„์„ํ•˜๋Š” ๊ฒƒ
๋ณด๋‹ค ์ข€ ๋” ์ข‹์€ ํ•™์Šตํšจ๊ณผ๋ฅผ ๋‚ผ ์ˆ˜ ์žˆ๋Š” ๊ฒƒ
์ด CNN์ด๋‹ค. ์ด๋ ‡๊ฒŒ Convolution์„ ์ ์šฉํ•˜
๋ฉด์„œ ์˜์ƒ ํ•™์Šต ์ชฝ์˜ ์„ฑ๋Šฅ์€ ๋น„์•ฝ์ ์œผ๋กœ ํ–ฅ
์ƒ๋˜์—ˆ๋‹ค.
1. local feature๊ฐ€ global feature๊ฐ€ ๋˜๋Š”
๊ณผ์ •
2. CNN๊ณผ์ •
CNN-Dropout
Summary
Summary
Summary-Structure
โ–  Convolutional Layer
โ€ข์ž…๋ ฅ ๋ฐ์ดํ„ฐ์— ํ•„ํ„ฐ๋ฅผ ์ ์šฉ ํ›„ ํ™œ์„ฑํ™” ํ•จ์ˆ˜๋ฅผ ๋ฐ˜์˜ํ•˜๋Š” ํ•„์ˆ˜ ์š”์†Œ
โ€ข์ž…๋ ฅ ์˜์ƒ์œผ๋กœ๋ถ€ํ„ฐ Convolution(filter)๋ฅผ ํ†ตํ•ด, feature map์„ ์ƒ์„ฑ
โ€ข์—ฌ๋Ÿฌ ๊ฐœ์˜ ๋‹ค๋ฅธ ํŠน์ง•์„ ์ถ”์ถœํ•˜๊ณ  ์‹ถ๋‹ค๋ฉด, ์—ฌ๋Ÿฌ ๊ฐœ์˜ convolution kernel๋ฅผ ์‚ฌ์šฉ
โ–  Sub-sampling(Pooling)
โ€ข๊ฐ€์žฅ ๊ฐ•ํ•œ ์‹ ํ˜ธ๋งŒ ์ „๋‹ฌํ•˜๋Š” ๋ฐฉ์‹์„ ์ฑ„ํƒํ•˜์—ฌ ๊ฐ€์žฅ ํฐ ๊ฐ’์„ ์„ ํƒํ•˜๋Š” ๋ฐฉ๋ฒ•์ธ max-pooling์„
์ฃผ๋กœ ์‚ฌ์šฉ
โ€ข์ด๋™์ด๋‚˜ ๋ณ€ํ˜• ๋“ฑ์— ๋ฌด๊ด€ํ•œ ํ•™์Šต ๊ฒฐ๊ณผ๋ฅผ ๋ณด์ด๊ธฐ ์œ„ํ•ด์„œ Convolution+Sub-sampling ๊ณผ์ •์„
์—ฌ๋Ÿฌ ๋ฒˆ ๊ฑฐ์ณ ๋Œ€ํ‘œํ•  ์ˆ˜ ์žˆ๋Š” ํŠน์ง•์„ ์–ป๋Š” ๊ฒƒ์ด ์ค‘์š”
Summary
Thank you

More Related Content

What's hot

Recurrent Neural Networks, LSTM and GRU
Recurrent Neural Networks, LSTM and GRURecurrent Neural Networks, LSTM and GRU
Recurrent Neural Networks, LSTM and GRUananth
ย 
Convolutional Neural Networks (CNN)
Convolutional Neural Networks (CNN)Convolutional Neural Networks (CNN)
Convolutional Neural Networks (CNN)Gaurav Mittal
ย 
Convolutional Neural Network and Its Applications
Convolutional Neural Network and Its ApplicationsConvolutional Neural Network and Its Applications
Convolutional Neural Network and Its ApplicationsKasun Chinthaka Piyarathna
ย 
Recurrent Neural Network
Recurrent Neural NetworkRecurrent Neural Network
Recurrent Neural NetworkMohammad Sabouri
ย 
Recurrent Neural Networks
Recurrent Neural NetworksRecurrent Neural Networks
Recurrent Neural NetworksSharath TS
ย 
Convolutional Neural Network (CNN) - image recognition
Convolutional Neural Network (CNN)  - image recognitionConvolutional Neural Network (CNN)  - image recognition
Convolutional Neural Network (CNN) - image recognitionYUNG-KUEI CHEN
ย 
Image classification with Deep Neural Networks
Image classification with Deep Neural NetworksImage classification with Deep Neural Networks
Image classification with Deep Neural NetworksYogendra Tamang
ย 
Convolutional neural network
Convolutional neural networkConvolutional neural network
Convolutional neural networkFerdous ahmed
ย 
Introduction to Recurrent Neural Network
Introduction to Recurrent Neural NetworkIntroduction to Recurrent Neural Network
Introduction to Recurrent Neural NetworkKnoldus Inc.
ย 
Deep Learning - Convolutional Neural Networks
Deep Learning - Convolutional Neural NetworksDeep Learning - Convolutional Neural Networks
Deep Learning - Convolutional Neural NetworksChristian Perone
ย 
MobileNet - PR044
MobileNet - PR044MobileNet - PR044
MobileNet - PR044Jinwon Lee
ย 
Attention is All You Need (Transformer)
Attention is All You Need (Transformer)Attention is All You Need (Transformer)
Attention is All You Need (Transformer)Jeong-Gwan Lee
ย 
Deep Learning Introduction Lecture
Deep Learning Introduction LectureDeep Learning Introduction Lecture
Deep Learning Introduction Lectureshivam chaurasia
ย 
CONVOLUTIONAL NEURAL NETWORK
CONVOLUTIONAL NEURAL NETWORKCONVOLUTIONAL NEURAL NETWORK
CONVOLUTIONAL NEURAL NETWORKMd Rajib Bhuiyan
ย 
Densely Connected Convolutional Networks
Densely Connected Convolutional NetworksDensely Connected Convolutional Networks
Densely Connected Convolutional NetworksHosein Mohebbi
ย 
Convolutional Neural Networks (CNNs)
Convolutional Neural Networks (CNNs)Convolutional Neural Networks (CNNs)
Convolutional Neural Networks (CNNs)Hajar Bouchriha
ย 
Convolutional neural network
Convolutional neural network Convolutional neural network
Convolutional neural network Yan Xu
ย 

What's hot (20)

Recurrent Neural Networks, LSTM and GRU
Recurrent Neural Networks, LSTM and GRURecurrent Neural Networks, LSTM and GRU
Recurrent Neural Networks, LSTM and GRU
ย 
Convolutional Neural Networks (CNN)
Convolutional Neural Networks (CNN)Convolutional Neural Networks (CNN)
Convolutional Neural Networks (CNN)
ย 
Convolutional Neural Network and Its Applications
Convolutional Neural Network and Its ApplicationsConvolutional Neural Network and Its Applications
Convolutional Neural Network and Its Applications
ย 
Recurrent Neural Network
Recurrent Neural NetworkRecurrent Neural Network
Recurrent Neural Network
ย 
cnn ppt.pptx
cnn ppt.pptxcnn ppt.pptx
cnn ppt.pptx
ย 
Recurrent Neural Networks
Recurrent Neural NetworksRecurrent Neural Networks
Recurrent Neural Networks
ย 
LSTM Tutorial
LSTM TutorialLSTM Tutorial
LSTM Tutorial
ย 
Convolutional Neural Network (CNN) - image recognition
Convolutional Neural Network (CNN)  - image recognitionConvolutional Neural Network (CNN)  - image recognition
Convolutional Neural Network (CNN) - image recognition
ย 
Image classification with Deep Neural Networks
Image classification with Deep Neural NetworksImage classification with Deep Neural Networks
Image classification with Deep Neural Networks
ย 
Convolutional neural network
Convolutional neural networkConvolutional neural network
Convolutional neural network
ย 
Resnet
ResnetResnet
Resnet
ย 
Introduction to Recurrent Neural Network
Introduction to Recurrent Neural NetworkIntroduction to Recurrent Neural Network
Introduction to Recurrent Neural Network
ย 
Deep Learning - Convolutional Neural Networks
Deep Learning - Convolutional Neural NetworksDeep Learning - Convolutional Neural Networks
Deep Learning - Convolutional Neural Networks
ย 
MobileNet - PR044
MobileNet - PR044MobileNet - PR044
MobileNet - PR044
ย 
Attention is All You Need (Transformer)
Attention is All You Need (Transformer)Attention is All You Need (Transformer)
Attention is All You Need (Transformer)
ย 
Deep Learning Introduction Lecture
Deep Learning Introduction LectureDeep Learning Introduction Lecture
Deep Learning Introduction Lecture
ย 
CONVOLUTIONAL NEURAL NETWORK
CONVOLUTIONAL NEURAL NETWORKCONVOLUTIONAL NEURAL NETWORK
CONVOLUTIONAL NEURAL NETWORK
ย 
Densely Connected Convolutional Networks
Densely Connected Convolutional NetworksDensely Connected Convolutional Networks
Densely Connected Convolutional Networks
ย 
Convolutional Neural Networks (CNNs)
Convolutional Neural Networks (CNNs)Convolutional Neural Networks (CNNs)
Convolutional Neural Networks (CNNs)
ย 
Convolutional neural network
Convolutional neural network Convolutional neural network
Convolutional neural network
ย 

Similar to CNN

HistoryOfCNN
HistoryOfCNNHistoryOfCNN
HistoryOfCNNTae Young Lee
ย 
History of Vision AI
History of Vision AIHistory of Vision AI
History of Vision AITae Young Lee
ย 
Dl from scratch(7)
Dl from scratch(7)Dl from scratch(7)
Dl from scratch(7)Park Seong Hyeon
ย 
Convolutional rnn
Convolutional rnnConvolutional rnn
Convolutional rnnLee Gyeong Hoon
ย 
FCN to DeepLab.v3+
FCN to DeepLab.v3+FCN to DeepLab.v3+
FCN to DeepLab.v3+Whi Kwon
ย 
แ„†แ…ตแ‡€แ„‡แ…กแ„ƒแ…กแ†จแ„‡แ…ฎแ„แ…ฅ แ„‰แ…ตแ„Œแ…กแ†จแ„’แ…กแ„‚แ…ณแ†ซแ„ƒแ…ตแ†ธแ„…แ…ฅแ„‚แ…ตแ†ผ 8์žฅ
แ„†แ…ตแ‡€แ„‡แ…กแ„ƒแ…กแ†จแ„‡แ…ฎแ„แ…ฅ แ„‰แ…ตแ„Œแ…กแ†จแ„’แ…กแ„‚แ…ณแ†ซแ„ƒแ…ตแ†ธแ„…แ…ฅแ„‚แ…ตแ†ผ 8์žฅแ„†แ…ตแ‡€แ„‡แ…กแ„ƒแ…กแ†จแ„‡แ…ฎแ„แ…ฅ แ„‰แ…ตแ„Œแ…กแ†จแ„’แ…กแ„‚แ…ณแ†ซแ„ƒแ…ตแ†ธแ„…แ…ฅแ„‚แ…ตแ†ผ 8์žฅ
แ„†แ…ตแ‡€แ„‡แ…กแ„ƒแ…กแ†จแ„‡แ…ฎแ„แ…ฅ แ„‰แ…ตแ„Œแ…กแ†จแ„’แ…กแ„‚แ…ณแ†ซแ„ƒแ…ตแ†ธแ„…แ…ฅแ„‚แ…ตแ†ผ 8์žฅSunggon Song
ย 
Deep Learning Into Advance - 1. Image, ConvNet
Deep Learning Into Advance - 1. Image, ConvNetDeep Learning Into Advance - 1. Image, ConvNet
Deep Learning Into Advance - 1. Image, ConvNetHyojun Kim
ย 
Cnn ๋ฐœํ‘œ์ž๋ฃŒ
Cnn ๋ฐœํ‘œ์ž๋ฃŒCnn ๋ฐœํ‘œ์ž๋ฃŒ
Cnn ๋ฐœํ‘œ์ž๋ฃŒ์ข…ํ˜„ ์ตœ
ย 
์—ฌ๋Ÿฌ ์ปจ๋ณผ๋ฃจ์…˜ ๋ ˆ์ด์–ด ํ…Œํฌ๋‹‰๊ณผ ๊ฒฝ๋Ÿ‰ํ™” ๊ธฐ๋ฒ•๋“ค
์—ฌ๋Ÿฌ ์ปจ๋ณผ๋ฃจ์…˜ ๋ ˆ์ด์–ด ํ…Œํฌ๋‹‰๊ณผ ๊ฒฝ๋Ÿ‰ํ™” ๊ธฐ๋ฒ•๋“ค์—ฌ๋Ÿฌ ์ปจ๋ณผ๋ฃจ์…˜ ๋ ˆ์ด์–ด ํ…Œํฌ๋‹‰๊ณผ ๊ฒฝ๋Ÿ‰ํ™” ๊ธฐ๋ฒ•๋“ค
์—ฌ๋Ÿฌ ์ปจ๋ณผ๋ฃจ์…˜ ๋ ˆ์ด์–ด ํ…Œํฌ๋‹‰๊ณผ ๊ฒฝ๋Ÿ‰ํ™” ๊ธฐ๋ฒ•๋“คDongyi Kim
ย 
[์ปดํ“จํ„ฐ๋น„์ „๊ณผ ์ธ๊ณต์ง€๋Šฅ] 7. ํ•ฉ์„ฑ๊ณฑ ์‹ ๊ฒฝ๋ง 2
[์ปดํ“จํ„ฐ๋น„์ „๊ณผ ์ธ๊ณต์ง€๋Šฅ] 7. ํ•ฉ์„ฑ๊ณฑ ์‹ ๊ฒฝ๋ง 2[์ปดํ“จํ„ฐ๋น„์ „๊ณผ ์ธ๊ณต์ง€๋Šฅ] 7. ํ•ฉ์„ฑ๊ณฑ ์‹ ๊ฒฝ๋ง 2
[์ปดํ“จํ„ฐ๋น„์ „๊ณผ ์ธ๊ณต์ง€๋Šฅ] 7. ํ•ฉ์„ฑ๊ณฑ ์‹ ๊ฒฝ๋ง 2jdo
ย 
CNN ์ดˆ๋ณด์ž๊ฐ€ ๋งŒ๋“œ๋Š” ์ดˆ๋ณด์ž ๊ฐ€์ด๋“œ (VGG ์•ฝ๊ฐ„ ํฌํ•จ)
CNN ์ดˆ๋ณด์ž๊ฐ€ ๋งŒ๋“œ๋Š” ์ดˆ๋ณด์ž ๊ฐ€์ด๋“œ (VGG ์•ฝ๊ฐ„ ํฌํ•จ)CNN ์ดˆ๋ณด์ž๊ฐ€ ๋งŒ๋“œ๋Š” ์ดˆ๋ณด์ž ๊ฐ€์ด๋“œ (VGG ์•ฝ๊ฐ„ ํฌํ•จ)
CNN ์ดˆ๋ณด์ž๊ฐ€ ๋งŒ๋“œ๋Š” ์ดˆ๋ณด์ž ๊ฐ€์ด๋“œ (VGG ์•ฝ๊ฐ„ ํฌํ•จ)Lee Seungeun
ย 
Deep Learning & Convolutional Neural Network
Deep Learning & Convolutional Neural NetworkDeep Learning & Convolutional Neural Network
Deep Learning & Convolutional Neural Networkagdatalab
ย 
Densely Connected Convolutional Networks
Densely Connected Convolutional NetworksDensely Connected Convolutional Networks
Densely Connected Convolutional NetworksOh Yoojin
ย 
Convolutional neural networks
Convolutional neural networksConvolutional neural networks
Convolutional neural networksHyunjinBae3
ย 
Where to Apply Dropout in Recurrent Neural Networks for Handwriting Recognition?
Where to Apply Dropout in Recurrent Neural Networks for Handwriting Recognition?Where to Apply Dropout in Recurrent Neural Networks for Handwriting Recognition?
Where to Apply Dropout in Recurrent Neural Networks for Handwriting Recognition?Lee Gyeong Hoon
ย 
์‹ค์ „ํ”„๋กœ์ ํŠธ ์ •์„œ๊ฒฝ ์–‘ํ˜„์ฐฌ
์‹ค์ „ํ”„๋กœ์ ํŠธ ์ •์„œ๊ฒฝ ์–‘ํ˜„์ฐฌ์‹ค์ „ํ”„๋กœ์ ํŠธ ์ •์„œ๊ฒฝ ์–‘ํ˜„์ฐฌ
์‹ค์ „ํ”„๋กœ์ ํŠธ ์ •์„œ๊ฒฝ ์–‘ํ˜„์ฐฌํ˜„์ฐฌ ์–‘
ย 
Designing more efficient convolution neural network
Designing more efficient convolution neural networkDesigning more efficient convolution neural network
Designing more efficient convolution neural networkNAVER Engineering
ย 
Designing more efficient convolution neural network
Designing more efficient convolution neural networkDesigning more efficient convolution neural network
Designing more efficient convolution neural networkDongyi Kim
ย 
๋„คํŠธ์›Œํฌ ๊ฒฝ๋Ÿ‰ํ™” ์ด๋ชจ์ €๋ชจ @ 2020 DLD
๋„คํŠธ์›Œํฌ ๊ฒฝ๋Ÿ‰ํ™” ์ด๋ชจ์ €๋ชจ @ 2020 DLD๋„คํŠธ์›Œํฌ ๊ฒฝ๋Ÿ‰ํ™” ์ด๋ชจ์ €๋ชจ @ 2020 DLD
๋„คํŠธ์›Œํฌ ๊ฒฝ๋Ÿ‰ํ™” ์ด๋ชจ์ €๋ชจ @ 2020 DLDKim Junghoon
ย 

Similar to CNN (20)

HistoryOfCNN
HistoryOfCNNHistoryOfCNN
HistoryOfCNN
ย 
History of Vision AI
History of Vision AIHistory of Vision AI
History of Vision AI
ย 
Dl from scratch(7)
Dl from scratch(7)Dl from scratch(7)
Dl from scratch(7)
ย 
Convolutional rnn
Convolutional rnnConvolutional rnn
Convolutional rnn
ย 
FCN to DeepLab.v3+
FCN to DeepLab.v3+FCN to DeepLab.v3+
FCN to DeepLab.v3+
ย 
แ„†แ…ตแ‡€แ„‡แ…กแ„ƒแ…กแ†จแ„‡แ…ฎแ„แ…ฅ แ„‰แ…ตแ„Œแ…กแ†จแ„’แ…กแ„‚แ…ณแ†ซแ„ƒแ…ตแ†ธแ„…แ…ฅแ„‚แ…ตแ†ผ 8์žฅ
แ„†แ…ตแ‡€แ„‡แ…กแ„ƒแ…กแ†จแ„‡แ…ฎแ„แ…ฅ แ„‰แ…ตแ„Œแ…กแ†จแ„’แ…กแ„‚แ…ณแ†ซแ„ƒแ…ตแ†ธแ„…แ…ฅแ„‚แ…ตแ†ผ 8์žฅแ„†แ…ตแ‡€แ„‡แ…กแ„ƒแ…กแ†จแ„‡แ…ฎแ„แ…ฅ แ„‰แ…ตแ„Œแ…กแ†จแ„’แ…กแ„‚แ…ณแ†ซแ„ƒแ…ตแ†ธแ„…แ…ฅแ„‚แ…ตแ†ผ 8์žฅ
แ„†แ…ตแ‡€แ„‡แ…กแ„ƒแ…กแ†จแ„‡แ…ฎแ„แ…ฅ แ„‰แ…ตแ„Œแ…กแ†จแ„’แ…กแ„‚แ…ณแ†ซแ„ƒแ…ตแ†ธแ„…แ…ฅแ„‚แ…ตแ†ผ 8์žฅ
ย 
Deep Learning Into Advance - 1. Image, ConvNet
Deep Learning Into Advance - 1. Image, ConvNetDeep Learning Into Advance - 1. Image, ConvNet
Deep Learning Into Advance - 1. Image, ConvNet
ย 
Cnn ๋ฐœํ‘œ์ž๋ฃŒ
Cnn ๋ฐœํ‘œ์ž๋ฃŒCnn ๋ฐœํ‘œ์ž๋ฃŒ
Cnn ๋ฐœํ‘œ์ž๋ฃŒ
ย 
์—ฌ๋Ÿฌ ์ปจ๋ณผ๋ฃจ์…˜ ๋ ˆ์ด์–ด ํ…Œํฌ๋‹‰๊ณผ ๊ฒฝ๋Ÿ‰ํ™” ๊ธฐ๋ฒ•๋“ค
์—ฌ๋Ÿฌ ์ปจ๋ณผ๋ฃจ์…˜ ๋ ˆ์ด์–ด ํ…Œํฌ๋‹‰๊ณผ ๊ฒฝ๋Ÿ‰ํ™” ๊ธฐ๋ฒ•๋“ค์—ฌ๋Ÿฌ ์ปจ๋ณผ๋ฃจ์…˜ ๋ ˆ์ด์–ด ํ…Œํฌ๋‹‰๊ณผ ๊ฒฝ๋Ÿ‰ํ™” ๊ธฐ๋ฒ•๋“ค
์—ฌ๋Ÿฌ ์ปจ๋ณผ๋ฃจ์…˜ ๋ ˆ์ด์–ด ํ…Œํฌ๋‹‰๊ณผ ๊ฒฝ๋Ÿ‰ํ™” ๊ธฐ๋ฒ•๋“ค
ย 
[์ปดํ“จํ„ฐ๋น„์ „๊ณผ ์ธ๊ณต์ง€๋Šฅ] 7. ํ•ฉ์„ฑ๊ณฑ ์‹ ๊ฒฝ๋ง 2
[์ปดํ“จํ„ฐ๋น„์ „๊ณผ ์ธ๊ณต์ง€๋Šฅ] 7. ํ•ฉ์„ฑ๊ณฑ ์‹ ๊ฒฝ๋ง 2[์ปดํ“จํ„ฐ๋น„์ „๊ณผ ์ธ๊ณต์ง€๋Šฅ] 7. ํ•ฉ์„ฑ๊ณฑ ์‹ ๊ฒฝ๋ง 2
[์ปดํ“จํ„ฐ๋น„์ „๊ณผ ์ธ๊ณต์ง€๋Šฅ] 7. ํ•ฉ์„ฑ๊ณฑ ์‹ ๊ฒฝ๋ง 2
ย 
Image classification
Image classificationImage classification
Image classification
ย 
CNN ์ดˆ๋ณด์ž๊ฐ€ ๋งŒ๋“œ๋Š” ์ดˆ๋ณด์ž ๊ฐ€์ด๋“œ (VGG ์•ฝ๊ฐ„ ํฌํ•จ)
CNN ์ดˆ๋ณด์ž๊ฐ€ ๋งŒ๋“œ๋Š” ์ดˆ๋ณด์ž ๊ฐ€์ด๋“œ (VGG ์•ฝ๊ฐ„ ํฌํ•จ)CNN ์ดˆ๋ณด์ž๊ฐ€ ๋งŒ๋“œ๋Š” ์ดˆ๋ณด์ž ๊ฐ€์ด๋“œ (VGG ์•ฝ๊ฐ„ ํฌํ•จ)
CNN ์ดˆ๋ณด์ž๊ฐ€ ๋งŒ๋“œ๋Š” ์ดˆ๋ณด์ž ๊ฐ€์ด๋“œ (VGG ์•ฝ๊ฐ„ ํฌํ•จ)
ย 
Deep Learning & Convolutional Neural Network
Deep Learning & Convolutional Neural NetworkDeep Learning & Convolutional Neural Network
Deep Learning & Convolutional Neural Network
ย 
Densely Connected Convolutional Networks
Densely Connected Convolutional NetworksDensely Connected Convolutional Networks
Densely Connected Convolutional Networks
ย 
Convolutional neural networks
Convolutional neural networksConvolutional neural networks
Convolutional neural networks
ย 
Where to Apply Dropout in Recurrent Neural Networks for Handwriting Recognition?
Where to Apply Dropout in Recurrent Neural Networks for Handwriting Recognition?Where to Apply Dropout in Recurrent Neural Networks for Handwriting Recognition?
Where to Apply Dropout in Recurrent Neural Networks for Handwriting Recognition?
ย 
์‹ค์ „ํ”„๋กœ์ ํŠธ ์ •์„œ๊ฒฝ ์–‘ํ˜„์ฐฌ
์‹ค์ „ํ”„๋กœ์ ํŠธ ์ •์„œ๊ฒฝ ์–‘ํ˜„์ฐฌ์‹ค์ „ํ”„๋กœ์ ํŠธ ์ •์„œ๊ฒฝ ์–‘ํ˜„์ฐฌ
์‹ค์ „ํ”„๋กœ์ ํŠธ ์ •์„œ๊ฒฝ ์–‘ํ˜„์ฐฌ
ย 
Designing more efficient convolution neural network
Designing more efficient convolution neural networkDesigning more efficient convolution neural network
Designing more efficient convolution neural network
ย 
Designing more efficient convolution neural network
Designing more efficient convolution neural networkDesigning more efficient convolution neural network
Designing more efficient convolution neural network
ย 
๋„คํŠธ์›Œํฌ ๊ฒฝ๋Ÿ‰ํ™” ์ด๋ชจ์ €๋ชจ @ 2020 DLD
๋„คํŠธ์›Œํฌ ๊ฒฝ๋Ÿ‰ํ™” ์ด๋ชจ์ €๋ชจ @ 2020 DLD๋„คํŠธ์›Œํฌ ๊ฒฝ๋Ÿ‰ํ™” ์ด๋ชจ์ €๋ชจ @ 2020 DLD
๋„คํŠธ์›Œํฌ ๊ฒฝ๋Ÿ‰ํ™” ์ด๋ชจ์ €๋ชจ @ 2020 DLD
ย 

More from chs71

Credit default risk
Credit default riskCredit default risk
Credit default riskchs71
ย 
Tensorflow
TensorflowTensorflow
Tensorflowchs71
ย 
Pandas
PandasPandas
Pandaschs71
ย 
Seoul square[mock project]
Seoul square[mock project]Seoul square[mock project]
Seoul square[mock project]chs71
ย 
Learning method
Learning methodLearning method
Learning methodchs71
ย 
Vip detection sensor
Vip detection sensorVip detection sensor
Vip detection sensorchs71
ย 
Share house
Share houseShare house
Share housechs71
ย 
Logistic regression1
Logistic regression1Logistic regression1
Logistic regression1chs71
ย 
Class imbalance problem1
Class imbalance problem1Class imbalance problem1
Class imbalance problem1chs71
ย 
Credit default risk
Credit default riskCredit default risk
Credit default riskchs71
ย 
Maximum likelihood estimation
Maximum likelihood estimationMaximum likelihood estimation
Maximum likelihood estimationchs71
ย 

More from chs71 (11)

Credit default risk
Credit default riskCredit default risk
Credit default risk
ย 
Tensorflow
TensorflowTensorflow
Tensorflow
ย 
Pandas
PandasPandas
Pandas
ย 
Seoul square[mock project]
Seoul square[mock project]Seoul square[mock project]
Seoul square[mock project]
ย 
Learning method
Learning methodLearning method
Learning method
ย 
Vip detection sensor
Vip detection sensorVip detection sensor
Vip detection sensor
ย 
Share house
Share houseShare house
Share house
ย 
Logistic regression1
Logistic regression1Logistic regression1
Logistic regression1
ย 
Class imbalance problem1
Class imbalance problem1Class imbalance problem1
Class imbalance problem1
ย 
Credit default risk
Credit default riskCredit default risk
Credit default risk
ย 
Maximum likelihood estimation
Maximum likelihood estimationMaximum likelihood estimation
Maximum likelihood estimation
ย 

CNN

  • 2. Traditional NN & CNN -์™„์ „์—ฐ๊ฒฐ๊ณ„์ธต (Affine ๊ณ„์ธต) -CNN โ€˜ํ•ฉ์„ฑ๊ณฑ ๊ณ„์ธต(conv)โ€™๊ณผ โ€˜ํ’€๋ง ๊ณ„์ธต(Pooling)โ€™ ์ถ”๊ฐ€ ์ถœ๋ ฅ์— ๊ฐ€๊นŒ์šด ์ธต์—์„œ๋Š” ์ง€๊ธˆ๊นŒ์ง€์˜ NN ๊ตฌ์„ฑ ๊ทธ๋Œ€๋กœ ์‚ฌ์šฉ โ–ถ ๊ฐ„๋‹จํ•˜๊ฒŒ, CNN์€ ์‹ ๊ฒฝ๋ง์— ๊ธฐ์กด์˜ ํ•„ํ„ฐ ๊ธฐ์ˆ ์„ ๋ณ‘ํ•ฉํ•˜์—ฌ ์‹ ๊ฒฝ๋ง์ด 2์ฐจ์› ์˜์ƒ์„ ์ž˜ ์Šต๋“ํ•  ์ˆ˜ ์žˆ๋„๋ก ์ตœ์ ํ™” ์‹œ ํ‚จ ์•Œ๊ณ ๋ฆฌ์ฆ˜์ด๋‹ค. โ–  Problems of Fully-connected Layer? โ€˜๋ฐ์ดํ„ฐ์˜ ํ˜•์ƒ์ด ๋ฌด์‹œ๋œ๋‹คโ€™ ์ด๋ฏธ์ง€์˜ ๊ฒฝ์šฐ, ์„ธ๋กœ/๊ฐ€๋กœ/์ฑ„๋„(์ƒ‰์ƒ)๋กœ ๊ตฌ์„ฑ๋œ 3์ฐจ์› ๋ฐ์ดํ„ฐ์ด๋‹ค. ๊ทธ๋Ÿฌ๋‚˜ ์™„์ „์—ฐ๊ฒฐ ๊ณ„์ธต์— ์ž…๋ ฅํ•  ๋•Œ๋Š” 3์ฐจ์›์˜ ๋ฐ์ดํ„ฐ๋ฅผ ํ‰ํ‰ํ•œ 1์ฐจ ์› ๋ฐ์ดํ„ฐ(gray scale)๋กœ ํ‰๋ฉดํ™” ํ•ด์ฃผ์–ด์•ผ ํ•œ๋‹ค. ์ด ๊ณผ์ •์—์„œ ๊ณต๊ฐ„ ์ •๋ณด๊ฐ€ ์†์‹ค๋  ์ˆ˜๋ฐ–์— ์—†๊ณ , ๊ฒฐ๊ณผ์ ์œผ๋กœ ์ด๋ฏธ์ง€ ๊ณต๊ฐ„ ์ •๋ณด ์œ ์‹ค๋กœ ์ธ ํ•œ ์ •๋ณด ๋ถ€์กฑ์œผ๋กœ ์ธ๊ณต ์‹ ๊ฒฝ๋ง์ด ํŠน์ง•์„ ์ถ”์ถœ ๋ฐ ํ•™์Šต์ด ๋น„ํšจ์œจ์ ์ด๊ณ  ์ •ํ™•๋„๋ฅผ ๋†’์ด๋Š”๋ฐ ํ•œ๊ณ„๊ฐ€ ์กด์žฌํ•œ๋‹ค. ์ฆ‰, ์™„์ „์—ฐ๊ฒฐ ๊ณ„์ธต์€ ํ˜•์ƒ์„ ๋ฌด์‹œํ•˜๊ณ  ๋ชจ๋“  ์ž…๋ ฅ๋ฐ์ดํ„ฐ๋ฅผ ๋™๋“ฑํ•œ ๋‰ด๋Ÿฐ(๊ฐ™์€ ์ฐจ์›์˜ ๋‰ด๋Ÿฐ)์œผ๋กœ ์ทจ๊ธ‰ํ•˜์—ฌ ํ˜•์ƒ์— ๋‹ด๊ธด ์ •๋ณด๋ฅผ ์‚ด๋ฆด ์ˆ˜ ์—† ๋‹ค. ์ด๋ฏธ์ง€์˜ ๊ณต๊ฐ„ ์ •๋ณด๋ฅผ ์œ ์ง€ํ•œ ์ƒํƒœ๋กœ ํ•™์Šต์ด ๊ฐ€๋Šฅํ•œ ๋ชจ๋ธ์ด ๋ฐ”๋กœ CNN(Convolutional Neural Network)์ž…๋‹ˆ๋‹ค.
  • 3. Convolutional Neural Network โ–  Diff. between CNN and Affine(Fully-connected Layer) ํ•ฉ์„ฑ๊ณฑ ๊ณ„์ธต์€ ํ˜•์ƒ์„ ์œ ์ง€ํ•œ๋‹ค. ์ด๋ฏธ์ง€๋ฅผ 3์ฐจ์› ๋ฐ์ดํ„ฐ๋กœ ์ž…๋ ฅ ๋ฐ›์œผ๋ฉฐ, ๋งˆ์ฐฌ๊ฐ€์ง€๋กœ ๋‹ค์Œ ๊ณ„์ธต์—๋„ 3์ฐจ์› ๋ฐ์ดํ„ฐ๋กœ ์ „๋‹ฌํ•œ๋‹ค. ๋”ฐ๋ผ์„œ CNN์—์„œ๋Š” ์ด ๋ฏธ์ง€์ฒ˜๋Ÿผ ํ˜•์ƒ์„ ๊ฐ€์ง„ ๋ฐ์ดํ„ฐ๋ฅผ ์ œ๋Œ€๋กœ ์ดํ•ดํ•  ๊ฐ€๋Šฅ์„ฑ์ด ๋†’๋‹ค. โ€ข๊ฐ ๋ ˆ์ด์–ด์˜ ์ž…์ถœ๋ ฅ ๋ฐ์ดํ„ฐ์˜ ํ˜•์ƒ ์œ ์ง€ โ€ข์ด๋ฏธ์ง€์˜ ๊ณต๊ฐ„ ์ •๋ณด๋ฅผ ์œ ์ง€ํ•˜๋ฉด์„œ ์ธ์ ‘ ์ด๋ฏธ์ง€์™€์˜ ํŠน์ง•์„ ํšจ๊ณผ์ ์œผ๋กœ ์ธ์‹ โ€ข๋ณต์ˆ˜์˜ ํ•„ํ„ฐ๋กœ ์ด๋ฏธ์ง€์˜ ํŠน์ง• ์ถ”์ถœ ๋ฐ ํ•™์Šต โ€ข์ถ”์ถœํ•œ ์ด๋ฏธ์ง€์˜ ํŠน์ง•์„ ๋ชจ์œผ๊ณ  ๊ฐ•ํ™”ํ•˜๋Š” Pooling ๋ ˆ์ด์–ด โ€ขํ•„ํ„ฐ๋ฅผ ๊ณต์œ  ํŒŒ๋ผ๋ฏธํ„ฐ๋กœ ์‚ฌ์šฉํ•˜๊ธฐ ๋•Œ๋ฌธ์—, ์ผ๋ฐ˜ ์ธ๊ณต ์‹ ๊ฒฝ๋ง๊ณผ ๋น„๊ตํ•˜์—ฌ ํ•™์Šต ํŒŒ๋ผ๋ฏธํ„ฐ๊ฐ€ ๋งค์šฐ ์ ์Œ โ–  CNNโ€™s Features โ€ขLocality CNN์€ local ์ •๋ณด๋ฅผ ํ™œ์šฉํ•œ๋‹ค. ๊ณต๊ฐ„์ ์œผ๋กœ ์ธ์ ‘ํ•œ ์‹ ํ˜ธ๋“ค์— ๋Œ€ํ•œ correlation ๊ด€๊ณ„๋ฅผ ๋น„์„ ํ˜• ํ•„ํ„ฐ๋ฅผ ์ ์šฉํ•˜์—ฌ ์ถ”์ถœํ•ด ๋‚ด๋Š”๋ฐ, ์ด๋Ÿฌํ•œ ํ•„ํ„ฐ๋ฅผ ์—ฌ๋Ÿฌ ๊ฐœ ์ ์šฉํ•˜๋ฉด ๋‹ค์–‘ํ•œ local ํŠน์ง•์„ ์ถ”์ถœํ•ด ๋‚ผ ์ˆ˜ ์žˆ๊ฒŒ ๋œ๋‹ค. โ€ขShared Weight ๋™์ผํ•œ ๊ณ„์ˆ˜๋ฅผ ๊ฐ–๋Š” filter๋ฅผ ์˜์ƒ ์ „์ฒด์— ๋ฐ˜๋ณต์ ์œผ๋กœ ์ ์šฉํ•จ์œผ๋กœ์จ ๋ณ€์ˆ˜์˜ ์ˆ˜๋ฅผ ํš๊ธฐ์ ์œผ๋กœ ์ค„์ผ ์ˆ˜ ์žˆ์œผ๋ฉฐ, topology๋ณ€ํ™”์— ๋ฌด๊ด€ํ•œ ํ•ญ์ƒ์„ฑ ์–ป์„ ์ˆ˜ ์žˆ๊ฒŒ๋œ๋‹ค.
  • 4. Convolution โ–  Convolution ์ž…๋ ฅ ์˜์ƒ์œผ๋กœ๋ถ€ํ„ฐ Convolution(filter)๋ฅผ ํ†ตํ•ด, feature map์„ ์ƒ์„ฑ - Filter(=Kernel) & Stride ์ด๋ฏธ์ง€์˜ ํŠน์ง•์„ ์ฐพ์•„๋‚ด๊ธฐ ์œ„ํ•œ ๊ณต์šฉ ํŒŒ๋ผ๋ฏธํ„ฐ๋กœ, ์ผ๋ฐ˜์ ์œผ๋กœ (4, 4)์ด๋‚˜ (3, 3)๊ณผ ๊ฐ™์€ ์ •์‚ฌ๊ฐ ํ–‰๋ ฌ๋กœ ์ •์˜๋œ๋‹ค. CNN์—์„œ ํ•™์Šต์˜ ๋Œ€์ƒ์€ ํ•„ํ„ฐ ํŒŒ๋ผ๋ฏธํ„ฐ์ด๋ฉฐ, ํ•ฉ์„ฑ๊ณฑ ์—ฐ์‚ฐ์€ ํ•„ํ„ฐ์˜ ์œˆ๋„์šฐ๋ฅผ ์ง€์ •๋œ ๊ฐ„๊ฒฉ(stride)์œผ๋กœ ์ˆœํšŒํ•˜๋ฉฐ ์ฑ„๋„๋ณ„๋กœ(์ปฌ๋Ÿฌ์˜ ๊ฒฝ์šฐ 3๊ฐœ) ์ž…๋ ฅ๋ฐ์ดํ„ฐ์— ์ ์šฉํ•˜์—ฌ Feature Map์„ ๋งŒ๋“ ๋‹ค. ์ฃผ์˜ )์ž…๋ ฅ ๋ฐ์ดํ„ฐ์˜ ์ฑ„๋„ ์ˆ˜์™€ ํ•„ํ„ฐ์˜ ์ฑ„๋„ ์ˆ˜๋Š” ๊ฐ™์•„์•ผ
  • 5. Convolution โ–  Filter ์ž…๋ ฅ ์˜์ƒ์œผ๋กœ๋ถ€ํ„ฐ Convolution(filter)๋ฅผ ํ†ตํ•ด, feature map์„ ์ƒ์„ฑ Filter๋Š” ํ•ด๋‹น ํŠน์ง•์„ ๋‘๋“œ๋Ÿฌ์ง€๊ฒŒํ•˜๊ฑฐ๋‚˜, ๊ทธ ํŠน์ง•์ด ๋ฐ์ดํ„ฐ์— ์žˆ๋Š”์ง€ ์—†๋Š”์ง€๋ฅผ ๊ฒ€์ถœํ•ด์ฃผ๋Š” ํ•จ์ˆ˜์ด ๋‹ค. ์˜ˆ1) ํŠน์ง•์„ ๋‘๋“œ๋Ÿฌ์ง€๊ฒŒ ํ•˜๋Š” ํ•„ํ„ฐ ์˜ˆ2) ํ•ด๋‹น ํŠน์ง• ๊ฒ€์ถœ ํ•„ํ„ฐ ๊ณก์„ ์„ ๊ฒ€์ถœํ•˜๋Š” ํ•„ํ„ฐ ์ง์„  ๋ถ€๋ถ„์„ ์ ์šฉํ•˜๋ฉด? ๊ฒฐ๊ณผ๊ฐ’ 0์— ์ˆ˜๋ ด ์ฆ‰, ํ•„ํ„ฐ๋Š” ์ž…๋ ฅ ๋ฐ›์€ ๋ฐ์ดํ„ฐ์—์„œ ๊ทธ ํŠน์„ฑ์„ ๊ฐ€์ง€๊ณ  ์žˆ์œผ๋ฉด ๊ฒฐ๊ณผ ๊ฐ’์ด ํฐ ๊ฐ’์ด ๋‚˜์˜ค๊ณ , ๊ฐ€์ง€๊ณ  ์žˆ์ง€ ์•Š์œผ๋ฉด 0์— ๊ฐ€๊นŒ์šด ๊ฐ’์ด ๋‚˜ ์˜ค๊ฒŒ ๋˜์–ด ํ…Œ์ดํ„ฐ๊ฐ€ ๊ทธ ํŠน์„ฑ์„ ๊ฐ€์ง€๊ณ  ์žˆ๋Š”์ง€ ์—†๋Š”์ง€์˜ ์—ฌ๋ถ€๋ฅผ ์•Œ ์ˆ˜ ์žˆ๊ฒŒ ํ•ด์ค€๋‹ค.
  • 6. Convolution โ–  multiple filter ์ž…๋ ฅ ์˜์ƒ์œผ๋กœ๋ถ€ํ„ฐ Convolution(filter)๋ฅผ ํ†ตํ•ด, feature map์„ ์ƒ์„ฑ ์ž…๋ ฅ ๊ฐ’์—๋Š” ์—ฌ๋Ÿฌ ๊ฐœ์˜ ํŠน์ง•์ด ์žˆ์–ด, ์—ฌ๋Ÿฌ ๊ฐœ์˜ ๋‹ค์ค‘ ํ•„ํ„ฐ ๊ฐ’์„ ๋‹ค์Œ๊ณผ ๊ฐ™์ด ์ ์šฉํ•˜๊ฒŒ ๋œ๋‹ค. Input data | Filter ใ…ก Filter
  • 7. Convolution โ–  Padding ์ž…๋ ฅ ์˜์ƒ์œผ๋กœ๋ถ€ํ„ฐ Convolution(filter)๋ฅผ ํ†ตํ•ด, feature map์„ ์ƒ์„ฑ ํ•ฉ์„ฑ๊ณฑ ์—ฐ์‚ฐ์˜ ํŒจ๋”ฉ ์ฒ˜๋ฆฌ : ์ž…๋ ฅ ๋ฐ์ดํ„ฐ ์ฃผ์œ„์— 0์„ ์ฑ„์šด๋‹ค. Convolution ๋ ˆ์ด์–ด์—์„œ Filter์™€ Stride์˜ ์ž‘์šฉ์œผ๋กœ Feature Map์˜ ํฌ๊ธฐ๋Š” ์ž…๋ ฅ๋ฐ์ดํ„ฐ๋ณด๋‹ค ์ž‘์•„์ง„๋‹ค. ํ•ฉ์„ฑ๊ณฑ ์—ฐ์‚ฐ์„ ๊ฑฐ์น  ๋•Œ ๋งˆ๋‹ค ํฌ๊ธฐ๊ฐ€ ์ž‘์•„์ง€๋ฉด ์–ด๋Š ์‹œ์ ์—์„œ๋Š” ํฌ๊ธฐ๊ฐ€ 1์ด ๋˜์–ด๋ฒ„๋ ค ๋” ์ด์ƒ ํ•ฉ์„ฑ๊ณฑ์„ ์ง„ํ–‰ํ•  ์ˆ˜ ์—†๊ฒŒ ๋˜๋Š”๋ฐ, CNN ๋„คํŠธ์›Œํฌ๋Š” ํ•˜๋‚˜์˜ ํ•„ํ„ฐ ๋ ˆ์ด์–ด๊ฐ€ ์•„๋‹ˆ๋ผ ์—ฌ๋Ÿฌ ๋‹จ๊ณ„์— ๊ฑธ์ณ์„œ ๊ณ„์† ํ•„ํ„ฐ๋ฅผ ์—ฐ์†์ ์œผ๋กœ ์ ์šฉํ•˜์—ฌ ํŠน์ง•์„ ์ถ”์ถœํ•˜๋Š” ๊ฒƒ์„ ์ตœ์ ํ™” ํ•ด ๋‚˜๊ฐ€๋Š”๋ฐ, ํ•„ํ„ฐ ์ ์šฉ ํ›„ ๊ฒฐ๊ณผ ๊ฐ’์ด ์ž‘์•„์ง€๊ฒŒ ๋˜๋ฉด ์ฒ˜์Œ์— ๋น„ํ•ด์„œ ํŠน์ง•์ด ๋งŽ์ด ์œ ์‹ค๋  ์ˆ˜๊ฐ€ ์žˆ๋‹ค. ์ถฉ๋ถ„ํžˆ ํŠน์ง•์ด ์ถ”์ถœ๋˜๊ธฐ ์ด์ „์— ๊ฒฐ๊ณผ ๊ฐ’์ด ์ž‘์•„ ์ง€๋ฉด ํŠน์ง•์ด ์œ ์‹ค๋˜๊ธฐ ๋•Œ๋ฌธ์—, ์ด๋ฅผ ๋ฐฉ์ง€ํ•˜๊ธฐ ์œ„ํ•œ ๋ฐฉ๋ฒ•์œผ๋กœ ์ถœ๋ ฅ ํฌ๊ธฐ๋ฅผ ์กฐ์ •ํ•  ๋ชฉ์ ์œผ๋กœ Padding์„ ์ง„ํ–‰ํ•œ๋‹ค.
  • 8. Convolution โ–  Output size ์ž…๋ ฅ ์˜์ƒ์œผ๋กœ๋ถ€ํ„ฐ Convolution(filter)๋ฅผ ํ†ตํ•ด, feature map์„ ์ƒ์„ฑ ํ•ฉ์„ฑ๊ณฑ ์—ฐ์‚ฐ์˜ ์ฒ˜๋ฆฌ ํ๋ฆ„ ๋ฐฐ์น˜์ฒ˜๋ฆฌ Input data ๋†’์ด : H ํญ : W ํ•„ํ„ฐ ๋†’์ด : FH ํ•„ํ„ฐ ํญ : FW Stride ํฌ๊ธฐ : S ํŒจ๋”ฉ ์‚ฌ์ด์ฆˆ: P
  • 9. Convolution โ–  Activation Function ์ž…๋ ฅ ์˜์ƒ์œผ๋กœ๋ถ€ํ„ฐ Convolution(filter)๋ฅผ ํ†ตํ•ด, feature map์„ ์ƒ์„ฑ ํ•„ํ„ฐ๋“ค์„ ํ†ตํ•ด์„œ Feature map์ด ์ถ”์ถœ๋˜์—ˆ์œผ๋ฉด, ์ด Feature map์— Activation function์„ ์ ์šฉํ•˜๊ฒŒ ๋œ๋‹ค. Activation function์˜ ๊ฐœ๋…์„ ์„ค๋ช…ํ•˜๋ฉด, ์œ„์˜ ์ฅ ๊ทธ ๋ฆผ์—์„œ ๊ณก์„  ๊ฐ’์˜ ํŠน์ง•์ด ๋“ค์–ด๊ฐ€ ์žˆ๋Š”์ง€ ์•ˆ ๋“ค์–ด๊ฐ€ ์žˆ๋Š”์ง€์˜ ํ•„ํ„ฐ๋ฅผ ํ†ตํ•ด์„œ ์ถ” ์ถœํ•œ ๊ฐ’์ด ๋“ค์–ด๊ฐ€ ์žˆ๋Š” ์˜ˆ์—์„œ๋Š” 6000, ์•ˆ ๋“ค์–ด๊ฐ€ ์žˆ๋Š” ์˜ˆ์—์„œ๋Š” 0 ์œผ๋กœ ๋‚˜์™”๋‹ค. ์ด ๊ฐ’์ด ์ •๋Ÿ‰์ ์ธ ๊ฐ’์œผ๋กœ ๋‚˜์˜ค๊ธฐ ๋•Œ๋ฌธ์—, ๊ทธ ํŠน์ง•์ด โ€œ์žˆ๋‹ค ์—†๋‹คโ€์˜ ๋น„์„ ํ˜• ๊ฐ’์œผ๋กœ ๋ฐ”๊ฟ” ์ฃผ๋Š” ๊ณผ์ •์ด ํ•„์š”ํ•œ๋ฐ, ์ด ๊ฒƒ์ด ๋ฐ”๋กœ Activation ํ•จ์ˆ˜์ด๋‹ค. ReLu ํ•จ์ˆ˜๋ฅผ ์ด์šฉํ•˜๋Š” ์ด์œ ๋Š” ๋‰ด๋Ÿด ๋„คํŠธ์›Œํฌ์—์„œ ์‹ ๊ฒฝ๋ง์ด ๊นŠ์–ด์งˆ ์ˆ˜๋ก ํ•™์Šต์ด ์–ด๋ ต๊ธฐ ๋•Œ๋ฌธ์—, ์ „์ฒด ๋ ˆ์ด์–ด๋ฅผ ํ•œ๋ฒˆ ๊ณ„์‚ฐํ•œ ํ›„, ๊ทธ ๊ณ„์‚ฐ ๊ฐ’์„ ์žฌ ํ™œ์šฉํ•˜์—ฌ ๋‹ค์‹œ ๊ณ„์‚ฐํ•˜๋Š” Back propagation์ด๋ผ๋Š” ๋ฐฉ๋ฒ•์„ ์‚ฌ์šฉํ•˜๋Š”๋ฐ, sigmoid ํ•จ์ˆ˜๋ฅผ activation ํ•จ์ˆ˜๋กœ ์‚ฌ์šฉํ•  ๊ฒฝ์šฐ, ๋ ˆ์ด์–ด๊ฐ€ ๊นŠ์–ด์ง€๋ฉด ์ด Back propagation์ด ์ œ๋Œ€๋กœ ์ž‘๋™์„ ํ•˜์ง€ ์•Š๊ธฐ ๋•Œ๋ฌธ์—,(๊ฐ’์„ ๋’ค์—์„œ ์•ž์œผ๋กœ ์ „๋‹ฌํ• ๋•Œ ํฌ์„์ด ๋˜๋Š” ํ˜„์ƒ. ์ด๋ฅผ Gradient Vanishing ์ด๋ผ๊ณ  ํ•œ๋‹ค.) ReLu๋ผ๋Š” ํ•จ์ˆ˜๋ฅผ ์‚ฌ์šฉํ•œ๋‹ค.
  • 10. Pooling ์ถœ๋ ฅ ๋ฐ์ดํ„ฐ์˜ ํฌ๊ธฐ๋ฅผ ์ค„์ด๊ฑฐ๋‚˜ ํŠน์ • ๋ฐ์ดํ„ฐ๋ฅผ ๊ฐ•์กฐํ•˜๊ฒŒ ๋˜๋Š” ๋ ˆ์ด์–ด - Pooling ํ’€๋ง ๋ ˆ์ด์–ด๋Š” ์ปจ๋ณผ๋ฃจ์…˜ ๋ ˆ์ด์–ด์˜ ์ถœ๋ ฅ ๋ฐ์ดํ„ฐ(Activation Map)๋ฅผ ์ž…๋ ฅ์œผ๋กœ ๋ฐ›์•„์„œ ์ถœ๋ ฅ ๋ฐ์ดํ„ฐ์˜ ํฌ๊ธฐ๋ฅผ ์ค„์ด๊ฑฐ๋‚˜ ํŠน์ • ๋ฐ์ดํ„ฐ๋ฅผ ๊ฐ• ์กฐํ•˜๋Š” ์šฉ๋„๋กœ ์‚ฌ์šฉํ•œ๋‹ค. ํ’€๋ง ๋ ˆ์ด์–ด๋ฅผ ์ฒ˜๋ฆฌํ•˜๋Š” ๋ฐฉ๋ฒ•์œผ๋กœ๋Š” Max Pooling๊ณผ Average Pooling, Min Pooling์ด ์žˆ๋‹ค. ์ผ๋ฐ˜์ ์œผ๋กœ ๋ฐ์ดํ„ฐํ’€๋ง์˜ ์œˆ๋„์šฐ ํฌ๊ธฐ์™€ ์ŠคํŠธ๋ผ์ด๋“œ๋Š” ๊ฐ™์€ ๊ฐ’์œผ๋กœ ์„ค์ •ํ•˜์—ฌ ๋ชจ๋“  ์›์†Œ๊ฐ€ ํ•œ ๋ฒˆ์”ฉ ์ฒ˜๋ฆฌ๋˜๋„๋ก ์„ค์ •ํ•œ๋‹ค. - ํŠน์ง• 1. ํ•™์Šตํ•ด์•ผ ํ•  ๋งค๊ฐœ๋ณ€์ˆ˜๊ฐ€ ์—†๋‹ค. 2. ์ฑ„๋„๋งˆ๋‹ค ๋…๋ฆฝ์ ์œผ๋กœ ๊ณ„์‚ฐํ•˜๊ธฐ ๋•Œ๋ฌธ์—, ์ฑ„๋„ ์ˆ˜๊ฐ€ ๋ณ€ํ•˜์ง€ ์•Š๋Š”๋‹ค. 3. ์ž…๋ ฅ์˜ ๋ณ€ํ™”์— ์˜ํ–ฅ์„ ์ ๊ฒŒ ๋ฐ›๋Š”๋‹ค. 4. ํ†ต๊ณผํ•˜๋ฉด ํ–‰๋ ฌ์˜ ํฌ๊ธฐ๊ฐ€ ๊ฐ์†Œํ•œ๋‹ค
  • 11. CNN โ–  Local feature > global feature ์ƒˆ๋กœ ์ƒ๊ธด ์ด๋ฏธ์ง€๋Š” ์› ์ด๋ฏธ์ง€์—์„œ ๋ถ€๋ถ„ ๋ถ€ ๋ถ„์˜ ํŠน์ง•์„ ๋ฝ‘์•„๋‚ธ ๊ฒƒ ์ด๋‹ค.(local feature). ์ด๋ ‡๊ฒŒ ๋ถ€๋ถ„ ๋ถ€๋ถ„์—์„œ ํŠน์ง•์„ ์•Œ ์ˆ˜ ์žˆ๊ธฐ ๋•Œ ๋ฌธ์— ์ „์ฒด ์ด๋ฏธ์ง€์—์„œ ํŠน์ง•์„ ๋ถ„์„ํ•˜๋Š” ๊ฒƒ ๋ณด๋‹ค ์ข€ ๋” ์ข‹์€ ํ•™์Šตํšจ๊ณผ๋ฅผ ๋‚ผ ์ˆ˜ ์žˆ๋Š” ๊ฒƒ ์ด CNN์ด๋‹ค. ์ด๋ ‡๊ฒŒ Convolution์„ ์ ์šฉํ•˜ ๋ฉด์„œ ์˜์ƒ ํ•™์Šต ์ชฝ์˜ ์„ฑ๋Šฅ์€ ๋น„์•ฝ์ ์œผ๋กœ ํ–ฅ ์ƒ๋˜์—ˆ๋‹ค. 1. local feature๊ฐ€ global feature๊ฐ€ ๋˜๋Š” ๊ณผ์ • 2. CNN๊ณผ์ •
  • 15. Summary-Structure โ–  Convolutional Layer โ€ข์ž…๋ ฅ ๋ฐ์ดํ„ฐ์— ํ•„ํ„ฐ๋ฅผ ์ ์šฉ ํ›„ ํ™œ์„ฑํ™” ํ•จ์ˆ˜๋ฅผ ๋ฐ˜์˜ํ•˜๋Š” ํ•„์ˆ˜ ์š”์†Œ โ€ข์ž…๋ ฅ ์˜์ƒ์œผ๋กœ๋ถ€ํ„ฐ Convolution(filter)๋ฅผ ํ†ตํ•ด, feature map์„ ์ƒ์„ฑ โ€ข์—ฌ๋Ÿฌ ๊ฐœ์˜ ๋‹ค๋ฅธ ํŠน์ง•์„ ์ถ”์ถœํ•˜๊ณ  ์‹ถ๋‹ค๋ฉด, ์—ฌ๋Ÿฌ ๊ฐœ์˜ convolution kernel๋ฅผ ์‚ฌ์šฉ โ–  Sub-sampling(Pooling) โ€ข๊ฐ€์žฅ ๊ฐ•ํ•œ ์‹ ํ˜ธ๋งŒ ์ „๋‹ฌํ•˜๋Š” ๋ฐฉ์‹์„ ์ฑ„ํƒํ•˜์—ฌ ๊ฐ€์žฅ ํฐ ๊ฐ’์„ ์„ ํƒํ•˜๋Š” ๋ฐฉ๋ฒ•์ธ max-pooling์„ ์ฃผ๋กœ ์‚ฌ์šฉ โ€ข์ด๋™์ด๋‚˜ ๋ณ€ํ˜• ๋“ฑ์— ๋ฌด๊ด€ํ•œ ํ•™์Šต ๊ฒฐ๊ณผ๋ฅผ ๋ณด์ด๊ธฐ ์œ„ํ•ด์„œ Convolution+Sub-sampling ๊ณผ์ •์„ ์—ฌ๋Ÿฌ ๋ฒˆ ๊ฑฐ์ณ ๋Œ€ํ‘œํ•  ์ˆ˜ ์žˆ๋Š” ํŠน์ง•์„ ์–ป๋Š” ๊ฒƒ์ด ์ค‘์š”