The main objective is to design a VLSI architecture to realize an wireless endoscopy system. This system is
used in medicinal field by recording images of the digestive system. It is developed to transfer the image data
using the RF transmission so as to avoid the pain and irritation to the digestive tract which can be caused by
the cables when using conventional endoscopes. The proposed system consists of a RF transceiver and a
CMOS color image sensor. Capturing of images is done by image color sensor. RF transceiver is used to
send the images wirelessly. CMOS color image sensor is interfaced with FPGA. Real time images captured
by the color image sensor are compressed and they are sent wirelessly by the RF transceiver. Image
compression is used since it is the best way to save the power in transmission and reception and to decrease
the bandwidth in communication. So, after capturing the image using the CMOS color image sensor, the
image is compressed. Compression is done by JPEG standard. After high-quality lossy compression, the
image is transmitted by the wireless RF transceiver. Wireless transceiver is chosen in such a way that it is
operated in low power.
CCS335 _ Neural Networks and Deep Learning Laboratory_Lab Complete Record
A VLSI Architecture Realisation of an Wireless Endoscopy System
1. International Journal of Embedded Systems and Applications (IJESA), Vol 9, No.1, March 2019
DOI : 10.5121/ijesa.2019.9103 21
A VLSI Architecture Realisation of an Wireless
Endoscopy System
Kavitha Veerappan 1
and Ramu Pitchaikkannu 2
1
Department of Electronics and Communication Engineering, M.A.M College of
Engineering and Technology, Trichy, India
2
Department of Electronics and Communication Engineering, Jaya Institute of
Technology, Thiruvallur, India
ABSTRACT
The main objective is to design a VLSI architecture to realize an wireless endoscopy system. This system is
used in medicinal field by recording images of the digestive system. It is developed to transfer the image data
using the RF transmission so as to avoid the pain and irritation to the digestive tract which can be caused by
the cables when using conventional endoscopes. The proposed system consists of a RF transceiver and a
CMOS color image sensor. Capturing of images is done by image color sensor. RF transceiver is used to
send the images wirelessly. CMOS color image sensor is interfaced with FPGA. Real time images captured
by the color image sensor are compressed and they are sent wirelessly by the RF transceiver. Image
compression is used since it is the best way to save the power in transmission and reception and to decrease
the bandwidth in communication. So, after capturing the image using the CMOS color image sensor, the
image is compressed. Compression is done by JPEG standard. After high-quality lossy compression, the
image is transmitted by the wireless RF transceiver. Wireless transceiver is chosen in such a way that it is
operated in low power.
KEYWORDS
Image compression, JPEG-LS, CMOS, RF & FPGA
1.INTRODUCTION
Endoscopy means looking inside and refers to looking inside the body for medical reasons using
an endoscope, it is an instrument used to examine the interior of a hollow organ or cavity of the
body. Unlike the conventional methods, wireless endoscopy system is developed to transfer the
image data using the RF transmission so as to avoid the pain and irritation to the digestive tract
which can be caused by the cables when using conventional endoscopes. Wireless Endoscopy is
used to capture and store images of the digestive system for use in medicine. The system contains
a tiny camera, it takes the picture of the inside of gastrointestinal tract. The primary use of the
system is to examine areas of the small intestine that cannot be seen by conventional types of
endoscopy.
The Wireless Endoscopy system comprises of three units. First unit, here, the real time image is
captured by the camera. Second unit, the captured image is compressed. And the third unit,
compressed image is transmitted. Image is captured by the CMOS color image sensor. The image
capture unit is used to buffer the raw pixel values captured. After capturing the image, it is
compressed. To implement this functionality, JPEG image compression technique is used in the
present work. Image compression is the best way to decrease the communication bandwidth and
save the transmitting power. Lossless or high-quality lossy compression algorithm is implemented
2. International Journal of Embedded Systems and Applications (IJESA), Vol 9, No.1, March 2019
22
in the image compression unit. After compression, the compressed image should be transmitted
wirelessly. It is sent by RF wireless transceiver.
The work proposed in [1] is described with a system of wireless endoscopy with low power ASIC.
This paper presents a wireless endoscopic system which is composed of a CMOS color image
sensor, a RF transceiver and a low power controlling and processing ASIC. The system consists of
a capsule, work station and a portable data recorder. The images are captured by the capsule inside
the human body and wirelessly transfer the data to the portable data recorder which is outside the
body. The data recorder serves as a bridge between the workstation and the capsule so that real-
time monitoring can be done by the doctors when data recorder is connected to the work station
using an USB connector. Capsule receives the control commands from the work station through
the data recorder, with bidirectional communication protocol implemented in the system.
In the paper [2], it is proposed to return the steep memory passage problem for transmission
endoscopy capsule. Considering the soft frame worth requirement parts of the expedition in the
pixel is useful in the entire power consumption. In order to reduce switching activities, Gray code
is utilized and to reduce the power consumption further.
The work proposed in [3] is given with an algorithm for image compression suitable for wireless
capsule endoscopy system. Our traditional and lossy image compression techniques are not chosen
because of power limitation and small size. An entropy Huffman coder is used for encoding and
decoding process and discrete cosine transform (DCT) is used in the proposed system. DCT is
chosen because it has low power consumption and complexity.. The algorithm provide lossless
compression as well as it also provides high-quality lossy compression.
The work proposed in [4] is an efficient wireless power transmission system for wireless
endoscopy. The system includes a transmitter and a receiver with a power management circuit
(PMC). A driver circuit and class E amplifier is enclosed in transmitter which used as the power
provider for the entire system.
The transmission efficiency is obtained in a higher rate in all the three such as receiver, transmitter
and magnetic field transportation and it is to be designed optimal in all the stages.
2.Methodology
The Wireless Endoscopy system consists of three units. First unit, the image is captured by the
camera. Second unit, the captured image is compressed. And the third unit, compressed image is
transmitted. Image is captured by the CMOS color image sensor. The raw pixel values which are
captured by the sensor are buffered by the image capture unit. After capturing the image, it is
compressed. To implement this functionality, JPEG image compression technique is used in the
present work. Lossless or high-quality lossy compression algorithm is implemented in the image
compression unit. After compression, the compressed image should be transmitted wirelessly. It is
sent by RF wireless transceiver.
3. International Journal of Embedded Systems and Applications (IJESA), Vol 9, No.1, March 2019
23
Fig. 1Block Diagram of Wireless Endoscopy System
The camera is interfaced with FPGA. Camera is used to capture images. After interfacing the
camera with FPGA, the captured image is stored in SRAM. The pixel values of the image are stored.
After obtaining the pixel values of the image, it is compressed using JPEG technique. Image
compression technique is applied to the original image so that the communication bandwidth is
decreased and the transmitting power is saved. After image compression process, the compressed
image is transmitted using RF transceiver.
2.1. CAMERA INTERFACING UNIT
In this project, CMOS color image sensor is used for capturing images. According to paper [2],
CMOS color image sensor is chosen in such a way that it could be used in low power consumption.
Fig. 2 OV6920 Omnivision CMOS color image sensor
The OV6920 is a CMOS image sensor which is 1/18-inch optical camera which incorporates a high
level of functionality and has very low power consumption in an ultra-small footprint package. This
property is useful in small disposable cameras for medical imaging applications such as diagnostic
and intubation systems. This is a family of products based on the most advance CMOS mixed signal
technology. It integrates image array, signal processing, timing and control circuitry, all on a single
chip. It is ideal for applications requiring a a small footprint, low power and low cost.
2.2. CMOS COLOR IMAGE SENSOR INTERFACING
The camera is chosen according to our need. Now, it is the need to interface with FPGA. So, camera
interfacing is done. After interfacing camera to the FPGA, the real time pictures captured by the
camera is stored into the SRAM of FPGA. The below picture clearly depicts the camera interfacing
with FPGA.
4. International Journal of Embedded Systems and Applications (IJESA), Vol 9, No.1, March 2019
24
Fig 3: Photo of wireless endoscope capsule prototype
The module is programed in Verilog HDL to interface the CMOS color image sensor with FPGA.
From the picture, we can infer that, the images captured by the color sensor are stored into SRAM
of FPGA. The captured images are displayed in the monitor. While simulating the module coded
for the camera interfacing, the images captured by the sensor is stored in SRAM simultaneously.
The image captured by the sensor is stored in the SRAM of FPGA. The image is stored in the
buffer. From the buffer, it is read as frame by frame. It converts YUV422 to YUV444. Then it is
converted into RGB format. Then, this pixel values are stored in SRAM. We can view this image
when FPGA is connected to the monitor. The captured images are displayed in the monitor.
2.3. IMAGE COMPRESSION
The captured image is compressed by using DCT (Discrete Cosine Transform). The resulted
coefficients of integer version of DCT are represented as VLI (variable length integer) and entropy
coded using Huffman coder. Application of integer DCT not only enables low complexity of
hardware implementation and very limited power consumption but also permits the algorithm to
provide lossless compression as well as lossy compression.
The JPEG is the most commonly used method of lossy compression for digital photography. The
JPEG Encoder block diagram is shown below. The main steps in the JPEG Encoding are: RGB to
YCbCr conversion, 2-D DCT of 8X8 blocks, Quantization, zig-zag scan of 8X8 blocks, differential
DC encoding and Run Length Encoding (RLE) for AC components and entropy Encoding.
Fig 4: JPEG Encoder Block Diagram
5. International Journal of Embedded Systems and Applications (IJESA), Vol 9, No.1, March 2019
25
In this section, all the steps of JPEG Encoding will be discussed in detail.
2.4. WIRELESS TRANSMISSION
Using Link Sprite software and Zigbee, images are transmitted wirelessly.
Fig 5: Microcam Serial JPEG Camera
2.5. IMAGE LOADED INTO SRAM
The image acquired is accessed through MATLAB through the function imread. This function will
read the image. After reading the image, the image is segmented into the least 8*8 matrix. The pixel
values of this segmented image are obtained. Then, the pixel values of the segmented image is taken
through a .MIF or .HEX file to store into RAM in FPGA. Counter is also designed in order to
generate addresses to store these pixel values in the RAM. Number of address created is in
accordance with the pixel values.
Quartus II allows the creation of Memory Initialization files (.MIF) and .HEX files. The pixel
values of the image to be stored are written in a .MIF or .HEX file and the RAM is initialized to
this file. Creating .MIF or .HEX file of our own size i.e., how many number of words and each
word size also was allowed. This .MIF or .HEX file is accessed directly in RAM using SRAM mega
wizard function. The mega wizard function has the SRAM module function programmed in VHDL
language. This mega wizard function can be accessed through port map function. In one SRAM
storage of maximum of 65,536 words, each maximum word size of 256 bits will be allowed. So a
.MIF or .HEX file of maximum of 65,536 words, each maximum word size of 256 bits can be
generated.
Fig. 6 .MIF FILE
6. International Journal of Embedded Systems and Applications (IJESA), Vol 9, No.1, March 2019
26
The pixel values of the image is stored into SRAM of FPGA. This is programmed in VHDL
language. The image acquired is accessed through MATLAB through the function imread. This
function will read the image. After reading the image, the image is segmented into the least 8*8
matrix. The segmented image pixel values are obtained. Then, the values of the pixels of segmented
image are taken through a .MIF or .HEX file to store into RAM in FPGA. Counter is designed in
order to generate addresses according to the number of pixel values, to store these values of the
image in the SRAM.
2.6. RESULTS
2.6.1. IMAGE LOADED INTO SRAM
The detailed explanation of the project was discussed and the results which were obtained are
depicted below. As discussed the captured image was taken first into MATLAB. The original image
taken and segmented into least 8*8 matrix. The pixel values are obtained. Those pixel values are
stored in SRAM. Storing of these pixel values into SRAM is shown below.
Fig. 7 RTL VIEW OF IMAG LOADED INTO SRAM
Fig. 8 PIXEL VALUES LOADED INTO SRAM
2.6.2. OV6920 Interfacing with FPGA
The camera is interfaced with FPGA. The below picture shows how the camera is interfaced in
real time.
7. International Journal of Embedded Systems and Applications (IJESA), Vol 9, No.1, March 2019
27
Fig.9 OV6920 INTERFACING WITH FPGA
2.6.3. IMAGE CAPTURED STORED IN SRAM
The CMOS color image sensor is interfaced with the camera. After interfacing, the image captured
in the camera is loaded into SRAM. The below diagrams depicts the values of the image is loaded
in YUV format first and then converted into RGB format. The RTL view of OV6920 camera
interfacing with FPGA is shown below.
Fig.10 RTL View of OV6920 interfacing with FPGA
8. International Journal of Embedded Systems and Applications (IJESA), Vol 9, No.1, March 2019
28
Fig.11 Image values in YUV format is loaded
Fig.12 Image values in RGB format is loaded 2.6.4. Image Compression
These images are compressed in JPEG technique. The compressed results are shown below.
Frist, a still is image is compressed in JPEG standard using MATLAB.
Fig. 13 Compression technique implemented in Matlab
Then, the compression technique module is implemented in Verilog HDL and integrated with
camera interfacing module. The below simulation results shows the image compression technique
implemented in Quartus tool.
9. International Journal of Embedded Systems and Applications (IJESA), Vol 9, No.1, March 2019
29
Fig.14 RTL View of Image Compression module added
Fig.15 Compressed Image values in YUV format is loaded
Fig.16 Compressed Image values in RGB format is loaded
10. International Journal of Embedded Systems and Applications (IJESA), Vol 9, No.1, March 2019
30
3.CONCLUSIONS
The VLSI architecture of an wireless endoscopy system can be designed and realized using Altera
Cyclone II board. Now, the camera interfacing and the image compression units are implemented
and tested. The system clearly depicts the interface of camera with the FPGA and the compression
techniques used to compress the image in a lossless manner or high quality lossy manner. Wireless
transmission of the image is also done.
REFERENCES
[1] Xinkai Chen, Xiaoyu Zhang, Linwei Zhang, Xiaowen Li, Nan Qi, Hanjun Jiang, Member, IEEE, and
Zhihua Wang, Member, “ A Wireless Capsule Endoscope System With Low-Power Controlling and
Processing ASIC, IEEE Biomedical Circuits and Systems, pp.11-22,2012.
[2] Milin Zhang, Amine BERMARK, Xiaowen LI, Zhihua WANG, “A Low Power CMOS Image Sensor
Design for Wireless Endoscopy Capsule”, IEEE Biomedical Circuits and Systems, pp.397-400, 2012.
[3] Pawel Tureza, Mariusz Duplaga, Al.Mickiewicza, “Low Power Image Compression for Wireless
Capsule Endoscopy”, IEEE Transactions on Imaging Systems and Techniques, vol-59,no.3,pp.1-4,
2010.
[4] Yu Mao, Liang Feng, Yuhua Cheng, “An Efficient Wireless Power Transmission System for the
Capsule Endoscopy Application”, IEEE Mechatronics and Automation, pp.221-224, 2011.
[5] Steph Yingke Gu1, Xiang Xie1, Guolin Li2, Tianjia Sun1, Qiang Zhang2, Ziqiang Wang1,
Zhihua Wang1, “A New System Design of the Multi-View Micro-Ball Endoscopy System”,
32nd Annual International Conference of the IEEE EMBS Buenos Aires, Argentina, August 31 -
September 4, pp.141-147, 2010.
[6] Stephen P. Woods∗ and Timothy G. Constandinou, Senior Member, IEEE, “Wireless Capsule
Endoscope for Targeted Drug Delivery: Mechanics and Design Considerations” IEEE
Transactions on Biomedical Engineering, Vol. 60, no. 4, pp. 35-39, 2013.
AUTHORS
[1] I, Kavitha Veerappan pursuing Ph.D from Anna University. Finished Masters
of Engineering (VLSI DESIGN) in "College of Engineering, Guindy" and
Received B.E degree (ECE) in Anna University. Area of Interests are Digital
and Analog VLSI.
[2] Ramu Pitchaikkanu pursuing Ph.D from Anna University. Finished Masters of
Engineering (COMMUNICATION SYSTEMS) in "A.C Tech, Karaikudi" and
Received B.E degree (ECE) in Anna University. Area of Interest is VLSI.