Hi everyone, we are team lOrAwAt and we decided to take up challenge of FOS Group
The objective was to transmit and receive a 1 MB JPEG image over Raw LoRa. Looks easy right?
Well, except it isn’t. LoRa is a type of long range network, that can transfer only small chunks of data - anything over 130 bytes is too big. So an image would need to be divided into many small packages, with high chance that some of them are going to be malformed or lost.
So why would we want to tackle such a challenging problem? Technology that allows us to send images wirelessly over long distances allows us to solve many issues, one of which could be monitoring growth of crops wirelessly.
Before we tell you how we did it, we want to show you we did it!
So, thats the architecture diagram. First we take a picture with mobile phone, which is sent to flask backend to compress it and divide it into smaller batches that are suitable for LoRa travel. Next, Pycom microcontroller calls that backend to receive preprocessed data and send it to Pycom receiver. After pycom receiver confirms all the data have been received, it sends the resulting photo back to backend, so we can see that it renders correctly on mobile app.
In the previous slide we said that we do image compression. Here you can see how example image looks pre-compression and post-compression, its not so bad, and it speeds image transfer by a lot, precompressed image would take 20 mins to send, whilst post-compressed takes only 2-3 minutes.
Here is how we build our protocol, we decided to use UDP-based protocol, with additional confirmations that allow us to store packages in order and make sure that they aren’t transformed. After all packages are sent, we calculate which packages were lost, so we could try to get them again from sender, making sure we get the entire image!