I am capturing an image from one Pi 3 using the standard PiCamera
python bindings.
I then send this image to another Pi 3 over a WiFi mesh network created using batman-adv
The code I've found for the socket part, sends the image len firstly, then sends the actual Byte stream.
This results in exactly 20 images in 10 seconds, i.e. 2 frames per second, which is a terrible frame rate.
I am confident that if I can skip the "handshake" so to speak then I can increase the capture rate, but how can I save/process an image without knowing the content length?
I tried decreasing the resolution of the image by half but that didn't result in any improvement.
I also printed out the image len of one attempt and then used the max to statically read bytes on the next run, but that resulted in un-viewable images; presumably because I read beyond one image so each file was more/less than actual image.
The below code is for the client.py
import io
import socket
import struct
import time
import picamera
client_socket = socket.socket()
client_socket.connect(('192.168.123.3', 6666))
connection = client_socket.makefile('wb')
try:
with picamera.PiCamera() as camera:
camera.resolution = (640, 480)
# Start a preview and let the camera warm up for 2 seconds
camera.start_preview()
time.sleep(2)
# Note the start time and construct a stream to hold image data
# temporarily (we could write it directly to connection but in this
# case we want to find out the size of each capture first to keep
# our protocol simple)
start = time.time()
stream = io.BytesIO()
for foo in camera.capture_continuous(stream, 'jpeg'):
# Write the length of the capture to the stream and flush to
# ensure it actually gets sent
connection.write(struct.pack('<L', stream.tell()))
connection.flush()
# Rewind the stream and send the image data over the wire
stream.seek(0)
connection.write(stream.read())
# If we've been capturing for more than 10 seconds, quit
if time.time() - start > 10:
break
# Reset the stream for the next capture
stream.seek(0)
stream.truncate()
# Write a length of zero to the stream to signal we're done
connection.write(struct.pack('<L', 0))
finally:
connection.close()
client_socket.close()
The code for server.py
import io
import socket
import struct
from PIL import Image
server_socket = socket.socket()
server_socket.bind(('192.168.123.3', 6666))
server_socket.listen(0)
imagecounter = 1
connection = server_socket.accept()[0].makefile('rb')
try:
while True:
# Read the length of the image as a 32-bit unsigned int. If the
# length is zero, quit the loop
image_len = struct.unpack('<L', connection.read(struct.calcsize('<L')))[0]
if not image_len:
break
# Construct a stream to hold the image data and read the image
# data from the connection
image_stream = io.BytesIO()
image_stream.write(connection.read(image_len))
# Rewind the stream, save it as a file
image_stream.seek(0)
with open('image%s.jpg' % imagecounter, 'wb') as img:
img.write(image_stream.read())
imagecounter += 1
finally:
connection.close()
server_socket.close()
The mesh
/ ad-hoc
network is a hard requirement. I'd like to use it and the reasons are beyond the scope of this question.
I am wondering:
- How can I cut out the image len handshake and still be able to save/process the image correctly
- Is there a better approach to speeding up this transfer?
My target is 10 to 15 images per second, or more if possible.