I'm making a robot with Raspberry. My goal is:
- Code a server on Rasbperry using Python that captures the image from camera and sends data via socket
- Code a client on PC using Java that reads data from socket and show it in a JavaSwing window
So I can see in real time what the Raspberry Camera see.
I tried this code for my project and I get the same latency. My python code is:
import socket
from time import sleep
import picamera
ADDRESS = '192.168.1.XXX' # My Address - SERVER
PORT = 2000 # Port to listen on (non-privileged ports are > 1023)
server = socket.socket()
server.bind((ADDRESS, PORT))
server.listen(0)
connection, addr = server.accept() # Accept one connection
params = (1280, 720, 48, 25000000)
client = connection.makefile('wb')
print('Connected to', addr)
camera = picamera.PiCamera()
camera.rotation = 180
camera.resolution = ( int(params[0]), int(params[1]) )
camera.framerate = int(params[2]) #48
try:
camera.start_recording(client, format='h264', intra_period=0, quality=0,
bitrate=int(params[3]) )
print("Recording started...")
camera.wait_recording(60)
finally:
camera.stop_recording()
print("Recording stopped...")
client.close()
server.close()
camera.stop_preview()
print("Connection closed")
My Java code is the next:
private final int WIDTH = 1280, HEIGHT = 720;
private final int FPS = 48;
private final int BITRATE = 25000000;
public InputStream connect(String address, int port) throws IOException {
socket = new Socket( InetAddress.getByName(address), port )
return socket.getInputStream();
}
public void openWindow(VideoPane video) {
EventQueue.invokeLater(new Runnable() {
@Override
public void run() {
try {
UIManager.setLookAndFeel(UIManager.getSystemLookAndFeelClassName());
} catch (Exception ex) { ex.printStackTrace(); }
JFrame frame = new JFrame("Testing");
frame.setDefaultCloseOperation(JFrame.EXIT_ON_CLOSE);
frame.add(video);
frame.pack();
frame.setLocationRelativeTo(null);
frame.setPreferredSize( new Dimension(WIDTH, HEIGHT) );
frame.setVisible(true);
}
});
}
public void avvia() throws IOException {
VideoPane video = new VideoPane(WIDTH, HEIGHT);
openWindow(video);
InputStream in = connect(serverAddress, serverPort);
streamToImageView( "h264", FPS, BITRATE*2, "ultrafast", 0, in, video);
try {
socket.close();
} catch (IOException e) {e.printStackTrace(); }
}
public void streamToImageView( final String format, final double frameRate, final int bitrate,
final String preset, final int numBuffers, InputStream input, VideoPane video) {
try{
final FrameGrabber grabber = new FFmpegFrameGrabber( input );
final Java2DFrameConverter converter = new Java2DFrameConverter();
grabber.setFrameRate(frameRate);
grabber.setFormat(format);
grabber.setVideoBitrate(bitrate);
grabber.setVideoOption("preset", preset);
grabber.setNumBuffers(numBuffers);
grabber.setMaxDelay(10);
grabber.start();
Frame frame;
BufferedImage bufferedImage;
while (!stop) {
frame = grabber.grab();
if (frame != null) {
bufferedImage = converter.convert(frame);
if (bufferedImage != null) {
video.updateFrame(bufferedImage);
}
}//if frame
}//while
grabber.close();
}//try
catch (IOException e) { e.printStackTrace(); }
}//streamToImageView
public class VideoPane extends JPanel {
private BufferedImage currentFrame;
public VideoPane(int width, int height) {
this.setPreferredSize( new Dimension(width, height) );
Timer timer = new Timer(FPS, new ActionListener() {
@Override
public void actionPerformed(ActionEvent e) {
repaint();
}
});
timer.start();
}
public void updateFrame(BufferedImage frame) {
this.currentFrame = frame;
}
@Override
protected void paintComponent(Graphics g) {
super.paintComponent(g);
if (currentFrame != null) {
Graphics2D g2d = (Graphics2D) g.create();
int x = (getWidth() - currentFrame.getWidth()) / 2;
int y = (getHeight() - currentFrame.getHeight()) / 2;
g2d.drawImage(currentFrame, x, y, this);
g2d.dispose();
}
}//paintComponent
}//VideoPane
I notice that the RPiCamera library code the stream video to h264 before sending to socket. I think that latency maybe is due to this computational overhead. I'd like to try another conversion format but I'm unable to decode the stream when received. I also tried to use UDP protocol but FFmpegFrameGrabber needs a InputStream and the UDP can only provide the ByteStream ofreceived packet. The other possible problem maybe a connection delay, so I tryed to connect raspberry with Ethernet. This video show the latency when raspberry is connected via Ethernet to router and PC via Wireless: LatencyVideo
Have you some ideas for reducing this latency? Thanks