I've been working on the backend. I've set up the postgres DB and the ORM (it was a bit of a hassle since using Typescript with it is confusing and tedious), I now have to build a scheduler for flights and I still need to figure out the video streaming,
I've been working on the backend so there's not really anything visual that I can share so I'll just repeat the image of the frontend that I shared yesterday. I have built the realtime API using websockets to send commands from the client to the server to the drone, it also relays information (height, temperature, etc) from the drone to the server to the client as well. For the backend, I still need to finish authentication and integrate a database (I will be using PostgreSQL most likely) to store information about flights. I will also integrate an API (something like S3, I'm thinking of using Cloudflare R2 since it's pretty accessible) to store recorded videos. This should ensure the server doesn't run out of storage and that the data is safely backed up.
I still have not figured out the video streaming part, so I'm focusing on the frontend now since that's something I'm not really good at. I'm hoping this will help me solve the issue as I'll come back to it with a clear mind. This is the login screen, it's just a simple screen prompting for a password. This will make a request to the backend which will check the password against a previously set bcrypt hashed admin password. If it is correct, it will assign a JWT as a session cookie which will be used to authenticate all other requests. I have to make sure this part is secure since it is meant to be exposed to the internet for users to access remotely.
I have successfully managed to connect to the drone using the SDK that I had ported to TS earlier. I can send commands to the drone and receive a video feed from the server. Now I will start building the API for the backend so that the client can send commands to the drone. First I have to figure out how to stream the video feed from the drone (yuv420p h264) to the client. I’ve been trying to get ffmpeg to transcode the video feed into an HLS stream so that it can be played in the browser, but I am unable to do so for some reason. I can confirm it is valid since it can be played using ffplay (as seen in the screenshot attached). I will continue to debug this.
I started building the Backend and Frontend, but before I could do that, I needed a library to interact with the drone. I found one but realized it was written in plain vanilla JS and had no type definitions. Since I wanted to use Typescript in my project, I really felt I could benefit from having type definitions. So I forked the library and converted it to typescript. The fork is open source and is available here.
I am starting off by testing the drone. I tried flying it but was getting some sort of sensor error, which was worrying. Fortunately, it disappeared after I calibrated it. I managed to fly it and everything worked as expected but there were a few issues that I am still trying to find a fix for. It seems there is a lot of interference, even when flying at a short distance. This means the drone sometimes fails to respond to commands and the video feed is choppy. Since the drone has no internal storage, the video is recorded from the device that controls it, so if the video feed is choppy/compressed, the final recording will be too. Fortunately for this project that doesn’t matter a lot and it can still perform its intended function. Next up I will use the SDK so I can replicate the flight path programmatically. (The choppiness in the video attached is not edited.)