I recently found some USB devices on eBay (Epiphan VGA2USB LR) that could take VGA as input and present the output as a webcam. Given that I was keen on the idea of not needing to lug out a VGA monitor ever again and there was claimed Linux support I took the risk and bought the whole job lot for about £20 (25 USD).
When they arrived, I plugged one in under the expectation that it would come up as USB UVC Devices but they did not. Was I missing something?
After looking through the vendors site I discovered there was a custom driver required for them to work. As I normally live the easy life on Linux of never needing to pull in drivers because the distribution kernel I am using has them already, this was a reasonably novel concept.
Sadly, it seems like driver support for the devices in question ended at Linux 4.9. Meaning none of my systems would run this device anymore (Debian 10 [Linux 4.19] or latest LTS Ubuntu [Linux 5.0])
But surely this was something I could patch myself right? Surely the package files where actually just a DKMS package that built the driver from source code on demand like a lot of the out of tree drivers out there…
Sadly. This was not the case.
Inside the package is just a pre-compiled binary called vga2usb.o
. I started doing some basic investigations on how hard it might be to reverse engineer and found some interesting string table entries:
$ strings vga2usb.ko | grep 'v2uco' | sort | uniq
v2ucom_autofirmware
v2ucom_autofirmware_ezusb
v2ucom_autofirmware_fpga
Is this device actually an FPGA-on-a-stick? What would the process be to get something like that running even look like?
Another both amusing and mildly alarming find was the strings for DSA private key parameters. This made me wonder if there was private key material inside this driver and what could be protected with it:
$ strings vga2usb.ko | grep 'epiphan' | sort | uniq
epiphan_dsa_G
epiphan_dsa_P
epiphan_dsa_Q
To observe the driver in its normal operating environment, I made a Debian 9 (the last supported release) VM, and did a KVM USB Passthrough to give it direct access to the device. I then installed the driver and confirmed that it worked.
After that, I wanted to see what the wire protocol looked like. I was hoping that the device sent raw (or close to raw frames) over the wire as this would make the task of writing a user space version of the driver easier.
To do this, I loaded the usbmon
module on the VM’s host machine and used Wireshark to take a packet capture of the USB traffic to and from the device during startup and whilst capturing video.
I found that on device startup there was a large number of small packets to the device before the device could capture data. I assumed that this meant that the device was in fact as described above an FPGA based platform that had no persistent storage. Every time the device was plugged in the devices firmware would have to be “bitstreamed” from the driver itself.
I confirmed this by opening one of the units up:
ISL98002CRZ-170 - Acting as an Analog to Digital Converter for the VGA signals | |
XC6SLX16 - Xilinx Spartan 6 FPGA | |
64 MB of DDR3 RAM | |
CY7C68013A - USB Controller / Frontend for the device |
Given that to “boot” this device I needed the bitstream to send to it, I got to work on the pre-compiled binaries to try to extract the bitstream/firmware. After running binwalk -x
and watching it find a few (zlib) compressed objects. I wrote a script that would search them for a known hex sequence and picked 3 bytes from the pcap that I knew were from the bitstreaming process to search for
$ bash scan.sh "03 3f 55"
trying 0.elf
trying 30020
trying 30020.zlib
trying 30020.zlib.decompressed
...
trying 84BB0
trying 84BB0.zlib
trying 84BB0.zlib.decompressed
trying AA240
trying AA240.zlib
trying AA240.zlib.decompressed
000288d0 07 2f 03 3f 55 50 7d 7c 00 00 00 00 00 00 00 00 |./.?UP}|........|
trying C6860
trying C6860.zlib
After decompressing the AA240.zlib file. I found that there was not enough data there to be the full bitstream. So I instead went down the route of extracting the firmware out of the USB packet capture.
I found that while both tshark and tcpdump can read USB packets inside pcap files, they both would only dump bits of information in the capture. Since that each program had different parts of the puzzle, I wrote a small program that would unify the output of both programs into go structs so they could be replayed back to the device.
At this point I noticed that bootstrapping comes in two stages, first the USB controller and then for the FPGA itself.
For at least a few days I was stuck on an issue where it would appear the whole bitstream would upload. But the device would not start up, despite it seeming like the packet captures between the real driver and userspace one looking identical.
This was eventually solved by combing through the pcap paying attention to the time it took to respond to each packet and noticing a large difference in one particular packet’s timing:
It turned out a manually entered typo caused an USB control write to go to the wrong area of a device. Serves me right for manually entering a value in…
Regardless, I now had a green blinking led on the device! A massive achievement!
Since it was relatively trivial to replicate the same packets that seemed to start the data streaming, I was able to write up a USB Bulk transfer endpoint and have data being dumped to disk in no time!
This is where the real challenge started. Because after analysis it appeared that the data was not obviously encoded in any way.
To start with, I used perf to get a general view of what the driver stack traces looked like while it was running:
Whilst I made progress with being able to hook functions that had frame data in them, I still didn’t get any closer to figuring out the encoding of the image data itself.
I did try the NSA’s Ghidra to get a better idea of what was going on inside of the real driver:
While Ghidra is incredible (this was my first time using it compared to IDA Pro) it still wasn’t quite good enough to reasonably help me understand the driver. I needed another path of investigation if I was going to reverse engineer this.
I decided to provision a Windows 7 VM and check if the Windows driver was doing anything different, I also noticed during that time that there was a SDK for the devices. One of the tools ended up being of particular interest:
PS> ls
Directory: epiphan_sdk-3.30.3.0007\epiphan\bin
Mode LastWriteTime Length Name
---- ------------- ------ ----
-a--- 10/26/2019 10:57 AM 528384 frmgrab.dll
-a--- 10/27/2019 5:41 PM 1449548 out.aw
-a--- 10/26/2019 10:57 AM 245760 v2u.exe
-a--- 10/26/2019 10:57 AM 94208 v2u_avi.exe
-a--- 10/26/2019 10:57 AM 102400 v2u_dec.exe
-a--- 10/26/2019 10:57 AM 106496 v2u_dshow.exe
-a--- 10/26/2019 10:57 AM 176128 v2u_ds_decoder.ax
-a--- 10/26/2019 10:57 AM 90112 v2u_edid.exe
-a--- 10/26/2019 10:57 AM 73728 v2u_kvm.exe
-a--- 10/26/2019 10:57 AM 77824 v2u_libdec.dll
PS> .\v2u_dec.exe
Usage:
v2u_dec <number of frames> [format] [compression level] <filename>
- sets compression level [1..5],
- captures and saves compressed frames to a file
v2u_dec x [format] <filename>
- decompresses frames from the file to separate BMP files
This tool let you fire “one shot” captures, noting that in the source it didn’t apply compression to the frames so that the output can be processed on a faster machine later. This was practically perfect, and I replicated the USB packet sequence to obtain these uncompressed blobs, and looking at the byte counts, it matched with getting around 3 bytes (RGB) per pixel!
Initial processing of these images (just taking the output and writing it as RGB pixels) resulted in something roughly inspired by the input I was giving to the device over VGA:
After some more debugging with a hex editor, I discovered there was some kind of marker every 1028 bytes, it took a slightly embarrassing amount of time to write a watertight filter for that, On the other hand I ended up producing some modern art in the process.
After realising that the tilt/sheer in the image was caused by me skipping and carrying over a pixel on every line (x=799 != x=800), I finally ended up with an image that was almost spot on apart from the colour:
Initially I thought this might have been a calibration thing, caused because I took some sample data when the VGA input was stuck on a solid colour, in order to fix this I built a new test image that would try to smoke these issues out, in hindsight I should have used something like a Philips PM5544 test card
After loading this image on to the VGA producing laptop, I ended up with an output of:
At this point I had a flashback to some 3d rendering/shader work I did long ago. This looked a lot like YUV colour.
I ended up reading up on YUV and remembered during my reverse engineering of the official kernel driver I’d found if I set a breakpoint on a function called v2ucom_convertI420toBGR24
the system would hang without the ability to resume. So maybe the input was in I420 encoding (of -pix_fmt yuv420p
fame) and the expected output was Blue Green and Red as 8 bit bytes?
After using Go’s built in YCbCrToRGB the image suddenly looked much closer to the original.
We did it! Despite the “WIP” quality we were able to do 7 FPS. Honestly, for me that was good enough, since my use for these are as an emergency VGA screen rather than anything else.
So now we know this device well enough to explain how to operate it from a cold boot:
To make use as easy as possible, I ended up rigging up a small web server inside the driver to make it super easy to use in a rush. Thanks to the MediaRecorder API in browsers, it also allows for an easy way to record the output of the screen to a video file.
As I’m sure a lot of people can relate regarding experimental code; I can’t say I’m proud of the code quality. But it’s likely in the state where it works well enough for me to use.
You can find the code for this (and pre-built versions for Linux and OSX) on github: https://github.com/benjojo/userspace-vga2usb/
Even if this is never used by anyone else, this was a hell of a roller coaster in USB protocol details, kernel debugging/module reverse engineering, and general video format decoding! If you liked this kind of stuff, you may like the rest of the blog. If you want to stay up to date with what I do next you can use my blog’s RSS Feed or you can follow me on twitter
Until next time!
Related Posts:
Ludicrously cheap HDMI capture for Linux (2016)
Teaching a cheap ethernet switch new tricks (2019)
Random Post: