What follows is the story of how I fixed not one, but two, different flawed Altera USB Blaster clone devices that never worked correctly after I bought them.
I recently built a few Time Sleuths for HDMI input lag testing. Time Sleuth is a neat little open-source project that allows you to measure the time it takes for your TV or monitor to display an image. The way it works is an Altera/Intel MAX 10 FPGA creates a video signal that is mostly a black screen with some blinking white boxes. It feeds the image off to a TI TFP410 DVI transmitter, and also looks for a pulse on a photodetector. It measures the difference in time between when the white box blinked on and the photodetector saw it, and displays it as a number on its generated video signal with some min/max/average stats. Pretty cool! If you’re not into soldering, you can also buy them fully assembled.
I decided to check my Windows Update history, and sure enough, my computer was affected:
The truth is that I’m not using BitLocker, so the update doesn’t really matter for me. I hadn’t even noticed that it was failing repeatedly. Apparently it had also failed to install on January 11th. Knowing that my computer would continue over and over failing at installing this update bothered me though. It just felt wrong.
I previously blogged about a strange problem I’ve been experiencing where my internet upload speed tests fail to reach the expected gigabit speed that my ISP provides — but only in Windows 10. I have a quick update on where I ended up with this problem, and a collection of further test results that point the finger at Windows 10’s TCP stack.
I didn’t mention this in my previous post, but the Seattle server I’ve been using for all of my testing (and seeing the problem with) is run by my ISP. The main thing I figured out, and I’m embarrassed I didn’t try this earlier, is that I do get my full upstream speed in Windows 10 on some (but not all) other nearby test servers, one of which is also run by my ISP.
I’m going to start this post off by saying that I feel incredibly spoiled to even be bringing up a problem like this given that I started out decades ago on a 33.6k dialup modem and today’s cable and DSL connections still have relatively low upstream bandwidth. But it’s still a technically interesting issue that I think is worth bringing up to a larger audience. The gist of what’s going on is: I’m lucky enough to live in an area where symmetrical gigabit fiber internet is available, affordable, and maintained by an awesome ISP. The problem is that although my downstream speed is fine, I’m not seeing my full upstream bandwidth during speed tests when I’m using Windows 10. I’m “only” getting around 300 to 400 megabits instead of the expected 940-ish I should see with a wired Ethernet test.
On a daily basis, I work on firmware for an embedded device that uses the BridgetekFT800. It’s a nifty chip that takes commands over SPI/I2C and turns them into an image displayed on an LCD. It’s very useful for displaying user interfaces with simple microcontrollers. Bridgetek is actually a spinoff company from FTDI, and this kind of solution seems right up their alley — take something complicated like USB or a display controller, and create a simpler interface for dealing with it, such as UART/SPI/I2C.
One thing that’s usually important about user interfaces is the ability to display text. The FT800 has a very basic capability for handling fonts. It’s not really much more than the ability to deal with sets of 127 sprites that each comprise a “font”. As a developer, if you want to use fonts aside from the (very limited) stock ones that come bundled in the FT800’s ROM, you have to create bitmap images that you upload into the 256 KB of available display RAM.
Several years ago, we had to deal with converting the user interface to display in a bunch of different languages, including Chinese. Most of the new languages we added weren’t a big problem, because we could just create a couple of fonts containing all of the special accented characters we needed and be done with it. Chinese, though, was a bigger challenge. There are so many different characters. Putting every character for every string into the limited display RAM is impossible. My coworker at the time came up with a clever script that automatically rasterized the font glyphs and created groups of different 127-character fonts for each displayed screen in the user interface. Every time you changed screens, the new set of fonts for that screen would be loaded into the display RAM.
A while ago, I put 16 GB of RAM into one of my computers. The computer is using a Foxconn P55MX motherboard with a Core i5 750. It’s old and could probably be replaced, but it still works for what I need.
Here’s the interesting part. This motherboard doesn’t officially support 16 GB of RAM. The specs on the page I linked indicate that it supports a maximum of 8 GB. It only has 2 slots, so I had a suspicion that 8 GB sticks just weren’t as common back when this motherboard first came out. I decided to try anyway. In a lot of cases, motherboards do support more RAM than the manufacturer officially claims to support.
I made sure the BIOS was completely updated (version 946F1P06) and put in my two 8 gig sticks. Then, I booted it up into my Ubuntu 16.04 install and everything worked perfectly. I decided that my theory about the motherboard actually supporting more RAM than the documentation claimed was correct and forgot about it. I enjoyed having all the extra RAM to work with and was happy that my gamble paid off.
Then, a few months later, I tried to boot into Windows 10. I mostly use this computer in Linux. I only occasionally need to boot into Windows to check something out. That’s when the fun really started.