Sep. 17, 2017

DIY TV remote controller, part 2: electronics and software test

After some research, I decided to use the Arduino Nano for this project, and got a half dozen units along with some miscellaneous components. They were bizarrely cheap - something like $3 / per on average. So far things are working fine. The clone Nanos I bought came with the CH340 USB to Serial chip so it required a driver installation from the vendor, but otherwise worked fine once I set the IDE to the right COM port.

Think I'm sold on Arduinos in general - the in-circuit code update makes it so easy to prototype and debug. And now I'm thinking of buying a cheap laptop for future projects, so that I'm not tethered to my desktop.

My setup here uses a transistor to power the infrared LED. An Arduino can source up to 40 mA per pin, but the Infrared LED I'm using takes 70 - 100 mA in pulsed mode. It can probably work well enough with that, but I was planning to use several LEDs in the final device, so that meant using a different power source. The circuit was supposed to use an NPN transistor, but I used an N-channel MOSFET I had laying around, the IRFD120.

The dominant infrared library for Arduino is IRremote and there are tons of tutorials and examples about it. It seems to have used parts of the LIRC project as a way to shape its code. That's where I ran into a snag.

IRremote's sendSharp() function doesn't work for all devices.

The Sharp-manufacturer codes from the LIRC listing didn't seem to work at all: volume up / mute / down codes were supposed to be 0x40a2, 0x43a2 and 0x42a2. I verified that the infrared LED was sending something by looking at it through a smartphone camera, but I didn't have an oscilloscope to compare original and constructed signals, and nothing was happening.

Troubleshooting with LIRC

Last year I bought the Vero 2, a Linux-based streaming device running OSMC and Kodi, and though it came with a radio remote controller it also had an infrared receiver and ran LIRC. After ssh'ing into the device, I killed the lircd daemon and ran irrecord -d /dev/lirc0 myconf.conf to inspect what my TV's remote controller was sending.

Using irrecord, I recorded the actual Sharp remote and my breadboard prototype's signals. The two conf files irrecord generated should have been identical, but they weren't. Something was wrong.

begin remote

  name  sharp.conf
  bits           15
  eps            30
  aeps          100

  one           250  1828
  zero          250   790
  ptrail        249
  gap          66772
  toggle_bit_mask 0x0

      begin codes
          volup                    0x40A2 0x435D
          voldown                  0x42A2 0x415D
          mute                     0x43A2 0x405D
          pwr                      0x41A2 0x425D
      end codes

end remote
begin remote

  name  dinotest5-arduino.conf
  bits           15
  flags SPACE_ENC
  eps            30
  aeps          100

  one           211  1844
  zero          211   837
  ptrail        206
  gap          40808
  toggle_bit_mask 0x0

      begin codes
          a                        0x7442 0x77BD
          b                        0x7442 0x77BD
          c                        0x3842 0x3BBD
      end codes

end remote

Irrecord verified that I was looking at the correct sharp codes as per LIRC documentation of a similar device. However, my prototype was off in a bunch of places. The gap value difference made me think that my nano's timing might be off (wrong oscillator, configuration error?), but a separate blinking LED test confirmed my stopwatch matched count of 1 second blinks after ten minutes. So that wasn't it.

I started suspecting the IRremote library's version of sendSharp(). First thing I noticed from looking at the generated LIRC configuration file of my actual remote is that the toggle bit mask was different than expected in the documentation.

IRrecord and raw signals to the rescue

Not having an oscilloscope, I ordered a 38 KHz infrared receiver that I originally thought I wouldn't need. Once I connected it and ran the IRrecord example to spy on the raw IR signals, everything started working. Well, almost. The serial monitor in the Arduino IDE showed activity and that was encouraging. Arduino's serial monitor and overall interface is so incredibly useful for development.

With my setup, I had to make a couple of alterations to the sample program - replace pin numbers, change input mode to INPUT_PULLUP, and tweak the buttonState logic to flip LOW and HIGH. But it worked, and worked great. The program could record some of my other devices and emit their signals, and those worked. All except for the Sharp TV, because I believe the signal has to be inverted as per their protocol, so that'll take an additional step or two.

Next steps: power management and button matrix

Specifically, I want to put the Nano to sleep and connect a pushbutton matrix circuit that can wake it up via interrupt pin. Most examples of the button matrix circuit I've seen weren't designed with an interrupt in mind. There are dedicated integrated circuits that can send an interrupt signal, but I figured the Nano has plenty of I/O pins available in my circuit to allow for 25 - 30 buttons, and I'm thinking I'll need 20 (4 + 5 I/O pins.)

Cheap Thrill

Before I wrote any code to send meaningful signals, I tested the infrared LED with my smartphone camera and recorded this. It's startling how bright infrared LEDs are, since we can't see at that wavelength.

Being a nerd, I found this oddly satisfying to watch, and it gave me some hints as to how to position the IR LEDs in the remote.

No comments \ Leave a comment
Aug. 22, 2017

The day I realized I wasn't using 121 remote controller buttons

This year I decided to get back into electronics and for my first project I decided to build a remote controller.

Reason for it was annoyance at having to juggle this many remotes for a handful of functions.  For the preliminary problem identification, I photoshopped out buttons that I wasn't using on the remotes.  If I consolidated the cursor (left, right, up, down, OK), the count went from 137 to 16.

Keep it simple, right?

Technically I'd still probably need more buttons than that - the device selector would still be needed, and maybe some strays I hadn't thought of yet (like a channel preset?), but it's still an 8:1 button reduction if I pull the project off.  Some of my earliest renderings of the remote were comical, but as I kept doodling them, I started drifting toward angles.


The buttons on this rough sketch are color-coded by which device uses them more.  The volume buttons are for the soundbar controller (gah, it wasn't HDMI-ARC-compatible), and with it as the sole exception, every remaining device (TV, Fios box, OSMC Vero 2) used the cursor extensively.  So, this remote will need a device selector somewhere and I pondered putting it in as a 4-state sliding switch on the side of the remote.  However, pressing the Netflix button (since the feature came with the TV) currently escapes straight to that TV app and it's convenient to have.


Eventually I want to get back into 3D printing, but for now I've decided to just design the controller and have it printed out by Shapeways or sculpteo. I'm sure it'll be a puzzle all by itself just to slice a design, and I'm just now revisiting OpenSCAD and brushing up on it.


For brains of the device, I thought to get into Arduino microcontrollers.  I've previously used PICs but what I've come to appreciate about Arduino is the rapid way for which they're designed to get code updates.  Tinkerer-friendly.  So far I'm eyeballing the Arduino Nano, or at least a common knockoff.

Power System

This alone is going to be a puzzle.  By my rough estimates, an Arduino running at full power will drain a battery in hours (seven, maybe?)  Secondly, they need 5 volts to run, which means needing to stack batteries (use AAA, AA, 9V -- too big?, or coin batteries.)   Or, a combination of these along with a DC/DC Boost converter, like the LM2623.

Power optimizations are going to be needed, since I imagine changing batteries daily or having to charge the remote is impractical. Putting it in heavy sleep mode should help, with input (buttons) being wakeup triggers.  And then, reducing the crystal frequency should bring the power usage down.  The downside is that using any current code libraries (and there are a lot) might be problematic, since they'll come to expect a certain operating frequency.

This isn't a unique project, but what I like about it is that it's very finite.

No comments \ Leave a comment

How we built the VOA Election Map in 2016

Despite working for a large international news organization, our digital budget is never quite what it really ought to be.  Even thinking about purchasing AP election map interactives or API access is always a challenge.  So, whenever it comes to election maps, often enough we roll our own.

It's not always a matter of budget, but also some very specialized circumstances.  We can't apply the same technology and interface from Western elections to places like Nigeria or Iran, and our prioritizes are to ensure support in 40-something languages, keeping text to a minimum.

Every four years I get an opportunity to explore new technologies for an election map.  Several elections ago (defensively, before my time here), it was static images updated by Photoshop and sent hourly to 40-something translators.  The election after that, we developed a database-driven SVG solution powered by RaphaelJS.  At that time, D3.js hadn't picked up speed yet, mobile traffic was low, IE6 was still dominant and RaphaelJS had a VML fallback for it.

But this past year, I was excited to pursue something new.  IE6 usage had dropped off a cliff, and WebGL support had finally gotten to a point where you could expect it to work on mobile devices and most desktop browsers.


The above picture shows a 3D election map prototype I built from scratch, using three.js.  There was some potential there to do fun things like extrude states and give them weight by their population (and therefore number of electoral votes).  You could pan and zoom camera to anywhere you like, even inverting the map.

States could be animated as you explore previous elections, literally flipping visually as their votes flipped from one party's dominance to the other.  It could have been an engaging way for reporters to explain complexity of the U.S. Election system.  Take it for a spin below.

  • Right-Click and drag: pans the map
  • Left-Click and drag: changes the camera angle
  • Mousewheel: zooms in and out

A quick walk-through: click on different years, and click on the 'show voting power' to make the states pop out.


Unfortunately, closer we got to the election, we had some reservations about the map so we defaulted to an SVG version shown up top with the reporters.

List of reasons is huge: performance, size, data transfer amount, development time for mobile-responsive compatibility, and general browser rendering compatibility.  And sadly, even the potential interactivity was confusing to users.  As standard as 3D camera controls are to nerds, we're still a very limited cross-section of users, and regular users today still struggle with drag and drop despite it being around for decades.  I'm convinced a virtual camera gimbal would have been a challenge.

The barrier to entry for WebGL-inexperienced developers was a bit disorienting (for example, logical events like interacting with a U.S. state required a projection solver, and importing the map coordinates was a projection systems chore).    Thankfully, there were tons of code examples out there and even an amateur could wade through it.

WebGL just wasn't ready yet for the 2016 U.S. election, but I'm hopeful that by the next one WebGL support will be more widespread, libraries such as three.js and BabylonJS will get leaner, and more scene-oriented and event driven.

No comments \ Leave a comment
Nov. 04, 2016

2016 Presidential Election Ballot Order, by State

While digging through the AP elections API feed I noticed it contained a BallotOrder attribute for each candidate, so I wanted to explore it a bit.  Table below shows what order the Candidates' names are printed in from state to state, focusing on the two big names.

Some states have a lot of candidates running for president (Colorado has 22) but all states ensured Clinton and Trump were in the top 8 slots.  Clinton's name shows up before Trump's on 29 ballots, and Trump's is listed before Clinton's 22 times.

State Name Order on Ballot Total Number of Presidential Candidates
1 2 3 4 5 6 7 8
AK Clinton Trump 6
AL Clinton Trump 4
AR Clinton Trump 8
AZ Trump Clinton 4
CA Clinton Trump 5
CO Clinton Trump 22
CT Clinton Trump 4
DC Trump Clinton 4
DE Clinton Trump 4
FL Trump Clinton 6
GA Trump Clinton 3
HI Clinton Trump 5
IA Trump Clinton 10
ID Clinton Trump 8
IL Clinton Trump 4
IN Clinton Trump 3
KS Clinton Trump 4
KY Trump Clinton 6
LA Clinton Trump 13
MA Clinton Trump 4
MD Trump Clinton 4
ME Clinton Trump 4
MI Trump Clinton 6
MN Trump Clinton 9
MO Clinton Trump 5
MS Clinton Trump 7
MT Clinton Trump 5
NC Trump Clinton 3
ND Clinton Trump 6
NE Trump Clinton 4
NH Clinton Trump 5
NJ Clinton Trump 9
NM Trump Clinton 8
NV Clinton Trump 6
NY Clinton Trump 8
OH Clinton Trump 5
OK Trump Clinton 3
OR Trump Clinton 4
PA Clinton Trump 5
RI Trump Clinton 5
SC Clinton Trump 7
SD Trump Clinton 4
TN Trump Clinton 7
TX Trump Clinton 4
UT Trump Clinton 10
VA Clinton Trump 5
VT Clinton Trump 6
WA Clinton Trump 7
WI Trump Clinton 7
WV Trump Clinton 5
WY Trump Clinton 6


No comments \ Leave a comment
Oct. 24, 2016

Minecraft Square Moderator Application Age Distribution

Back in 2010 I ran a moderately popular minecraft server called Minecraft Square.  Its niche was fast hardware with tons of ram in the days when such servers didn't affordably exist.  As hosting hardware improved in the industry and my attention span diminished, I sunset and shutdown the server.

However, I kept various server logs and parsing them from time to time is somewhat interesting.  Back then we allowed any players to apply to become a moderator, and asked for some information for a panel to review them as a candidate. Below is a graph of their age distribution collected from 2010 to 2011.


Sample size is approximately 345, and the highest bar there at age 14 was 43 applications.

The moderator applicant ages were self-reported.  Much later, some players admitted they inflated their age in order to improve their chances in the process through perceived maturity, which confirmed what we occasionally suspected at the time. The distribution graph's bell curve suggests to me that many kids probably did the same, since a vast majority of applicants were 11-16.

There's a steep dropoff at age 17, and my speculation is that 17 year olds wanted to round themselves up to 18, a symbolic age of being seen as a trustworthy adult.

No comments \ Leave a comment
Feb. 10, 2015

Levenshtein distance between 10 million usernames and their passwords

Mark Burnett, a security researcher, recently released a collection of 10 million passwords along with their usernames. My question was, how different are 10 million usernames from their passwords?  Taking a tiny bit of time, I performed a simple analysis looking at the Levenshtein distance between them and composed the graph below.

What this means is, if people in this dataset used their username as a password (ex: user dino, password dino), but then changed it a little (password dino1), how many insertions, deletions or substitutions did these users have to make from the set?  See for yourself.

Distance of 0 means usernames and passwords are exactly identical (in the graph below, 213,133 passwords are same as their usernames).  Distance of 1 means one character was added, deleted or changed. And so on...

1 comment \ Leave a comment
Posts on this blog solely represent my personal opinions and technical experience.

© 2009-2019 Edin (Dino) Beslagic