Witch Engineering 2022

Lucas J. Ross
11 min readDec 27, 2022

Friends who know my spouse and I know that Halloween and the surrounding season bring us much joy. For me, it has a lot to do with living in central Texas and leaving the oppressively hot summer behind. It provides a freedom from air conditioned spaces that invariably sparks my creative energies.

In the year 2020, coronavirus brought some anxieties that put a twist on how my creativity manifested. From this, together with fond memories of childhood trick-or-treating, a tradition I call the “spooky hackathon” was born.

Dark Sorcerer of Treats: “Behold as I conjure forth a delicious morsel… of DoOoOoOm”

My day job typically consists of backend server software development. It’s all pretty abstract and nothing I build for work can be held in hand or heard/smelled/tasted. I figured building an animatronic would be a nice diversion because it’s so “close to the metal.” It’s a little like the difference between cooking a meal yourself and authoring one of those life stories for a recipe on the Internet. The smell of burning solder and plastic is a little sickening but also part of a very sensory and intimate engineering experience. In enterprise software, the fastest code build & deploy processes take minutes and the reward is no more than a checkmark on your screen, and you’re forced to find your dopamine hit elsewhere.

Floor piano, 2021

For Halloween 2021 I decided to tone down the creepiness a bit and make a big floor piano. I incorporated a game where it plays a musical interval, and when a trick-or-treater repeats it by stepping on the matching keys, treats are dispensed (from the candy conveyor I made for the previous year’s display). Visit this Instagram post for videos. (For some reason Medium no longer lets you embed Instagram posts in an article).

My Halloween 2022 project was more ambitious. The rest of this article is devoted to showcasing its development.

The Weejee Board

Visit this Instagram post for videos of my “weejee” in action.

I think I got the inspiration for this from a friend who mentioned Ouija(TM) boards at a party. Yes, “Ouija” is a registered trademark of a board game company and I honestly haven’t made much effort to avoid infringing on it besides using the spelling “weejee.” Anyway, I figured it would be fun to have a weejee board with a planchette (the pointy piece that players traditionally move around with their hands) that moves around by itself to answer questions. As an experienced software developer I had no doubt that this was doable. The ways people these days are having their floors vaccuumed and music selected to play on the stereo are a couple of encouraging realities that came to mind. It would be a haunted hybrid Alexa-Roomba. I started the project at the beginning of October, 2022.

Implementation

One strategy that I adopted for this project that I believe was critical to its success was defining constraints early on, both to simplify the implementation and to optimize the end result for the user (trick-or-treater) experience. Robotics is a complex field with which I was relatively unfamiliar when I started on this, so it was important to identify the simplest solutions to various problems to ensure I’d have an enjoyable product within my timeline of four weeks. Also complicating the matter was that it’s hard to stare at a computer screen anymore after a full day of doing so for my day job.

Not the final design

This initial design served me well to the end with a few notable exceptions. I wrote a Spring Boot server in Kotlin, not Java, because Kotlin is a better language. Check out my source code if you like.

I used the Arduino microcontroller platform because it’s a classic, popular choice and I had already become familiar with it through other projects (prior years’ Halloween displays and some home automation tinkering). C++ isn’t my best language, but Arduino doesn’t demand particularly fancy programming technique. Practically speaking, being limited to a single execution thread and kilobytes of RAM and program space are what really set it apart from programming for modern general-purpose computers.

Zumo tank with ankle-destroying plow attachment

The basic platform for the robot is the Zumo. It’s about four inches wide and provides headers for a classic-form-factor Arduino. One major challenge with the Zumo was that a lot of the Arduino I/O pins are used by the Zumo driver shield. As I’ll show later, I added a second Arduino for functionality that the Zumo-driving Arduino couldn’t accommodate. It was also tricky to get a feel for how activating a motor with a “speed” (a signed integer representing the degree of power applied to the motor) would correspond to actual velocities; some independent factors, including friction of the treads on the surface and the amount of juice left in the batteries, make it difficult to predict how far it will go in some period of time. Some experiments were done in advance to define constants that predict straight-line velocities, and the bot takes measurements after periods of movement to figure out where it has really ended up.

Location & orientation. An early idea was to install a camera on the underside of the planchette pointing downward, so that a microcontroller could take pictures of the board at various times to determine both location and orientation. This seemed possible because the board surface would be covered mostly with complete sets of numbers and letters as well as distinct drawings, which means a picture taken of any given (say) 2cm by 2cm square on the board could be reliably matched up with a board location without any other reference. I eventually realized, however, that the techniques used to do this share a solution space with some of the most challenging ones in computer science, namely the area of computer vision. I spent a few evenings playing around with the most popular open-source CV library, OpenCV, but found the learning curve so steep that it was doubtful that I could figure it out within the timeline.

I needed another option, and when I brought it up with my spouse, she noted that on Criminal Minds, the detectives or whoever often locate things by triangulating. That inspired me to look for solutions that could provide distances between the planchette and the edges of the board. The most obvious option for that was the popular ultrasonic sensor. They’re inexpensive and provide good enough precision (+/- a centimeter or two). Some other options I ruled out were laser and infrared distance sensors; they have a minimum functional distance of 10 or 20cm, and the board is only around 60cm from top to bottom, which would leave the planchette “blind” within much of the board area. (Shorter reflections are too fast for common light-based sensors to measure, and better sensors are extremely expensive.)

To give the ultrasonic pulses something to bounce off of, I put walls around the board, just high enough to encompass the vertical span of ultrasonic transmitters & receivers placed around the robot. I figured three or four of them would provide enough data for determining location.

In addition, determining orientation is facilitated by a magnetometer (an electronic compass) built into the Zumo shield. The program obtains a zero value for the x axis of the board by measuring the direction of Earth’s magnetic north and subtracting an operator-supplied offset for the orientation of the board itself.

At least two annoying problems came about with the ultrasonic sensors. When the planchette was oriented at angles approaching diagonals (some degrees off from either axis), pulses would bounce off walls onto other walls before getting back to the robot, causing wildly inaccurate readings (Figure 1).

Figure 1: Confused robot

Another issue is that within the middle area of the board, even with four sensors on the robot, location can’t be determined if the planchette is in a more-or-less diagonal orientation (Figure 2). Adding more sensors to fill the gap seemed like an overly complicated path to go down, and anyway, I only had four sensors on hand. This necessitated some further constraints on how the planchette uses sonar to determine location.

Figure 2: Same distances with same orientation: true location is indeterminate

I noticed when a pulse is sent perpendicular to a wall it virtually never results in a multiple-bounce echo. So, my solution to these problems was to constrain the robot to try to determine location only while it’s oriented toward the top of the board. (Any orientation in alignment with the x or y axis would work, but the top orientation fits the desired user experience — there’s no reason it should ever end up pointing downward.) After any stage of moving around the board, it turns to orient itself toward the top of the board, takes a little time to measure location, and then turns to where it needs to go next. I used two pairs of sensors, each placed opposite of each other. That way each pair provides two measurements that usually sum to the distance between opposite walls. When they don’t because a sensor didn’t hear the echo or some other measurement error occurred, it settles for whichever measurement is smaller (closer measurements are more accurate). This strategy provides the additional advantage of being able to calculate position without the use of trigonometric functions like arctan2, which are a lot of work for an 8-bit Arduino CPU.

Also worth noting is that it’s important to avoid crosstalk among the sensors (one sensor’s receiver getting the echo of a different sensor’s transmitter). To that end, I used the common convention of having sensors send their pulses about 20 milliseconds apart from each other. Incidentally this helps save on input pin usage: having the echoes from all the sensors come in at different times means they can all come in on the same pin instead of four of them. (For anyone thinking of duplicating this, note that diodes should be placed after sensor outputs to avoid reverse current flows.)

Location-finding was a doozy, but things went pretty smoothly from there.

System architecture. I didn’t have one Arduino that would serve all functions of the system so I used two, each with different capabilities, in addition to an external application that serves as the “brains” behind the whole experience. Figure 3 illustrates the functions performed by each component and how they communicate. One thing I realized, to my discomfort, was that this ended up looking like a traditional organizational hierarchy, i.e. the structure in which all my corporate trauma has originated: a chief executive gives commands to middle management which in turn gives commands to workers who perform physical labor. But, these are computers we’re talking about, not people, so I’m not too worried about internalizing my design and becoming like some particular managers that I’ve reported to or anything.

Figure 3

The Weejee Service needs a lot more computational power than an Arduino can provide, as it performs bidirectional streaming of audio to Amazon’s Transcribe service in exchange for transcriptions. It does some hacky pattern matching on those transcriptions to identify question types (who, what, when, where…). Another constraint placed on the system was that it won’t try to be ChatGPT or anything. It just picks a semantically logical but usually nonsensical number or word from a list based on the question type. This had pretty entertaining results, but, like computer vision, NLP (natural language processing) is an area I look forward to digging into.

“POI” in the diagram stands for “point of interest” and refers abstractly to a letter or number contained in an answer (or a yes/no answer). This intermediate concept serves to separate concerns. The Weejee Service tells the net/display MCU (microcontroller unit) what answers to give, the net/display MCU splits those up into individual POIs and tells the Zumo MCU what POIs it needs to visit, and the Zumo MCU queues those POIs up and translates them into x/y coordinates to which it will physically go.

I designed all parts of the system to be reactive to commands over network/serial interfaces and perform tasks asynchronously. This is partly to conserve energy that would otherwise be consumed to frequently “poll” other components for states or tasks, considering that the Arduinos, motors, and LCD, together draw power from four AA batteries. It also avoids blocking operations which aren’t great with a single-threaded and interrupt-reliant processing model. For instance, when the net/display MCU sends a POI to the Zumo MCU, it doesn’t wait around for the Zumo to finish its various movements. The Zumo MCU sends a signal when it’s done spelling something out, and the net/display MCU reacts by displaying a smiley face.

“Enjoy your prophecy!”

Other notables

  • I don’t have a schematic for the connections between Arduinos and components. It was all wired up on the fly. The code has some macros for pin assignments, and I’m happy to answer any specific questions the reader may have about how things were connected.
  • If someone says “hey Ouija!” before the bot is done spelling something out, it’ll stop what it’s doing and go back to the “what’s your question?” state. I thought it best to be as responsive to the user as possible.
  • I incorporated this LCD. The font used in the LCD is stored on the net/display MCU as a set of bitmaps, which I generated with the lcd-image-converter tool.
  • The serial communication between Arduinos is a little buggy and sometimes POIs are skipped (“spiderman” might end up being spelled “spderma”). This is most likely a programming error (maybe the Zumo MCU needs to be more patient with the serial input).
  • The board and planchette top are made of wood left over from a bathroom renovation. I spray-painted base coats of light brown, and members of my family and I painted everything on the board with paintbrushes. Probably not the most efficient solution —with a Cricut machine we could have make stencils for the letters — but we had fun with it.
  • The title of this article was inspired by a colleague at Favor, to whom I mentioned this project, calling me a “witch engineer.” I appreciated that and I’m owning it.

Conclusion

I might do a write-up about last year’s project, but I consider Halloween projects prior to this one to be “practice” projects and probably less interesting.

Thanks for reading and happy Halloween!

--

--

Lucas J. Ross

Software developer in Austin, Texas, pursuing the art of explaining complex ideas in the simplest ways possible.