Since the title of the post is auto explanatory of what I am going to build, I would like to know how many of the Element14 readers remember a tool which inspired the name of this project: yacc, an acronym for "yet another compiler of compilers".
But let's focus on the build: the goal is create a replica of the flux capacitor, the device that makes time travel possible. The flux capacitor consisted of a box with three small, flashing incadescent lamps arranged as a "Y", located above and behind the passenger seat of the time machine. Here is a frame of the movie, where the flux capacitor is visible behind Einsten, just before the first time displacement (Credit https://backtothefuture.fandom.com/wiki/Flux_capacitor)
There a thousands of projects to create a replica of the flux capacitor, but all of them use a string of LEDs to simulate the flow of energy. I would like to follow a completely different approach: all the visual effects will be recreated by means of a Raspberry Pi board equipped with a 7 inches, 800x480 screen.
So, the plan is
- Make a video that recreates the visual effects seen in the film
- Play the video continuously on the the Raspberry Pi using VLC. I will also some physical buttons to change the video and step volume up and down
- Find some recycled object to reproduce the incandescent lamps. Fun fact: the original flux capacitor props used high pressure gas relays as the active components which, while operating, created the steady pulsing which is seen when the time circuits are turned on
- Find a box to host all the components
1. Make a video
When it comes to create video editing, my preferred choice is VSDC, a free yet very powerful video editor. I guess it would be incredibly boring to go through the steps I followed to create the video.
Suffice it to say that I first created an animation with the energy flow, then I cut-and-pasted this component three time to create the three flows that converge toward the center. For the lighting effect, I downloaded a clip from pixabay. The clip had a chroma key, which I easily removed thanks to the powerful tools included in VSDC
This is the final result
2. Play the video
To play the video, I am going to use VLC, which is preinstalled in all the Raspberry Pi images. The nice thing here is that I would like to control the player by means of some physical buttons. To achieve this, I installed a Pimoroni automation hat mini on the back of the Raspberry Pi. The Pimoroni automation hat has three digital inputs, that are connected to three switches., that has the following functions
- cycle through the videos
- step volume up
- step volume down
To install the libraries to control the Pimoroni automation hat mini, just run the following commands (refer to Pimoroni github for the other install options)
curl https://get.pimoroni.com/automationhat | bash
sudo apt-get install pimoroni
This will install some useful examples in Pimoroni/automation/hat-min/examples.
Now, we want to control the VLC player from the python script. To do this, VLC will be launched with the following command lines
os.system("killall vlc")
os.system("DISPLAY=:0 vlc -f --intf rc --rc-host localhost:4212 &");
time.sleep(2.0)
I kill any previous VLC instance, launch the VC application in fullscreen mode (option -f) and start a CLI interface server so that I can send commands through a local network connection
We can now send command to the VLC player using
nc localhst 4212
This command opens an interactive console. Type "help" to print all the supported command.
Since we want to automate the execution of these commands, we can pass to nc a file with the list of commands to execute in sequence
nc localhost 4212 < commands.txt
where commands.txt is a text file that contains valid commands. For example, to play a video you need to create a file with the following lines
add fluxcapacitor.avi play repeat on quit
The first command adds the given file to the playlist, the second line actually starts the video. Then "repeat on" puts the video in loop. Finally, the last line exits the interactive console. Here is the full Python code to play a video file
def initPlaylist():
global files
global fileIdx
with open("/tmp/cmd.txt",'w') as tf:
tf.write("add Videos/"+files[fileIdx]+"\n")
tf.write("repeat on\n")
tf.write("play\n")
tf.write("quit\n")
os.system("nc localhost 4212 < /tmp/cmd.txt")
The files variable is an array with all the files stored in the Videos folder. The array is populated when the script starts by the following lines
files = [f for f in os.listdir("/home/admin/Videos/") if f.endswith("mp4")]
files = sorted(files)
print("Available files ",files)
fileIdx is the index of the video to play. I had to implement a weird logic because VLC CLI uses a sort of "autoincrement" index that changes every time a new video is added to a playlist. I mean, a video is not identified by its position in the playlist, but by a value that is incremented every time a new video is added to the play list. So, to change the video and activate the loop mode, I have to
- delete the previous video. I need to keep track of the "autoincrement" index - this is what the variable playIdx is for
- add the new video
- activate the loop mode
- exit the CLI
Here is the the full code to select the next video to play
def playNext():
global playIdx
global fileIdx
with open("/tmp/cmd.txt",'w') as tf:
tf.write("delete "+str(playIdx)+"\n")
playIdx = playIdx+1
fileIdx = (fileIdx+1) % len(files)
print("Playing file "+files[fileIdx])
tf.write("add Videos/"+files[fileIdx]+"\n")
tf.write("repeat on\n")
tf.write("quit\n")
os.system("nc localhost 4212 < /tmp/cmd.txt")
3. Reading the buttons
The YACF has three buttons, placed on the box's left side:
- Green button: volume up
- White button: select next video
- Blue button: volume down
The status of the three buttons are continuously read inside the script's main loop and, when one of the buttons is pressed, the corresponding action is executed
print("Started");
while True:
if automationhat.input[2].is_on():
# volume up
print("Volume up")
volumeUp()
if automationhat.input[0].is_on():
# volume down
print("Volume down")
volumeDown()
if automationhat.input[1].is_on():
# play next video
playNext()
time.sleep(1.0)
4. Autostart the script
To launch the script at boot, I created a systemd service. I created a file with the content shown below and saved the file as flux.service into the default folder for system service, namely /etc/systemd/system
[Unit]
Description=Flux Capacitor
[Service]
ExecStart=/usr/bin/python3 flux.py
WorkingDirectory=/home/pi/
StandardOutput=inherit
StandardError=inherit
Restart=always
User=pi
[Install]
WantedBy=multi-user.target
To enable the script, run the following commands
sudo systemctl daemon-reload
sudo systemctl enable flux.service
5. Recreate the appearance of the flux capacitor
My goal is to recreate the appearance of the flux capacitor by means of recycled materials. I took three transparent pens to recreate the three "channels" where energy flows. The three circular objects at the end of the three legs of the "Y" are made from small candles. Finally, the three connectors has been 3D printed, where the cables are just probes of an old multimeter.
6. Place all the components in a box
Finally, all the components have been placed in a cardboard box that used to contain an evaluation kit of some Quectel devices. The cardboard fits perfectly the 7 inches display and, besides that, has a nice magnetic lock. First, I cut an hole in the front cover to make the Raspberry Pi's screen and the other components visible.
Then, the box has been completely sprayed with a black paint.
7. Final result
Source code and videos are available on my github