JohnPennington

(software_developer):


LOCATION = [Seattle, Washington]

CONTACT = john.penningt1@gmail.com

GITHUB = github.com/beetlebox-dev

LINKEDIN = linkedin.com/in/john-penningt1

CV = beetlebox.dev/static/PenningtonCV.pdf



Go to the Beetlebox App Hub

# Or read more about Beetlebox Apps below:



Core (game):

MAIN_LANGUAGE = JavaScript

DESCRIPTION = An online game using the HTML <canvas> element and the Web Audio API. It is responsive, compatible with desktop and mobile web browsers, and can be controlled using arrow-keys, mouse, or Touch Events. I also enjoyed doing all of the graphic and sound design for this project.

View source code on GitHub

Play Core



WORD=>PLAY (language_analysis):

MAIN_LANGUAGE = Python

DESCRIPTION = WORD=>PLAY finds paths that connect random words entered by the user. I sourced words, definitions, and pointers from WordNet, an English database from Princeton. My code searches through the database, using synonym-like relationships to go from word to word until reaching the desired target. URL routing is handled with Flask. For the web design, I used the Jinja template engine to inject custom CSS and style paths based on their length.

View source code on GitHub

Go to WORD=>PLAY



wordPATH (language_puzzle):

MAIN_LANGUAGE = JavaScript

DESCRIPTION = The algorithms in WORD=>PLAY are transformed here to create a puzzle game. Players are given a start word, and make a series of binary choices while trying to reach the target word. React is used for the client-side framework. Google Cloud Jobs and Scheduler are used together server-side to generate a new puzzle daily.

View web service source code on GitHub

View puzzle generator source code on GitHub

Play wordPATH



MIRA3 (image_animation):

MAIN_LANGUAGE = JavaScript

DESCRIPTION = MIRA3 will accept images submitted by the user and move the shapes within the image, creating distorted still frames and moving animations similar to cellular automata. The algorithm creates a duotone image based on pixel lightness, choosing a lightness threshold that minimizes the statistical variance of the pixel lightness values on each side of the threshold. The duotone image is used to identify shapes and edges to move.

All image processing happens client-side, with the server simply fulfilling the initial page request. Web Workers are used for processing the image so that each frame calculated becomes immediately available. The web design is responsive to any screen size, and is optimized for both desktop and mobile.

YouTube Demo

View source code on GitHub

Go to MIRA3



SoundX (ferry_route_optimization):

MAIN_LANGUAGE = Python

DESCRIPTION = I often travel to and from Kitsap County, Washington, and it's sometimes complicated deciding whether to use the Seattle/Bainbridge or the Edmonds/Kingston ferry routes (depending on the destination, traffic, ferry schedules, etc.). So I wrote code which figures this all out for me!

It uses data from the Google Routes API to determine drive times to and from each ferry terminal, and uses data from the WSDOT API to coordinate ferry departures and show relevant ferry alerts. The web design is responsive with desktop and mobile in mind. I also enjoyed programming a loading animation for this project using the HTML <canvas> element.

View source code on GitHub

Go to SoundX



HARMio (real_time_melody_harmonization):

MAIN_LANGUAGE = JavaScript

DESCRIPTION = For this project, the user can play any melody, and it will be harmonized in real-time. Sound is generated using a 4-voice triangular wave synthesizer created within the Web Audio API. For any given note entered, I created a database of possible chords. The chords are indexed by tonal center, and are chosen based on proximity to the last implied tonal center. I use a random log-based probability function, and weight nearby tonal centers to occur more frequently, with more distantly related chords happening occasionally. The colors within the app change along with the implied tonal center.

View source code on GitHub

Go to HARMio



Quark (animation):

MAIN_LANGUAGE = JavaScript

DESCRIPTION = I initially designed this animation using the tkinter GUI package (view the original tkinter source code here), and converted it to JavaScript for the web. Each time a click or key press is detected, a random hue and motion is assigned to the shapes.

View source code on GitHub

Go to Quark



BeetleboxMusic (piano+computer):

MAIN_LANGUAGE = Pure Data

GENRE = [piano, electronic, experimental]

DESCRIPTION = This is my piano-centered electronic music project. The human piano part is enhanced by electronic sounds, homemade software synthesizers, and computer accompaniment. Live performances include quite a bit of improvised/spontaneous sections. The electronics are interactive, and respond to unplanned inputs, forming a duet between piano and computer. The audio coding is done with the Pure Data interactive audio programming language. I also did all of the graphic design and video production.

YouTube

SoundCloud



© 2021-2024 Johnathan Pennington | All rights reserved.