Hi, I'm Calvin Kinateder.

Software Engineer

I'm a software engineer focused on building applications that are perfectly designed from the inside out. No matter what I'm working on, my vision is for an equal focus on both the developer and the user. I received my B.S. and M.S. in Computer Science from the University of Cincinnati in 2025.

Experience

SRC, Inc.

Software Engineer

Jun 2025 – Present

Supporting the Electronic Warfare division at AFRL, Wright Patterson Air Force Base.

Node.jsPythonMATLABDockerJenkins
Visit Company

Siemens

Software Engineer Co-op

Jan 2022 – August 2024

Researched cutting-edge technologies via academic papers to explore how they could be used for profit

PythonPyTorchTensorflowDockerBash
Visit Company

Skyward LTD

Computer Engineer Co-op

Jun 2021 – Aug 2021

Implemented real-time inference for anomaly detection on incoming aircraft ADS-B transmissions

C++GolangTensorflowTensorRTDocker
Visit Company

CoverMyMeds

Software Development Intern

Jun 2019 – Aug 2019

Created monitoring applications to observe production and development environments

PythonHTMLJavaScriptRuby on RailsAgile
Visit Company

Projects

flower-dp

Fully functional boilerplate for differentially private federated learning

PythonPyTorchTensorflowDocker

bacta

Python library for backtesting algorithmic trading strategies

PythonNumpyDocker

google-photos-exif-merger

Fixes the broken EXIF data in Google Photos exports

PythonEXIFWeb

stegosaurus-midi

Embedded MIDI controller built on the RP2040 + Arduino platform for audio device manipulation

C++JavaScriptWebMIDIUNIX

Master's Thesis

A Novel Approach To Implementing Knowledge Distillation In Tsetlin Machines

Abstract

The Tsetlin Machine (TM) is a propositional logic based model that uses conjunctive clauses to learn patterns from data. As with typical neural networks, the performance of a Tsetlin Machine is largely dependent on its parameter count, with a larger number of parameters producing higher accuracy but slower execution. Knowledge distillation in neural networks transfers information from an already-trained teacher model to a smaller student model to increase accuracy in the student without increasing execution time. We propose a novel approach to implementing knowledge distillation in Tsetlin Machines by utilizing the probability distributions of each output sample in the teacher to provide additional context to the student. Additionally, we propose a novel clause-transfer algorithm that weighs the importance of each clause in the teacher and initializes the student with only the most essential data. We find that our algorithm can significantly improve performance in the student model without negatively impacting latency in the tested domains of image recognition and text classification.

Read Full Thesis

Skills

PythonC++CJavaTypeScriptNode.jsGolangGitBashLinuxMySQLMATLABPyTorchCUDATensorRTTensorFlowAPI DevelopmentComputer VisionDockerAlgorithmsData StructuresFederated LearningDifferential PrivacyCircuitryCSGLLMsMath Logic

Get In Touch